Fin names former Twilio exec Evan Cummack as CEO, raises $20M

Work insights platform Fin raised $20 million in Series A funding and brought in Evan Cummack, a former Twilio executive, as its new chief executive officer.

The San Francisco-based company captures employee workflow data from across applications and turns it into productivity insights to improve the way enterprise teams work and remain engaged.

Fin was founded in 2015 by Andrew Kortina, co-founder of Venmo, and Facebook’s former VP of product and Slow Ventures partner Sam Lessin. Initially, the company was doing voice assistant technology — think Alexa but powered by humans and machine learning — and then workplace analytics software. You can read more about Fin’s origins at the link below.

In 2020, the company pivoted again to the company it is today. The new round was led by Coatue, with participation from First Round Capital, Accel and Kleiner Perkins. The original team was talented, but small, so the new funding will build out sales, marketing and engineering teams, Cummack said.

“At that point, the right thing was to raise money, so at the end of last year, the company raised a $20 million Series A, and it was also decided to find a leadership team that knows how to build an enterprise,” Cummack told TechCrunch. “The company had completely pivoted and removed ‘Analytics’ from our name because it was not encompassing what we do.”

Fin’s software measures productivity and provides insights on ways managers can optimize processes, coach their employees and see how teams are actually using technology to get their work done. At the same time, employees are able to manage their workflow and highlight areas where there may be bottlenecks. All combined, it leads to better operations and customer experiences, Cummack said.

Graphic showing how work is really done. Image Credits: Fin

Fin’s view is that as more automation occurs, the company is looking at a “renaissance of human work.” There will be more jobs and more types of jobs, but people will be able to do them more effectively and the work will be more fulfilling, he added.

Particularly with the use of technology, he notes that in the era before cloud computing, there was a small number of software vendors. Now with the average tech company using over 130 SaaS apps, it allows for a lot of entrepreneurs and adoption of best-in-breed apps so that a viable company can start with a handful of people and leverage those apps to gain big customers.

“It’s different for enterprise customers, though, to understand that investment and what they are spending their money on as they use tools to get their jobs done,” Cummack added. “There is massive pressure to improve the customer experience and move quickly. Now with many people working from home, Fin enables you to look at all 130 apps as if they are one and how they are being used.”

As a result, Fin’s customers are seeing metrics like 16% increase in team utilization and engagement, a 25% decrease in support ticket handle time and a 71% increase in policy compliance. Meanwhile, the company itself is doubling and tripling its customers and revenue each year.

Now with leadership and people in place, Cummack said the company is positioned to scale, though it already had a huge head start in terms of a meaningful business.

Arielle Zuckerberg, partner at Coatue, said via email that she was part of a previous firm that invested in Fin’s seed round to build a virtual assistant. She was also a customer of Fin Assistant until it was discontinued.

When she heard the company was pivoting to enterprise, she “was excited because I thought it was a natural outgrowth of the previous business, had a lot of potential and I was already familiar with management and thought highly of them.”

She believed the “brains” of the company always revolved around understanding and measuring what assistants were doing to complete a task as a way to create opportunities for improvement or automation. The pivot to agent-facing tools made sense to Zuckerberg, but it wasn’t until the global pandemic that it clicked.

“Service teams were forced to go remote overnight, and companies had little to no visibility into what people were doing working from home,” she added. “In this remote environment, we thought that Fin’s product was incredibly well-suited to address the challenges of managing a growing remote support team, and that over time, their unique data set of how people use various apps and tools to complete tasks can help business leaders improve the future of work for their team members. We believe that contact center agents going remote was inevitable even before COVID, but COVID was a huge accelerant and created a compelling ‘why now’ moment for Fin’s solution.”

Going forward, Coatue sees Fin as “a process mining company that is focused on service teams.” By initially focusing on customer support and contact center use case — a business large enough to support a scaled, standalone business — rather than joining competitors in going after Fortune 500 companies where implementation cycles are long and there is slow time-to-value, Zuckerberg said Fin is better able to “address the unique challenges of managing a growing remote support team with a near-immediate time-to-value.”

 

#accel, #andrew-kortina, #arielle-zuckerberg, #artificial-intelligence, #automation, #business-intelligence, #business-process-management, #cloud, #cloud-computing, #coatue, #enterprise, #fin, #first-round-capital, #funding, #groupware, #kleiner-perkins, #machine-learning, #process-mining, #recent-funding, #saas, #sam-lessin, #slow-ventures, #startups, #talent, #tc, #twilio, #workflow

Peak raises $75M for a platform that helps non-tech companies build AI applications

As artificial intelligence continues to weave its way into more enterprise applications, a startup that has built a platform to help businesses, especially non-tech organizations, build more customized AI decision making tools for themselves has picked up some significant growth funding. Peak AI, a startup out of Manchester, England, that has built a “decision intelligence” platform, has raised $75 million, money that it will be using to continue building out its platform as well as to expand into new markets, and hire some 200 new people in the coming quarters.

The Series C is bringing a very big name investor on board. It is being led by SoftBank Vision Fund 2, with previous backers Oxx, MMC Ventures, Praetura Ventures, and Arete also participating. That group participated in Peak’s Series B of $21 million, which only closed in February of this year. The company has now raised $118 million; it is not disclosing its valuation.

(This latest funding round was rumored last week, although it was not confirmed at the time and the total amount was not accurate.)

Richard Potter, Peak’s CEO, said the rapid follow-on in funding was based on inbound interest, in part because of how the company has been doing.

Peak’s so-called Decision Intelligence platform is used by retailers, brands, manufacturers and others to help monitor stock levels, build personalized customer experiences, as well as other processes that can stand to have some degree of automation to work more efficiently, but also require sophistication to be able to measure different factors against each other to provide more intelligent insights. Its current customer list includes the likes of Nike, Pepsico, KFC, Molson Coors, Marshalls, Asos, and Speedy, and in the last 12 months revenues have more than doubled.

The opportunity that Peak is addressing goes a little like this: AI has become a cornerstone of many of the most advanced IT applications and business processes of our time, but if you are an organization — and specifically one not built around technology — your access to AI and how you might use it will come by way of applications built by others, not necessarily tailored to you, and the costs of building more tailored solutions can often be prohibitively high. Peak claims that those using its tools have seen revenues on average rise 5%; return on ad spend double; supply chain costs reduce by 5%; and inventory holdings (a big cost for companies) reduce by 12%.

Peak’s platform, I should point out, is not exactly a “no-code” approach to solving that problem — not yet at least: it’s aimed at data scientists and engineers at those organizations so that they can easily identify different processes in their operations where they might benefit from AI tools, and to build those out with relatively little heavy lifting.

There have also been different market factors that have also played a role. Covid-19, for example, and the boost that we have seen both in increasing “digital transformation” in businesses, and making e-commerce processes more efficient to cater to rising consumer demand and more strained supply chains, have all led to businesses being more open to and keen to invest in more tools to improve their automation intelligently.

This, combined with Peak AI’s growing revenues, is part of what interested SoftBank. The investor has been long on AI for a while, but it has been building out a section of its investment portfolio to provide strategic services to the kinds of businesses that it invests in. Those include e-commerce and other consumer-facing businesses, which make up one of the main segments of Peak’s customer base.

“In Peak we have a partner with a shared vision that the future enterprise will run on a centralized AI software platform capable of optimizing entire value chains,” Max Ohrstrand, senior investor for SoftBank Investment Advisers, said in a statement. “To realize this a new breed of platform is needed and we’re hugely impressed with what Richard and the excellent team have built at Peak. We’re delighted to be supporting them on their way to becoming the category-defining, global leader in Decision Intelligence.”

Longer term, it will be interesting to see how and if Peak evolves to be extend its platform to a wider set of users at the organizations that are already its customers.

Potter said he believes that “those with technical predispositions” will be the most likely users of its products in the near and medium term. You might assume that would cut out, for example, marketing managers, although the general trend in a lot of software tools has precisely been to build versions of the same tools used by data scientists for these tell technical people to engage in the process of building what it is that they want to use. “I do think it’s important to democratize the ability to stream data pipelines, and to be able to optimize those to work in applications,” he added.

#ai, #articles, #artificial-intelligence, #automation, #business-process-management, #ceo, #e-commerce, #enterprise, #europe, #funding, #kfc, #manchester, #mmc-ventures, #nike, #partner, #peak, #peak-ai, #pepsico, #science-and-technology, #series-b, #softbank-group, #softbank-vision-fund, #software-platform, #tc, #united-kingdom, #vodafone

UIPath CEO Daniel Dines is coming to TC Sessions: SaaS to talk RPA and automation

UIPath came seemingly out of nowhere in the last several years, going public last year in a successful IPO during which it raised over $527 million. It raised $2 billion in private money prior to that with its final private valuation coming in at an amazing $35 billion. UIPath CEO Daniel Dines will be joining us on a panel on automation at TC Sessions: Saas on October 27th.

The company has been able capture all this investor attention doing something called Robotic Process Automation, which provides a way to automate a series of highly mundane tasks. It has become quite popular, especially to help bring a level of automation to legacy systems that might not be able to handle more modern approaches to automation involving artificial intelligence and machine learning. In 2019 Gartner found that RPA was the fastest growing category in enterprise software.

In point of fact,  UIPath didn’t actually come out of nowhere. It was founded in 2005 as a consulting company and transitioned to software over the years. The company took its first VC funding, a modest $1.5 million seed round in 2015, according to Crunchbase data.

As RPA found its market, the startup began to take off, raising gobs of money including a $568 million round in April 2019 and $750 million in its final private raise in February 2021.

Dines will be appearing on a panel discussing the role of automation in the enterprise. Certainly, the pandemic drove home the need for increased automation as masses of office workers moved to work from home, a trend that is likely to continue even after the pandemic slows.

As the RPA market leader, he is uniquely positioned to discuss how this software and other similar types will evolve in the coming years and how it could combine with related trends like no-code and process mapping. Dines will be joined on the panel by investor Laela Sturdy from Capital G and ServiceNow’s Dave Wright where they will discuss the state of the automation market, why it’s so hot and where the next opportunities could be.

In addition to our discussion with Dines, the conference will also include Databricks’ Ali Ghodsi, Salesforce’s Kathy Baxter and Puppet’s Abby Kearns, as well as investors Casey Aylward and Sarah Guo, among others. We hope you’ll join us. It’s going to be a stimulating day.

Buy your pass now to save up to $100. We can’t wait to see you in October!

Is your company interested in sponsoring or exhibiting at TC Sessions: SaaS 2021? Contact our sponsorship sales team by filling out this form.

#abby-kearns, #ali-ghodsi, #articles, #artificial-intelligence, #automation, #business-process-automation, #business-process-management, #business-software, #casey-aylward, #ceo, #daniel-dines, #databricks, #dave-wright, #enterprise, #kathy-baxter, #laela-sturdy, #machine-learning, #robotic-process-automation, #rpa, #salesforce, #sarah-guo, #servicenow, #software, #tc, #tc-sessions-saas-2021, #technology, #uipath

Build a digital ops toolbox to streamline business processes with hyperautomation

Reliance on a single technology as a lifeline is a futile battle now. When simple automation no longer does the trick, delivering end-to-end automation needs a combination of complementary technologies that can give a facelift to business processes: the digital operations toolbox.

According to a McKinsey survey, enterprises that have likely been successful with digital transformation efforts adopted sophisticated technologies such as artificial intelligence, Internet of Things or machine learning. Enterprises can achieve hyperautomation with the digital ops toolbox, the hub for your digital operations.

The hyperautomation market is burgeoning: Analysts predict that by 2025, it will reach around $860 billion.

The toolbox is a synchronous medley of intelligent business process management (iBPM), robotic process automation (RPA), process mining, low code, artificial intelligence (AI), machine learning (ML) and a rules engine. The technologies can be optimally combined to achieve the organization’s key performance indicator (KPI) through hyperautomation.

The hyperautomation market is burgeoning: Analysts predict that by 2025, it will reach around $860 billion. Let’s see why.

The purpose of a digital ops toolbox

The toolbox, the treasure chest of technologies it is, helps with three crucial aspects: process automation, orchestration and intelligence.

Process automation: A hyperautomation mindset introduces the world of “automating anything that can be,” whether that’s a process or a task. If something can be handled by bots or other technologies, it should be.

Orchestration: Hyperautomation, per se, adds an orchestration layer to simple automation. Technologies like intelligent business process management orchestrate the entire process.

Intelligence: Machines can automate repetitive tasks, but they lack the decision-making capabilities of humans. And, to achieve a perfect harmony where machines are made to “think and act,” or attain cognitive skills, we need AI. Combining AI, ML and natural language processing algorithms with analytics propels simple automation to become more cognitive. Instead of just following if-then rules, the technologies help gather insights from the data. The decision-making capabilities enable bots to make decisions.

 

Simple automation versus hyperautomation

Here’s a story of evolving from simple automation to hyperautomation with an example: an order-to-cash process.

#artificial-intelligence, #business-process-management, #business-software, #column, #data-mining, #ec-cloud-and-enterprise-infrastructure, #ec-column, #ec-enterprise-applications, #enterprise, #machine-learning, #minimum-viable-product, #process-mining, #robotic-process-automation, #software-development, #tc

Achieving digital transformation through RPA and process mining

Understanding what you will change is most important to achieve a long-lasting and successful robotic process automation transformation. There are three pillars that will be most impacted by the change: people, process and digital workers (also referred to as robots). The interaction of these three pillars executes workflows and tasks, and if integrated cohesively, determines the success of an enterprisewide digital transformation.

Robots are not coming to replace us, they are coming to take over the repetitive, mundane and monotonous tasks that we’ve never been fond of. They are here to transform the work we do by allowing us to focus on innovation and impactful work. RPA ties decisions and actions together. It is the skeletal structure of a digital process that carries information from point A to point B. However, the decision-making capability to understand and decide what comes next will be fueled by RPA’s integration with AI.

From a strategic standpoint, success measures for automating, optimizing and redesigning work should not be solely centered around metrics like decreasing fully loaded costs or FTE reduction, but should put the people at the center.

We are seeing software vendors adopt vertical technology capabilities and offer a wide range of capabilities to address the three pillars mentioned above. These include powerhouses like UiPath, which recently went public, Microsoft’s Softomotive acquisition, and Celonis, which recently became a unicorn with a $1 billion Series D round. RPA firms call it “intelligent automation,” whereas Celonis targets the execution management system. Both are aiming to be a one-stop shop for all things related to process.

We have seen investments in various product categories for each stage in the intelligent automation journey. Process and task mining for process discovery, centralized business process repositories for CoEs, executives to manage the pipeline and measure cost versus benefit, and artificial intelligence solutions for intelligent document processing.

For your transformation journey to be successful, you need to develop a deep understanding of your goals, people and the process.

Define goals and measurements of success

From a strategic standpoint, success measures for automating, optimizing and redesigning work should not be solely centered around metrics like decreasing fully loaded costs or FTE reduction, but should put the people at the center. To measure improved customer and employee experiences, give special attention to metrics like decreases in throughput time or rework rate, identify vendors that deliver late, and find missed invoice payments or determine loan requests from individuals that are more likely to be paid back late. These provide more targeted success measures for specific business units.

The returns realized with an automation program are not limited to metrics like time or cost savings. The overall performance of an automation program can be more thoroughly measured with the sum of successes of the improved CX/EX metrics in different business units. For each business process you will be redesigning, optimizing or automating, set a definitive problem statement and try to find the right solution to solve it. Do not try to fit predetermined solutions into the problems. Start with the problem and goal first.

Understand the people first

To accomplish enterprise digital transformation via RPA, executives should put people at the heart of their program. Understanding the skill sets and talents of the workforce within the company can yield better knowledge of how well each employee can contribute to the automation economy within the organization. A workforce that is continuously retrained and upskilled learns how to automate and flexibly complete tasks together with robots and is better equipped to achieve transformation at scale.

#api, #artificial-intelligence, #automation, #business-process-management, #cloud-elements, #column, #ec-column, #ec-enterprise-applications, #enterprise, #microsoft, #ml, #process-mining, #robot-process-automation, #uipath, #workflow

How to cut through the promotional haze and select a digital building platform

Everyone from investors to casual LinkedIn observers has more reasons than ever to look at buildings and wonder what’s going on inside. The property industry is known for moving slowly when it comes to adopting new technologies, but novel concepts and products are now entering this market at a dizzying pace.

However, this ever-growing array of smart-building products has made it confusing for professionals who seek to implement digital building platform (DBP) technologies in their spaces, let alone across their entire enterprise. The waters get even murkier when it comes to cloud platforms and their impact on ROI with regard to energy usage and day-to-day operations.

Breaking down technology decisions into bite-sized pieces, starting with fundamental functions, is the most straightforward way to cut through the promotional haze.

Facility managers, energy professionals and building operators are increasingly hit with daily requests to review the latest platform for managing and operating their buildings. Here are a few tips to help decision-makers clear through the marketing fluff and put DBP platforms to the test.

The why, how and what

Breaking down technology decisions into bite-sized pieces, starting with fundamental functions, is the most straightforward way to cut through the promotional haze. Ask two simple questions: Who on your team will use this technology and what problem will it solve for them? Answers to these questions will help you maintain your key objectives, making it easier to narrow down the hundreds of options to a handful.

Another way to prioritize problems and solutions when sourcing smart-building technology is to identify your use cases. If you don’t know why you need a technology platform for your smart building, you’ll find it difficult to tell which option is better. Further, once you have chosen one, you’ll be hard put to determine if it has been successful. We find use cases draw the most direct line from why to how and what.

For example, let’s examine the why, how and what questions for a real estate developer planning to construct or modernize a commercial office building:

  • Why will people come? — Our building will be full of amenities and technological touches that will make discerning tenants feel comfortable, safe and part of a warm community of like-minded individuals.
  • How will we do it? — Implement the latest tenant-facing technology offering services and capabilities that are not readily available at home. We will create indoor and outdoor environments that make people feel comfortable and happy.
  • What tools, products and technology will we use?

This last question is often the hardest to answer and is usually left until the last possible moment. For building systems integrators, this is where the real work begins.

Focus on desired outcomes

When various stakeholder groups begin their investigations of the technology, it is crucial to define the outcomes everyone hopes to achieve for each use case. When evaluating specific products, it helps to categorize them at high levels.

Several high-level outcomes, such as digital twin enablement, data normalization and data storage are expected across multiple categories of systems. However, only an enterprise building management system includes the most expected outcomes. Integration platform as a service, bespoke reports and dashboarding, analytics as a service and energy-optimization platforms have various enabled and optional outcomes.

The following table breaks down a list of high-level outcomes and aligns them to a category of smart-building platforms available in the market. Expanded definitions of each item are included at the end of this article.

#augmented-reality, #building-management-systems, #business-process-management, #column, #digital-building-platform, #digital-transformation, #ec-column, #ec-how-to, #ec-real-estate-and-proptech, #enterprise, #real-estate

How to launch a successful RPA initiative

Robotic process automation (RPA) is rapidly moving beyond the early adoption phase across verticals. Automating just basic workflow processes has resulted in such tremendous efficiency improvements and cost savings that businesses are adapting automation at scale and across the enterprise.

While there is a technical component to robotic automation, RPA is not a traditional IT-driven solution. It is, however, still important to align the business and IT processes around RPA. Adapting business automation for the enterprise should be approached as a business solution that happens to require some technical support.

A strong working relationship between the CFO and CIO will go a long way in getting IT behind, and in support of, the initiative rather than in front of it.

A strong working relationship between the CFO and CIO will go a long way in getting IT behind, and in support of, the initiative rather than in front of it.

More important to the success of a large-scale RPA initiative is support from senior business executives across all lines of business and at every step of the project, with clear communications and an advocacy plan all the way down to LOB managers and employees.

As we’ve seen in real-world examples, successful campaigns for deploying automation at scale require a systematic approach to developing a vision, gathering stakeholder and employee buy-in, identifying use cases, building a center of excellence (CoE) and establishing a governance model.

Create an overarching vision

Your strategy should include defining measurable, strategic objectives. Identify strategic areas that benefit most from automation, such as the supply chain, call centers, AP or revenue cycle, and start with obvious areas where business sees delays due to manual workflow processes. Remember, the goal is not to replace employees; you’re aiming to speed up processes, reduce errors, increase efficiencies and let your employees focus on higher value tasks.

#automation, #business-process-automation, #business-process-management, #column, #ec-column, #ec-enterprise-applications, #ec-how-to, #enterprise, #robotic-process-automation, #rpa, #saas, #workflow

For M&A success, tap legal early and often

While mergers and acquisitions may be the right strategic path for many businesses, organizations tend to underestimate the role in-house legal teams play in a large-scale strategic transaction until the company is firmly entrenched in a deal.

While the CEO and board might fully appreciate the counsel of the legal team, the ability of the legal team to earn the support of the business — from product and development to marketing and HR — is critical to a smooth, efficient closing and post-close integration process.

Your in-house legal team should be held accountable for catching things specific to your business that outside attorneys will miss.

Having been on the inside of M&A transactions, here are a few insights that I recommend any executive team considering a major strategic transaction keep in mind when working with, and setting expectations for, the in-house legal function as the deal moves from business agreement to closing and through integration.

Is this the right transaction to move the business forward?

When you’re thinking of M&A (or any other type of strategic transaction, for that matter), it is critical to understand why you’re pursuing a deal and what the potential implications (both good and bad) may be for the business at large.

As the executive or founding team, have you agreed that doing the deal is the best way to further the overall strategic business objectives? Is the proposed deal allowing the company to scale more rapidly or efficiently? Does the transaction provide for a more diversified, complementary product offering?

After settling that the deal is the best way to achieve the overall business objective, the next focus is on execution. How will the resulting leadership bring together the two organizations? Do you have a plan on how to go from closing the transaction to successfully moving forward with the strategic purposes for doing the transaction in the first place? Is there agreement on product direction, go-to-market strategies, staffing, company culture, etc.?

These high-level discussions are important to have while evaluating a potential deal, and bringing in your internal legal leadership is critical in these early phases. You may identify during these early discussions aspects that are critical for the deal to be a success, and being sure your legal team is aware of these aspects allows the team to anticipate post-closing issues and resolve them proactively with the structure of the deal or by explicitly calling out critical obligations of each party.

Your in-house legal team should be held accountable for catching things specific to your business that outside attorneys will miss. Outside attorneys are experts in M&A or IPOs or venture financings or whatever else, but your in-house lawyers are experts in your company — that’s the true value of having an in-house team.

#advisors, #business-process-management, #column, #ec-column, #ec-how-to, #human-resource-management, #lawyer, #ma, #mergers-and-acquisitions, #tc

Investment in construction automation is essential to rebuilding US infrastructure

With the United States moving all-in on massive infrastructure investment, much of the discussion has focused on jobs and building new green industries for the 21st century. While the Biden administration’s plan will certainly expand the workforce, it also provides a massive opportunity for the adoption of automation technologies within the construction industry.

Despite the common narrative of automating away human jobs, the two are not nearly as much in conflict, especially with new investments creating space for new roles and work. In fact, one of the greatest problems facing the construction industry remains a lack of labor, making automation a necessity for moving forward with these ambitious projects.

In fact, one of the greatest problems facing the construction industry remains a lack of labor, making automation a necessity for moving forward with these ambitious projects.

The residential construction industry alone had some 223,000 and 332,000 unfilled construction job vacancies at the peak unemployment rate of 15% in 2020, but that’s actually about the same when unemployment was only at 4.1%. Between 1985 and 2015, the average age of construction workers increased from 36 to 42.5, while those aged 55 and older increased from 12% to over 20%. The 2018 Population Survey conducted by the Census Bureau found that workers under 25 comprised just 9% of the construction industry, compared to 12.3% of the overall U.S. labor force.

Productivity in the construction industry has likewise remained static since 1995, primarily driven by the aging demographic of the existing labor force, the apprenticeship nature of the job, and difficulty in attracting and retaining new workers. In short, there is insufficient labor to do the job, while existing staff are becoming increasingly less productive as skilled workers that have accumulated decades of experience in their crafts are lost due to retirement.

Automation will need to be a key element of any major infrastructure push, especially if we hope to meet the ambitious goals of current proposals. That being said, not all areas of the construction industry are primed, or even viable, for this shift to automation.

The challenges of construction automation

Construction is one of the world’s largest industries but has two major challenges: market fragmentation and complex stakeholders.

The construction industry as a whole is nationally fragmented but occasionally locally concentrated. This differs depending on the segment and type of construction company, with each generally comprising less than 10 workers. The top 100 general contractors account for less than 20% of the total construction market. Subcontractors are even more fragmented, with top players accounting for less than 1% of the total market share. This makes sales processes and scaling very slow and highly inefficient.

#artificial-intelligence, #biden-administration, #business-process-management, #column, #construction, #ec-column, #ec-real-estate-and-proptech, #hardware, #manufacturing, #real-estate, #startups, #united-states

Vista Equity takes minority stake in Canada’s Vena with $242M investment

Vena, a Canadian company focused on the Corporate Performance Management (CPM) software space, has raised $242 million in Series C funding from Vista Equity Partners.

As part of the financing, Vista Equity is taking a minority stake in the company. The round follows $25 million in financing from CIBC Innovation Banking last September, and brings Vena’s total raised since its 2011 inception to over $363 million.

Vena declined to provide any financial metrics or the valuation at which the new capital was raised, saying only that its “consistent growth and…strong customer retention and satisfaction metrics created real demand” as it considered raising its C round.

The company was originally founded as a B2B provider of planning, budgeting and forecasting software. Over time, it’s evolved into what it describes as a “fully cloud-native, corporate performance management platform” that aims to empower finance, operations and business leaders to “Plan to Growtheir businesses. Its customers hail from a variety of industries, including banking, SaaS, manufacturing, healthcare, insurance and higher education. Among its over 900 customers are the Kansas City Chiefs, Coca-Cola Consolidated, World Vision International and ELF Cosmetics.

Vena CEO Hunter Madeley told TechCrunch the latest raise is “mostly an acceleration story for Vena, rather than charting new paths.”

The company plans to use its new funds to build out and enable its go-to-market efforts as well as invest in its product development roadmap. It’s not really looking to enter new markets, considering it’s seeing what it describes as “tremendous demand” in the markets it currently serves directly and through its partner network.

“While we support customers across the globe, we’ll stay focused on growing our North American, U.K. and European business in the near term,” Madeley said.

Vena says it leverages the “flexibility and familiarity” of an Excel interface within its “secure” Complete Planning platform. That platform, it adds, brings people, processes and systems into a single source solution to help organizations automate and streamline finance-led processes, accelerate complex business processes and “connect the dots between departments and plan with the power of unified data.”            

Early backers JMI Equity and Centana Growth Partners will remain active, partnering with Vista “to help support Vena’s continued momentum,” the company said. As part of the raise, Vista Equity Managing Director Kim Eaton and Marc Teillon, senior managing director and co-head of Vista’s Foundation Fund, will join the company’s board.

“The pandemic has emphasized the need for agile financial planning processes as companies respond to quickly-changing market conditions, and Vena is uniquely positioned to help businesses address the challenges required to scale their processes through this pandemic and beyond,” said Eaton in a written statement. 

Vena currently has more than 450 employees across the U.S., Canada and the U.K., up from 393 last year at this time.

#banking, #business-process-management, #canada, #coca-cola, #enterprise, #exit, #finance, #funding, #fundings-exits, #healthcare, #information-technology, #ma, #manufacturing, #private-equity, #recent-funding, #startups, #tc, #united-kingdom, #vista-equity-partners

5 emerging use cases for productivity infrastructure in 2021

When the world flipped upside down last year, nearly every company in every industry was forced to implement a remote workforce in just a matter of days — they had to scramble to ensure employees had the right tools in place and customers felt little to no impact. While companies initially adopted solutions for employee safety, rapid response and short-term air cover, they are now shifting their focus to long-term, strategic investments that empower growth and streamline operations.

As a result, categories that make up productivity infrastructure — cloud communications services, API platforms, low-code development tools, business process automation and AI software development kits — grew exponentially in 2020. This growth was boosted by an increasing number of companies prioritizing tools that support communication, collaboration, transparency and a seamless end-to-end workflow.

Productivity infrastructure is on the rise and will continue to be front and center as companies evaluate what their future of work entails and how to maintain productivity, rapid software development and innovation with distributed teams.

According to McKinsey & Company, the pandemic accelerated the share of digitally enabled products by seven years, and “the digitization of customer and supply-chain interactions and of internal operations by three to four years.” As demand continues to grow, companies are taking advantage of the benefits productivity infrastructure brings to their organization both internally and externally, especially as many determine the future of their work.

Automate workflows and mitigate risk

Developers rely on platforms throughout the software development process to connect data, process it, increase their go-to-market velocity and stay ahead of the competition with new and existing products. They have enormous amounts of end-user data on hand, and productivity infrastructure can remove barriers to access, integrate and leverage this data to automate the workflow.

Access to rich interaction data combined with pre-trained ML models, automated workflows and configurable front-end components enables developers to drastically shorten development cycles. Through enhanced data protection and compliance, productivity infrastructure safeguards critical data and mitigates risk while reducing time to ROI.

As the post-pandemic workplace begins to take shape, how can productivity infrastructure support enterprises where they are now and where they need to go next?

#artificial-intelligence, #business-process-management, #cloud-computing, #column, #ec-column, #ec-enterprise-applications, #ml, #productivity, #remote-work, #startups

3 steps to ease the transition to a no-code company

Gartner predicts low/no-code will represent 65% of all app development by 2024. Clearly, it’s the future, but what is it, and how can you turn your organization into a no-code company to get ahead of the trend?

No-code is changing how organizations build and maintain applications. It democratizes application development by creating “citizen developers” who can quickly build out applications that meet their business-facing needs in real time, realigning IT and business objectives by bringing them closer together than ever.

Anyone can now create and modify their own tools without complex coding skills using no-code’s easy-to-use visual interfaces and drag-and-drop functionality.

Anyone can now create and modify their own tools without complex coding skills using no-code’s easy-to-use visual interfaces and drag-and-drop functionality. This creates organizational flexibility and agility, addresses growing IT backlogs and budgets, and helps fill the IT gap caused by a shortage of skilled developers.

Despite the many benefits, adopting a no-code platform won’t suddenly turn you into a no-code company. It’s a process. Here are three steps to help your transition:

1. Future-proof your tech strategy

For a long time, the threat of digital disruption and the subsequent need for digital transformation has been driving IT strategy. The pandemic made this threat all the more acute. Most organizations were forced to rapidly rethink their tech strategy in the new digital normal.

This strategy has been effective for many organizations, but it’s also been largely reactive. Organizations have been fighting to keep up with the acceleration of digital trends. The opportunity with no-code, which is still in its early days, is to make that tech strategy more proactive.

We find that many organizations still think about tech strategy from a predominantly IT lens without considering organizational structural changes that could be around the corner. Think about it: Having a critical mass of citizen developers in five years could dramatically change how your organization allocates resources, organizes departments and even hires talent.

Don’t future-proof your tech strategy for a slightly evolved version of your current organization, future-proof it for a fundamentally more democratized environment where everyone can build their own applications for their own needs. That’s a profound change. Here are three things to consider:

#business-process-management, #citizen-developers, #column, #developer, #digital-transformation, #ec-column, #ec-enterprise-applications, #ec-how-to, #no-code-software, #saas, #software-development, #startups

DeepSee.ai raises $22.6M Series A for its AI-centric process automation platform

DeepSee.ai, a startup that helps enterprises use AI to automate line-of-business problems, today announced that it has raised a $22.6 million Series A funding round led by led by ForgePoint Capital. Previous investors AllegisCyber Capital and Signal Peak Ventures also participated in this round, which brings the Salt Lake City-based company’s total funding to date to $30.7 million.

The company argues that it offers enterprises a different take on process automation. The industry buzzword these days is ‘robotic process automation,’ but DeepSee.ai argues that what it does is different. I describe its system as ‘knowledge process automation’ (KPA). The company itself defines this as a system that “mines unstructured data, operationalizes AI-powered insights, and automates results into real-time action for the enterprise.” But the company also argues that today’s bots focus on basic task automation that doesn’t offer the kind of deeper insights that sophisticated machine learning models can bring to the table. The company also stresses that it doesn’t aim to replace knowledge workers but help them leverage AI to turn the plethora of data that businesses now collect into actionable insights.

Image Credits: DeepSee.ai

“Executives are telling me they need business outcomes and not science projects,” writes DeepSee.ai CEO Steve Shillingford. “And today, the burgeoning frustration with most AI-centric deployments in large-scale enterprises is they look great in theory but largely fail in production. We think that’s because right now the current ‘AI approach’ lacks a holistic business context relevance. It’s unthinking, rigid, and without the contextual input of subject-matter experts on the ground. We founded DeepSee to bridge the gap between powerful technology and line-of-business, with adaptable solutions that empower our customers to operationalize AI-powered automation – delivering faster, better, and cheaper results for our users.”

To help businesses get started with the platform, DeepSee.ai offers three core tools. There’s DeepSee Assembler, which ingests unstructured data and gets it ready for labeling, model review and analysis. Then, DeepSee Atlas can use this data to train AI models that can understand a company’s business processes and help subject-matter experts define templates, rules and logic for automating a company’s internal processes. The third tool, DeepSee Advisor, meanwhile focuses on using text analysis to help companies better understand and evaluate their business processes.

Currently, the company’s focus is on providing these tools for insurance companies, the public sector and capital markets. In the insurance space, use cases include fraud detection, claims prediction and processing, and using large amounts of unstructured data to identify patterns in agent audits, for example.

That’s a relatively limited number of industries for a startup to operate in, but the company says it will use its new funding to accelerate product development and expand to new verticals.

“Using KPA, line-of-business executives can bridge data science and enterprise outcomes, operationalize AI/ML-powered automation at scale, and use predictive insights in real time to grow revenue, reduce cost, and mitigate risk,” said Sean Cunningham, Managing Director of ForgePoint Capital. “As a leading cybersecurity investor, ForgePoint sees the daily security challenges around insider threat, data visibility, and compliance. This investment in DeepSee accelerates the ability to reduce risk with business automation and delivers much-needed AI transparency required by customers for implementation.”

#allegiscyber-capital, #articles, #artificial-intelligence, #automation, #automation-anywhere, #business-process-automation, #business-process-management, #business-software, #cloud, #emerging-technologies, #enterprise, #forgepoint-capital, #machine-learning, #recent-funding, #robotic-process-automation, #salt-lake-city, #signal-peak-ventures, #startups

Lang.ai snags $2M to remove technical burden of implementing AI for businesses

Lang.ai, which has developed a no-code platform for businesses, closed on a $2 million seed funding round.

The company’s SaaS platform aims to allow business users to structure any free-text data with custom categories built through a drag & drop interface, based on AI-extracted concepts.

Village Global led the financing, which included participation from new and existing backers including Acceleprise, Oceans Ventures, Alumni Ventures Group, 2.12 Angels, GTMFund, and Lorimer Ventures.

Spain-born Jorge Penalva founded Lang.ai in 2018 with the goal of giving any business user the ability “to build enterprise-ready natural language processing models in just minutes.” It was built to give non-engineers a way to automate repetitive tasks in use cases such as customer service and claims processing.

“It can be installed in our cloud or theirs,” Penalva said. 

Lang.ai saw its revenue double from the last quarter of 2020 to the first quarter of 2021 and the seed funding was motivated mainly to continue that momentum.

“We’re getting demand in the form of projects with our larger customers, so we needed the funding to be able to support that demand,” Penalva told TechCrunch.

In his previous role of CEO of Séntisis, Penalva realized that processes driven by free-text data remained a blind spot for many companies. 

“Today, millions of dollars and hours are invested by companies to manually read and process textual information captured from disparate areas of their business,” he said.

His mission with Lang.ai is to “empower businesses to put AI to work for them, without the technical complexities of building and training custom algorithms.” 

Specifically, Penalva said that Lang.ai’s product analyzes a customer’s historical data “in minutes” and suggests AI-extracted concepts to build custom categories through a drag & drop interface. The custom categories are applied in real-time to automate “tedious” tasks such as the manual tagging and routing of support tickets, the processing of insurance claims and the dispatching of field engineers to incoming work order requests.

Put simply, Lang.ai’s goal is to remove the technical burden of implementing AI for a business.

Lang.ai’s community of users (called “Citizen NLP Builders”) consists of mostly non-technical business roles, ranging from customer service operations to marketers, business analysts and UX designers.

Customers include Freshly, Userzoom, Playvox, Spain’s CaixaBank, Yalo Chat and Bancolombia, among others. 

Ben Segal, director of infrastructural efficiency at Freshly, described the platform as “so nimble.”

“Out of the box, it took us two days to make automated tagging 15% more reliable than a previous platform that we had had in production for 2 years, with the added benefit that now all of our teams can tap into and exploit our support data,” Segal said. “The marketing team has built workflows to understand key customer moments. Our data and analytics team is super excited about having all these new tags in Snowflake, and it’s crazy how easy it is to use.”

Penalva is proud of the fact that Lang.ai’s engineering team is primarily based in Spain and that he has been able to grow the 10-person company outside of his native country.

“With very few resources, it took us a little over two years to build an enterprise-grade product and find the right set of early customers and investors who are aligned with our vision,” he said. “I moved to the US from Spain to build a global company and this is just the beginning…Lang has always been powered by immigrant hustle, and it has been core to our values since day 1.”

#alumni-ventures-group, #artificial-intelligence, #business-process-management, #funding, #recent-funding, #saas, #spain, #startups, #tc, #tc-include, #techcrunch-include-company

Levity is a ‘no-code’ AI tool to let anyone create workflow automations

Levity, which has been operating in stealth (until now), is the latest no-code company to throw its wares into the ring, having picked up $1.7M in pre-seed funding led by Gil Dibner’s Angular Ventures. The Berlin-based startup wants to bring AI-powered workflow automation to anyone, letting knowldge workers automate tedious, repetitive and manual parts of their job without the need to learn how to code.

Suitable for customer service, marketing, operations, HR, and more, Levity has elected to be a horizontal offering from the get-go. Typical repetitive tasks that can be automated includes reviewing and categorizing documents, images, or text. The premise is that conventional, rule-based automation software isn’t able to automate tasks like these as it requires cognitive abilities, meaning that they usually done manually. This, of course, is where machine learning come into play.

“We want to solve the problem that people spend so much time at their jobs doing boring, repetitive stuff that can be automated to free up space and time for fun and interesting work,” says Gero Keil, co-founder and CEO. “Even though this is what AI has been promising us for decades, there are very few solutions out there, and even less for non-technical people who can’t code”.

To that end, Keil says Levity’s entire mission is to help non-technical knowledge workers automate what they couldn’t automate before. Specifically, the startup targets work processes that involve making decisions on unstructured data, such as images, text, PDFs and other documents.

“For example, if a company receives hundreds or thousands of emails from partners and customers with attachments every day, someone typically has to download the attachment, look at it and then decide what to do with it,” explains Keil. “With Levity, they can train their own custom AI on all of the historic data that they have accumulated, and once it has learned from that it seamlessly integrates with their existing tools and workflows e.g. Dropbox, Gmail, Slack etc.”

More broadly, he says there are many companies struggling to “productionize AI” that would really benefit from having an end-to-end platform “that enables them to build their own AI solutions and make them part of their processes”.

Keil argues that Levity’s main competitor is people doing work manually, but concedes that there is crossover with automation machine learning tools, workflow automation offerings, and labeling tools,

“Instead of going deep into every domain of the ML value chain and making the lives of developers and data scientists at large corporations easier, we focus only the most essential bits and pieces, wrap them in simple and enjoyable UX and abstract the rest away,” he says. “That makes us the best for non-developers in small and medium-sized businesses that want to automate previously non automatable processes in the most straightforward way. The people that have the automation problem become the same people that solve the automation problem; it’s a paradigm shift just like what Wix and Squarespace did to websites”.

Adds Gil Dibner, general partner and founder at Angular Ventures, in a statement: “Levity is driving a massive shift that will affect all knowledge workers. By allowing knowledge workers to easily train AI engines, build AI-powered automations, and integrate them into their everyday workflows, Levity is radically democratizing the benefits of AI.”

Alongside Angular, Levity’s other backers include: System.One, Discovery Ventures (founders of SumUp), Martin Henk (founder of Pipedrive) and various additional unnamed angel investors.

#angular-ventures, #artificial-intelligence, #automation, #berlin, #business-process-management, #europe, #fundings-exits, #levity, #startups, #tc

Microsoft brings new robotic process automation features to its Power Platform

Earlier this year, Microsoft acquired Softomotive, a player in the low-code robotic process automation space with a focus on Windows. Today, at its Ignite conference, the company is launching Power Automate Desktop, a new application based on Softomotive’s technology that lets anyone automate desktop workflows without needing to program.

“The big idea of Power Platform is that we want to go make it so development is accessible to everybody,” Charles Lamanna, Microsoft’s corporate VP for its low-code platform, told me. “And development includes understanding and reporting on your data with Power BI, building web and mobile applications with Power Apps, automating your tasks — whether it’s through robotic process automation or workflow automation — with Power Automate, or building chatbots and chat-based experiences with Power Virtual Agent.”

Power Automate already allowed users to connect web-based applications, similar to Zapier and IFTTT, but the company also launched a browser extension earlier late last year to help users connect native system components to Power Automate. Now, with the integration of the Softomotive technology and the launch of this new low-code Windows application, it’s taking this integration into the native Windows user interface one step further.

“Everything still runs in the cloud and still connects to the cloud, but you now have a rich desktop application to author and record your UI automations,” Lamanna explained. He likened it to an ‘ultimate connector,’ noting that the “ultimate API is just the UI.”

He also stressed that the new app feels like any other modern Office app like Outlook (which is getting a new Mac version today, by the way) or Word. And like the modern versions of those apps, Power Automate Desktop derives a lot of its power from being connected to the cloud.

It’s also worth noting that Power Automate isn’t just a platform for automating simple two- or three-step processes (like sending you a text message when your boss emails you), but also for multistep, business-critical workflows. T-Mobile, for example, is using the platform to automate some of the integration processes between its systems and Sprint.

Lamanna noted that for some large enterprises, adopting these kinds of low-code services necessitates a bit of a culture shift. IT still needs to have some insights into how these tools are used, after all, to ensure that data is kept safe, for example.

Another new feature the company announced today is an integration between the Power Platform and GitHub, which is now in public preview. The idea here is to give developers the ability to create their own software lifecycle workflows. “One of the core ideas of Power Platform is that it’s low code,” Lamanna said. “So it’s built first for business users, business analysts, not the classical developers. But pro devs are welcome. The saying I have is: we’re throwing a party for business users, but pro devs are also invited to the party.” But to get them onto the platform, the team wants to meet them where they are and let them use the tools they already use — and that’s GitHub (and Visual Studio and Visual Studio Code).

#articles, #author, #automation, #business, #business-process-automation, #business-process-management, #business-software, #economy, #ifttt, #microsoft, #microsoft-windows, #player, #softomotive, #tc, #windows, #zapier

ZenHub’s new automation tools improve developer hand-offs in GitHub

ZenHub, the popular project management solution for GitHub users, today announced the launch of its new features for automating hand-offs between teams. The idea behind Automated Workflows, as it is called, is to remove some of the manual busywork of updating multiple boards across teams when a new patch is ready to go to testing, for example (or when it fails those tests and the development team has to fix it).

As ZenHub founder and CEO Aaron Upright told me, Automated Workflows are only the first step in the company’s journey from not just being the most integrated service on GitHub but also the most automated.

Image Credits: ZenHub

Teams still struggle with the mechanics of agile project management, he noted. “Things like what frameworks to choose. How to organize their projects. You talk to small companies and teams, you talk to large companies — it’s a problem for everyone, where people don’t know if they should be Scrum, or Kanban or how to organize Sprint planning meetings.” What ZenHub wants to do is remove as many of these friction points as possible and automate them for teams.

It’s starting with the hand-off between teams because that’s one of the pain points its customers are struggling with all the time. And since teams tend to have their own projects and workspaces, the ZenHub team had to build a solution that worked across a company’s various boards.

The result is a new tool that is pretty much a drag-and-drop service that automatically creates notifications and moves items between workplaces as they move from QA to production, for example.

“It’s a way to automate work between different workspaces,” explained Upright. “And we’re really excited about this being kind of the first step in our automation journey.”

Over time, Upright expects, the team will be able to use machine learning to understand more about the connections that its users are making between teams. Using that data, its systems may be able to also recommend workflows as well.

The next part of ZenHub’s focus on automation will be a tool for managing the Sprint planning process.

“Already today’s, ZenHub is capturing things like velocity. We’re measuring that on a team by team basis. We understand the priority of issues in our workflow. What we want to be able to do is allow teams to automatically set a Sprint schedule, say, for example, every two weeks. Then, based on the velocity that we know about your team, maybe your team can accomplish 50 story points every two weeks — we want to auto-build that Sprint for you.”

#agile-software-development, #articles, #business, #business-process-management, #ceo, #developer, #economy, #github, #groupware, #kanban, #machine-learning, #scrum, #workflow, #zenhub

Layer gets $5.6M to make joint working on spreadsheets less hassle

Layer is not trying to replace Excel or Google Sheets. Instead the Berlin-based productivity startup wants to make life easier for those whose job entails wrangling massive spreadsheets and managing data inputs from across an organization — such as for budgeting, financial reporting or HR functions — by adding a granular control access layer on top.

The idea for a ‘SaaS to supercharge spreadsheets’ came to the co-founders as a result of their own experience of workflow process pain-points at the place they used to work, as is often the case with productivity startups.

“Constantin [Schünemann] and I met at Helpling, the marketplace for cleaning services, where I was the company’s CFO and I had to deal with spreadsheets on a daily level,” explains co-founder Moritz ten Eikelder. “There was one particular reference case for what we’re building here — the update of the company’s financial model and business case which was a 20MB Excel file with 30 different tabs, hundreds of roles of assumptions. It was a key steering tool for management and founders. It was also the basis for the financial reporting.

“On average it needed to be updated twice per month. And that required input by around about 20-25 people across the organization. So right then about 40 different country managers and various department heads. The problem was we could not share the entire file with [all the] people involved because it contained a lot of very sensitive information like salary data, cash burn, cash management etc.”

While sharing a Dropbox link to the file with the necessary individuals so they could update the sheet with their respective contributions would have risked breaking the master file. So instead he says they created individual templates and “carve outs” for different contributors. But this was still far from optimal from a productivity point of view. Hence feeling the workflow burn — and their own entrepreneurial itch.

“Once all the input was collected from the stakeholders you would start a very extensive and tedious copy paste exercise — where you would copy from these 25 difference sources and insert them data into your master file in order to create an up to date version,” says ten Eikelder, adding: “The pain points are pretty clear. It’s an extremely time consuming and tedious process… And it’s extremely prone to error.”

Enter Layer: A web app that’s billed as a productivity platform for spreadsheets which augments rather than replaces them — sitting atop Microsoft Excel and Google Sheets files and bringing in a range of granular controls.

The idea is to offer a one-stop shop for managing access and data flows around multi-stakeholder spreadsheets, enabling access down to individual cell level and aiding collaboration and overall productivity around these key documents by streamlining the process of making and receiving data input requests.

“You start off by uploading an Excel file to our web application. In that web app you can start to build workflows across a feature spectrum,” says Schünemann — noting, for example, that the web viewer allows users to drag the curser to highlight a range of cells they wish to share.

“You can do granular user provisioning on top of that where in the offline world you’d have to create manual carve outs or manual copies of that file to be able to shield away data for example,” he goes on. “On top of that you can then request input [via an email asking for a data submission].

“Your colleagues keep on working in their known environments and then once he has submitted input we’ve built something that is very similar to a track changes functionality in Word. So you as a master user could review all changes in the Layer app — regardless of whether they’re coming through Excel or Google Sheets… And then we’ve built a consolidation feature so that you don’t need to manually copy-paste from different spreadsheets into one. So with just a couple of clicks you can accept changes and they will be taken over into your master file.”

Layer’s initial sales focus is on the financial reporting function but the co-founders say they see this as a way of getting a toe in the door of their target mid-sized companies.

The team believes there are wider use-cases for the tool, given the ubiquity of spreadsheets as a business tool. Although, for now, their target users are organizations with between 150-250 employees so they’re not (yet) going after the enterprise market.

“We believe this is a pretty big [opportunity],” Schünemann tells TechCrunch. “Why because back in 2018 when we did our first research we initially started out with this one spreadsheet at Helpling but after talking to 50 executives, most of them from the finance world or from the financial function of different sized companies, it’s pretty clear that the spreadsheet dependency is still to this day extremely high. And that holds true for financial use cases — 87% of all budgeting globally is still done via spreadsheets and not big ERP systems… but it also goes beyond that. If you think about it spreadsheets are really the number one workflow platform still used to this day. It’s probably the most used user interface in any given company of a certain size.”

“Our current users we have, for example, a real estate company whereby the finance function is using Layer but also the project controller and also some parts of the HR team,” he adds. “And this is a similar pattern. You have similarly structured workflows on top of spreadsheets in almost all functions of a company. And the bigger you get, the more of them you have.

“We use the finance function as our wedge into a company — just because it’s where our domain experience lies. You also usually have a couple of selective use cases which tend to have these problems more because of the intersections between other departments… However sharing or collecting data in spreadsheets is used not only in finance functions.”

The 2019 founded startup’s productivity platform remains in private beta for now — and likely the rest of this year — but they’ve just nabbed €5 million (~$5.6M) in seed funding to get the product to market, with a launch pegged for Q1 2021.

The seed round is led by Index Ventures (Max Rimpel is lead there), and with participation from earlier backers btov Partners. Angel investors also joining the seed include Ajay Vashee (CFO at Dropbox); Carlos Gonzales-Cadenaz (COO of GoCardless), Felix Jahn (founder and CEO of McMakler), Matt Robinson (founder of GoCardless and Nested) and Max Tayenthal (co-founder and CFO of N26).

Commenting in a statement, Index’s Rimpel emphasized the utility the tool offers for “large distributed organizations”, saying: “Spreadsheets are one of the most successful UI’s ever created, but they’ve been built primarily for a single user, not for large distributed organisations with many teams and departments inputting data to a single document. Just as GitHub has helped developers contribute seamlessly to a single code base, Layer is now bringing sophisticated collaboration tools to the one billion spreadsheet users across the globe.”

On the competition front, Layer said it sees its product as complementary to tech giants Google and Microsoft, given the platform plugs directly into those spreadsheet standards. Whereas other productivity startups, such as the likes of Airtable (a database tool for non-coders) and Smartsheets (which bills itself as a “collaboration platform”) are taking a more direct swing at the giants by gunning to assimilate the spreadsheet function itself, at least for certain use cases.

“We never want to be a new Excel and we’re also not aiming to be a new Google Sheets,” says Schünemann, discussing the differences between Layer and Airtable et al. “What Github is to code we want to be to spreadsheets.”

Given it’s working with the prevailing spreadsheet standard it’s a productivity play which, should it prove successful, could see tech giants copying or cloning some of its features. Given enough scale, the startup could even end up as an acquisition target for a larger productivity focused giant wanting to enhance its own product offering. Though the team claims not to have entertained anything but the most passing thoughts of such an exit at this early stage of their business building journey.

“Right now we are really complementary to both big platforms [Google and Microsoft],” says Schünemann. “However it would be naive for us to think that one or the other feature that we build won’t make it onto the product roadmap of either Microsoft or Google. However our value proposition goes beyond just a single feature. So we really view ourselves as being complementary now and also in the future. Because we don’t push out Excel or Google Sheets from an organization. We augment both.”

“Our biggest competitor right now is probably the ‘we’ve always done it like that’ attitude in companies,” he adds, rolling out the standard early stage startup response when asked to name major obstacles. “Because any company has hacked their processes and tools to make it work for them. Some have built little macros. Some are using Jira or Atlassian tools for their project management. Some have hired people to manage their spreadsheet ensembles for them.”

On the acquisition point, Schünemann also has this to say: “A pre-requisite for any successful exit is building a successful company beforehand and I think we believe we are in a space where there are a couple of interesting exit routes to be taken. And Microsoft and Google are obviously candidates where there would be a very obvious fit but the list goes beyond that — all the file hosting tools like Dropbox or the big CRM tools, Salesforce, could also be interesting for them because it very much integrates into the heart of any organization… But we haven’t gone beyond that simple high level thought of who could acquire us at some point.” 

#apps, #atlassian, #berlin, #business-process-management, #cloud-applications, #collaboration-tools, #crm, #europe, #fundings-exits, #github, #gocardless, #google, #google-sheets, #layer, #matt-robinson, #microsoft, #microsoft-excel, #n26, #productivity, #project-management, #real-estate, #recent-funding, #salesforce, #software-as-a-service, #spreadsheet, #startups, #tc, #web-app, #web-application, #web-applications

‘No code’ will define the next generation of software

It seems like every software funding and product announcement these days includes some sort of reference to “no code” platforms or functionality. The frequent callbacks to this buzzy term reflect a realization that we’re entering a new software era.

Similar to cloud, no code is not a category itself, but rather a shift in how users interface with software tools. In the same way that PCs democratized software usage, APIs democratized software connectivity and the cloud democratized the purchase and deployment of software, no code will usher in the next wave of enterprise innovation by democratizing technical skill sets. No code is empowering business users to take over functionality previously owned by technical users by abstracting complexity and centering around a visual workflow. This profound generational shift has the power to touch every software market and every user across the enterprise.

The average enterprise tech stack has never been more complex

In a perfect world, all enterprise applications would be properly integrated, every front end would be shiny and polished, and internal processes would be efficient and automated. Alas, in the real world, engineering and IT teams spend a disproportionate share of their time fighting fires in security, fixing internal product bugs and running vendor audits. These teams are bursting at the seams, spending an estimated 30% of their resources building and maintaining internal tools, torpedoing productivity and compounding technical debt.

Seventy-two percent of IT leaders now say project backlogs prevent them from working on strategic projects. Hiring alone can’t solve the problem. The demand for technical talent far outpaces supply, as demonstrated by the fact that six out of 10 CIOs expect skills shortages to prevent their organizations from keeping up with the pace of change.

At the same time that IT and engineering teams are struggling to maintain internal applications, business teams keep adding fragmented third-party tools to increase their own agility. In fact, the average enterprise is supporting 1,200 cloud-based applications at any given time. Lacking internal support, business users bring in external IT consultants. Cloud promised easy as-needed software adoption with seamless integration, but the realities of quickly changing business needs have led to a roaring comeback of expensive custom software.

#business-process-management, #cloud, #cloud-computing, #cloud-infrastructure, #column, #developer, #extra-crunch, #market-analysis, #no-code, #optical-character-recognition, #software, #software-as-a-service, #software-development, #startups, #tc, #work

Implement DevSecOps to transform your business to IT-as-code

Conduct an online search and you’ll find close to one million websites offering their own definition of DevSecOps.

Why is it that domain experts and practitioners alike continue to iterate on analogous definitions? Likely, it’s because they’re all correct. DevSecOps is a union between culture, practice and tools providing continuous delivery to the end user. It’s an attitude; a commitment to baking security into the engineering process. It’s a practice; one that prioritizes processes that deliver functionality and speed without sacrificing security or test rigor. Finally, it’s a combination of automation tools; correctly pieced together, they increase business agility.

The goal of DevSecOps is to reach a future state where software defines everything. To get to this state, businesses must realize the DevSecOps mindset across every tech team, implement work processes that encourage cross-organizational collaboration, and leverage automation tools, such as for infrastructure, configuration management and security. To make the process repeatable and scalable, businesses must plug their solution into CI/CD pipelines, which remove manual errors, standardize deployments and accelerate product iterations. Completing this process, everything becomes code. I refer to this destination as “IT-as-code.”

Why is DevSecOps important?

Whichever way you cut it, DevSecOps, as a culture, practice or combination of tools, is of increasing importance. Particularly these days, with more consumers and businesses leaning on digital, enterprises find themselves in the irrefutable position of delivering with speed and scale. Digital transformation that would’ve taken years, or at the very least would’ve undergone a period of premeditation, is now urgent and compressed into a matter of months.

The keys to a successful DevSecOps program

Security and operations are a part of this new shift to IT, not just software delivery: A DevSecOps program succeeds when everyone, from security, to operations, to development, is not only part of the technical team but able to share information for repeatable use. Security, often seen as a blocker, will uphold the “secure by design” principle by automating security code testing and reviews, and educating engineers on secure design best practices. Operations, typically reactive to development, can troubleshoot incongruent merges between engineering and production proactively. However, currently, businesses are only familiar with utilizing automation for software delivery. They don’t know what automation means for security or operations. Figuring out how to apply the same methodology throughout the whole program and therefore the whole business is critical for success.

#business-process-management, #column, #computing, #continuous-delivery, #developer, #extra-crunch, #github, #saas, #security, #software-development, #startups, #work

How startups can leverage elastic services for cost optimization

Due to COVID-19, business continuity has been put to the test for many companies in the manufacturing, agriculture, transport, hospitality, energy and retail sectors. Cost reduction is the primary focus of companies in these sectors due to massive losses in revenue caused by this pandemic. The other side of the crisis is, however, significantly different.

Companies in industries such as medical, government and financial services, as well as cloud-native tech startups that are providing essential services, have experienced a considerable increase in their operational demands — leading to rising operational costs. Irrespective of the industry your company belongs to, and whether your company is experiencing reduced or increased operations, cost optimization is a reality for all companies to ensure a sustained existence.

One of the most reliable measures for cost optimization at this stage is to leverage elastic services designed to grow or shrink according to demand, such as cloud and managed services. A modern product with a cloud-native architecture can auto-scale cloud consumption to mitigate lost operational demand. What may not have been obvious to startup leaders is a strategy often employed by incumbent, mature enterprises — achieving cost optimization by leveraging managed services providers (MSPs). MSPs enable organizations to repurpose full-time staff members from impacted operations to more strategic product lines or initiatives.

Why companies need cost optimization in the long run

#business-process-management, #cloud, #cloud-computing, #cloud-infrastructure, #column, #coronavirus, #covid-19, #economy, #enterprise, #extra-crunch, #finance, #financial-services, #growth-and-monetization, #managed-services, #manufacturing, #pricing, #saas, #startups, #subscription-services

Enterprise companies find MLOps critical for reliability and performance

Enterprise startups UIPath and Scale have drawn huge attention in recent years from companies looking to automate workflows, from RPA (robotic process automation) to data labeling.

What’s been overlooked in the wake of such workflow-specific tools has been the base class of products that enterprises are using to build the core of their machine learning (ML) workflows, and the shift in focus toward automating the deployment and governance aspects of the ML workflow.

That’s where MLOps comes in, and its popularity has been fueled by the rise of core ML workflow platforms such as Boston-based DataRobot. The company has raised more than $430 million and reached a $1 billion valuation this past fall serving this very need for enterprise customers. DataRobot’s vision has been simple: enabling a range of users within enterprises, from business and IT users to data scientists, to gather data and build, test and deploy ML models quickly.

Founded in 2012, the company has quietly amassed a customer base that boasts more than a third of the Fortune 50, with triple-digit yearly growth since 2015. DataRobot’s top four industries include finance, retail, healthcare and insurance; its customers have deployed over 1.7 billion models through DataRobot’s platform. The company is not alone, with competitors like H20.ai, which raised a $72.5 million Series D led by Goldman Sachs last August, offering a similar platform.

Why the excitement? As artificial intelligence pushed into the enterprise, the first step was to go from data to a working ML model, which started with data scientists doing this manually, but today is increasingly automated and has become known as “auto ML.” An auto-ML platform like DataRobot’s can let an enterprise user quickly auto-select features based on their data and auto-generate a number of models to see which ones work best.

As auto ML became more popular, improving the deployment phase of the ML workflow has become critical for reliability and performance — and so enters MLOps. It’s quite similar to the way that DevOps has improved the deployment of source code for applications. Companies such as DataRobot and H20.ai, along with other startups and the major cloud providers, are intensifying their efforts on providing MLOps solutions for customers.

We sat down with DataRobot’s team to understand how their platform has been helping enterprises build auto-ML workflows, what MLOps is all about and what’s been driving customers to adopt MLOps practices now.

The rise of MLOps

#artificial-intelligence, #automated-machine-learning, #business-process-management, #column, #data-scientist, #databricks, #datarobot, #developer, #devops, #enterprise, #extra-crunch, #finance, #machine-learning, #market-analysis, #ml, #mlops, #security, #software-development, #workflow

Google Cloud launches new tools for deploying ML pipelines

Google Cloud today announced the beta launch of Cloud AI Platform Pipelines, a new enterprise-grade service that is meant to give developers a single tool to deploy their machine learning pipelines, together with tools for monitoring and auditing them.

“When you’re just prototyping a machine learning (ML) model in a notebook, it can seem fairly straightforward,” Google notes in today’s announcement. “But when you need to start paying attention to the other pieces required to make an ML workflow sustainable and scalable, things become more complex.” And as complexity grows, building a repeatable and auditable process becomes harder.

That, of course, is where Pipelines comes in. It gives developers the ability to build these repeatable processes. As Google notes, there are two parts to the service: the infrastructure for deploying and running those workflows, and the tools for building and debugging the pipelines. The service automates processes like setting up Kubernetes Engine clusters and storage, as well as manually configuring Kubeflow Pipelines. It also uses TensorFlow Extended for building TensorFlow-based workflows and the Argo workflow engine for running the pipelines.

In addition to the infrastructure services, you also get visual tools for building the pipelines, versioning, artifact tracking and more.

With all of this, getting started only takes a few clicks, Google promises, though actually configuring the pipelines isn’t exactly trivial, of course. Google Cloud is adding a bit of complexity (or flexibility, depending on your perspective) here, given that you can use both the Kubeflow Pipelines SDK and the TensorFlow Extended SDK for authoring pipelines.

 

 

#business-process-management, #cloud, #cloud-computing, #cloud-infrastructure, #computing, #google, #google-cloud, #kubeflow, #kubernetes, #machine-learning, #tc