Data scientists: Bring the narrative to the forefront

By 2025, 463 exabytes of data will be created each day, according to some estimates. (For perspective, one exabyte of storage could hold 50,000 years of DVD-quality video.) It’s now easier than ever to translate physical and digital actions into data, and businesses of all types have raced to amass as much data as possible in order to gain a competitive edge.

However, in our collective infatuation with data (and obtaining more of it), what’s often overlooked is the role that storytelling plays in extracting real value from data.

The reality is that data by itself is insufficient to really influence human behavior. Whether the goal is to improve a business’ bottom line or convince people to stay home amid a pandemic, it’s the narrative that compels action, rather than the numbers alone. As more data is collected and analyzed, communication and storytelling will become even more integral in the data science discipline because of their role in separating the signal from the noise.

Data alone doesn’t spur innovation — rather, it’s data-driven storytelling that helps uncover hidden trends, powers personalization, and streamlines processes.

Yet this can be an area where data scientists struggle. In Anaconda’s 2020 State of Data Science survey of more than 2,300 data scientists, nearly a quarter of respondents said that their data science or machine learning (ML) teams lacked communication skills. This may be one reason why roughly 40% of respondents said they were able to effectively demonstrate business impact “only sometimes” or “almost never.”

The best data practitioners must be as skilled in storytelling as they are in coding and deploying models — and yes, this extends beyond creating visualizations to accompany reports. Here are some recommendations for how data scientists can situate their results within larger contextual narratives.

Make the abstract more tangible

Ever-growing datasets help machine learning models better understand the scope of a problem space, but more data does not necessarily help with human comprehension. Even for the most left-brain of thinkers, it’s not in our nature to understand large abstract numbers or things like marginal improvements in accuracy. This is why it’s important to include points of reference in your storytelling that make data tangible.

For example, throughout the pandemic, we’ve been bombarded with countless statistics around case counts, death rates, positivity rates, and more. While all of this data is important, tools like interactive maps and conversations around reproduction numbers are more effective than massive data dumps in terms of providing context, conveying risk, and, consequently, helping change behaviors as needed. In working with numbers, data practitioners have a responsibility to provide the necessary structure so that the data can be understood by the intended audience.

#column, #computing, #data, #data-management, #data-visualization, #developer, #ec-column, #ec-consumer-applications, #ec-enterprise-applications, #enterprise, #machine-learning, #peter-wang, #startups, #storytelling

0

Enterprise security attackers are one password away from your worst day

If the definition of insanity is doing the same thing over and over and expecting a different outcome, then one might say the cybersecurity industry is insane.

Criminals continue to innovate with highly sophisticated attack methods, but many security organizations still use the same technological approaches they did 10 years ago. The world has changed, but cybersecurity hasn’t kept pace.

Distributed systems, with people and data everywhere, mean the perimeter has disappeared. And the hackers couldn’t be more excited. The same technology approaches, like correlation rules, manual processes, and reviewing alerts in isolation, do little more than remedy symptoms while hardly addressing the underlying problem.

Credentials are supposed to be the front gates of the castle, but as the SOC is failing to change, it is failing to detect. The cybersecurity industry must rethink its strategy to analyze how credentials are used and stop breaches before they become bigger problems.

It’s all about the credentials

Compromised credentials have long been a primary attack vector, but the problem has only grown worse in the mid-pandemic world. The acceleration of remote work has increased the attack footprint as organizations struggle to secure their network while employees work from unsecured connections. In April 2020, the FBI said that cybersecurity attacks reported to the organization grew by 400% compared to before the pandemic. Just imagine where that number is now in early 2021.

It only takes one compromised account for an attacker to enter the active directory and create their own credentials. In such an environment, all user accounts should be considered as potentially compromised.

Nearly all of the hundreds of breach reports I’ve read have involved compromised credentials. More than 80% of hacking breaches are now enabled by brute force or the use of lost or stolen credentials, according to the 2020 Data Breach Investigations Report. The most effective and commonly-used strategy is credential stuffing attacks, where digital adversaries break in, exploit the environment, then move laterally to gain higher-level access.

#column, #computer-security, #credential-stuffing, #crime, #cyberattack, #cybercrime, #cyberwarfare, #data-breach, #ec-column, #ec-cybersecurity, #encryption, #enterprise, #fireeye, #national-security-agency, #phishing, #security, #solarwinds

0

Should Dell have pursued a more aggressive debt-reduction move with VMware?

When Dell announced it was spinning out VMware yesterday, the move itself wasn’t surprising: there had been public speculation for some time. But Dell could have gone a number of ways in this deal, despite its choice to spin VMware out as a separate company with a constituent dividend instead of an outright sale.

The dividend route, which involves a payment to shareholders between $11.5 and $12 billion, has the advantage of being tax-free (or at least that’s what Dell hopes as it petitions the IRS). For Dell, which owns 81% of VMware, the dividend translates to somewhere between $9.3 and $9.7 billion in cash, which the company plans to use to pay down a portion of the huge debt it still holds from its $58 billion EMC purchase in 2016.

VMware was the crown jewel in that transaction, giving Dell an inroad to the cloud it had lacked prior to the deal. For context, VMware popularized the notion of the virtual machine, a concept that led to the development of cloud computing as we know it today. It has since expanded much more broadly beyond that, giving Dell a solid foothold in cloud native computing.

Dell hopes to have its cake and eat it too with this deal: it generates a large slug of cash to use for personal debt relief while securing a five-year commercial deal that should keep the two companies closely aligned. Dell CEO Michael Dell will remain chairman of the VMware board, which should help smooth the post-spinout relationship.

But could Dell have extracted more cash out of the deal?

Doing what’s best for everyone

Patrick Moorhead, principal analyst at Moor Insights and Strategies, says that beyond the cash transaction, the deal provides a way for the companies to continue working closely together with the least amount of disruption.

“In the end, this move is more about maximizing the Dell and VMware stock price [in a way that] doesn’t impact customers, ISVs or the channel. Wall Street wasn’t valuing the two companies together nearly as [strongly] as I believe it will as separate entities,” Moorhead said.

#cloud, #dell, #ec-cloud-and-enterprise-infrastructure, #ec-news-analysis, #enterprise, #finance, #tc, #vmware

0

Tecton teams with founder of Feast open source machine learning feature store

Tecton, the company that pioneered the notion of the machine learning feature store, has teamed up with the founder of the open source feature store project called Feast. Today the company announced the release of version 0.10 of the open source tool.

The feature store is a concept that the Tecton founders came up with when they were engineers at Uber. Shortly thereafter an engineer named Willem Pienaar read the founder’s Uber blog posts on building a feature store and went to work building Feast as an open source version of the concept.

“The idea of Tecton [involved bringing] feature stores to the industry, so we build basically the best in class, enterprise feature store. […] Feast is something that Willem created, which I think was inspired by some of the early designs that we published at Uber. And he built Feast and it evolved as kind of like the standard for open source feature stores, and it’s now part of the Linux Foundation,” Tecton co-founder and CEO Mike Del Balso explained.

Tecton later hired Pienaar, who is today an engineer at the company where he leads their open source team. While the company did not originally start off with a plan to build an open source product, the two products are closely aligned, and it made sense to bring Pienaar on board.

“The products are very similar in a lot of ways. So I think there’s a similarity there that makes this somewhat symbiotic, and there is no explicit convergence necessary. The Tecton product is a superset of what Feast has. So it’s an enterprise version with a lot more advanced functionality, but at Feast we have a battle-tested feature store that’s open source,” Pienaar said.

As we wrote in a December 2020 story on the company’s $35 million Series B, it describes a feature store as “an end-to-end machine learning management system that includes the pipelines to transform the data into what are called feature values, then it stores and manages all of that feature data and finally it serves a consistent set of data.”

Del Balso says that from a business perspective, contributing to the open source feature store exposes his company to a different group of users, and the commercial and open source products can feed off one another as they build the two products.

“What we really like, and what we feel is very powerful here, is that we’re deeply in the Feast community and get to learn from all of the interesting use cases […] to improve the Tecton product. And similarly, we can use the feedback that we’re hearing from our enterprise customers to improve the open source project. That’s the kind of cross learning, and ideally that feedback loop involved there,” he said.

The plan is for Tecton to continue being a primary contributor with a team inside Tecton dedicated to working on Feast. Today, the company is releasing version 0.10 of the project.

#artificial-intelligence, #developer, #enterprise, #feature-stores, #linux-foundation, #machine-learning, #open-source, #tc

0

BigEye (formerly Toro) scores $17M Series A to automate data quality monitoring

As companies create machine learning models, the operations team needs to ensure the data used for the model is of sufficient quality, a process that can be time consuming. BigEye (formerly Toro), an early stage startup is helping by automating data quality.

Today the company announced a $17 million Series A led Sequoia Capital with participation from existing investor Costanoa Ventures. That brings the total raised to $21 million with the $4 million seed, the startup raised last May.

When we spoke to BigEye CEO and co-founder Kyle Kirwan last May, he said the seed round was going to be focussed on hiring a team — they are 11 now — and building more automation into the product, and he says they have achieved that goal.

“The product can now automatically tell users what data quality metrics they should collect from their data, so they can point us at a table in Snowflake or Amazon Redshift or whatever and we can analyze that table and recommend the metrics that they should collect from it to monitor the data quality — and we also automated the alerting,” Kirwan explained.

He says that the company is focusing on data operations issues when it comes to inputs to the model such as the table isn’t updating when it’s supposed to, it’s missing rows or there are duplicate entries. They can automate alerts to those kinds of issues and speed up the process of getting model data ready for training and production.

Bogomil Balkansky, the partner at Sequoia who is leading today’s investment sees the company attacking an important part of the machine learning pipeline. “Having spearheaded the data quality team at Uber, Kyle and Egor have a clear vision to provide always-on insight into the quality of data to all businesses,” Balkansky said in a statement.

As the founding team begins building the company, Kirwan says that building a diverse team is a key goal for them and something they are keenly aware of.

“It’s easy to hire a lot of other people that fit a certain mold, and we want to be really careful that we’re doing the extra work to [understand that just because] it’s easy to source people within our network, we need to push and make sure that we’re hiring a team that has different backgrounds and different viewpoints and different types of people on it because that’s how we’re going to build the strongest team,” he said.

BigEye offers on prem and SaaS solutions, and while it’s working with paying customers like Instacart, Crux Informatics, and Lambda School, the product won’t be generally available until later in the year.

#artificial-intelligence, #bigeye, #enterprise, #funding, #machine-learning, #recent-funding, #sequoia-capital, #startups, #tc, #toro

0

IBM acquires Italy’s MyInvenio to integrate process mining directly into its suite of automation tools

Automation has become a big theme in enterprise IT, with organizations using RPA, no-code and low-code tools, and other  technology to speed up work and bring more insights and analytics into how they do things every day, and today IBM is announcing an acquisition as it hopes to take on a bigger role in providing those automation services. The IT giant has acquired MyInvenio, an Italian startup that builds and operates process mining software.

Process mining is the part of the automation stack that tracks data produced by a company’s software, as well as how the software works, in order to provide guidance on what a company could and should do to improve it. In the case of myInvenio, the company’s approach involves making a “digital twin” of an organization to help track and optimize processes. IBM is interested in how myInvenio’s tools are able to monitor data in areas like sales, procurement, production and accounting to help organizations identify what might be better served with more automation, which it can in turn run using RPA or other tools as needed.

Terms of the deal are not being disclosed. It is not clear if myInvenio had any outside investors (we’ve asked and are awaiting a response). This is the second acquisition IBM has made out of Italy. (The first was in 2014, a company called CrossIdeas that now forms part of the company’s security business.)

IBM and myInvenio are not exactly strangers: the two inked a deal as recently as November 2020 to integrate the Italian startup’s technology into IBM’s bigger automation services business globally.

Dinesh Nirmal, GM of IBM Automation, said in an interview that the reason IBM acquired the company was two-fold. First, it lets IBM integrate the technology more closely into the company’s Cloud Pak for Business Automation, which sits on and is powered by Red Hat OpenShift and has other automation capabilities already embedded within it, specifically robotic process automation (RPA), document processing, workflows and decisions.

Second and perhaps more importantly, it will mean that IBM will not have to tussle for priority for its customers in competition with other solution partners that myInvenio already had. IBM will be the sole provider.

“Partnerships are great but in a partnership you also have the option to partner with others, and when it comes to priority who decides?” he said. “From the customer perspective, will they will work just on our deal, or others first? Now, our customers will get the end result of this… We can bring a single solution to an end user or an enterprise, saying, ‘look you have document processing, RPA, workflow, mining. That is the beauty of this and what customers will see.”

He said that IBM currently serves customers across a range of verticals including financial, insurance, healthcare and manufacturing with its automation products.

Notably, this is not the first acquisition that IBM has made to build out this stack. Last year, it acquired WDG to expand into robotic process automation.

And interestingly, it’s not even the only partnership that IBM has had in process mining. Just earlier this month, it announced a deal with one of the bigger names in the field, Celonis, a German startup valued at $2.5 billion in 2019.

Ironically, at the time, my colleague Ron wondered aloud why IBM wasn’t just buying Celonis outright in that deal. It’s hard to speculate if price was one reason. Remember: we don’t know the terms of this acquisition, but given myInvenio was off the fundraising radar, chances are it’s possibly a little less than Celonis’s pricetag.

We’ve asked and IBM has confirmed that it will continue to work with Celonis alongside now offering its own native process mining tools.

“In keeping with IBM’s open approach and $1 billion investment in ecosystem, [Global Business Services, IBM’s enterprise services division] works with a broad range of technologies based on client and market demand, including IBM AI and Automation software,” a spokesperson said in a statement. “Celonis focuses on execution management which supports GBS’ transformation of clients’ business processes through intelligent workflows across industries and domains. Specifically, Celonis has deep connectivity into enterprise systems such as Salesforce, SAP, Workday or ServiceNow, so the Celonis EMS platform helps GBS accelerate clients’ transformations and BPO engagements with these ERP platforms.”

Indeed, at the end of the day, companies that offer services, especially suites of services, are working in environments where they have to be open to customers using their own technology, or bringing in something else.

There may have been another force pushing IBM to bring more of this technology in-house, and that’s wider competitive climate. Earlier this year, SAP acquired another European startup in the process mining space, Signavio, in a deal reportedly worth about $1.2 billion. As more of these companies get snapped up by would-be IBM rivals, and those left standing are working with a plethora of other parties, maybe it was high time for IBM to make sure it had its own horse in the race.

“Through IBM’s planned acquisition of myInvenio, we are revolutionizing the way companies manage their process operations,” said Massimiliano Delsante, CEO, myInvenio, who will be staying on with the deal. “myInvenio’s unique capability to automatically analyze processes and create simulations — what we call a ‘Digital Twin of an Organization’ —  is joining with IBM’s AI-powered automation capabilities to better manage process execution. Together we will offer a comprehensive solution for digital process transformation and automation to help enterprises continuously transform insights into action.”

#automation, #enterprise, #europe, #fundings-exits, #ibm, #italy, #ma, #myinvenio, #process-mining, #rpa

0

Cado Security locks in $10M for its cloud-native digital forensics platform

As computing systems become increasingly bigger and more complex, forensics have become an increasingly important part of how organizations can better secure them. As the recent Solar Winds breach has shown, it’s not always just a matter of being able to identify data loss, or prevent hackers from coming in in the first place. In cases where a network has already been breached, running a thorough investigation is often the only way to identify what happened, if a breach is still active, and whether a malicious hacker can strike again.

As a sign of this growing priority, a startup called Cado Security, which has built forensics technology native to the cloud to run those investigations, is announcing $10 million in funding to expand its business.

Cado’s tools today are used directly by organizations, but also security companies like Redacted — a somewhat under-the-radar security startup in San Francisco co-founded by Facebook’s former chief security officer Max Kelly and John Hering, the co-founder of Lookout. It uses Cado to carry out the forensics part of its work.

The funding for London-based Cado is being led by Blossom Capital, with existing investors Ten Eleven Ventures also participating, among others. As another signal of demand, this Series A is coming only six months after Cado raised its seed round.

The task of securing data on digital networks has grown increasingly complex over the years: not only are there more devices, more data and a wider range of configurations and uses around it, but malicious hackers have become increasingly sophisticated in their approaches to needling inside networks and doing their dirty work.

The move to the cloud has also been a major factor. While it has helped a wave of organizations expand and run much bigger computing processes are part of their business operations, it has also increased the so-called attack surface and made investigations much more complicated, not least because a lot of organizations run elastic processes, scaling their capacity up and down: this means when something is scaled down, logs of previous activity essentially disappear.

Cado’s Response product — which works proactively on a network and all of its activity after it’s installed — is built to work across cloud, on-premise and hybrid environments. Currently it’s available for AWS EC2 deployments and Docker, Kubernetes, OpenShift and AWS Fargate container systems, and the plan is to expand to Azure very soon. (Google Cloud Platform is less of a priority at the moment, CEO James Campbell said, since it rarely comes up with current and potential customers.)

Campbell co-founded Cado with Christopher Doman (the CTO) last April, with the concept for the company coming out of their respective experiences working on security services together at PwC, and respectively for government organizations (Campbell in Australia) and AlienVault (the security firm acquired by AT&T). In all of those, one persistent issue the two continued to encounter was the issue with adequate forensics data, essential for tracking the most complex breaches.

A lot of legacy forensics tools, in particular those tackling the trove of data in the cloud, was based on “processing data with open source and pulling together analysis in spreadsheets,” Campbell said. “There is a need to modernize this space for the cloud era.”

In a typical breach, it can take up to a month to run a thorough investigation to figure out what is going on, since, as Doman describes it, forensics looks at “every part of the disk, the files in a binary system. You just can’t find what you need without going to that level, those logs. We would look at the whole thing.”

However, that posed a major problem. “Having a month with a hacker running around before you can do something about it is just not acceptable,” Campbell added. The result, typically, is that other forensics tools investigate only about 5% of an organization’s data.

The solution — for which Cado has filed patents, the pair said — has essentially involved building big data tools that can automate and speed up the very labor intensive process of looking through activity logs to figure out what looks unusual and to find patterns within all the ones and zeros.

“That gives security teams more room to focus on what the hacker is getting up to, the remediation aspect,” Campbell explained.

Arguably, if there were better, faster tracking and investigation technology in place, something like Solar Winds could have been better mitigated.

The plan for the company is to bring in more integrations to cover more kinds of systems, and go beyond deployments that you’d generally classify as “infrastructure as a service.”

“Over the past year, enterprises have compressed their cloud adoption timelines while protecting the applications that enable their remote workforces,” said Imran Ghory, partner at Blossom Capital, in a statement. “Yet as high-profile breaches like SolarWinds illustrate, the complexity of cloud environments makes rapid investigation and response extremely difficult since security analysts typically are not trained as cloud experts. Cado Security solves for this with an elegant solution that automates time-consuming tasks like capturing forensically sound cloud data so security teams can move faster and more efficiently. The opportunity to help Cado Security scale rapidly is a terrific one for Blossom Capital.”

#cado-security, #cloud-security, #cybersecurity, #digital-forensics, #enterprise, #europe, #forensics, #funding, #security

0

Dell is spinning out VMware in a deal expected to generate over $9B for the company

Dell announced this afternoon that it’s spinning out VMware, a move that has been suspected for some time. Dell, acquired VMware as part of the massive $58 billion EMC acquisition (announced as $67 billion) in 2015.

The way that the deal work is that Dell plans to offer VMware shareholders a special dividend of between $11.5 and 12 billion. As Dell owns approximately 81% of those shares that would work out to somewhere between $9.3 and $9.7 billion coming into Dell’s coffers when the deal closes later this year.

Even when it was part of EMC, VMware had a special status in that it operates as a separate entity with its own executive team, board of directors and the stock has been sold separately as well.

“Both companies will remain important partners, providing Dell Technologies with a differentiated advantage in how we bring solutions to customers. At the same time, Dell Technologies will continue to modernize its core infrastructure and PC businesses and embrace new opportunities through an open ecosystem to grow in hybrid and private cloud, edge and telecom,” Dell CEO Michael Dell said in a statement.]

While there is a lot of CEO speak in that statement, it appears to mean that the move is mostly administrative as the companies will continue to work closely together, even after the spin off is official. Dell will remain as chairman of both companies. What’s more, the company plans to use the cash proceeds from the deal to help pay down the massive debt it still has left over from the EMC deal.

The deal is expected to close at the end of this year, but it has to clear a number of regulatory hurdles first. That includes garnering a favorable ruling from the IRS that the deal qualifies for a tax-free spin-off, which is seems to be a considerable hurdle for a deal like this.

This is a breaking story. We will have more soon.

#cloud, #dell, #enterprise, #finance, #tc, #vmware

0

PlexTrac raises $10M Series A round for its collaboration-centric security platform

PlexTrac, a Boise, ID-based security service that aims to provide a unified workflow automation platform for red and blue teams, today announced that it has raised a $10 million Series A funding round led by Noro-Moseley Partners and Madrona Venture Group. StageDot0 ventures also participated in this round, which the company plans to use to build out its team and grow its platform.

With this new round, the company, which was founded in 2018, has now raised a total of $11 million, with StageDot0 leading its 2019 seed round.

PlexTrac CEO and President Dan DeCloss

PlexTrac CEO and President Dan DeCloss

“I have been on both sides of the fence, the specialist who comes in and does the assessment, produces that 300-page report and then comes back a year later to find that some of the critical issues had not been addressed at all.  And not because the organization didn’t want to but because it was lost in that report,” PlexTrac CEO and President Dan DeCloss said. “These are some of the most critical findings for an entity from a risk perspective. By making it collaborative, both red and blue teams are united on the same goal we all share, to protect the network and assets.”

With an extensive career in security that included time as a penetration tester for Veracode and the Mayo Clinic, as well as senior information security advisor for Anthem, among other roles, DeCloss has quite a bit of first-hand experience that led him to found PlexTrac. Specifically, he believes that it’s important to break down the wall between offense-focused red teams and defense-centric blue teams.

Image Credits: PlexTrac

 

 

“Historically there has been more of the cloak and dagger relationship but those walls are breaking down– and rightfully so, there isn’t that much of that mentality today– people recognize they are on the same mission whether they are internal security team or an external team,” he said. “With the PlexTrac platform the red and blue teams have a better view into the other teams’ tactics and techniques – and it makes the whole process into an educational exercise for everyone.”

At its core, PlexTrac makes it easier for security teams to produce their reports — and hence free them up to actually focus on ‘real’ security work. To do so, the service integrates with most of the popular scanners like Qualys, and Veracode, but also tools like ServiceNow and Jira in order to help teams coordinate their workflows. All the data flows into real-time reports that then help teams monitor their security posture. The service also features a dedicated tool, WriteupsDB, for managing reusable write-ups to help teams deliver consistent reports for a variety of audiences.

“Current tools for planning, executing, and reporting on security testing workflows are either nonexistent (manual reporting, spreadsheets, documents, etc…) or exist as largely incomplete features of legacy platforms,” Madrona’s S. Somasegar and Chris Picardo write in today’s announcement. “The pain point for security teams is real and PlexTrac is able to streamline their workflows, save time, and greatly improve output quality. These teams are on the leading edge of attempting to find and exploit vulnerabilities (red teams) and defend and/or eliminate threats (blue teams).”

 

#cloud-applications, #computer-security, #computing, #enterprise, #information-technology, #madrona-venture-group, #mayo-clinic, #noro-moseley-partners, #qualys, #recent-funding, #red-team, #security, #servicenow, #startups

0

Upstack raises $50M for its platform and advisory to help businesses plan and buy for digital transformation

Digital transformation has been one of the biggest catchphrases of the past year, with many an organization forced to reckon with aging IT, a lack of digital strategy, or simply the challenges of growth after being faced with newly-remote workforces, customers doing everything online and other tech demands.

Now, a startup called Upstack that has built a platform to help those businesses evaluate how to grapple with those next steps — including planning and costing out different options and scenarios, and then ultimately buying solutions — is announcing financing to do some growth of its own.

The New York startup has picked up funding of $50 million, money that it will be using to continue building out its platform and expanding its services business.

The funding is coming from Berkshire Partners, and it’s being described as an “initial investment”. The firm, which makes private equity and late-stage growth investments, typically puts between $100 million and $1 billion in its portfolio companies so this could end up as a bigger number, especially when you consider the size of the market that Upstack is tackling: the cloud and internet infrastructure brokerage industry generates annual revenues “in excess of $70 billion,” the company estimates.

We’re asking about the valuation, but PitchBook notes that the median valuation in its deals is around $211 million. Upstack had previously raised around $35 million.

Upstack today already provides tools to large enterprises, government organizations, and smaller businesses to compare offerings and plan out pricing for different scenarios covering a range of IT areas, including private, public and hybrid cloud deployments; data center investments; network connectivity; business continuity and mobile services, and the plan is to bring in more categories to the mix, including unified communications and security.

Notably, Upstack itself is profitable and names a lot of customers that themselves are tech companies — they include Cisco, Accenture, cloud storage company Backblaze, Riverbed and Lumen — a mark of how digital transformation and planning for it are not necessarily a core competency even of digital businesses, but especially those that are not technology companies. It says it has helped complete over 3,700 IT projects across 1,000 engagements to date.

“Upstack was founded to bring enterprise-grade advisory services to businesses of all sizes,” said Christopher Trapp, founder and CEO, in a statement. “Berkshire’s expertise in the data center, connectivity and managed services sectors aligns well with our commitment to enabling and empowering a world-class ecosystem of technology solutions advisors with a platform that delivers higher value to their customers.”

The core of the Upstack’s proposition is a platform that system integrators, or advisors, plus end users themselves, can use to design and compare pricing for different services and solutions. This is an unsung but critical aspect of the ecosystem: We love to hear and write about all the interesting enterprise technology that is being developed, but the truth of the matter is that buying and using that tech is never just a simple click on a “buy” button.

Even for smaller organizations, buying tech can be a hugely time-consuming task. It involves evaluating different companies and what they have to offer — which can differ widely in the same category, and gets more complex when you start to compare different technological approaches to the same problem.

It also includes the task of designing solutions to fit one’s particular network. And finally, there are the calculations that need to be made to determine the real cost of services once implemented in an organization. It also gives users the ability to present their work, which also forms a critical part of the evaluating and decision-making process. When you think about all of this, it’s no wonder that so many organizations have opted to follow the “if it ain’t broke, don’t fix it” school of digital strategy.

As technology has evolved, the concept of digital transformation itself has become more complicated, making tools like Upstack’s more in demand both by companies and the people they hire to do this work for them. Upstack also employs a group of about 15 advisors — consultants — who also provide insight and guidance in the procurement process, and it seems some of the funding will also be used to invest in expanding that team.

(Incidentally, the model of balancing technology with human experts is one used by other enterprise startups that are built around the premise of helping businesses procure technology: BlueVoyant, a security startup that has built a platform to help businesses manage and use different security services, also retains advisors who are experts in that field.)

The advisors are part of the business model: Upstack’s customers can either pay Upstack a consulting fee to work with its advisors, or Upstack receives a commission from suppliers that a company ends up using, having evaluated and selected them via the Upstack platform.

The company competes with traditional systems integrators and consultants, but it seems that the fact that it has built a tech platform that some of its competitors also use is one reason why it’s caught the eye of investors, and also seen strong growth.

Indeed, when you consider the breadth of services that a company might use within their infrastructure — whether it’s software to run sales or marketing, or AI to run a recommendation for products on a site, or business intelligence or RPA — it will be interesting to see how and if Upstack considers deeper moves into these areas.

“Upstack has quickly become a leader in a large, rapidly growing and highly fragmented market,” said Josh Johnson, principal at Berkshire Partners, in a statement. “Our experience has reinforced the importance of the agent channel to enterprises designing and procuring digital infrastructure. Upstack’s platform accelerates this digital transformation by helping its advisors better serve their enterprise customers. We look forward to supporting Upstack’s continued growth through M&A and further investment in the platform.”

#cloud-services, #digital-transformation, #enterprise, #funding, #upstack

0

Meroxa raises $15M Series A for its real-time data platform

Meroxa, a startup that makes it easier for businesses to build the data pipelines to power both their analytics and operational workflows, today announced that it has raised a $15 million Series A funding round led by Drive Capital. Existing investors Root, Amplify and Hustle Fund also participated in this round, which together with the company’s previously undisclosed $4.2 million seed round now brings total funding in the company to $19.2 million.

The promise of Meroxa is that can use a single platform for their various data needs and won’t need a team of experts to build their infrastructure and then manage it. At its core, Meroxa provides a single Software-as-a-Service solution that connects relational databases to data warehouses and then helps businesses operationalize that data.

Image Credits: Meroxa

“The interesting thing is that we are focusing squarely on relational and NoSQL databases into data warehouse,” Meroxa co-founder and CEO DeVaris Brown told me. “Honestly, people come to us as a real-time FiveTran or real-time data warehouse sink. Because, you know, the industry has moved to this [extract, load, transform] format. But the beautiful part about us is, because we do change data capture, we get that granular data as it happens.” And businesses want this very granular data to be reflected inside of their data warehouses, Brown noted, but he also stressed that Meroxa can expose this stream of data as an API endpoint or point it to a Webhook.

The company is able to do this because its core architecture is somewhat different from other data pipeline and integration services that, at first glance, seem to offer a similar solution. Because of this, users can use the service to connect different tools to their data warehouse but also build real-time tools on top of these data streams.

Image Credits: Meroxa

“We aren’t a point-to-point solution,” Meroxa co-founder and CTO Ali Hamidi explained. “When you set up the connection, you aren’t taking data from Postgres and only putting it into Snowflake. What’s really happening is that it’s going into our intermediate stream. Once it’s in that stream, you can then start hanging off connectors and say, ‘Okay, well, I also want to peek into the stream, I want to transfer my data, I want to filter out some things, I want to put it into S3.”

Because of this, users can use the service to connect different tools to their data warehouse but also build real-time tools to utilize the real-time data stream. With this flexibility, Hamidi noted, a lot of the company’s customers start with a pretty standard use case and then quickly expand into other areas as well.

Brown and Hamidi met during their time at Heroku, where Brown was a director of product management and Hamidi a lead software engineer. But while Heroku made it very easy for developers to publish their web apps, there wasn’t anything comparable in the highly fragmented database space. The team acknowledges that there are a lot of tools that aim to solve these data problems, but few of them focus on the user experience.

Image Credits: Meroxa

“When we talk to customers now, it’s still very much an unsolved problem,” Hamidi said. “It seems kind of insane to me that this is such a common thing and there is no ‘oh, of course you use this tool because it addresses all my problems.’ And so the angle that we’re taking is that we see user experience not as a nice-to-have, it’s really an enabler, it is something that enables a software engineer or someone who isn’t a data engineer with 10 years of experience in wrangling Kafka and Postgres and all these things. […] That’s a transformative kind of change.”

It’s worth noting that Meroxa uses a lot of open-source tools but the company has also committed to open-sourcing everything in its data plane as well. “This has multiple wins for us, but one of the biggest incentives is in terms of the customer, we’re really committed to having our agenda aligned. Because if we don’t do well, we don’t serve the customer. If we do a crappy job, they can just keep all of those components and run it themselves,” Hamidi explained.

Today, Meroxa, which the team founded in early 2020, has over 24 employees (and is 100% remote). “I really think we’re building one of the most talented and most inclusive teams possible,” Brown told me. “Inclusion and diversity are very, very high on our radar. Our team is 50% black and brown. Over 40% are women. Our management team is 90% underrepresented. So not only are we building a great product, we’re building a great company, we’re building a great business.”  

#api, #business-intelligence, #cloud, #computing, #data-management, #data-warehouse, #database, #developer, #drive-capital, #enterprise, #heroku, #hustle-fund, #information-technology, #nosql, #product-management, #recent-funding, #software-engineer, #startups, #web-apps

0

Zoho launches new low code workflow automation product

Workflow automation has been one of the key trends this year so far, and Zoho, a company known for its suite of affordable business tools has joined the parade with a new low code workflow product called Qntrl (pronounced control).

Zoho’s Rodrigo Vaca, who is in charge of Qntrl’s marketing says that most of the solutions we’ve been seeing are built for larger enterprise customers. Zoho is aiming for the mid-market with a product that requires less technical expertise than traditional business process management tools.

“We enable customers to design their workflows visually without the need for any particular kind of prior knowledge of business process management notation or any kind of that esoteric modeling or discipline,” Vaca told me.

While Vaca says, Qntrl could require some technical help to connect a workflow to more complex backend systems like CRM or ERP, it allows a less technical end user to drag and drop the components and then get help to finish the rest.

“We certainly expect that when you need to connect to NetSuite or SAP you’re going to need a developer. If nothing else, the IT guys are going to ask questions, and they will need to provide access,” Vaca said.

He believes this product is putting this kind of tooling in reach of companies that may have been left out of workflow automation for the most part, or which have been using spreadsheets or other tools to create crude workflows. With Qntrl, you drag and drop components, and then select each component and configure what happens before, during and after each step.

What’s more, Qntrl provides a central place for processing and understanding what’s happening within each workflow at any given time, and who is responsible for completing it.

We’ve seen bigger companies like Microsoft, SAP, ServiceNow and others offering this type of functionality over the last year as low code workflow automation has taken center stage in business.

This has become a more pronounced need during the pandemic when so many workers could not be in the office. It made moving work in a more automated workflow more imperative, and we have seen companies moving to add more of this kind of functionality as a result.

Brent Leary, principal analyst at CRM Essentials, says that Zoho is attempting to remove some the complexity from this kind of tool.

“It handles the security pieces to make sure the right people have access to the data and processes used in the workflows in the background, so regular users can drag and drop to build their flows and processes without having to worry about that stuff,” Leary told me.

Zoho Qntrl is available starting today starting at just $7 per user month.

#cloud, #enterprise, #low-code, #tc, #workflow-automation, #zoho

0

Docugami’s new model for understanding documents cuts its teeth on NASA archives

You hear so much about data these days that you might forget that a huge amount of the world runs on documents: a veritable menagerie of heterogeneous files and formats holding enormous value yet incompatible with the new era of clean, structured databases. Docugami plans to change that with a system that intuitively understands any set of documents and intelligently indexes their contents — and NASA is already on board.

If Docugami’s product works as planned, anyone will be able to take piles of documents accumulated over the years and near-instantly convert them to the kind of data that’s actually useful to people.

Because it turns out that running just about any business ends up producing a ton of documents. Contracts and briefs in legal work, leases and agreements in real estate, proposals and releases in marketing, medical charts, etc, etc. Not to mention the various formats: Word docs, PDFs, scans of paper printouts of PDFs exported from Word docs, and so on.

Over the last decade there’s been an effort to corral this problem, but movement has largely been on the organizational side: put all your documents in one place, share and edit them collaboratively. Understanding the document itself has pretty much been left to the people who handle them, and for good reason — understanding documents is hard!

Think of a rental contract. We humans understand when the renter is named as Jill Jackson, that later on, “the renter” also refers to that person. Furthermore, in any of a hundred other contracts, we understand that the renters in those documents are the same type of person or concept in the context of the document, but not the same actual person. These are surprisingly difficult concepts for machine learning and natural language understanding systems to grasp and apply. Yet if they could be mastered, an enormous amount of useful information could be extracted from the millions of documents squirreled away around the world.

What’s up, .docx?

Docugami founder Jean Paoli says they’ve cracked the problem wide open, and while it’s a major claim, he’s one of few people who could credibly make it. Paoli was a major figure at Microsoft for decades, and among other things helped create the XML format — you know all those files that end in x, like .docx and .xlsx? Paoli is at least partly to thank for them.

“Data and documents aren’t the same thing,” he told me. “There’s a thing you understand, called documents, and there’s something that computers understand, called data. Why are they not the same thing? So my first job [at Microsoft] was to create a format that can represent documents as data. I created XML with friends in the industry, and Bill accepted it.” (Yes, that Bill.)

The formats became ubiquitous, yet 20 years later the same problem persists, having grown in scale with the digitization of industry after industry. But for Paoli the solution is the same. At the core of XML was the idea that a document should be structured almost like a webpage: boxes within boxes, each clearly defined by metadata — a hierarchical model more easily understood by computers.

Illustration showing a document corresponding to pieces of another document.

Image Credits: Docugami

“A few years ago I drank the AI kool-aid, got the idea to transform documents into data. I needed an algorithm that navigates the hierarchical model, and they told me that the algorithm you want does not exist,” he explained. “The XML model, where every piece is inside another, and each has a different name to represent the data it contains — that has not been married to the AI model we have today. That’s just a fact. I hoped the AI people would go and jump on it, but it didn’t happen.” (“I was busy doing something else,” he added, to excuse himself.)

The lack of compatibility with this new model of computing shouldn’t come as a surprise — every emerging technology carries with it certain assumptions and limitations, and AI has focused on a few other, equally crucial areas like speech understanding and computer vision. The approach taken there doesn’t match the needs of systematically understanding a document.

“Many people think that documents are like cats. You train the AI to look for their eyes, for their tails… documents are not like cats,” he said.

It sounds obvious, but it’s a real limitation: advanced AI methods like segmentation, scene understanding, multimodal context, and such are all a sort of hyper-advanced cat detection that has moved beyond cats to detect dogs, car types, facial expressions, locations, etc. Documents are too different from one another, or in other ways too similar, for these approaches to do much more than roughly categorize them.

And as for language understanding, it’s good in some ways but not in the ways Paoli needed. “They’re working sort of at the English language level,” he said. “They look at the text but they disconnect it from the document where they found it. I love NLP people, half my team is NLP people — but NLP people don’t think about business processes. You need to mix them with XML people, people who understand computer vision, then you start looking at the document at a different level.”

Docugami in action

Illustration showing a person interacting with a digital document.

Image Credits: Docugami

Paoli’s goal couldn’t be reached by adapting existing tools (beyond mature primitives like optical character recognition), so he assembled his own private AI lab, where a multi-disciplinary team has been tinkering away for about two years.

“We did core science, self-funded, in stealth mode, and we sent a bunch of patents to the patent office,” he said. “Then we went to see the VCs, and Signalfire basically volunteered to lead the seed round at $10 million.”

Coverage of the round didn’t really get into the actual experience of using Docugami, but Paoli walked me through the platform with some live documents. I wasn’t given access myself and the company wouldn’t provide screenshots or video, saying it is still working on the integrations and UI, so you’ll have to use your imagination… but if you picture pretty much any enterprise SaaS service, you’re 90 percent of the way there.

As the user, you upload any number of documents to Docugami, from a couple dozen to hundreds or thousands. These enter a machine understanding workflow that parses the documents, whether they’re scanned PDFs, Word files, or something else, into an XML-esque hierarchical organization unique to the contents.

“Say you’ve got 500 documents, we try to categorize it in document sets, these 30 look the same, those 20 look the same, those 5 together. We group them with a mix of hints coming from how the document looked, what it’s talking about, what we think people are using it for, etc,” said Paoli. Other services might be able to tell the difference between a lease and an NDA, but documents are too diverse to slot into pre-trained ideas of categories and expect it to work out. Every set of documents is potentially unique, and so Docugami trains itself anew every time, even for a set of one. “Once we group them, we understand the overall structure and hierarchy of that particular set of documents, because that’s how documents become useful: together.”

Illustration showing a document being turned into a report and a spreadsheet.

Image Credits: Docugami

That doesn’t just mean it picks up on header text and creates an index, or lets you search for words. The data that is in the document, for example who is paying whom, how much and when, and under what conditions, all that becomes structured and editable within the context of similar documents. (It asks for a little input to double check what it has deduced.)

It can be a little hard to picture, but now just imagine that you want to put together a report on your company’s active loans. All you need to do is highlight the information that’s important to you in an example document — literally, you just click “Jane Roe” and “$20,000” and “5 years” anywhere they occur — and then select the other documents you want to pull corresponding information from. A few seconds later you have an ordered spreadsheet with names, amounts, dates, anything you wanted out of that set of documents.

All this data is meant to be portable too, of course — there are integrations planned with various other common pipes and services in business, allowing for automatic reports, alerts if certain conditions are reached, automated creation of templates and standard documents (no more keeping an old one around with underscores where the principals go).

Remember, this is all half an hour after you uploaded them in the first place, no labeling or pre-processing or cleaning required. And the AI isn’t working from some preconceived notion or format of what a lease document looks like. It’s learned all it needs to know from the actual docs you uploaded — how they’re structured, where things like names and dates figure relative to one another, and so on. And it works across verticals and uses an interface anyone can figure out a few minutes. Whether you’re in healthcare data entry or construction contract management, the tool should make sense.

The web interface where you ingest and create new documents is one of the main tools, while the other lives inside Word. There Docugami acts as a sort of assistant that’s fully aware of every other document of whatever type you’re in, so you can create new ones, fill in standard information, comply with regulations, and so on.

Okay, so processing legal documents isn’t exactly the most exciting application of machine learning in the world. But I wouldn’t be writing this (at all, let alone at this length) if I didn’t think this was a big deal. This sort of deep understanding of document types can be found here and there among established industries with standard document types (such as police or medical reports), but have fun waiting until someone trains a bespoke model for your kayak rental service. But small businesses have just as much value locked up in documents as large enterprises — and they can’t afford to hire a team of data scientists. And even the big organizations can’t do it all manually.

NASA’s treasure trove

Image Credits: NASA

The problem is extremely difficult, yet to humans seems almost trivial. You or I could glance through 20 similar documents and a list of names and amounts easily, perhaps even in less time than it takes for Docugami to crawl them and train itself.

But AI, after all, is meant to imitate and excel human capacity, and it’s one thing for an account manager to do monthly reports on 20 contracts — quite another to do a daily report on a thousand. Yet Docugami accomplishes the latter and former equally easily — which is where it fits into both the enterprise system, where scaling this kind of operation is crucial, and to NASA, which is buried under a backlog of documentation from which it hopes to glean clean data and insights.

If there’s one thing NASA’s got a lot of, it’s documents. Its reasonably well maintained archives go back to its founding, and many important ones are available by various means — I’ve spent many a pleasant hour perusing its cache of historical documents.

But NASA isn’t looking for new insights into Apollo 11. Through its many past and present programs, solicitations, grant programs, budgets, and of course engineering projects, it generates a huge amount of documents — being, after all, very much a part of the federal bureaucracy. And as with any large organization with its paperwork spread over decades, NASA’s document stash represents untapped potential.

Expert opinions, research precursors, engineering solutions, and a dozen more categories of important information are sitting in files searchable perhaps by basic word matching but otherwise unstructured. Wouldn’t it be nice for someone at JPL to get it in their head to look at the evolution of nozzle design, and within a few minutes have a complete and current list of documents on that topic, organized by type, date, author, and status? What about the patent advisor who needs to provide a NIAC grant recipient information on prior art — shouldn’t they be able to pull those old patents and applications up with more specificity than any with a given keyword?

The NASA SBIR grant, awarded last summer, isn’t for any specific work, like collecting all the documents of such and such a type from Johnson Space Center or something. It’s an exploratory or investigative agreement, as many of these grants are, and Docugami is working with NASA scientists on the best ways to apply the technology to their archives. (One of the best applications may be to the SBIR and other small business funding programs themselves.)

Another SBIR grant with the NSF differs in that, while at NASA the team is looking into better organizing tons of disparate types of documents with some overlapping information, at NSF they’re aiming to better identify “small data.” “We are looking at the tiny things, the tiny details,” said Paoli. “For instance, if you have a name, is it the lender or the borrower? The doctor or the patient name? When you read a patient record, penicillin is mentioned, is it prescribed or prohibited? If there’s a section called allergies and another called prescriptions, we can make that connection.”

“Maybe it’s because I’m French”

When I pointed out the rather small budgets involved with SBIR grants and how his company couldn’t possibly survive on these, he laughed.

“Oh, we’re not running on grants! This isn’t our business. For me, this is a way to work with scientists, with the best labs in the world,” he said, while noting many more grant projects were in the offing. “Science for me is a fuel. The business model is very simple – a service that you subscribe to, like Docusign or Dropbox.”

The company is only just now beginning its real business operations, having made a few connections with integration partners and testers. But over the next year it will expand its private beta and eventually open it up — though there’s no timeline on that just yet.

“We’re very young. A year ago we were like five, six people, now we went and got this $10M seed round and boom,” said Paoli. But he’s certain that this is a business that will be not just lucrative but will represent an important change in how companies work.

“People love documents. Maybe it’s because I’m French,” he said, “but I think text and books and writing are critical — that’s just how humans work. We really think people can help machines think better, and machines can help people think better.”

#artificial-intelligence, #automation, #documents, #enterprise, #language, #natural-language-understanding, #saas, #startups, #tc

0

Microsoft goes all in on healthcare with $19.7B Nuance acquisition

When Microsoft announced it was acquiring Nuance Communications this morning for $19.7 billion, you could be excused for doing a Monday morning double take at the hefty price tag.

That’s surely a lot of money for a company on a $1.4 billion run rate, but Microsoft, which has already partnered with the speech-to-text market leader on several products over the last couple of years, saw a company firmly embedded in healthcare and it decided to go all in.

And $20 billion is certainly all in, even for a company the size of Microsoft. But 2020 forced us to change the way we do business from restaurants to retailers to doctors. In fact, the pandemic in particular changed the way we interact with our medical providers. We learned very quickly that you don’t have to drive to an office, wait in waiting room, then in an exam room, all to see the doctor for a few minutes.

Instead, we can get on the line, have a quick chat and be on our way. It won’t work for every condition of course — there will always be times the physician needs to see you — but for many meetings such as reviewing test results or for talk therapy, telehealth could suffice.

Microsoft CEO Satya Nadella says that Nuance is at the center of this shift, especially with its use of cloud and artificial intelligence, and that’s why the company was willing to pay the amount it did to get it.

“AI is technology’s most important priority, and healthcare is its most urgent application. Together, with our partner ecosystem, we will put advanced AI solutions into the hands of professionals everywhere to drive better decision-making and create more meaningful connections, as we accelerate growth of Microsoft Cloud in Healthcare and Nuance,” Nadella said in a post announcing the deal.

Microsoft sees this deal doubling what was already a considerable total addressable market to nearly $500 billion. While TAMs always tend to run high, that is still a substantial number.

It also fits with Gartner data, which found that by 2022, 75% of healthcare organizations will have a formal cloud strategy in place. The AI component only adds to that number and Nuance brings 10,000 existing customers to Microsoft including some of the biggest healthcare organizations in the world.

Brent Leary, founder and principal analyst at CRM Essentials, says the deal could provide Microsoft with a ton of health data to help feed the underlying machine learning models and make them more accurate over time.

“There is going be a ton of health data being captured by the interactions coming through telemedicine interactions, and this could create a whole new level of health intelligence,” Leary told me.

That of course could drive a lot of privacy concerns where health data is involved, and it will be up to Microsoft, which just experienced a major breach on its Exchange email server products last month, to assure the public that their sensitive health data is being protected.

Leary says that ensuring data privacy is going to be absolutely key to the success of the deal. “The potential this move has is pretty powerful, but it will only be realized if the data and insights that could come from it are protected and secure — not only protected from hackers but also from unethical use. Either could derail what could be a game changing move,” he said.

Microsoft also seemed to recognize that when it wrote, “Nuance and Microsoft will deepen their existing commitments to the extended partner ecosystem, as well as the highest standards of data privacy, security and compliance.”

We are clearly on the edge of a sea change when it comes to how we interact with our medical providers in the future. COVID pushed medicine deeper into the digital realm in 2020 out of simple necessity. It wasn’t safe to go into the office unless absolutely necessary.

The Nuance acquisition, which is expected to close some time later this year, could help Microsoft shift deeper into the market. It could even bring Teams into it as a meeting tool, but it’s all going to depend on the trust level people have with this approach, and it will be up to the company to make sure that both healthcare providers and the people they serve have that.

#artificial-intelligence, #cloud, #enterprise, #ma, #mergers-and-acquisitions, #microsoft, #nuance-communications, #privacy, #tc, #telemedicine

0

Microsoft is acquiring Nuance Communications for $19.7B

Microsoft agreed today to acquire Nuance Communications, a leader in speech to text software, for $19.7 billion. Bloomberg broke the story over the weekend that the two companies were in talks.

In a post announcing the deal, the company said this was about increasing its presence in the healthcare vertical, a place where Nuance has done well in recent years. In fact, the company announced the Microsoft Cloud for Healthcare last year, and this deal is about accelerating its presence there. Nuance’s products in this area include Dragon Ambient eXperience, Dragon Medical One and PowerScribe One for radiology reporting.

“Today’s acquisition announcement represents the latest step in Microsoft’s industry-specific cloud strategy,” the company wrote. The acquisition also builds on several integrations and partnerships the two companies have made in the last couple of years.

The company boasts 10,000 healthcare customers, according to information on the website. Those include AthenaHealth, Johns Hopkins, Mass General Brigham and Cleveland Clinic to name but a few, and it was that customer base that attracted Microsoft to pay the price it did to bring Nuance into the fold.

Nuance CEO Mark Benjamin will remain with the company and report to Scott Guthrie, Microsoft’s EVP in charge of the cloud and AI group.

Nuance has a complex history. It went public in 2000 and began buying speech recognition products including Dragon Dictate from Lernout Hauspie in 2001. It merged with a company called ScanSoft in 2005. That company began life as Visioneer, a scanning company in 1992.

Today, the company has a number of products including Dragon Dictate, a consumer and business text to speech product that dates back to the early 1990s. It’s also involved in speech recognition, chat bots and natural language processing particularly in healthcare and other verticals.

The company has 6,000 employees spread across 27 countries. In its most recent earnings report from November 2020, which was for Q42020, the company reported $352.9 million in revenue compared to $387.6 million in the same period a year prior. That’s not the direction a company wants to go in, but it is still a run rate of over $1.4 billion.

At the time of that earnings call, the company also announced it was selling its medical transcription and electronic health record (EHR) Go-Live services to Assured Healthcare Partners and Aeries Technology Group. Company CEO Benjamin said this was about helping the company concentrate on its core speech services.

“With this sale, we will reach an important milestone in our journey towards a more focused strategy of advancing our Conversational AI, natural language understanding and ambient clinical intelligence solutions,” Benjamin said in a statement at the time.

It’s worth noting that Microsoft already has a number speech recognition and chat bot products of its own including desktop speech to text services in Windows and on Azure, but it took a chance to buy a market leader and go deeper into the healthcare vertical.

The transaction has already been approved by both company boards and Microsoft reports it expects the deal to close by the end of this year, subject to standard regulatory oversight and approval by Nuance shareholders.

This would mark the second largest purchase by Microsoft ever, only surpassed by the $26.2 billion the company paid for LinkedIn in 2016.

#artificial-intelligence, #enterprise, #ma, #mergers-and-acquisitions, #microsoft, #nuance-communications, #tc

0

SnackMagic picks up $15M to expand from build-your-own snack boxes into a wider gifting marketplace

The office shut-down at the start of the Covid-19 pandemic last year spurred huge investment in digital transformation and a wave of tech companies helping with that, but there were some distinct losers in the shift, too — specifically those whose business models were predicated on serving the very offices that disappeared overnight. Today, one of the companies that had to make an immediate pivot to keep itself afloat is announcing a round of funding, after finding itself not just growing at a clip, but making a profit, as well.

SnackMagic, a build-your-own snack box service, has raised $15 million in a Series A round of funding led by Craft Ventures, with Luxor Capital also participating.

(Both investors have an interesting track record in the food-on-demand space: Most recently, Luxor co-led a $528 million round in Glovo in Spain, while Craft backs/has backed the likes of Cloud Kitchens, Postmates and many more).

The funding comes on the back of a strong year for the company, which hit a $20 million revenue run rate in eight months and turned profitable in December 2020.

Founder and CEO Shaunuk Amin said in an interview that the plan will be to use the funding both to continue growing SnackMagic’s existing business, as well as extend into other kinds of gifting categories. Currently, you can ship snacks anywhere in the world, but the customizable boxes — recipients are gifted an amount that they can spend, and they choose what they want in the box themselves from SnackMagic’s menu, or one that a business has created and branded as a subset of that — are only available in locations in North America, serviced by SnackMagic’s primary warehouse. Other locations are given options of pre-packed boxes of snacks right now, but the plan is to slowly extend its pick-and-mix model to more geographies, starting with the U.K.

Alongside this, the company plans to continue widening the categories of items that people can gift each other beyond chocolates, chips, hot sauces and other fun food items, into areas like alcohol, meal kits, and non-food items. There’s also scope for expanding to more use cases into areas like corporate gifting, marketing and consumer services, and analytics coming out of its sales.

Amin calls the data that SnackMagic is amassing about customer interest in different brands and products “the hidden gem” of the platform.

“It’s one of the most interesting things,” he said. Brands that want to add their items to the wider pool of products — which today numbers between 700 and 800 items — also get access to a dashboard where they monitor what’s selling, how much stock is left of their own items, and so on. “One thing that is very opaque [in the CPG world] is good data.”

For many of the bigger companies that lack their own direct sales channels, it’s a significantly richer data set than what they typically get from selling items in the average brick and mortar store, or from a bigger online retailer like Amazon. “All these bigger brands like Pepsi and Kellogg not only want to know this about their own products more but also about the brands they are trying to buy,” Amin said. Several of them, he added, have approached his company to partner and invest, so I guess we should watch this space.

SnackMagic’s success comes from a somewhat unintended, unlikely beginning, and it’s a testament to the power of compelling, yet extensible technology that can be scaled and repurposed if necessary. In its case, there is personalization technology, logistics management, product inventory and accounting, and lots of data analytics involved.

The company started out as Stadium, a lunch delivery service in New York City that was leveraging the fact that when co-workers ordered lunch or dinner together for the office — say around a team-building event or a late-night working session, or just for a regular work day — oftentimes they found that people all hankered for different things to eat.

In many cases, people typically make separate orders for the different items, but that also means if you are ordering to all eat together, things would not arrive at the same time; if it’s being expensed, it’s more complicated on that front too; and if you’re thinking about carbon footprints, it might also mean a lot less efficiency on that front too.

Stadium’s solution was a platform that provided access to multiple restaurants’ menus, and people could pick from all of them for a single order. The business had been operating for six years and was really starting to take off.

“We were quite well known in the city, and we had plans to expand, and we were on track for March 2020 being our best month ever,” Amin said. Then, Covid-19 hit. “There was no one left in the office,” he said. Revenue disappeared overnight, since the idea of delivering many items to one place instantly stopped being a need.

Amin said that they took a look at the platform they had built to pick many options (and many different costs, and the accounting that came with that) and thought about how to use that for a different end. It turned out that even with people working remotely, companies wanted to give props to their workers, either just to say hello and thanks, or around a specific team event, in the form of food and treats — all the more so since the supply of snacks you typically come across in so many office canteens and kitchens were no longer there for workers to tap.

It’s interesting, but perhaps also unsurprising, that one of the by-products of our new way of working has been the rise of more services that cater (no pun intended) to people working in more decentralised ways, and that companies exploring how to improve rewarding people in those environments are also seeing a bump.

Just yesterday, we wrote about a company called Alyce raising $30 million for its corporate gifting platform that is also based on personalization — using AI to help understand the interests of the recipient to make better choices of items that a person might want to receive.

Alyce is taking a somewhat different approach to SnackMagic: it’s not holding any products itself, and there is no warehouse but rather a platform that links up buyers with those providing products. And Alyce’s initial audience is different, too: instead of internal employees (the first, but not final, focus for SnackMagic) it is targeting corporate gifting, or presents that sales and marketing people might send to prospects or current clients as a please and thank you gesture.

But you can also see how and where the two might meet in the middle — and compete not just with each other, but the many other online retailers, Amazon and otherwise, plus the consumer goods companies themselves looking for ways of diversifying business by extending beyond the B2C channel.

“We don’t worry about Amazon. We just get better,” Amin said when I asked him about whether he worried that SnackMagic was too easy to replicate. “It might be tough anyway,” he added, since “others might have the snacks but picking and packing and doing individual customization is very different from regular e-commerce. It’s really more like scalable gifting.”

Investors are impressed with the quick turnaround and identification of a market opportunity, and how it quickly retooled its tech to make it fit for purpose.

“SnackMagic’s immediate success was due to an excellent combination of timing, innovative thinking and world-class execution,” said Bryan Rosenblatt, principal investor at Craft Ventures, in a statement. “As companies embrace the future of a flexible workplace, SnackMagic is not just a snack box delivery platform but a company culture builder.”

#corporate-gifting, #ecommerce, #enterprise, #food, #funding, #gifting, #snacks

0

Daily Crunch: KKR invests $500M into Box

Box gets some financial ammunition against an activist investor, Samsung launches the Galaxy SmartTag+ and we look at the history of CryptoPunks. This is your Daily Crunch for April 8, 2021.

The big story: KKR invests $500M into Box

Private equity firm KKR is making an investment into Box that should help the cloud content management company buy back shares from activist investor Starboard Value, which might otherwise have claimed a majority of board seats and forced a sale.

After the investment, Aaron Levie will remain with Box as its CEO, but independent board member Bethany Mayer will become the chair, while KKR’s John Park is joining the board as well.

“The KKR move is probably the most important strategic move Box has made since it IPO’d,” said Alan Pelz-Sharpe of Deep Analysis. “KKR doesn’t just bring a lot of money to the deal, it gives Box the ability to shake off some naysayers and invest in further acquisitions.”

The tech giants

Samsung’s AirTags rival, the Galaxy SmartTag+, arrives to help you find lost items via AR — This is a version of Samsung’s lost-item finder that supports Bluetooth Low Energy and ultra-wideband technology.

Spotify stays quiet about launch of its voice command ‘Hey Spotify’ on mobile — Access to the “Hey Spotify” voice feature is rolling out more broadly, but Spotify isn’t saying anything officially.

Verizon and Honda want to use 5G and edge computing to make driving safer — The two companies are piloting different safety scenarios at the University of Michigan’s Mcity, a test bed for connected and autonomous vehicles.

Startups, funding and venture capital

Norway’s Kolonial rebrands as Oda, bags $265M on a $900M valuation to grow its online grocery delivery business in Europe — Oda’s aim is to provide “a weekly shop” for prices that compete against those of traditional supermarkets.

Tines raises $26M Series B for its no-code security automation platform — Tines co-founders Eoin Hinchy and Thomas Kinsella were both in senior security roles at DocuSign before they left to start their own company in 2018.

Yext co-founder unveils Dynascore, which dynamically synchronizes music and video — This is the first product from Howard Lerman’s new startup Wonder Inventions.

Advice and analysis from Extra Crunch

Four strategies for getting attention from investors — MaC Venture Capital founder Marlon Nichols joined us at TechCrunch Early Stage to discuss his strategies for early-stage investing, and how those lessons can translate into a successful launch for budding entrepreneurs.

How to get into a startup accelerator —  Neal Sáles-Griffin, managing director of Techstars Chicago, explains when and how to apply to a startup accelerator.

Understanding how fundraising terms can affect early-stage startups — Fenwick & West partner Dawn Belt breaks down some of the terms that trip up first-time entrepreneurs.

(Extra Crunch is our membership program, which helps founders and startup teams get ahead. You can sign up here.)

Everything else

The Cult of CryptoPunks — Ethereum’s “oldest NFT project” may not actually be the first, but it’s the wildest.

Biden proposes gun control reforms to go after ‘ghost guns’ and close loopholes — President Joe Biden has announced a new set of initiatives by which he hopes to curb the gun violence he described as “an epidemic” and “an international embarrassment.”

Apply to Startup Battlefield at TechCrunch Disrupt 2021 — All you need is a killer pitch, an MVP, nerves of steel and the drive and determination to take on all comers to claim the coveted Disrupt Cup.

The Daily Crunch is TechCrunch’s roundup of our biggest and most important stories. If you’d like to get this delivered to your inbox every day at around 3pm Pacific, you can subscribe here.

#box, #daily-crunch, #enterprise

0

Immersion cooling to offset data centers’ massive power demands gains a big booster in Microsoft

LiquidStack does it. So does Submer. They’re both dropping servers carrying sensitive data into goop in an effort to save the planet. Now they’re joined by one of the biggest tech companies in the world in their efforts to improve the energy efficiency of data centers, because Microsoft is getting into the liquid-immersion cooling market.

Microsoft is using a liquid it developed in-house that’s engineered to boil at 122 degrees Fahrenheit (lower than the boiling point of water) to act as a heat sink, reducing the temperature inside the servers so they can operate at full power without any risks from overheating.

The vapor from the boiling fluid is converted back into a liquid through contact with a cooled condenser in the lid of the tank that stores the servers.

“We are the first cloud provider that is running two-phase immersion cooling in a production environment,” said Husam Alissa, a principal hardware engineer on Microsoft’s team for datacenter advanced development in Redmond, Washington, in a statement on the company’s internal blog. 

While that claim may be true, liquid cooling is a well-known approach to dealing with moving heat around to keep systems working. Cars use liquid cooling to keep their motors humming as they head out on the highway.

As technology companies confront the physical limits of Moore’s Law, the demand for faster, higher performance processors mean designing new architectures that can handle more power, the company wrote in a blog post. Power flowing through central processing units has increased from 150 watts to more than 300 watts per chip and the GPUs responsible for much of Bitcoin mining, artificial intelligence applications and high end graphics each consume more than 700 watts per chip.

It’s worth noting that Microsoft isn’t the first tech company to apply liquid cooling to data centers and the distinction that the company uses of being the first “cloud provider” is doing a lot of work. That’s because bitcoin mining operations have been using the tech for years. Indeed, LiquidStack was spun out from a bitcoin miner to commercialize its liquid immersion cooling tech and bring it to the masses.

“Air cooling is not enough”

More power flowing through the processors means hotter chips, which means the need for better cooling or the chips will malfunction.

“Air cooling is not enough,” said Christian Belady, vice president of Microsoft’s datacenter advanced development group in Redmond, in an interview for the company’s internal blog. “That’s what’s driving us to immersion cooling, where we can directly boil off the surfaces of the chip.”

For Belady, the use of liquid cooling technology brings the density and compression of Moore’s Law up to the datacenter level

The results, from an energy consumption perspective, are impressive. The company found that using two-phase immersion cooling reduced power consumption for a server by anywhere from 5 percent to 15 percent (every little bit helps).

Microsoft investigated liquid immersion as a cooling solution for high performance computing applications such as AI. Among other things, the investigation revealed that two-phase immersion cooling reduced power consumption for any given server by 5% to 15%. 

Meanwhile, companies like Submer claim they reduce energy consumption by 50%, water use by 99%, and take up 85% less space.

For cloud computing companies, the ability to keep these servers up and running even during spikes in demand, when they’d consume even more power, adds flexibility and ensures uptime even when servers are overtaxed, according to Microsoft.

“[We] know that with Teams when you get to 1 o’clock or 2 o’clock, there is a huge spike because people are joining meetings at the same time,” Marcus Fontoura, a vice president on Microsoft’s Azure team, said on the company’s internal blog. “Immersion cooling gives us more flexibility to deal with these burst-y workloads.”

At this point, data centers are a critical component of the internet infrastructure that much of the world relies on for… well… pretty much every tech-enabled service. That reliance however has come at a significant environmental cost.

“Data centers power human advancement. Their role as a core infrastructure has become more apparent than ever and emerging technologies such as AI and IoT will continue to drive computing needs. However, the environmental footprint of the industry is growing at an alarming rate,” Alexander Danielsson, an investment manager at Norrsken VC noted last year when discussing that firm’s investment in Submer.

Solutions under the sea

If submerging servers in experimental liquids offers one potential solution to the problem — then sinking them in the ocean is another way that companies are trying to cool data centers without expending too much power.

Microsoft has already been operating an undersea data center for the past two years. The company actually trotted out the tech as part of a push from the tech company to aid in the search for a COVID-19 vaccine last year.

These pre-packed, shipping container-sized data centers can be spun up on demand and run deep under the ocean’s surface for sustainable, high-efficiency and powerful compute operations, the company said.

The liquid cooling project shares most similarity with Microsoft’s Project Natick, which is exploring the potential of underwater datacenters that are quick to deploy and can operate for years on the seabed sealed inside submarine-like tubes without any onsite maintenance by people. 

In those data centers nitrogen air replaces an engineered fluid and the servers are cooled with fans and a heat exchanger that pumps seawater through a sealed tube.

Startups are also staking claims to cool data centers out on the ocean (the seaweed is always greener in somebody else’s lake).

Nautilus Data Technologies, for instance, has raised over $100 million (according to Crunchbase) to develop data centers dotting the surface of Davey Jones’ Locker. The company is currently developing a data center project co-located with a sustainable energy project in a tributary near Stockton, Calif.

With the double-immersion cooling tech Microsoft is hoping to bring the benefits of ocean-cooling tech onto the shore. “We brought the sea to the servers rather than put the datacenter under the sea,” Microsoft’s Alissa said in a company statement.

Ioannis Manousakis, a principal software engineer with Azure (left), and Husam Alissa, a principal hardware engineer on Microsoft’s team for datacenter advanced development (right), walk past a container at a Microsoft datacenter where computer servers in a two-phase immersion cooling tank are processing workloads. Photo by Gene Twedt for Microsoft.

#artificial-intelligence, #bitcoin, #cloud, #computing, #data-center, #energy-consumption, #energy-efficiency, #enterprise, #liquid-cooling, #microsoft, #saas, #tc

0

Quiq acquires Snaps to create a combined customer messaging platform

At first glance, Quiq and Snaps might sound like similar startups — they both help businesses talk to their customers via text messaging and other messaging apps. But Snaps CEO Christian Brucculeri said “there’s almost no overlap in what we do” and that the companies are “almost complete complements.”

That’s why Quiq (based in Bozeman, Montana) is acquiring Snaps (based in New York). The entire Snaps team is joining Quiq, with Brucculeri becoming senior vice president of sales and customer success for the combined organization.

Quiq CEO Mike Myer echoed Bruccleri’s point, comparing the situation to dumping two pieces of a jigsaw puzzle on the floor and discovering “the two pieces fit perfectly.” More specifically, he told me that Quiq has generally focused on customer service messaging, with a “do it yourself, toolset approach.” After all, the company was founded by two technical co-founders, and Myer joked, “We can’t understand why [a customer] can’t just call an API.”

Snaps, meanwhile, has focused more on marketing conversations, and on a managed service approach where it handles all of the technical work for its customers. In addition, Myer said that while Quiq has “really focused on platform aspect from beginning” — building integrations with more than a dozen messaging channels including Apple Business Chat, Google’s Business Messages, Instagram, Facebook Messenger and WhatsApp — it doesn’t have “a deep natural language or conversational AI capability” the way Snaps does.

Myer added that demand for Quiq’s offering has been growing dramatically, with revenue up 300% year-over-year in the last six months of 2020. At the same time, he suggested that the divisions between marketing and customer service are beginning to dissolve, with service teams increasingly given sales goals, and “at younger, more commerce-focused organizations, they don’t have this differentiation between marketing and customer service” at all.

Apparently the two companies were already working together to create a combined offering for direct messaging on Instagram, which prompted broader discussions about how to bring the two products together. Moving forward, they will offer a combined platform for a variety of customers under the Quiq brand. (Quiq’s customers include Overstock.com, West Elm, Men’s Wearhouse and Brinks Home Security, while Snaps’ane Bryant, Live Nation, General Assembly, Clairol and Nioxin.) Brucculeri said this will give businesses one product to manage their conversations across “the full customer journey.”

“The key term you’re hearing is conversation,” Myer added. “It’s not about a ticket or a case or a question […] it’s an ongoing conversation.”

Snaps had raised $11.3 million in total funding from investors including Signal Peak Ventures. The financial terms of the acquisition were not disclosed.

#advertising-tech, #enterprise, #fundings-exits, #mobile, #mobile-messaging, #quiq, #snaps, #startups

0

Industry experts bullish on $500M KKR investment in Box, but stock market remains skeptical

When Box announced it was getting a $500 million investment from private equity firm KKR this morning, it was hard not to see it as a positive move for the company. It has been operating under the shadow of Starboard Value, and this influx of cash could give it a way forward independent of the activist investors.

Industry experts we spoke to were all optimistic about the deal, seeing it as a way for the company to regain control, while giving it a bushel of cash to make some moves. However, early returns from the stock market were not as upbeat as the stock price was plunging this morning.

Alan Pelz-Sharpe, principal analyst at Deep Analysis, a firm that follows the content management market closely, says that it’s a significant move for Box and opens up a path to expanding through acquisition.

“The KKR move is probably the most important strategic move Box has made since it IPO’d. KKR doesn’t just bring a lot of money to the deal, it gives Box the ability to shake off some naysayers and invest in further acquisitions,” Pelz-Sharpe told me, adding “Box is no longer a startup its a rapidly maturing company and organic growth will only take you so far. Inorganic growth is what will take Box to the next level.”

Dion Hinchcliffe, an analyst at Constellation Research, who covers the work from home trend and the digital workplace, sees it similarly, saying the investment allows the company to focus longer term again.

“Box very much needs to expand in new markets beyond its increasingly commoditized core business. The KKR investment will give them the opportunity to realize loftier ambitions long term so they can turn their established market presence into a growth story,” he said.

Pelz-Sharpe says that it also changes the power dynamic after a couple of years of having Starboard pushing the direction of the company.

“In short, as a public company there are investors who want a quick flip and others that want to grow this company substantially before an exit. This move with KKR potentially changes the dynamic at Box and may well put Aaron Levie back in the driver’s seat.”

Josh Stein, a partner at DFJ and early investor in Box, who was a long time board member, says that it shows that Box is moving in the right direction.

“I think it makes a ton of sense. Management has done a great job growing the business and taking it to profitability. With KKR’s new investment, you have two of the top technology investors in the world putting significant capital into going long on Box,” Stein said.

Perhaps Stein’s optimism is warranted. In its most recent earnings report from last month, the company announced revenue of $198.9 million, up 8% year-over-year with FY2021 revenue closing at $771 million up 11%. What’s more, the company is cash-flow positive, and has predicted an optimistic future outlook.

“As previously announced, Box is committed to achieving a revenue growth rate between 12-16%, with operating margins of between 23-27%, by fiscal 2024,” the company reiterated in a statement this morning.

Investors remains skeptical, however, with the company stock price getting hammered this morning. As of publication the share price was down over 9%. At this point, market investors may be waiting for the next earnings report to see if the company is headed in the right direction. For now, the $500 million certainly gives the company options, regardless of what Wall Street thinks in the short term.

#box, #cloud, #cloud-content-management, #enterprise, #kkr, #private-equity, #tc

0

KKR hands Box a $500M lifeline

Box announced this morning that private equity firm KKR is investing $500 million in the company, a move that could help the struggling cloud content management vendor get out from under pressure from activist investor Starboard Value.

The company plans to use the proceeds in what’s called a “dutch auction” style sale to buy back shares from certain investors for the price determined by the auction, an activity that should take place after the company announces its next earnings report in May. This would presumably involve buying out Starboard, which took a 7.5% stake in the company in 2019.

Last month Reuters reported that Starboard could be looking to take over a majority of the board seats when the company board meets in June. That could have set them up to take some action, most likely forcing a sale.

While it’s not clear what will happen now, it seems likely that with this cash, they will be able to stave off action from Starboard, and with KKR in the picture be able to take a longer term view. Box CEO Aaron Levie sees the move as a vote of confidence from KKR in Box’s approach.

“KKR is one of the world’s leading technology investors with a deep understanding of our market and a proven track record of partnering successfully with companies to create value and drive growth. With their support, we will be even better positioned to build on Box’s leadership in cloud content management as we continue to deliver value for our customers around the world,” Levie said in a statement.

Under the terms of the deal, John Park, Head of Americas Technology Private Equity at KKR, will be joining the Box board of directors. The company also announced that independent board member Bethany Mayer will be appointed chairman of the board, effective on May 1st.

Earlier this year, the company bought e-signature startup SignRequest, which could help open up a new set of workflows for the company as it tries to expand its market. With KKR’s backing, it’s not unreasonable to expect that Box, which is cash flow positive, could be taking additional steps to expand the platform in the future.

Box stock was down over 8% premarket, a signal that perhaps Wall Street isn’t thrilled with the announcement, but the cash influx should give Box some breathing room to reset and push forward.

#aaron-levie, #box, #cloud, #cloud-content-management, #enterprise, #kkr, #private-equity, #tc

0

Alyce, an AI-based personalised corporate gifting startup, raises $30M

Swag has a long and patchy history in the world of business. For every hip pair of plaid socks, there are five t-shirts you may never wear, an itchy scarf, a notepad your kids might use, and an ugly mug; and most of all, likely thousands of dollars and lots of time invested to make those presents a reality. Now, a startup that has built a service to rethink the concept behind corporate gifts and make them more effective is today announcing a round of funding to continue expanding its business — and one sign that it may be on to something is its progress so far.

Alyce, a Boston startup that has built an AI platform that plugs into various other apps that you might use to interact and track your relationships with others in your working life — sales prospects, business partners, colleagues — and then uses the information to personalise gift recommendations for those people, has raised $30 million, a Series B that it will be using to continue building out its platform, signing up more users, and hiring more people for its team.

This round is being led by General Catalyst, with Boston Seed Capital, Golden Ventures, Manifest, Morningside and Victress Captial — all previous backers — also participating.

Alyce says that it has grown 300% year-over-year between 2019 and 2020, tackling a corporate gifting and promotional items industry that ASI Market Research estimates is worth around $24.7 billion annually. Its customers today include Adobe’s Marketo, G2, Lenovo, Wex, Invision, DialPad, GrubHub, and 6Sense.

As with so many other apps and services that aim at productivity and people management, Alyce notes that this year of working remotely — which has tested many a relationship and job function, led to massive inbound and outbound digital activity (the screen is where everything gets played out now), and frankly burned a lot of us out — has given it also a new kind of relevance.

“As everyone was flooded with spam last year unsubscribing soared,” Greg Segall, founder and CEO of Alyce, said in a statement. “When a prospect opts out, that’s forever. It’s clear that both brands and customers crave the same thing – a much more purposeful and relatable way to engage.”

Alyce’s contribution to more quality engagement comes in the form of AI-fueled personalization.

Linking up with the other tools people typically use to track their communications with people — they include Marketo, Salesforce, Vidyard and Google’s email and calendar apps — the system has been built with algorithms that read details from those apps to construct some details about the preferences and tastes of the intended gift recipient. It then uses that to come up with a list of items that might appeal to that person from a wider list that it has compiled, with some 10,000 items in all. (And yes, these can also include more traditional corporate swag items like those socks or mugs.) Then, instead of sending an actual gift, “Swag Select”, as Alyce’s service is called, sends a gift code that lets the person redeem with his or her own choice from a personalised, more narrowed-down list of items.

Alyce itself doesn’t actually hold or distribute the presents: it connects up with third parties that send these out. (It prices its service based on how much it is used, and how many more tools a user might want to have to personalise and send out gifts.)

Yes, you might argue that a lot of this sounds actually very impersonal — the gift giver is not directly involved in the selection or sending of a present at all, which instead is “selected” by way of AI. Essentially, this is a variation of the personalization and recommendation technology that has been built to serve ads, suggest products to you on e-commerce sites, and more.

But on the other hand, it’s an interesting solution to the problem of trying to figure out what to get someone, which can be a challenge when you really know a person, and even harder when you don’t, while at the same time helping to create and fulfill a gesture that, at the end of the day, is about being thoughtful of them, not really the gift itself.

(You could also argue, I think, that since the gift lists are based on a person’s observations about the recipient, there is in fact some personal touches here, even if they have been run through an algorithmic mill before getting to you.)

And ultimately, the aim of these gifts is to say “thank you for this work relationship, which I appreciate”, or “please buy more printer paper from me” — not “I’m sorry for being rude to you at dinner last night.” Although… if this works as it should, maybe there might well be an opportunity to extending the model to more use cases, for example brands looking for ways to change up their direct mail marketing campaigns, or yes, people who want to patch things up after a spat the night before.

Notably, for General Catalyst, it’s interested indeed in the bigger gifting category, pointing to the potential of how this service could be scaled in the future.

“At General Catalyst, we are proud to lead the latest round of funding for Alyce as the company has reimagined the gifting category with technology and impact. The ability to deliver products and experiences that both the giver and recipient feel good about is incredibly powerful,” said Larry Bohn, Managing Director at General Catalyst, in a statement.

#alyce, #artificial-intelligence, #corporate-gifts, #ecommerce, #enterprise, #gifting, #marketing, #sales-and-marketing, #swag, #tc

0

EHR startup Canvas Medical raises $17M and partners with insurance heavyweight Anthem

Canvas Medical, an electronic health records (EHR) startup, today announced their $17 million Series A and a new partnership with Anthem, one of the biggest health insurance companies in the country.

The round was co-led by Inspired Capital and IA Ventures, with participation from Upfront Ventures. This round brings the company’s total funding to date to $20 million. 

The San Francisco-based company, which launched in 2015, aims to help doctors experience a more efficient — and painless — approach to delivering value-based care by offering an EHR platform that promises “80% fewer clicks, 3x faster workflows, and the ability to truly work on one screen,” said Andrew Hines, the company’s CEO and founder.

Andrew Hines

Andrew Hines. Image Credits: Canvas Medical

Value-based care is a delivery model where providers are paid based on patient health outcomes as opposed to the traditional pay-per-service model where doctors are reimbursed per visit.

We’ve seen a transition in the U.S. toward value-based care over the last several years, and that shift is also being reflected in how doctors are getting reimbursed. As a result, existing EHR companies find themselves having to add bells and whistles to their platforms, which in turn has compromised the doctor’s workflow experience.

“What has happened over time is we have asked our clinicians to become sophisticated coders. They are clicking through screens that are cluttered, that are not designed with human factors in mind,” said Steve Strongwater in Catalyst, a journal on innovation in care delivery published by the New England Journal of Medicine. Strongwater is a physician and the CEO of Atrius Health in Boston.

“Current EHRs are a workplace hazard from an ergonomics perspective,” said Hines. “It’s like if you sit in the wrong chair day in and day out, your back is going to hurt.” 

While technology has made many people’s jobs easier, that’s not the case for doctors. Studies have shown that EHRs are actually a source of physician burnout in the U.S., which is in and of itself a problem of national concern. 

The EHR market is extremely fragmented (there are several hundred EHR companies in the U.S.) which makes sharing medical records between physicians a challenge. Because health insurance claims contain significant medical information, insurance companies are a reliable alternative source for a lot of the important data about their members. But if a doctor needs to access that information for treatment purposes – which they have to do regularly – they have to log into a different portal or access a different report depending on each patient’s insurance. That’s one of the problems Canvas aims to solve, and their partnership with Anthem is just the beginning.

While there’s often a major amount of inertia — and associated cost — with changing EHRs, Hines, a data scientist-turned-entrepreneur, says the company assuages these concerns by leading its sale efforts with its numbers.

“Doctors who use Canvas experience 30% more productivity in the first month and are able to save 1-2 hours a day charting — which allows them to see more patients or go home early,” he added.

 

#anthem, #apps, #canvas-medical, #cloud, #electronic-health-records, #enterprise, #funding, #health, #ia-ventures, #inspired-capital, #recent-funding, #saas, #startups, #tc, #venture-capital

0

Okta expands into privileged access management and identity governance reporting

Okta today announced it was expanding its platform into a couple of new areas. Up to this point, the company has been known for its identity access management product, giving companies the ability to sign onto multiple cloud products with a single sign on. Today, the company is moving into two new areas: privileged access and identity governance

Privileged access gives companies the ability to provide access on an as-needed basis to a limited number of people to key administrative services inside a company. This could be your database or your servers or any part of your technology stack that is highly sensitive and where you want to tightly control who can access these systems.

Okta CEO Todd McKinnon says that Okta has always been good at locking down the general user population access to cloud services like Salesforce, Office 365 and Gmail. What these cloud services have in common is you access them via a web interface.

Administrators access the speciality accounts using different protocols. “It’s something like secure shell, or you’re using a terminal on your computer to connect to a server in the cloud, or it’s a database connection where you’re actually logging in with a SQL connection, or you’re connecting to a container which is the Kubernetes protocol to actually manage the container,” McKinnon explained.

Privileged access offers a couple of key features including the ability to limit access to a given time window and to record a video of the session so there is an audit trail of exactly what happened while someone was accessing the system. McKinnon says that these features provide additional layers of protection for these sensitive accounts.

He says that it will be fairly trivial to carve out these accounts because Okta already has divided users into groups and can give these special privileges to only those people in the administrative access group. The challenge was figuring out how to get access to these other kinds of protocols.

The governance piece provides a way for security operations teams to run detailed reports and look for issues related to identity. “Governance provides exception reporting so you can give that to your auditors, and more importantly you can give that to your security team to make sure that you figure out what’s going on and why there is this deviation from your stated policy,” he said.

All of this when combined with the $6.5 billion acquisition of Auth0 last month is part of a larger plan by the company to be what McKinnon calls the identity cloud. He sees a market with several strategic clouds and he believes identity is going to be one of them.

“Because identity is so strategic for everything, it’s unlocking your customer, access, it’s unlocking your employee access, it’s keeping everything secure. And so this expansion, whether it’s customer identity with zero trust or whether it’s doing more on the workforce identity with not just access, but privileged access and identity governance. It’s about identity evolving in this primary cloud,” he said.

While both of these new products were announced today at the company’s virtual Oktane customer conference, they won’t be generally available until the first quarter of next year.

#cloud, #enterprise, #identity, #okta, #security, #tc, #todd-mckinnon

0

Swyft raises $17.5 million to bring same-day delivery to all the retailers that aren’t Amazon

Thanks to major players like Amazon and Walmart, we’ve become accustomed to next- or same-day delivery. But the pandemic has also renewed our interest in buying from smaller businesses and retailers.

Swyft, a company that has just raised $17.5 million in Series A, helps retailers of any size provide affordable same-day delivery. The round was co-led by Inovia Capital and Forerunner Ventures, with participation from Shopify and existing investors Golden Ventures and Trucks VC.

Swyft is a marketplace, connecting a network of shipping carriers with vendors. But the company also provides software to those carriers to make them more efficient, and turns them into a vast network that allows them to pick up more inventory without adding to their infrastructure.

In other words, several regional carriers may play a part in delivering a parcel shipped via Swyft without making any big changes to their original routes or adding new drivers, trucks, etc.

To date, major players in both shipping and retail have dominated this space, thanks in large part to their ability to deliver quickly. Swyft is looking to amass an army, for lack of a better term, comprised of all of the smaller players, including mom and pop retailers and vendors as well as smaller, regional carriers. Banded together through software, these carriers and retailers can match the scale and influence of the behemoths without spending a fortune.

Swyft was cofounded by Aadil Kazmi (CEO), Zeeshan Hamid (Head of Engineering), and Maraz Rahman (Head of Sales). Kazmi and Hamid both spent their careers at Amazon, working on data and last-mile operations for the behemoth. Rahman was an early employee at a YC-backed proptech startup.

The trio started asking themselves early last year why retailers weren’t able to offer same-day delivery and chose to tackle the gap they discovered.

The key ingredient to Swyft is not its aggregation of couriers, but the software it provides to them. Because Swyft is increasing demand for these carriers, it also needs to make them more efficient. The back-end software allows carriers to digitize or automate a good deal of what they’re traditionally doing by hand.

CEO Aadil Kazmi says that Swyft is able to come in anywhere between 25 and 30 percent cheaper than the incumbent option.

“I don’t know what percent of your purchases are from Amazon, but for me it’s like 150 percent,” said Eurie Kim. “I’d prefer to buy elsewhere with the pandemic, and support local and independent brands, but Amazon’s trained us all to have fast and free shipping. It feels like an opportunity where the consumer experience is really lacking and the burden on merchants and retailers is extremely heavy.”

Swyft currently has 16 full-time employees. Twelve percent are female and 75 percent are people of color, according to the company.

Since April 2020, Swyft has facilitated the delivery of more than 180,000 packages, and expanded gross margin from 78 percent to 82 percent, thanks in large part to revenue from the software side of the business and a zero-asset model.

#enterprise, #eurie-kim, #forerunner-ventures, #recent-funding, #startups, #tc

0

Pathlight, a performance management tool for customer-facing teams and the individuals in them, raises $25M

The longer we continue to work with either all or part of our teams in remote, out-of-physical-office environments, the more imperative it becomes for those teams to have some tools in place to keep the channels of communication and management open, and for the individuals in those teams to have a sense of how well they are performing. Today, one of the startups that provides a team productivity app with that in mind is announcing a round of funding to fuel its growth.

Pathlight, which has built a performance management platform for customer-facing teams — sales, field service and support — to help managers and employees themselves to track and analyze how they are doing, to coach them when and where it’s needed, and to communicate updates and more, has picked up $25 million — money that it will be using to continue growing its customer base and the functionality across its app.

The funding is being led by Insight Partners, with previous backers Kleiner Perkins and Quiet Capital also participating, alongside Uncorrelated Ventures; Jeremy Stoppelman, CEO of Yelp; David Glazer, CFO of Palantir; and Michael Ovitz, co-founder of CAA and Owner of Broad Beach Ventures. Pathlight has now raised $35 million.

Pathlight today provides users with a range of tools to visualize team and individual performance across various parameters set by managers, using data that teams integrate from other platforms like Salesforce, Zendesk and Outreach, among others.

Using that data and specific metrics for the job in question, managers can then initiate conversations with individuals to focus in on specific areas where things need attention, and provide some coaching to help fix it. It can also be used to provide team-wide updates and encouragement, which sits alongside whatever other tools a person might use in their daily customer-facing work.

Since launching in March 2020, the startup has picked up good traction, with customers including Twilio, Earnin, Greenhouse, and CLEAR. But perhaps even more importantly, the pandemic and resulting switch to remote work has underscored how necessary tools like Pathlight’s have become: the startup says that engagement on its platform has shot up 300% in the last 12 months.

Alexander Kvamme, the CEO of Pathlight, said that he first became aware of the challenges of communicating across customer-facing teams, and having transparency on how they are doing as individuals and as a group, when he was at Yelp. Yelp had acquired his startup, reservations service SeatMe, and used the acquisition to build and run Yelp Reservations.

He was quick to realize that there weren’t really effective tools for him to see how individuals in the sales team were doing, how they were doing compared to goals the company wanted to achieve and based on the sales data they already had in other systems, how to work more effectively with people to communicate when something needed changing, and how to tailor all that in line with new variations in the formula — in their case, how to sell new products like a reservations service alongside advertising and other Yelp services for businesses.

“Whether it’s five or 3,000 people, the problem doesn’t go away,” he said. “Everyone uses their own systems, and it hurts front line employees when they don’t know how they are doing, or don’t get recognition when they are doing well, or don’t get coaching when they are not. Our thesis was that if software is eating the world, and you as a company are buying more software and analytics, over time managers will be more like data analysts. So we are providing a way for managers to be more data-driven.”

Five years down the line, Kvamme got the bug again to start a company and decided to return to that problem, teaming up with co-founder Trey Doig, the engineer who designed SeatMe and then turned it into Yelp Reservations and is now Pathlight’s CTO.

As they see it, the challenge has still not really been addressed. That’s not to say that there are not a number of companies — competitors to Pathlight, looking to fill that gap as well. Another people management platform called Lattice last year picked up $45 million  (I’m guessing it will be raising money again around about now); HubSpot, Zoho, SalesLoft and a number of others also are taking different approaches to the same challenge: front-line customer-facing people spend the majority of their time and attention on interacting with people, and so there need to be better tools in place to help them figure out how to make that communication more effective, figure out what is working and what is not.

And all of this, of course, is not at all new: it’s not like we all woke up one day and suddenly wanted to know how we are doing at work, or managers suddenly felt they needed to communicate with staff.

What has changed, however, is how we work: many of us have not seen the inside of our offices for more than a year at this point, and for a large proportion of us, we may never return again, or if we do it will be under different circumstances.

All of this means that some of the more traditional metrics and indicators of our performance, praising, management relationships, and learning from team mates simply is not there anymore.

In customer-facing areas like sales, support and field service, that lack of contact may be even more acute, since many of the teams working in these environments have long relied on huddles and communication throughout the day, week and month to continuously tweak work and improve it. So while tools like Pathlight’s will be useful as data analytics provision for teams regardless of how we work, it can be argued that they are even more important right now.

“I think people have started to realize that if you can empower front line to be more independent, your numbers will go up and do better,” Kvamme said.

This is part of what went into the investment decision made here.

“With the acceleration of digital transformation across the enterprise, it’s not enough to rethink the way we work—we must also rethink the way we manage,” said Jeff Lieberman, MD at Insight Partners. “Pathlight is ushering in a new age of data-driven management, an ethos that we believe every enterprise will need to embrace—quickly. We are excited to partner with the Pathlight team as they bring their powerful platform to companies across the world.”

#customer-service, #customer-support, #enterprise, #funding, #pathlight, #performance-management, #sales

0

Esri brings its flagship ArcGIS platform to Kubernetes

Esri, the geographic information system (GIS), mapping and spatial analytics company, is hosting its (virtual) developer summit today. Unsurprisingly, it is making a couple of major announcements at the event that range from a new design system and improved JavaScript APIs to support for running ArcGIS Enterprise in containers on Kubernetes.

The Kubernetes project was a major undertaking for the company, Esri Product Managers Trevor Seaton and Philip Heede told me. Traditionally, like so many similar products, ArcGIS was architected to be installed on physical boxes, virtual machines or cloud-hosted VMs. And while it doesn’t really matter to end-users where the software runs, containerizing the application means that it is far easier for businesses to scale their systems up or down as needed.

Esri ArcGIS Enterprise on Kubernetes deployment

Esri ArcGIS Enterprise on Kubernetes deployment

“We have a lot of customers — especially some of the larger customers — that run very complex questions,” Seaton explained. “And sometimes it’s unpredictable. They might be responding to seasonal events or business events or economic events, and they need to understand not only what’s going on in the world, but also respond to their many users from outside the organization coming in and asking questions of the systems that they put in place using ArcGIS. And that unpredictable demand is one of the key benefits of Kubernetes.”

Deploying Esri ArcGIS Enterprise on Kubernetes

Deploying Esri ArcGIS Enterprise on Kubernetes

The team could have chosen to go the easy route and put a wrapper around its existing tools to containerize them and call it a day, but as Seaton noted, Esri used this opportunity to re-architect its tools and break it down into microservices.

“It’s taken us a while because we took three or four big applications that together make up [ArcGIS] Enterprise,” he said. “And we broke those apart into a much larger set of microservices. That allows us to containerize specific services and add a lot of high availability and resilience to the system without adding a lot of complexity for the administrators — in fact, we’re reducing the complexity as we do that and all of that gets installed in one single deployment script.”

While Kubernetes simplifies a lot of the management experience, a lot of companies that use ArcGIS aren’t yet familiar with it. And as Seaton and Heede noted, the company isn’t forcing anyone onto this platform. It will continue to support Windows and Linux just like before. Heede also stressed that it’s still unusual — especially in this industry — to see a complex, fully integrated system like ArcGIS being delivered in the form of microservices and multiple containers that its customers then run on their own infrastructure.

Image Credits: Esri

In addition to the Kubernetes announcement, Esri also today announced new JavaScript APIs that make it easier for developers to create applications that bring together Esri’s server-side technology and the scalability of doing much of the analysis on the client-side. Back in the day, Esri would support tools like Microsoft’s Silverlight and Adobe/Apache Flex for building rich web-based applications. “Now, we’re really focusing on a single web development technology and the toolset around that,” Esri product manager Julie Powell told me.

A bit later this month, Esri also plans to launch its new design system to make it easier and faster for developers to create clean and consistent user interfaces. This design system will launch April 22, but the company already provided a bit of a teaser today. As Powell noted, the challenge for Esri is that its design system has to help the company’s partners to put their own style and branding on top of the maps and data they get from the ArcGIS ecosystem.

 

#computing, #developer, #enterprise, #esri, #gis, #javascript, #kubernetes, #linux, #microsoft-windows, #software, #tc, #vms

0

Aporia raises $5M for its AI observability platform

Machine learning (ML) models are only as good as the data you feed them. That’s true during training, but also once a model is put in production. In the real world, the data itself can change as new events occur and even small changes to how databases and APIs report and store data could have implications on how the models react. Since ML models will simply give you wrong predictions and not throw an error, it’s imperative that businesses monitor their data pipelines for these systems.

That’s where tools like Aporia come in. The Tel Aviv-based company today announced that it has raised a $5 million seed round for its monitoring platform for ML models. The investors are Vertex Ventures and TLV Partners.

Image Credits: Aporia

Aporia co-founder and CEO Liran Hason, after five years with the Israel Defense Forces, previously worked on the data science team at Adallom, a security company that was acquired by Microsoft in 2015. After the sale, he joined venture firm Vertex Ventures before starting Aporia in late 2019. But it was during his time at Adallom where he first encountered the problems that Aporio is now trying to solve.

“I was responsible for the production architecture of the machine learning models,” he said of his time at the company. “So that’s actually where, for the first time, I got to experience the challenges of getting models to production and all the surprises that you get there.”

The idea behind Aporia, Hason explained, is to make it easier for enterprises to implement machine learning models and leverage the power of AI in a responsible manner.

“AI is a super powerful technology,” he said. “But unlike traditional software, it highly relies on the data. Another unique characteristic of AI, which is very interesting, is that when it fails, it fails silently. You get no exceptions, no errors. That becomes really, really tricky, especially when getting to production, because in training, the data scientists have full control of the data.”

But as Hason noted, a production system may depend on data from a third-party vendor and that vendor may one day change the data schema without telling anybody about it. At that point, a model — say for predicting whether a bank’s customer may default on a loan — can’t be trusted anymore, but it may take weeks or months before anybody notices.

Aporia constantly tracks the statistical behavior of the incoming data and when that drifts too far away from the training set, it will alert its users.

One thing that makes Aporio unique is that it gives its users an almost IFTTT or Zapier-like graphical tool for setting up the logic of these monitors. It comes pre-configured with more than 50 combinations of monitors and provides full visibility in how they work behind the scenes. That, in turn, allows businesses to fine-tune the behavior of these monitors for their own specific business case and model.

Initially, the team thought it could build generic monitoring solutions. But the team realized that this wouldn’t only be a very complex undertaking, but that the data scientists who build the models also know exactly how those models should work and what they need from a monitoring solution.

“Monitoring production workloads is a well-established software engineering practice, and it’s past time for machine learning to be monitored at the same level,” said Rona Segev, founding partner at  TLV Partners. “Aporia‘s team has strong production-engineering experience, which makes their solution stand out as simple, secure and robust.”

 

#adallom, #aporia, #artificial-intelligence, #enterprise, #machine-learning, #microsoft, #ml, #recent-funding, #startups, #tc, #tel-aviv, #tlv-partners, #vertex-ventures

0