5 ways AI can help mitigate the global shipping crisis

With the fourth quarter now upon us, every industry faces a challenge in managing a holiday production calendar that will deliver the goods. The key for startups looking to defend the quarter from disruptions is to adopt a proactive, data-driven approach to inventory management.

Here are five methods we’ve been counseling clients to adopt:

  • Use data and analytics to identify and map out the inventory being affected by the global shipping crisis. If you don’t have the data about what is on a ship transporting your materials, then use this crisis as an opportunity to justify prioritizing supply chain digital transformation with data, IoT and advanced analytics (e.g., machine learning and simulation). You need to know the location of your goods all times if you are going to successfully gauge what impact a shortage will have on your operation.

    Ultimately, AI will help startups understand how myriad disruptions affect their supply chain so they can better respond with a Plan B when the unthinkable happens.

  • If you don’t have the data readily available, then you need to partner with a vendor and use a secure environment to share second-party data to deliver AI-driven actionable insights on the business impact on all parties involved, from startup to retailer to the consumer.
  • Simulate and forecast the impact of these supply-side issues on the demand side. Conduct scenario planning exercises and inform critical business decisions. If this ability is not in place, an emergency like a pandemic, civil unrest or an uncontrollable rate hike will wreak havoc on your business plan. Use this situation as an opportunity to put a disaster management program in place to prepare for the potential risks.

    #analytics, #artificial-intelligence, #business-intelligence, #column, #ec-column, #ec-manufacturing-and-supply-chain, #internet-of-things, #logistics, #machine-learning, #startups, #supply-chain, #supply-chain-management

FullStory raises $103M at a $1.8B valuation to combat rage clicks on websites and apps

Even with all the years of work that have been put into improving how screen-based interfaces work, our experiences with websites, mobile apps, and any other interactive service you might use still often come up short: we can’t find what we want, we’re bombarded with exactly what we don’t need, or the flow is just buggy in one way or another.

Now, FullStory, one of the startups that’s built a platform to identify when all of the above happens and provide suggestions to publishers for fixing it — it’s obsessed enough with the issue that it went so far as to trademark the phrase “Rage Clicks”, the focus of its mission — is announcing a big round of funding, a sign of its success and ambitions to do more.

The Atlanta-based company has closed a Series D round of $103 million, an oversubscribed round that actually was still growing between me interviewing the company and publishing this story (when we talked last week the figure was $100 million). Permira’s growth fund — which has previously invested in other customer experience startups like Klarna and Nexthink — is leading this round, with previous investors Kleiner Perkins, GV, Stripes, Dell Technologies Capital, Salesforce Ventures, and Glynn Capital also participating.

FullStory, which has raised close to $170 million to date, has confirmed that the investment values the company at $1.8 billion.

Scott Voigt, FullStory’s founder and CEO, tells me that FullStory currently has some 3,100 paying customers on its books across verticals like retail, SaaS, finance, and travel (customers include Peloton, the Financial Times, VMware and JetBlue), which collectively are on course to rack up more than 15 billion user sessions this year — working out to 1 trillion interactions involving clicks, navigations, highlights, scrolls, and frustration signals. It says that annual recurring revenue has to date risen by more than 70% year-on-year.

The plan now will be to continue investing in R&D to bring more real-time intelligence into its products, “and pass those insights on to customers,” and also to “move more aggressively into Europe and Asia Pacific,” he added.

FullStory competes with others like Glassbox and Decibel, although it also claims its tools have more presence on websites than its three biggest competitors combined.

Working across different divisions like product, customer success and marketing, and engineering, FullStory uses machine learning algorithms to analyze how people navigate websites and other digital interfaces.

If approved as part of the “consent gate” you might encounter because of, say, GDPR regulations, it then tracks things like when they are clicking in areas excessively over a short period of time because of delays (the so-called “rage clicks”); or when a click leads nowhere because of, for example, a blip in a piece of JavaScript; or when a person is just scrolling or moving their mouse or cursor or finger in a frustrated (fast) way — again with little or no subsequent activity (or activity from the customer ceasing altogether) resulting from it. It doesn’t use — nor does it have plans to — use eye tracking, or anything like sentiment analysis around data that customers put into, say, customer response windows.

FullStory then packages up the insights that it does collect into data streams that can be used with various visualization tools (having Salesforce as a strategic backer is interesting in this regard, given that it owns Tableau), or spreadsheets, or whatever a customer chooses to put them into. While it doesn’t offer direct remediation (perhaps an area it could tackle in the future), it does offer suggestions for alternative actions to fix whatever problems are arising.

Part of what has given FullStory a big boost in recent times (this round is by far the biggest fundraise the company has ever done) is the fact that, in today’s world, digital business has become the centerpiece of all business. Because of Covid-19 and the need for social distancing that have taken away some of the traffic of in-person experiences like going to stores, organizations that have natively or built experiences online are seeing unprecedented amounts of traffic; and they are now joined by organizations that have shifted into digital experiences simply to stay in business.

All of that has contributed to a huge amount of content online, and a big shift in mindset to making it better (and in the most urgent of cases, even more basically, simply usable), and that has resulted in the stars aligning for companies like FullStory.

“The category was so nascent to begin with that we had to explain the concept to customers,” Voigt told me of the company’s early days, where selling meant selling would-be customers on to the very idea of digital experience insights. “But digital experience, in the wake of Covid-19, suddenly mattered more than it ever has before, and the continued amount of inbound interest has been afterburner for us.” He noted that demand is increasing among mid-market and enterprise organizations, and something that has also helped FullStory grow is the general movement of talent in the industry.

“Our customers tend to take their tools with them when they change their jobs,” he said. Those tools include FullStory’s analytics.

The evolution of bringing more AI into the world of basically structuring what might otherwise be unstructured data has been a big boost to the world of analytics, and investors are interested in FullStory because of how it’s taken that trend and grown its business on top of it.

“We are very excited to partner with the FullStory team as they continue to expand and build a truly extraordinary technology brand that improves the digital experience for all stakeholders,” said Alex Melamud, who led the transaction on behalf of Permira Growth, in a statement.

“Traditional analytics have been upended by AI- and ML-enabled approaches that can instantly uncover nuanced patterns and anomalies in customer behavior,” said Bruce Chizen, a senior advisor at Permira, in a statement. “Leveraging both structured and unstructured data, FullStory has rapidly established itself as the market and technology leader in DXI and is now the fastest-growing company in the category and the de facto system of record for all digital experience data.” Chizen is joining the FullStory Board with this round.

#analytics, #digital-experience, #ecommerce, #enterprise, #fullstory, #funding

Firebolt raises $127M more for its new approach to cheaper and more efficient big data analytics

Snowflake changed the conversation for many companies when it comes to the potentials of data warehousing. Now one of the startups that’s hoping to disrupt the disruptor is announcing a big round of funding to expand its own business.

Firebolt, which has built a new kind of cloud data warehouse that promises much more efficient, and cheaper, analytics around whatever is stored within it, is announcing a major Series B of $127 million on the heels of huge demand for its services.

The company, which only came out of stealth mode in December, is not disclosing its valuation with this round, which brings the total raised by the Israeli company to $164 million. New backers Dawn Capital and K5 Global are in this round, alongside previous backers Zeev Ventures, TLV Partners, Bessemer Venture Partners, and Angular Ventures.

Nor is it disclosing many details about its customers at the moment. CEO and co-founder Eldad Farkash told me in an interview that most of them are US-based, and that the numbers have grown from the dozen or so that were using Firebolt when it was still in stealth mode (it worked quietly for a couple of years building its product and onboarding customers before finally launching six months ago). They are all migrating from existing data warehousing solutions like Snowflake or BigQuery. In other words, its customers are already cloud-native, big-data companies: it’s not trying to proselytize on the basic concept but work with those who are already in a specific place as a business.

“If you’re not using Snowflake or BigQuery already, we prefer you come back to us later,” he said. Judging by the size and quick succession of the round, that focus is paying off.

The challenge that Firebolt set out to tackle is that while data warehousing has become a key way for enterprises to analyze, update and manage their big data stores — after all, your data is only as good as the tools you have to parse it and keep it secure — typically data warehousing solutions are not efficient, and they can cost a lot of money to maintain.

The challenge was seen first-hand by the three founders of Firebolt, Farkash (CEO), Saar Bitner (COO) and Ariel Yaroshevich (CTO) when they were at a previous company, the business intelligence powerhouse Sisense, where respectively they were one of its co-founders and two members of its founding team. At Sisense, the company continually came up against an issue: When you are dealing in terabytes of data, cloud data warehouses were straining to deliver good performance to power its analytics and other tools, and the only way to potentially continue to mitigate that was by piling on more cloud capacity. And that started to become very expensive.

Firebolt set out to fix that by taking a different approach, re-architecting the concept. As Farkash sees it, while data warehousing has indeed been a big breakthrough in big data, it has started to feel like a dated solution as data troves have grown.

“Data warehouses are solving yesterday’s problem, which was, ‘How do I migrate to the cloud and deal with scale?’ ” he told me back in December. Google’s BigQuery, Amazon’s RedShift and Snowflake as fitting answers for that issue, believes, but “we see Firebolt as the new entrant in that space, with a new take on design on technology. We change the discussion from one of scale to one of speed and efficiency.”

The startup claims that its performance is up to 182 times faster than that of other data warehouses with a SQL-based system that works on academic research that had yet to be applied anywhere, around how to handle data in a lighter way, using new techniques in compression and how data is parsed. Data lakes in turn can be connected with a wider data ecosystem, and what it translates to is a much smaller requirement for cloud capacity. And lower costs.

Fast forward to today, and the company says the concept is gaining a lot of traction with engineers and developers in industries like business intelligence, customer-facing services that need to parse a lot of information to serve information to users in real-time, and back-end data applications. That is proving out what investors suspected would be a shift before the startup even launched, stealthily or otherwise.

“I’ve been an investor at Firebolt since their Series A round and before they had any paying customers,” said Oren Zeev of Zeev Ventures. “What had me invest in Firebolt is mostly the team. A group of highly experienced executives mostly from the big data space who understand the market very well, and the pain organizations are experiencing. In addition, after speaking to a few of my portfolio companies and Firebolt’s initial design partners, it was clear that Firebolt is solving a major pain, so all in all, it was a fairly easy decision. The market in which Firebolt operates is huge if you consider the valuations of Snowflake and Databricks. Even more importantly, it is growing rapidly as the migration from on-premise data warehouse platforms to the cloud is gaining momentum, and as more and more companies rely on data for their operations and are building data applications.”

#analytics, #big-data, #data-warehousing, #enterprise, #europe, #firebolt, #funding

Enterprise AI platform Dataiku launches managed service for smaller companies

Dataiku is going downstream with a new product today called Dataiku Online. As the name suggests, Dataiku Online is a fully managed version of Dataiku. It lets you take advantage of the data science platform without going through a complicated setup process that involves a system administrator and your own infrastructure.

If you’re not familiar with Dataiku, the platform lets you turn raw data into advanced analytics, run some data visualization tasks, create data-backed dashboards and train machine learning models. In particular, Dataiku can be used by data scientists, but also business analysts and less technical people.

The company has been mostly focused on big enterprise clients. Right now, Dataiku has more than 400 customers, such as Unilever, Schlumberger, GE, BNP Paribas, Cisco, Merck and NXP Semiconductors.

There are two ways to use Dataiku. You can install the software solution on your own, own-premise servers. You can also run it on a cloud instance. With Dataiku Online, the startup offers a third option and takes care of setup and infrastructure for you.

“Customers using Dataiku Online get all the same features that our on-premises and cloud instances provide, so everything from data preparation and visualization to advanced data analytics and machine learning capabilities,” co-founder and CEO Florian Douetteau said. “We’re really focused on getting startups and SMBs on the platform — there’s a perception that small or early-stage companies don’t have the resources or technical expertise to get value from AI projects, but that’s simply not true. Even small teams that lack data scientists or specialty ML engineers can use our platform to do a lot of the technical heavy lifting, so they can focus on actually operationalizing AI in their business.”

Customers using Dataiku Online can take advantage of Dataiku’s pre-built connectors. For instance, you can connect your Dataiku instance with a cloud data warehouse, such as Snowflake Data Cloud, Amazon Redshift and Google BigQuery. You can also connect to a SQL database (MySQL, PostgreSQL…), or you can just run it on CSV files stored on Amazon S3.

And if you’re just getting started and you have to work on data ingestion, Dataiku works well with popular data ingestion services. “A typical stack for our Dataiku Online Customers involves leveraging data ingestion tools like FiveTran, Stitch or Alooma, that sync to a cloud data warehouse like Google BigQuery, Amazon Redshift or Snowflake. Dataiku fits nicely within their modern data stacks,” Douetteau said.

Dataiku Online is a nice offering to get started with Dataiku. High-growth startups might start with Dataiku Online as they tend to be short on staff and want to be up and running as quickly as possible. But as you become bigger, you could imagine switching to a cloud or on-premise installation of Dataiku. Employees can keep using the same platform as the company scales.

#ai, #analytics, #artificial-intelligence, #data, #dataiku, #developer, #enterprise-ai, #machine-learning, #startups, #tc

Productivity startup Time is Ltd raises $5.6M to be the ‘Google Analytics for company time’

Productivity analytics startup Time is Ltd wants to be the Google Analytics for company time. Or perhaps a sort of “Apple Screen Time” for companies. Whatever the case, the founders reckon that if you can map how time is spent in a company enormous productivity gains can be unlocked and, money better spent.

It’s now raised a $5.6 million late seed funding round led by Mike Chalfen, of London-based Chalfen Ventures, with participation from Illuminate Financial Management and existing investor Accel. Acequia Capital and former Seal Software chairman Paul Sallaberry are also contributing to the new round, as is former Seal board member Clark Golestani. Furthermore, Ulf Zetterberg, founder and former CEO of contract discovery and analytics company Seal Software, is joining as President and co-founder.

The venture is the latest from serial entrepreneur Jan Rezab, better known for founding SocialBakers, which was acquired last year.

We are all familiar with inefficient meetings, pestering notifications chat, video conferencing tools and the deluge of emails. Time is Ltd. says it plans to address this by acquiring insights and data platforms such as Microsoft 365, Google Workspace, Zoom, Webex, MS Teams, Slack, and more. The data and insights gathered would then help managers to understand and take a new approach to measure productivity, engagement, and collaboration, the startup says.

The startup says it has now gathered 400 indicators that companies can choose from. For example, a task set by The Wall Street Journal for Time is Ltd. found the average response time for Slack users vs. email was 16.3 minutes, comparing to emails which was 72 minutes.

Chalfen commented: “Measuring hybrid and distributed work patterns is critical for every business. Time Is Ltd.’s platform makes such measurement easily available and actionable for so many different types of organizations that I believe it could make work better for every business in the world.”

Rezab said: “The opportunity to analyze these kinds of collaboration and communication data in a privacy-compliant way alongside existing business metrics is the future of understanding the heartbeat of every company – I believe in 10 years time we will be looking at how we could have ignored insights from these platforms.”

Tomas Cupr, Founder and Group CEO of Rohlik Group, the European leader of e-grocery, said: “Alongside our traditional BI approaches using performance data, we use Time is Ltd. to help improve the way we collaborate in our teams and improve the way we work both internally and with our vendors – data that Time is Ltd. provides is a must-have for business leaders.”

#accel, #analytics, #apple, #articles, #board-member, #business-intelligence, #ceo, #chairman, #computing, #digital-marketing, #e-grocery, #europe, #google, #leader, #london, #microsoft, #mike-chalfen, #seal-software, #serial-entrepreneur, #slack, #socialbakers, #software, #tc, #the-wall-street-journal, #time-is-ltd, #video-conferencing, #webex

June makes product analytics more accessible

Meet June, a new startup that wants to make it easier to create analytics dashboards and generate reports even if you’re not a product analytics expert. June is built on top of your Segment data. Like many no-code startups, it uses templates and a graphical interface so that non-technical profiles can start using it.

“What we do today is instant analytics and that’s why we’re building it on top of Segment,” co-founder and CEO Enzo Avigo told me. “It lets you access data much more quickly.”

Segment acts as the data collection and data repository for your analytics. After that, you can start playing with your data in June. Eventually, June plans to diversify its data sources.

“Our long-term vision is to become the Airtable of analytics,” Avigo said.

If you’re familiar with Airtable, June may look familiar. The company has built a template library to help you get started. For instance, June helps you track user retention, active users, your acquisition funnel, engagement, feature usage, etc.

Image Credits: June

Once you pick a template, you can start building a report by matching data sources with templates. June automatically generates charts, sorts your user base into cohorts and shows you important metrics. You can create goals so that you receive alerts in Slack whenever something good or bad is happening.

Advanced users can also use June so that everyone in the team is using the same tool. They can create custom SQL queries and build a template based on those queries.

The company raised a seed round of $1.85 million led by Point Nine. Y Combinator, Speedinvest, Kima Ventures, eFounders and Base Case also participated, as well as several business angels.

Prior to June, the startup’s two co-founders worked for Intercom. They noticed that the analytics tool was too hard to use for many people. They didn’t rely on analytics to make educated decisions.

There are hundreds of companies using June every week and that number is growing by 10% per week. Right now, the product is free but the company plans to charge based on usage.

Image Credits: June

#analytics, #developer, #europe, #france, #france-newsletter, #fundings-exits, #june, #product-analytics, #segment, #startups

New Instagram insights make its TikTok competitor Reels more appealing

Over the last year, Instagram has added a slew of features to help independent creators make a living, like Instagram Shop and Shopping in Reels. Today, Instagram launched new Insights for Reels and Live on its Professional Dashboard, giving businesses and creators essential data about the reach of their content. These tools will help Reels catch up with its competitor TikTok, which already offers users detailed analytics. As Instagram and TikTok continue trying to keep up with one another, it can only be a good thing for influencers and small businesses that use these platforms to bolster their income. 

Previously, Instagram creators could only view publicly available metrics, like the views, likes, or comments on a Reel. Now, they will be able to access data like Accounts Reached, Saves, and Shares for their Reels. Instagram will also share the number of Peak Concurrent Viewers that tune in to watch their Live videos. Plus, in the Account Insights section of the app, Instagram will add breakdowns that show users what kinds of accounts they are reaching, and which content formats are generating their strongest engagement. 

For entrepreneurs and content creators whose businesses run on social commerce, these analytics might not change the game, but they certainly make it easier to play. Shopping in Reels makes in-app sales more convenient, but until now, scant data was available to help businesses tailor their Reels to reach potential customers. On the other hand, TikTok’s analytics have long provided creators with data on their videos’ average watch time, types of traffic sources, and performance by geographic location. The viral video app announced earlier this month that it would work with specific brands, like the streetwear label Hype, to test in-app sales. This would deepen its competition with Instagram, but it’s still unclear when the feature will be widely available. So, Instagram’s Insights, combined with established in-app shopping, can create a perfect storm for content creators to better reach and monetize their target audiences.

“I always thought it was weird that there were no Insights for Reels. Sometimes it feels like shooting in the dark,” Quinn Jones told TechCrunch. Jones is one of the owners of KIKAY, a handmade jewelry business based in Los Angeles. With over 90,000 followers across Instagram and TikTok, the Gen-Z creators rely on social media to expand their audience and increase their sales. Though KIKAY has gone viral on TikTok, Jones said that Instagram has been the best way for the small business to gain followers.

“Insights are definitely going to be useful going forward,” said Jones. “It’s currently hard to tell the actual effective reach your videos have, and seeing Insights means more feedback to help improve content.”

For influencers, these analytics are also helpful for collaborating with brands on sponsored content. 

“I’ve been wanting Insights for Reels for the longest time. All we know now is views, likes, and comments,” said Cara Cochran, an LGBTQ+ content creator and microinfluencer. She notes that brands have already been pushing creators to make videos on Reels ever since Instagram redesigned its interface to place the short videos front-and-center. 

“Now that they are rolling out analytics, I think we will see a lot of brands push for more and more Reels instead of just static posts,” she says. “I think it brings their products to life in a whole new way, and it almost works like a commercial for them instead of just a static ad.” 

Instagram will begin rolling out Insights today. The company also says that over the coming months, it will add tools to help creators measure engagement over a preset time frame and begin to support Insights on desktop. 

#analytics, #apps, #creators, #insights, #instagram, #social-media, #social-media-marketing, #tiktok

This crypto surveillance startup — ‘We’re bomb sniffing dogs’ — just raised Series A funding

Solidus Labs, a company that says its surveillance and risk-monitoring software can detect manipulation across cryptocurrency trading platforms, is today announcing $20 million in Series A funding led by Evolution Equity Partners, with participation from Hanaco Ventures, Avon Ventures, 645 Ventures, the cryptocurrencies derivative exchange FTX,  and also a sprinkling of government officials, including former CFTC commissioner Chris Giancarlo and former SEC commissioner Troy Paredes.

It’s pretty great timing, given the various signals coming from the U.S. government just last week that it’s intent on improving its crypto monitoring efforts — such as the U.S. Treasury’s call for stricter cryptocurrency compliance with the IRS.

Of course, Solidus didn’t spring into existence last week. Rather, Solidus was founded in 2017 by several former Goldman Sachs employees who worked on the firm’s electronic trading desk for equities. At the time, Bitcoin was only becoming buzzier, but while the engineers anticipated different use cases for the cryptocurrency, they also recognized that a lack of compliance tools would be a barrier to its adoption by bigger financial institution, so they left to build these.

Fast forward and today Solidus employs 30 people, has raised $23.75 million altogether, and is the process of doubling its head count to address growing demand. We talked with Solidus’s New York-based cofounder and CEO Asaf Meir — who was himself one of those former Goldman engineers — about the company late last week. Excerpts from chat follow, edited lightly for length.

TC: Who are your customers?

AM: We work with exchanges, broker dealers, OTC desks, liquidity providers, and regulators — anyone who is exposed to the risk of buying and selling cryptocurrencies crypto assets or digital assets, whatever you want to call them.

TC: What are you promising to uncover for them?

AM: What we detect, largely speaking, is volume and price manipulation, and that has to do with wash trading, spoofing, layering, pump and dumps, and an additional growing library of crypto native alerts that truly only exist in our unique market.

We had a 400% increase in inbound demand over 2020 driven largely by two factors, I think. One is regulatory scrutiny. Globally, regulators have gone off to market participants, letting them know that they have to ask for permission not forgiveness. The second reason — which I like better — is the drastic institutional increase in appetite toward exposure for this asset class. Every institution, the first question they ask any executing platform is: ‘What are your risk mitigation tools? How do you make sure there is market integrity?’

TC: We talked a couple of months ago, and you mentioned having a growing pipeline of customers, like the trading platform Bittrex in Seattle. Is demand coming primarily from the U.S.?

AM: We have demand in Asia and in Europe, as well, so we will be our opening offices there, too.

TC: Is your former employer Goldman a customer?

AM: I can’t comment on that, but I would say there isn’t a bank right now that isn’t thinking about how they’re going to get exposure to crypto assets, and in order to do that in a safe, compliant and robust way, they have to employ crypto-specific solutions.

Right now, there’s the new frontier —  the clients we’re currently working with, which are these crypto-pure exchanges, broker dealers. liquidity providers, and even traditional financial institutions that are coming into crypto and opening a crypto operation or a crypto desk. Then there’s the new new frontier; your NFTs, stablecoins, indexes, lending platforms, decentralized protocols and God knows what [else] all of a sudden reaching out to us, telling us they want to do the right thing, to ensure the users on their platform are well-protected, and that trading activities are audited, and [to enlist us] to prevent any manipulation.

TC: How does your subscription service work and who is building the tech?

AM: We consume private data from our clients — all their training data —  and we then put it in our detection models, which we ultimately surface through insights and alerts on our dashboard, which they have access to.

As for who is building it, we have a lot of fintech engineers who are coming from Goldman and Morgan Stanley and Citi and bringing that traditional knowledge of large trading systems at scale; we also have incredible data scientists out of Israel whose expertise is in anomaly detection, which they are applying to financial crime, working with us.

TC: What do these crimes look like?

AM: When we started out, there was much more wholesale manipulation happening whether through wash trading or pump-and-dumps — things that are more easy to perform. What we’re seeing today are extremely sophisticated manipulation schemes where bad actors are able to exploit different executing platforms. We’re quite literally surfacing new alerts that if you were to use a legacy, rule-based system you wouldn’t be able to [surface] because you’re not really sure what you’re looking for. We oftentimes have an alert that we haven’t named yet; we just know that this type of behavior is considered manipulative in nature and that our client should be looking into it.

TC: Can you elaborate a bit more about these new anomalies?

AM: I’m conflicted about how much can we share of our clients’ private data. But one thing we’re seeing is [a surge in] account extraction attacks, which is when through different ways, bad actors are able to gain access to an account’s funds and are able in a sophisticated way to trade out of the exchange or broker dealer or custodian. That’s happening in different social engineering-related ways, but we’re able, through account deviation and account profiling, to alert the exchange or broker dealer or financial institution we’re working with to avoid that.

We’re about detection and prevention, not about tracing [what went wrong and where] after the fact. And we can do that regardless of knowing even personal identifiable information about that account. It’s not about the name or the IP address; it’s all about the attributes of trading. In fact, if we have an exchange in Hong Kong that’s experiencing a pump-and-dump on a certain coin pair, we can preemptively warn the rest of our client base so they can take steps to prepare and protect themselves.

TC: On the prevention front, could you also stop that activity on the Hong Kong exchange? Are you empowered by your clients to step in if you detect something anomalous?

AM: We’re bomb sniffing dogs, so we’re not coming to disable the bot. We know how to take the data and point out manipulation, but it’s then up to the financial institution to handle the case.

Pictured above: Seated left to right is CTO Praveen Kumar and CEO Asaf Meir. Standing is COO Chen Arad.

#645-ventures, #analytics, #asaf-meir, #blockchain, #chainalysis, #crypto, #elementus, #evolution-equity-partners, #ftx, #hanaco-ventures, #recent-funding, #solidus-labs, #startups, #surveillance, #tc, #venture-capital

Piano raises $88M for analytics, subscription and personalization tools for publishers, adds LinkedIn as investor

As publishers face up to whatever might be their next existential crisis — there are so many options from which to choose, including Substack stealing all their writers; or Clubhouse pulling in people through audio-based conversations where news and analysis intermingle seamlessly with networking — a startup that’s helping them build more tools to keep their businesses and audiences intact, and hopefully grow, is announcing a growth round of its own.

Piano, which provides analytics and subscription services to publishers, has closed a round of $88 million, funding that it will be using to continue building out the technology that it provides to its customers, as well as forge into newer areas where it can better connect audiences online.

The funding comes on the heels of a strong period for Piano. The company works with around 1,000 customers — they include CNBC, Wall Street Journal, NBC Sports, Insider, The Economist, Gannett, Le Parisien, Nielsen, MIT Technology Review, The Telegraph, South China Morning Post and (disclaimer) TechCrunch; and it has seen revenues grow 400% since 2019.

Piano has an interesting new backer in this round that might point to what form those newer areas of development might take. LinkedIn, the Microsoft-owned social networking site aimed at the working world, is participating in this Series B, which is being led by previous backer Updata Partners. Rittenhouse Ventures, which is based in Piano’s hometown of Philadelphia, is also participating.

(Piano is not disclosing valuation with this round but I understand it’s operating on a $75 million annual run rate currently. It has now raised just over $241 million.)

Trevor Kaufman, Piano’s CEO, would not be drawn out on how it specifically will be working with LinkedIn, but it’s notable that the latter company has long held back from leveraging the profiles it holds on its 740 million users to do much outside of the core LinkedIn experience.

That could be applied in a number of ways, for example similar to Facebook and Google logins to third-party sites; or for providing an identity layer to comment on stories; or even building a way to manage logins via a LinkedIn profile, which could then potentially be used to help people manage and read/consume all the content they subscribe to. Or something totally different: LinkedIn has a lot of unrealised potential that Piano could help tap.

“Members are increasingly turning to LinkedIn to stay informed on the news and views that shape their respective industries – critical to this is the work we do with trusted publishers and journalists,” said Scott Roberts, VP and Head of Business Development at LinkedIn, in a statement. “The opportunity to collaborate with Piano to help unlock more value for publisher content on LinkedIn makes it a natural strategic investment opportunity.”

The last year has seen many of us spending significantly more time indoors, and for a number of us that has also meant more reading, especially of smaller and more digestible formats such as periodicals. In a way, it’s no surprise that models like Substack’s have emerged and apparently thrived in this period, where writers are looking for different approaches and ways of connecting with readers while publishers by and large are conserving costs and strategies to weather out the storm.

Piano’s rise in that context is especially interesting, as in many cases it’s not reinventing the wheel for publishers but providing them with the tools to better leverage the content production that they already have in place. What’s notable is that in the process, it’s been able to capitalize on changing sentiments in the publishing industry. Whereas paywalls and subscriptions have in the past been seen as a drag on traffic (and the ads that get sold against it) and only useful for those in the world of B2B, now they are increasingly becoming more commonplace in a much wider range of settings, Kaufman said.

Piano’s tools are notable not just as basic levers to manage subscriptions (free and paid) but a more sophisticated set of analytics that provide more insight into how content is being read, which can in turn be used to develop those subscription tiers and determine the likelihood of people subscribing (Nieman Lab has good article on how that works here). 

To add to that, now another area where Piano is likely to develop more products is in the area of newsletters. No, not the Substack kind, but building tools for publishers to help them build out newsletter businesses that they can monetize if they choose. Indeed, the other kind of newsletter venture is far from Piano’s agenda.

“I can’t imagine a more damaging entity for journalism than Substack,” Kaufman told me. “I think it’s gotten a tremendous amount of attention from writers because it is a fantasy come true for journalists, this idea that you can make $500k a year for writing on occasion. But nothing can be farther from the truth.” He believes the model is so “pumped up venture funding” that it’s not a viable one for the long haul.

That remains to be seen, I suppose, and of course Piano has a strong vested interest in supporting its publisher customers. What it mainly says to me is that there are still some innings left in this game, and maybe some more games in a longer series.

The company may also be dipping into more M&A, given how fragmented the audience development, analytics and measurement space is. In March of this year, the company acquired AT Internet, a French company, to better manage and crunch analytics from across a number of silos, including traffic, advertising, subscriptions, engagement and more.

“Piano’s recent growth has been outstanding, and we continue to be impressed by the expanding set of capabilities they bring to both media companies and brands looking to drive more revenue from their audiences,” said Jon Seeber, general partner at Updata Partners and a member of Piano’s board, in a statement “They now have a true end-to-end platform that can power all aspects of the customer journey, allowing their clients to incorporate only the highest-quality data from across touchpoints to create the best experiences for users.”

#analytics, #funding, #media, #newsletters, #paywalls, #piano, #publishers, #publishing, #subscriptions

Google Analytics prepares for life after cookies

As consumer behavior and expectations around privacy have shifted — and operating systems and browsers have adapted to this — the age of cookies as a means of tracking user behavior is coming to an end. Few people will bemoan this, but advertisers and marketers rely on having insights into how their efforts translate into sales (and publishers like to know how their content performs as well). Google is obviously aware of this and it is now looking to machine learning to ready its tools like Google Analytics for this post-cookie future.

headshot of Vidhya Srinivasan, VP/GM, Advertising at Google

Vidhya Srinivasan, VP/GM, Advertising at Google

Last year, the company brought several machine learning tools to Google Analytics already. At the time, the focus was on alerting users to significant changes in their campaign performance, for example. Now, it is taking this a step further by using its machine learning systems to model user behavior when cookies are not available.

It’s hard to underestimate the importance of this shift, but according to Vidhya Srinivasan, Google’s VP and GM for Ads Buying, Analytics and Measurement who joined the company after a long stint at Amazon two years ago (and IBM before that), it’s also the only way to go.

“The principles we outlined to drive our measurement roadmap are based on shifting consumer expectations and ecosystem paradigms. Bottom line: the future is consented. It’s modeled. It’s first-party. So that’s what we’re using as our guide for the next gen of our products and solutions,” she said in her first media interview after joining Google.

It’s still early days and a lot of users may yet consent and opt in to tracking and sharing their data in some form or another. But the early indications are that this will be a minority of users. Unsurprisingly, first-party data and the data Google can gather from users who consent becomes increasingly valuable in this context.

Because of this, Google is now also making it easier to work with this so-called ‘consented data’ and to create better first-party data through improved integrations with tools like the Google Tag Manager.

Last year, Google launched Consent Mode, which helps advertisers manage cookie behavior based on local data-protection laws and user preferences. For advertisers in the EU and in the U.K., Consent Mode allows them to adjust their Google tags based on a user’s choices and soon, Google will launch a direct integration with Tag Manager to make it easier to modify and customize these tags.

How Consent Mode works today.

What’s maybe more important, though, is that Consent Mode will now use conversion modeling for users who don’t consent to cookies. Google says this can recover about 70% of ad-click-to-conversion journeys that would otherwise be lost to advertisers.

In addition, Google is also making it easier for bring in first-party data (in a privacy-forward way) to Google Analytics to improve measurements and its models.

“Revamping a popular product with a long history is something people are going to have opinions about – we know that. But we felt strongly that we needed Google Analytics to be relevant to changing consumer behavior and ready for a cookie-less world – so that’s what we’re building,” Srinivasan said. “The machine learning that Google has invested in for years — that experience is what we’re putting in action to drive the modeling underlying this tech. We take having credible insights and reporting in the market seriously. We know that doing the work on measurement is critical to market trust. We don’t take the progress we’ve made for granted and we’re looking to continue iterating to ensure scale, but above all we’re prioritizing user trust.”

 

 

#advertising-tech, #amazon, #analytics, #articles, #computing, #european-union, #gm, #google, #google-analytics, #ibm, #machine-learning, #operating-systems, #tc, #tracking, #united-kingdom, #vp, #web-analytics, #world-wide-web

Analytics as a service: Why more enterprises should consider outsourcing

With an increasing number of enterprise systems, growing teams, a rising proliferation of the web and multiple digital initiatives, companies of all sizes are creating loads of data every day. This data contains excellent business insights and immense opportunities, but it has become impossible for companies to derive actionable insights from this data consistently due to its sheer volume.

According to Verified Market Research, the analytics-as-a-service (AaaS) market is expected to grow to $101.29 billion by 2026. Organizations that have not started on their analytics journey or are spending scarce data engineer resources to resolve issues with analytics implementations are not identifying actionable data insights. Through AaaS, managed services providers (MSPs) can help organizations get started on their analytics journey immediately without extravagant capital investment.

MSPs can take ownership of the company’s immediate data analytics needs, resolve ongoing challenges and integrate new data sources to manage dashboard visualizations, reporting and predictive modeling — enabling companies to make data-driven decisions every day.

AaaS could come bundled with multiple business-intelligence-related services. Primarily, the service includes (1) services for data warehouses; (2) services for visualizations and reports; and (3) services for predictive analytics, artificial intelligence (AI) and machine learning (ML). When a company partners with an MSP for analytics as a service, organizations are able to tap into business intelligence easily, instantly and at a lower cost of ownership than doing it in-house. This empowers the enterprise to focus on delivering better customer experiences, be unencumbered with decision-making and build data-driven strategies.

Organizations that have not started on their analytics journey or are spending scarce data engineer resources to resolve issues with analytics implementations are not identifying actionable data insights.

In today’s world, where customers value experiences over transactions, AaaS helps businesses dig deeper into their psyche and tap insights to build long-term winning strategies. It also enables enterprises to forecast and predict business trends by looking at their data and allows employees at every level to make informed decisions.

#analytics, #analytics-as-a-service, #artificial-intelligence, #big-data, #business-intelligence, #column, #data-management, #data-mining, #ec-column, #ec-enterprise-applications, #enterprise, #machine-learning, #predictive-analytics

Hacking my way into analytics: A creative’s journey to design with data

Growing up, did you ever wonder how many chairs you’d have to stack to reach the sky?

No? I guess that’s just me then.

As a child, I always asked a lot of “how many/much” questions. Some were legitimate (“How much is 1 USD in VND?”); some were absurd (“How tall is the sky and can it be measured in chairs?”). So far, I’ve managed to maintain my obnoxious statistical probing habit without making any mortal enemies in my 20s. As it turns out, that habit comes with its perks when working in product.

Growing up, did you ever wonder how many chairs you’d have to stack to reach the sky?

My first job as a product designer was at a small but energetic fintech startup whose engineers also dabbled in pulling data. I constantly bothered them with questions like, “How many exports did we have from that last feature launched?” and “How many admins created at least one rule on this page?” I was curious about quantitative analysis but did not know where to start.

I knew I wasn’t the only one. Even then, there was a growing need for basic data literacy in the tech industry, and it’s only getting more taxing by the year. Words like “data-driven,” “data-informed” and “data-powered” increasingly litter every tech organization’s product briefs. But where does this data come from? Who has access to it? How might I start digging into it myself? How might I leverage this data in my day-to-day design once I get my hands on it?

Data discovery for all: What’s in the way?

“Curiosity is our compass” is one of Kickstarter’s guiding principles. Powered by a desire for knowledge and information, curiosity is the enemy of many larger, older and more structured organizations — whether they admit it or not — because it hinders the production flow. Curiosity makes you pause and take time to explore and validate the “ask.” Asking as many what’s, how’s, why’s, who’s and how many’s as possible is important to help you learn if the work is worth your time.

#analytics, #business-intelligence, #column, #data-analysis, #data-management, #data-tools, #database, #developer, #ec-column, #startups

With the right tools, predicting startup revenue is possible

For a long time, “revenue” seemed to be a taboo word in the startup world. Fortunately, things have changed with the rise of SaaS and alternative funding sources such as revenue-based investing VCs. Still, revenue modeling remains a challenge for founders. How do you predict earnings when you are still figuring it out?

The answer is twofold: You need to make your revenue predictable, repeatable and scalable in the first place, plus make use of tools that will help you create projections based on your data. Here, we’ll suggest some ways you can get more visibility into your revenue, find the data that really matter and figure out how to put a process in place to make forecasts about it.

You need to make your revenue predictable, repeatable and scalable in the first place, plus make use of tools that will help you create projections based on your data.

Base projections on repeatable, scalable results

Aaron Ross is a co-author of “Predictable Revenue,” a book based on his experience of creating a process and team that helped grow Salesforce’s revenue by more than $100 million. “Predictable” is the key word here: “You want growth that doesn’t require guessing, hope and frantic last-minute deal-hustling every quarter- and year-end,” he says.

This makes recurring revenue particularly desirable, though it is by no means the be-all-end-all of predictable revenue. On one hand, there is always the risk that recurring revenue won’t last, as customers may churn and organic growth runs out of gas. On the other, there is a broader picture for predictable revenue that goes beyond subscription-based models.

Ross and his co-author, Marylou Tyler, outline three steps to predictable revenue: predictable lead generation, a dedicated sales development team and consistent sales systems. They wrote an entire book about it, so it would be hard to sum it up here. So what’s the takeaway? You shouldn’t base your projections on processes and results that aren’t repeatable and scalable.

Cross the hot coals

In their early days, startups usually grow via word of mouth, luck and sheer hustle. The problem is that it likely won’t lead to sustainable growth; as the saying goes, what got you here won’t get you there. In between, there is typically a phase of uncertainty and missed results that Ross refers to as “the hot coals.”

Before the hot coals, predicting revenue is vain at best, and oftentimes impossible. I, for one, remember being at a loss when an old-school investor asked me for five-year profit-and-loss projections when my now-defunct startup was nowhere near a stable money-making path. Not all seed investors expect this, so there was obviously a mismatch here, but the challenge is still the same for most founders: How do you bridge the gap between traditional projections and the reality of a startup?

#analytics, #business-intelligence, #chargebee, #chartmogul, #finance, #fishtown-analytics, #fivetran, #forecasting, #saas, #salesforce, #segment, #startups, #stitch, #tc, #y-combinator-alumni, #zuora

Orca Security raises $210M Series C at a unicorn valuation

Orca Security, an Israeli cybersecurity startup that offers an agent-less security platform for protecting cloud-based assets, today announced that it has raised a $210 million Series C round at a $1.2 billion valuation. The round was led by Alphabet’s independent growth fund CapitalG and Redpoint Ventures. Existing investors GGV Capital, ICONIQ Growth and angel syndicate Silicon Valley CISO Investment also participated. YL Ventures, which led Orca’s seed round and participated in previous rounds, is not participating in this round — and it’s worth noting that the firm recently sold its stake in Axonius after that company reached unicorn status.

If all of this sounds familiar, that may be because Orca only raised its $55 million Series B round in December, after it announced its $20.5 million Series A round in May. That’s a lot of funding rounds in a short amount of time, but something we’ve been seeing more often in the last year or so.

Orca Security co-founders Gil Geron (left) and Avi Shua (right). Image Credits: Orca Security

As Orca co-founder and CEO Avi Shua told me, the company is seeing impressive growth and it — and its investors — want to capitalize on this. The company ended last year beating its own forecast from a few months before, which he noted was already aggressive, by more than 50%. Its current slate of customers includes Robinhood, Databricks, Unity, Live Oak Bank, Lemonade and BeyondTrust.

“We are growing at an unprecedented speed,” Shua said. “We were 20-something people last year. We are now closer to a hundred and we are going to double that by the end of the year. And yes, we’re using this funding to accelerate on every front, from dramatically increasing the product organization to add more capabilities to our platform, for post-breach capabilities, for identity access management and many other areas. And, of course, to increase our go-to-market activities.”

Shua argues that most current cloud security tools don’t really work in this new environment. Many, because they are driven by metadata, can only detect a small fraction of the risks, and agent-based solutions may take months to deploy and still not cover a business’ entire cloud estate. The promise of Orca Security is that it can not only cover a company’s entire range of cloud assets but that it is also able to help security teams prioritize the risks they need to focus on. It does so by using what the company calls its “SideScanning” technology, which allows it to map out a company’s entire cloud environment and file systems.

“Almost all tools are essentially just looking at discrete risk trees and not the forest. The risk is not just about how pickable the lock is, it’s also where the lock resides and what’s inside the box. But most tools just look at the issues themselves and prioritize the most pickable lock, ignoring the business impact and exposure — and we change that.”

It’s no secret that there isn’t a lot of love lost between Orca and some of its competitors. Last year, Palo Alto Networks sent Orca Security a sternly worded letter (PDF) to stop it from comparing the two services. Shua was not amused at the time and decided to fight it. “I completely believe there is space in the markets for many vendors, and they’ve created a lot of great products. But I think the thing that simply cannot be overlooked, is a large company that simply tries to silence competition. This is something that I believe is counterproductive to the industry. It tries to harm competition, it’s illegal, it’s unconstitutional. You can’t use lawyers to take your competitors out of the media.”

Currently, though, it doesn’t look like Orca needs to worry too much about the competition. As GGV Capital managing partner Glenn Solomon told me, as the company continues to grow and bring in new customers — and learn from the data it pulls in from them — it is also able to improve its technology.

“Because of the novel technology that Avi and [Orca Security co-founder and CPO] Gil [Geron] have developed — and that Orca is now based on — they see so much. They’re just discovering more and more ways and have more and more plans to continue to expand the value that Orca is going to provide to customers. They sit in a very good spot to be able to continue to leverage information that they have and help DevOps teams and security teams really execute on good hygiene in every imaginable way going forward. I’m super excited about that future.”

As for this funding round, Shua noted that he found CapitalG to be a “huge believer” in this space and an investor that is looking to invest into the company for the long run (and not just trying to make a quick buck). The fact that CapitalG is associated with Alphabet was obviously also a draw.

“Being associated with Alphabet, which is one of the three major cloud providers, allowed us to strengthen the relationship, which is definitely a benefit for Orca,” he said. “During the evaluation, they essentially put Orca in front of the security leadership at Google. Definitely, they’ve done their own very deep due diligence as part of that.”

#alphabet, #analytics, #axonius, #capitalg, #cloud, #cybersecurity-startup, #enterprise, #ggv-capital, #identity-access-management, #orca-security, #recent-funding, #redpoint-ventures, #security, #startups, #tc, #yl-ventures

No-code business intelligence service y42 raises $2.9M seed round

Berlin-based y42 (formerly known as Datos Intelligence), a data warehouse-centric business intelligence service that promises to give businesses access to an enterprise-level data stack that’s as simple to use as a spreadsheet, today announced that it has raised a $2.9 million seed funding round led by La Famiglia VC. Additional investors include the co-founders of Foodspring, Personio and Petlab.

The service, which was founded in 2020, integrates with over 100 data sources, covering all the standard B2B SaaS tools from Airtable to Shopify and Zendesk, as well as database services like Google’s BigQuery. Users can then transform and visualize this data, orchestrate their data pipelines and trigger automated workflows based on this data (think sending Slack notifications when revenue drops or emailing customers based on your own custom criteria).

Like similar startups, y42 extends the idea data warehouse, which was traditionally used for analytics, and helps businesses operationalize this data. At the core of the service is a lot of open source and the company, for example, contributes to GitLabs’ Meltano platform for building data pipelines.

y42 founder and CEO Hung Dang

y42 founder and CEO Hung Dang.

“We’re taking the best of breed open-source software. What we really want to accomplish is to create a tool that is so easy to understand and that enables everyone to work with their data effectively,” Y42 founder and CEO Hung Dang told me. “We’re extremely UX obsessed and I would describe us as no-code/low-code BI tool — but with the power of an enterprise-level data stack and the simplicity of Google Sheets.”

Before y42, Vietnam-born Dang co-founded a major events company that operated in over 10 countries and made millions in revenue (but with very thin margins), all while finishing up his studies with a focus on business analytics. And that in turn led him to also found a second company that focused on B2B data analytics.

Image Credits: y42

Even while building his events company, he noted, he was always very product- and data-driven. “I was implementing data pipelines to collect customer feedback and merge it with operational data — and it was really a big pain at that time,” he said. “I was using tools like Tableau and Alteryx, and it was really hard to glue them together — and they were quite expensive. So out of that frustration, I decided to develop an internal tool that was actually quite usable and in 2016, I decided to turn it into an actual company. ”

He then sold this company to a major publicly listed German company. An NDA prevents him from talking about the details of this transaction, but maybe you can draw some conclusions from the fact that he spent time at Eventim before founding y42.

Given his background, it’s maybe no surprise that y42’s focus is on making life easier for data engineers and, at the same time, putting the power of these platforms in the hands of business analysts. Dang noted that y42 typically provides some consulting work when it onboards new clients, but that’s mostly to give them a head start. Given the no-code/low-code nature of the product, most analysts are able to get started pretty quickly  — and for more complex queries, customers can opt to drop down from the graphical interface to y42’s low-code level and write queries in the service’s SQL dialect.

The service itself runs on Google Cloud and the 25-people team manages about 50,000 jobs per day for its clients. the company’s customers include the likes of LifeMD, Petlab and Everdrop.

Until raising this round, Dang self-funded the company and had also raised some money from angel investors. But La Famiglia felt like the right fit for y42, especially due to its focus on connecting startups with more traditional enterprise companies.

“When we first saw the product demo, it struck us how on top of analytical excellence, a lot of product development has gone into the y42 platform,” said Judith Dada, General Partner at LaFamiglia VC. “More and more work with data today means that data silos within organizations multiply, resulting in chaos or incorrect data. y42 is a powerful single source of truth for data experts and non-data experts alike. As former data scientists and analysts, we wish that we had y42 capabilities back then.”

Dang tells me he could have raised more but decided that he didn’t want to dilute the team’s stake too much at this point. “It’s a small round, but this round forces us to set up the right structure. For the series, A, which we plan to be towards the end of this year, we’re talking about a dimension which is 10x,” he told me.

#alteryx, #analytics, #berlin, #big-data, #business-intelligence, #business-software, #ceo, #cloud, #data, #data-analysis, #data-management, #data-warehouse, #enterprise, #general-partner, #information-technology, #judith-dada, #recent-funding, #shopify, #sql, #startups, #vietnam

It’s time to abandon business intelligence tools

Organizations spend ungodly amounts of money — millions of dollars — on business intelligence (BI) tools. Yet, adoption rates are still below 30%. Why is this the case? Because BI has failed businesses.

Logi Analytics’ 2021 State of Analytics: Why Users Demand Better survey showed that knowledge workers spend more than five hours a day in analytics, and more than 99% consider analytics very to extremely valuable when making critical decisions. Unfortunately, many are dissatisfied with their current tools due to the loss of productivity, multiple “sources of truth,” and the lack of integration with their current tools and systems.

A gap exists between the functionalities provided by current BI and data discovery tools and what users want and need.

Throughout my career, I’ve spoken with many executives who wonder why BI continues to fail them, especially when data discovery tools like Qlik and Tableau have gained such momentum. The reality is, these tools are great for a very limited set of use cases among a limited audience of users — and the adoption rates reflect that reality.

Data discovery applications allow analysts to link with data sources and perform self-service analysis, but still come with major pitfalls. Lack of self-service customization, the inability to integrate into workflows with other applications, and an overall lack of flexibility seriously impacts the ability for most users (who aren’t data analysts) to derive meaningful information from these tools.

BI platforms and data discovery applications are supposed to launch insight into action, informing decisions at every level of the organization. But many are instead left with costly investments that actually create inefficiencies, hinder workflows and exclude the vast majority of employees who could benefit from those operational insights. Now that’s what I like to call a lack of ROI.

Business leaders across a variety of industries — including “legacy” sectors like manufacturing, healthcare and financial services — are demanding better and, in my opinion, they should have gotten it long ago.

It’s time to abandon BI — at least as we currently know it.

Here’s what I’ve learned over the years about why traditional BI platforms and newer tools like data discovery applications fail and what I’ve gathered from companies that moved away from them.

The inefficiency breakdown is killing your company

Traditional BI platforms and data discovery applications require users to exit their workflow to attempt data collection. And, as you can guess, stalling teams in the middle of their workflow creates massive inefficiencies. Instead of having the data you need to make a decision readily available to you, instead, you have to exit the application, enter another application, secure the data and then reenter the original application.

According to the 2021 State of Analytics report, 99% of knowledge workers had to spend additional time searching for information they couldn’t easily locate in their analytics solution.

#analytics, #business-intelligence, #column, #ec-enterprise-applications, #ec-future-of-work, #enterprise, #information-technology, #real-time-data, #saas, #startups

Noogata raises $12M seed round for its no-code enterprise AI platform

Noogata, a startup that offers a no-code AI solution for enterprises, today announced that it has raised a $12 million seed round led by Team8, with participation from Skylake Capital. The company, which was founded in 2019 and counts Colgate and PepsiCo among its customers, currently focuses on e-commerce, retail and financial services, but it notes that it will use the new funding to power its product development and expand into new industries.

The company’s platform offers a collection of what are essentially pre-built AI building blocks that enterprises can then connect to third-party tools like their data warehouse, Salesforce, Stripe and other data sources. An e-commerce retailer could use this to optimize its pricing, for example, thanks to recommendations from the Noogata platform, while a brick-and-mortar retailer could use it to plan which assortment to allocate to a given location.

Image Credits: Noogata

“We believe data teams are at the epicenter of digital transformation and that to drive impact, they need to be able to unlock the value of data. They need access to relevant, continuous and explainable insights and predictions that are reliable and up-to-date,” said Noogata co-founder and CEO Assaf Egozi. “Noogata unlocks the value of data by providing contextual, business-focused blocks that integrate seamlessly into enterprise data environments to generate actionable insights, predictions and recommendations. This empowers users to go far beyond traditional business intelligence by leveraging AI in their self-serve analytics as well as in their data solutions.”

Image Credits: Noogata

We’ve obviously seen a plethora of startups in this space lately. The proliferation of data — and the advent of data warehousing — means that most businesses now have the fuel to create machine learning-based predictions. What’s often lacking, though, is the talent. There’s still a shortage of data scientists and developers who can build these models from scratch, so it’s no surprise that we’re seeing more startups that are creating no-code/low-code services in this space. The well-funded Abacus.ai, for example, targets about the same market as Noogata.

“Noogata is perfectly positioned to address the significant market need for a best-in-class, no-code data analytics platform to drive decision-making,” writes Team8 managing partner Yuval Shachar. “The innovative platform replaces the need for internal build, which is complex and costly, or the use of out-of-the-box vendor solutions which are limited. The company’s ability to unlock the value of data through AI is a game-changer. Add to that a stellar founding team, and there is no doubt in my mind that Noogata will be enormously successful.”

#analytics, #artificial-intelligence, #big-data, #business, #business-intelligence, #computing, #data-warehouse, #e-commerce, #enterprise, #machine-learning, #noogata, #recent-funding, #salesforce, #startups, #stripe, #team8

TigerGraph raises $105M Series C for its enterprise graph database

TigerGraph, a well-funded enterprise startup that provides a graph database and analytics platform, today announced that it has raised a $105 million Series C funding round. The round was led by Tiger Global and brings the company’s total funding to over $170 million.

“TigerGraph is leading the paradigm shift in connecting and analyzing data via scalable and native graph technology with pre-connected entities versus the traditional way of joining large tables with rows and columns,” said TigerGraph found and CEO, Yu Xu. “This funding will allow us to expand our offering and bring it to many more markets, enabling more customers to realize the benefits of graph analytics and AI.”

Current TigerGraph customers include the likes of Amgen, Citrix, Intuit, Jaguar Land Rover and UnitedHealth Group. Using a SQL-like query language (GSQL), these customers can use the company’s services to store and quickly query their graph databases. At the core of its offerings is the TigerGraphDB database and analytics platform, but the company also offers a hosted service, TigerGraph Cloud, with pay-as-you-go pricing, hosted either on AWS or Azure. With GraphStudio, the company also offers a graphical UI for creating data models and visually analyzing them.

The promise for the company’s database services is that they can scale to tens of terabytes of data with billions of edges. Its customers use the technology for a wide variety of use cases, including fraud detection, customer 360, IoT, AI, and machine learning.

Like so many other companies in this space, TigerGraph is facing some tailwind thanks to the fact that many enterprises have accelerated their digital transformation projects during the pandemic.

“Over the last 12 months with the COVID-19 pandemic, companies have embraced digital transformation at a faster pace driving an urgent need to find new insights about their customers, products, services, and suppliers,” the company explains in today’s announcement. “Graph technology connects these domains from the relational databases, offering the opportunity to shrink development cycles for data preparation, improve data quality, identify new insights such as similarity patterns to deliver the next best action recommendation.”

#amgen, #analytics, #articles, #artificial-intelligence, #aws, #business-intelligence, #ceo, #citrix, #citrix-systems, #computing, #data, #database, #enterprise, #graph-database, #intuit, #jaguar-land-rover, #machine-learning, #tiger-global

Twinco Capital scores €3M for its supply chain finance solution

Twinco Capital, a Madrid and Amsterdam-based startup making it easier to access supply chain finance, has raised €3 million in funding.

Leading the round is Spanish VC fund Mundi Ventures, with participation from previous backer Finch Capital and several unnamed angels. Twinco Capital also has a debt facility with the Spanish investment bank EBN Banco de Negocios, which is common for any type of lending company.

Founded in 2016 by Sandra Nolasco and Carmen Marin Romano, Twinco Capital offers a supply chain finance solution that includes purchase order funding. To do this, it integrates with large corporates on the purchase side and then funds suppliers by paying up to 60% of the purchase order value upfront and the remainder immediately upon delivery.

The entire process is digital, promising a quick decision and fast deployment of funds, and is powered by Twinco’s supply chain analytics and the data it is able to access by partnering with both sides of the supply chain.

“The financing of global supply chains is expensive and inefficient, the burden of the cost is mostly borne by the suppliers and in particular by those that are SMEs in emerging markets,” explains Twinco Capital co-founder and CEO Sandra Nolasco.

“Take any global supply chain, such as apparel, automotive, electronics etc. Exporters in countries like Bangladesh, China or Vietnam that have been supplying European companies for years, with stable commercial relationships. However, their creditworthiness is still measured only on the basis of annual financials, making access to competitive liquidity a major obstacle for growth”.

By having visibility on both sides, including upcoming orders, Twinco provides liquidity to the suppliers “from purchase order to final invoice payment”.

“We do that by analyzing supply chain data – the performance of the suppliers, the network effects between common suppliers and buyers (and many more data points I am not allowed to mention!),” says the Twinco CEO. “In short, using advanced data analytics we can better assess, price and significantly mitigate risk. The good news is that the more transactions we fund, the more suppliers and buyers we add, the more robust is our risk assessment. We believe there is a strong network effect”.

To that end, Twinco makes money by charging a “discount fee” for each purchase order it funds. “Since default rates are a fraction of that fee, we can unlock significant value,” says Nolasco.

Meanwhile, the fintech is also unlocking an asset class for investors and competes with local banks that are much more manual and don’t benefit from increased visibility via network effects. Nolasco says that to ensure interests are aligned, the company uses a portion of equity to also invest in the purchase orders it funds.

#analytics, #europe, #finch-capital, #fundings-exits, #mundi-ventures, #startups, #supply-chain, #tc, #twinco-capital

Iteratively raises $5.4M to help companies build data pipelines they can trust

As companies gather more data, ensuring that they can trust the quality of that data is becoming increasingly important. An analytics pipeline is only as good as the data it collects, after all, and messy data — or outright bugs — can easily lead to issues further down the line.

Seattle-based Iteratively wants to help businesses build data pipelines they can trust. The company today announced a $5.4 million seed funding round led by Google’s AI-centric Gradient Ventures fund. Fika Ventures and early Iteratively investor PSL Ventures also participated, with Gradient Ventures partner Zach Bratun-Glennon joining the company’s board.

Patrick Thompson, Iteratively’s Co-founder and CEO, started working on Iteratively about two years ago. Before that, he worked at Atlassian and at Simplicity, where he met his co-founder Ondrej Hrebicek. After getting started, the team spent six months doing customer discovery and the theme they picked up on was that companies weren’t trusting the data they captured.

“We interviewed a ton of companies who built internal solutions, trying to solve this particular problem. We actually built one at Atlassian, as well, so I was very much intimately familiar with this pain. And so we decided to bring a product to market that really helps alleviate the pain,” he told me.

Image Credits: Iteratively

In a lot of companies, the data producers and data consumers don’t really talk to each other — and if they do, it’s often only through a spreadsheet or wiki. Iteratively aims to provide a collaborative environment to bring these different groups together and create a single source of truth for all stakeholders. “Typically, there’s a handoff process, either on a JIRA ticket or a Confluence page or spreadsheet, where they try to hand over these requirements — and generally, it’s never really implemented correctly, which then causes a lot of pain points down down the line,” Thompson explained.

Currently, Iteratively focuses on event streaming data for product and marketing analytics — the kind of data that typically flows into a Mixpanel, Amplitude or Segment. Iteratively itself sits at the origin of the data, say an app, and then validates the data and routes it to whatever third-party solution a company may use. That means the tool sits right where the data is generated, but this setup also means that none of the data ever flows through Iteratively’s own servers.

Image Credits: Iteratively

“We don’t actually see the data,” Thompson stressed. “We’re not a data set processor. We’re a wrapper over the top of your own analytics pipeline or your own third party SaaS tools, but we verify the payloads as they flow through our SDK on the client.”

Over time, though, that may change, he acknowledged and Iteratively may do some data processing as well, but likely with a focus on metadata and observability.

Since the company doesn’t actually process any of the data itself, it’s charging customers by seat and not based on how many events move through their pipelines, for example. That may obviously change over time as the company looks into doing some data processing on its side as well.

Currently, Iteratively has about 10 employees and plans to grow that to 20 by the end of the year. The company plans to hire across R&D, sales and marketing.

Iteratively‘s software has a unique approach to enabling company-wide collaboration and enforcing data quality,” said Grandient’s Bratun-Glennon. “Going forward, we believe that intelligent analytics and data-driven business decision making will differentiate successful companies and best-in-class products. Iteratively‘s mission, product and team are poised to give each of their customers these capabilities.”

#analytics, #articles, #artificial-intelligence, #atlassian, #business-intelligence, #co-founder, #data, #data-processing, #developer, #fika-ventures, #gradient-ventures, #information-technology, #mixpanel, #seattle

Hightouch raises $2.1M to help businesses get more value from their data warehouses

Hightouch, a SaaS service that helps businesses sync their customer data across sales and marketing tools, is coming out of stealth and announcing a $2.1 million seed round. The round was led by Afore Capital and Slack Fund, with a number of angel investors also participating.

At its core, Hightouch, which participated in Y Combinator’s Summer 2019 batch, aims to solve the customer data integration problems that many businesses today face.

During their time at Segment, Hightouch co-founders Tejas Manohar and Josh Curl witnessed the rise of data warehouses like Snowflake, Google’s BigQuery and Amazon Redshift — that’s where a lot of Segment data ends up, after all. As businesses adopt data warehouses, they now have a central repository for all of their customer data. Typically, though, this information is then only used for analytics purposes. Together with former Bessemer Ventures investor Kashish Gupta, the team decided to see how they could innovate on top of this trend and help businesses activate all of this information.

hightouch founders

HighTouch co-founders Kashish Gupta, Josh Curl and Tejas Manohar.

“What we found is that, with all the customer data inside of the data warehouse, it doesn’t make sense for it to just be used for analytics purposes — it also makes sense for these operational purposes like serving different business teams with the data they need to run things like marketing campaigns — or in product personalization,” Manohar told me. “That’s the angle that we’ve taken with Hightouch. It stems from us seeing the explosive growth of the data warehouse space, both in terms of technology advancements as well as like accessibility and adoption. […] Our goal is to be seen as the company that makes the warehouse not just for analytics but for these operational use cases.”

It helps that all of the big data warehousing platforms have standardized on SQL as their query language — and because the warehousing services have already solved the problem of ingesting all of this data, Hightouch doesn’t have to worry about this part of the tech stack either. And as Curl added, Snowflake and its competitors never quite went beyond serving the analytics use case either.

Image Credits: Hightouch

As for the product itself, Hightouch lets users create SQL queries and then send that data to different destinations  — maybe a CRM system like Salesforce or a marketing platform like Marketo — after transforming it to the format that the destination platform expects.

Expert users can write their own SQL queries for this, but the team also built a graphical interface to help non-developers create their own queries. The core audience, though, is data teams — and they, too, will likely see value in the graphical user interface because it will speed up their workflows as well. “We want to empower the business user to access whatever models and aggregation the data user has done in the warehouse,” Gupta explained.

The company is agnostic to how and where its users want to operationalize their data, but the most common use cases right now focus on B2C companies, where marketing teams often use the data, as well as sales teams at B2B companies.

Image Credits: Hightouch

“It feels like there’s an emerging category here of tooling that’s being built on top of a data warehouse natively, rather than being a standard SaaS tool where it is its own data store and then you manage a secondary data store,” Curl said. “We have a class of things here that connect to a data warehouse and make use of that data for operational purposes. There’s no industry term for that yet, but we really believe that that’s the future of where data engineering is going. It’s about building off this centralized platform like Snowflake, BigQuery and things like that.”

“Warehouse-native,” Manohar suggested as a potential name here. We’ll see if it sticks.

Hightouch originally raised its round after its participation in the Y Combinator demo day but decided not to disclose it until it felt like it had found the right product/market fit. Current customers include the likes of Retool, Proof, Stream and Abacus, in addition to a number of significantly larger companies the team isn’t able to name publicly.

#afore-capital, #analytics, #articles, #big-data, #business-intelligence, #cloud, #computing, #data-management, #data-warehouse, #developer, #enterprise, #information, #personalization, #recent-funding, #slack-fund, #software-as-a-service, #startups, #tc

Microsoft launches Azure Purview, its new data governance service

As businesses gather, store and analyze an ever-increasing amount of data, tools for helping them discover, catalog, track and manage how that data is shared are also becoming increasingly important. With Azure Purview, Microsoft is launching a new data governance service into public preview today that brings together all of these capabilities in a new data catalog with discovery and data governance features.

As Rohan Kumar, Microsoft’s corporate VP for Azure Data told me, this has become a major paint point for enterprises. While they may be very excited about getting started with data-heavy technologies like predictive analytics, those companies’ data- and privacy- focused executives are very concerned to make sure that the way the data is used is compliant or that the company has received the right permissions to use its customers’ data, for example.

In addition, companies also want to make sure that they can trust their data and know who has access to it and who made changes to it.

“[Purview] is a unified data governance platform which automates the discovery of data, cataloging of data, mapping of data, lineage tracking — with the intention of giving our customers a very good understanding of the breadth of the data estate that exists to begin with, and also to ensure that all these regulations that are there for compliance, like GDPR, CCPA, etc, are managed across an entire data estate in ways which enable you to make sure that they don’t violate any regulation,” Kumar explained.

At the core of Purview is its catalog that can pull in data from the usual suspects like Azure’s various data and storage services but also third-party data stores including Amazon’s S3 storage service and on-premises SQL Server. Over time, the company will add support for more data sources.

Kumar described this process as a ‘multi-semester investment,’ so the capabilities the company is rolling out today are only a small part of what’s on the overall roadmap already. With this first release today, the focus is on mapping a company’s data estate.

Image Credits: Microsoft

“Next [on the roadmap] is more of the governance policies,” Kumar said. “Imagine if you want to set things like ‘if there’s any PII data across any of my data stores, only this group of users has access to it.’ Today, setting up something like that is extremely complex and most likely you’ll get it wrong. That’ll be as simple as setting a policy inside of Purview.”

In addition to launching Purview, the Azure team also today launched Azure Synapse, Microsoft’s next-generation data warehousing and analytics service, into general availability. The idea behind Synapse is to give enterprises — and their engineers and data scientists — a single platform that brings together data integration, warehousing and big data analytics.

“With Synapse, we have this one product that gives a completely no code experience for data engineers, as an example, to build out these [data] pipelines and collaborate very seamlessly with the data scientists who are building out machine learning models, or the business analysts who build out reports for things like Power BI.”

Among Microsoft’s marquee customers for the service, which Kumar described as one of the fastest-growing Azure services right now, are FedEx, Walgreens, Myntra and P&G.

“The insights we gain from continuous analysis help us optimize our network,” said Sriram Krishnasamy, senior vice president, strategic programs at FedEx Services. “So as FedEx moves critical high value shipments across the globe, we can often predict whether that delivery will be disrupted by weather or traffic and remediate that disruption by routing the delivery from another location.”

Image Credits: Microsoft

#analytics, #big-data, #business-intelligence, #cloud, #computing, #data, #data-management, #data-protection, #developer, #enterprise, #general-data-protection-regulation, #information, #microsoft, #rohan-kumar, #tc

Amazon S3 Storage Lens gives IT visibility into complex S3 usage

As your S3 storage requirements grow, it gets harder to understand exactly what you have, and this especially true when it crosses multiple regions. This could have broad implications for administrators, who are forced to build their own solutions to get that missing visibility. AWS changed that this week when it announced a new product called Amazon S3 Storage Lens, a way to understand highly complex S3 storage environments.

The tool provides analytics that help you understand what’s happening across your S3 object storage installations, and to take action when needed. As the company describes the new service in a blog post, “This is the first cloud storage analytics solution to give you organization-wide visibility into object storage, with point-in-time metrics and trend lines as well as actionable recommendations,” the company wrote in the post.

Amazon S3 Storage Lens Console

Image Credits: Amazon

The idea is to present a set of 29 metrics in a dashboard that help you “discover anomalies, identify cost efficiencies and apply data protection best practices,” according to the company. IT administrators can get a view of their storage landscape and can drill down into specific instances when necessary, such as if there is a problem that requires attention. The product comes out of the box with a default dashboard, but admins can also create their own customized dashboards, and even export S3 Lens data to other Amazon tools.

For companies with complex storage requirements, as in thousands or even tens of thousands of S3 storage instances, who have had to kludge together ways to understand what’s happening across the systems, this gives them a single view across it all.

S3 Storage Lens is now available in all AWS regions, according to the company.

#amazon-s3, #analytics, #aws, #cloud, #dashboards, #enterprise, #storage

Databricks launches SQL Analytics

AI and data analytics company Databricks today announced the launch of SQL Analytics, a new service that makes it easier for data analysts to run their standard SQL queries directly on data lakes. And with that, enterprises can now easily connect their business intelligence tools like Tableau and Microsoft’s Power BI to these data repositories as well.

SQL Analytics will be available in public preview on November 18.

In many ways, SQL Analytics is the product Databricks has long been looking to build and that brings its concept of a ‘lake house’ to life. It combines the performance of a data warehouse, where you store data after it has already been transformed and cleaned, with a data lake, where you store all of your data in its raw form. The data in the data lake, a concept that Databrick’s co-founder and CEO Ali Ghodsi has long championed, is typically only transformed when it gets used. That makes data lakes cheaper, but also a bit harder to handle for users.

Image Credits: Databricks

“We’ve been saying Unified Data Analytics, which means unify the data with the analytics. So data processing and analytics, those two should be merged. But no one picked that up,” Ghodsi told me. But ‘lake house’ caught on as a term.

“Databricks has always offered data science, machine learning. We’ve talked about that for years. And with Spark, we provide the data processing capability. You can do [extract, transform, load]. That has always been possible. SQL Analytics enables you to now do the data warehousing workloads directly, and concretely, the business intelligence and reporting workloads, directly on the data lake.”

The general idea here is that with just one copy of the data, you can enable both traditional data analyst use cases (think BI) and the data science workloads (think AI) Databricks was already known for. Ideally, that makes both use cases cheaper and simpler.

The service sits on top of an optimized version of Databricks’ open-source Delta Lake storage layer to enable the service to quickly complete queries. In addition, Delta Lake also provides auto-scaling endpoints to keep the query latency consistent, even under high loads.

While data analysts can query these data sets directly, using standard SQL, the company also built a set of connectors to BI tools. Its BI partners include Tableau, Qlik, Looker and Thoughtspot, as well as ingest partners like Fivetran, Fishtown Analytics, Talend and Matillion.

Image Credits: Databricks

“Now more than ever, organizations need a data strategy that enables speed and agility to be adaptable,” said Francois Ajenstat, Chief Product Officer at Tableau. “As organizations are rapidly moving their data to the cloud, we’re seeing growing interest in doing analytics on the data lake. The introduction of SQL Analytics delivers an entirely new experience for customers to tap into insights from massive volumes of data with the performance, reliability and scale they need.”

In a demo, Ghodsi showed me what the new SQL Analytics workspace looks like. It’s essentially a stripped-down version of the standard code-heavy experience that Databricks users are familiar with. Unsurprisingly, SQL Analytics provides a more graphical experience that focuses more on visualizations and not Python code.

While there are already some data analysts on the Databricks platform, this obviously opens up a large new market for the company — something that would surely bolster its plans for an IPO next year.

#ali-ghodsi, #analytics, #apache-spark, #artificial-intelligence, #business-intelligence, #cloud, #data-analysis, #data-lake, #data-management, #data-processing, #data-science, #data-warehouse, #databricks, #democrats, #enterprise, #fishtown-analytics, #fivetran, #information, #looker, #machine-learning, #python, #sql, #tableau, #talend

Rockset announces $40M Series B as data analytics solution gains momentum

Rockset, a cloud-native analytics company, announced a $40 million Series B investment today led by Sequoia with help from Greylock, the same two firms that financed its Series A. The startup has now raised a total of $61.5 million, according to the company.

As co-founder and CEO Venkat Venkataramani told me at the time of the Series A in 2018, there is a lot of manual work involved in getting data ready to use and it acts as a roadblock to getting to real insight. He hoped to change that with Rockset.

“We’re building out our service with innovative architecture and unique capabilities that allows full-featured fast SQL directly on raw data. And we’re offering this as a service. So developers and data scientists can go from useful data in any shape, any form to useful applications in a matter of minutes. And it would take months today,” he told me in 2018.

In fact, “Rockset automatically builds a converged index on any data — including structured, semi-structured, geographical and time series data — for high-performance search and analytics at scale,” the company explained.

It seems to be resonating with investors and customers alike as the company raised a healthy B round and business is booming. Rockset supplied a few metrics to illustrate this. For starters, revenue grew 290% in the last quarter. While they didn’t provide any foundational numbers for that percentage growth, it is obviously substantial.

In addition, the startup reports adding hundreds of new users, again not nailing down any specific numbers, and queries on the platform are up 313%. Without specifics, it’s hard to know what that means, but that seems like healthy growth for an early stage startup, especially in this economy.

Mike Vernal, a partner at Sequoia, sees a company helping to get data to work faster than other solutions, which require a lot of handling first. “Rockset, with its innovative new approach to indexing data, has quickly emerged as a true leader for real-time analytics in the cloud. I’m thrilled to partner with the company through its next phase of growth,” Vernal said in a statement.

The company was founded in 2016 by the creators of RocksDB. The startup had previously raised a $3 million seed round when they launched the company and the $18.5 million A round in 2018.

#analytics, #cloud, #cloud-native, #data, #enterprise, #fundings-exits, #greylock, #recent-funding, #rockset, #sequoia, #startups, #tc

SimilarWeb raises $120M for its AI-based market intelligence platform for sites and apps

Israeli startup SimilarWeb has made a name for itself with an AI-based platform that lets sites and apps track and understand traffic not just on their own sites, but those of its competitors. Now, it’s taking the next step in its growth. The startup has raised $120 million, funding it will use to continue expanding its platform both through acquisitions and investing in its own R&D, with a focus on providing more analytics services to larger enterprises alongside its current base of individuals and companies of all sizes that do business on the web.

Co-led by ION Crossover Partners and Viola Growth, the round doubles the total amount that the startup has raised to date to $240 million. Or Offer, SimilarWeb’s founder and CEO, said in an interview that it was not disclosing its valuation this time around except to say that his company is now “playing in the big pool.” It counts more than half of the Fortune 100 as customers, with Walmart, P&G, Adidas and Google, among them.

For some context, it hit an $800 million valuation in its last equity round, in 2017.

SimilarWeb’s technology competes with other analytics and market intelligence providers ranging from the likes of Nielsen and ComScore through to the Apptopias of the world in that, at its most basic level, it provides a dashboard to users that provides insights into where people are going on desktop and mobile. Where it differs, Offer said, is in how it gets to its information, and what else it’s doing in the process.

For starters, it focuses not just how many people are visiting, but also a look into what is triggering the activity — the “why”, as it were — behind the activity. Using a host of AI tech such as machine learning algorithms and deep learning — like a lot of tech out of Israel, it’s being built by people with deep expertise in this area — Offer says that SimilarWeb is crunching data from a number of different sources to extrapolate its insights.

He declined to give much detail on those sources but told me that he cheered the arrival of privacy gates and cookie lists for helping ferret out, expose and sometimes eradicate some of the more nefarious “analytics” services out there, and said that SimilarWeb has not been affected at all by that swing to more data protection, since it’s not an analytics service, strictly speaking, and doesn’t sniff data on sights in the same way. It’s also exploring widening its data pool, he added:

“We are always thinking about what new signals we could use,” he said. “Maybe they will include CDNs. But it’s like Google with its rankings in search. It’s a never ending story to try to get the highest accuracy in the world.”

The global health pandemic has driven a huge amount of activity on the web this year, with people turning to sites and apps not just for leisure — something to do while staying indoors, to offset all the usual activities that have been cancelled — but for business, whether it be consumers using e-commerce services for shopping, or workers taking everything online and to the cloud to continue operating.

That has also seen a boost of business for all the various companies that help the wheels turn on that machine, SimilarWeb included.

“Consumer behavior is changing dramatically, and all companies need better visibility,” said Offer. “It started with toilet paper and hand sanitizer, then moved to desks and office chairs, but now it’s not just e-commerce but everything. Think about big banks, whose business was 70% offline and is now 70-80% online. Companies are building and undergoing a digital transformation.”

That in turn is driving more people to understand how well their web presence is working, he said, with the basic big question being: “What is my marketshare, and how does that compare to my competition? Everything is about digital visibility, especially in times of change.”

Like many other companies, SimilarWeb did see an initial dip in business, Offer said, and to that end the company has taken on some debt as part of Israel’s Paycheck Protection Program, to help safeguard some jobs that needed to be furloughed. But he added that most of its customers prior to the pandemic kicking off are now back, along with customers from new categories that hadn’t been active much before, like automotive portals.

That change in customer composition is also opening some doors of opportunity for the company. Offer noted that in recent months, a lot of large enterprises — which might have previously used SimilarWeb’s technology indirectly, via a consultancy, for example — have been coming to the company direct.

“We’ve started a new advisory service [where] our own expert works with a big customer that might have more deep and complex questions about the behaviour we are observing. They are questions all big businesses have right now.” The service sounds like a partly-educational effort, teaching companies that are not necessarily digital-first be more proactive, and partly consulting.

New customer segments, and new priorities in the world of business, are two of the things that drove this round, say investors.

“SimilarWeb was always an incredible tool for any digital professional,” said Gili Iohan of ION Crossover Partners, in a statement. “But over the last few months it has become apparent that traffic intelligence — the unparalleled data and digital insight that SimilarWeb offers — is an absolute essential for any company that wants to win in the digital world.”

As for acquisitions, SimilarWeb has historically made these to accelerate its technical march. For example, in 2015 it acquired Quettra to move deeper into mobile analytics and it acquired Swayy to move into content discovery insights (key for e-commerce intelligence). Offer would not go into too much detail about what it has identified as a further target but given that there are quite a lot of companies building tech in this area currently, that there might be a case for some consolidation around bigger platforms to combine some of the features and functionality. Offer said that it was looking at “companies with great data and digital intelligence, with a good product. There are a lot of opportunities right now on the table.”

The company will also be doing some hiring, with the plan to be to add 200 more people globally by January (it has around 600 employees today).

“Since we joined the company three years ago, SimilarWeb has executed a strategic transformation from a general-purpose measurement platform to vertical-based solutions, which has significantly expanded its market opportunity and generated immense customer value,” said Harel Beit-On, Founder and General Partner at Viola Growth, in a statement. “With a stellar management team of accomplished executives, we believe this round positions the company to own the digital intelligence category, and capitalize on the acceleration of the digital era.”

#analytics, #artificial-intelligence, #ecommerce, #enterprise, #europe, #market-intelligence, #recent-funding, #similarweb, #startups, #web-traffic

Google Meet and other Google services go down (Updated)

Google’s engineers aren’t having a good day today. This afternoon, a number of Google services went offline or are barely reachable. These services include Google Meet, Drive, Docs, Analytics, Classroom and Calendar, for example.

While Google’s own status dashboards don’t show any issues, we’re seeing reports from around the world from people who aren’t able to reach any of these services. Best we can tell, these issues started around 6pm PT.

It’s unusual for this number of Google services to go down at once. Usually, it’s only a single service that is affected. This time around, however, it’s clearly a far broader issue.

We’ve reached out to Google and will update this post once we hear more about what happened.

Update (6:30pm PT): and we’re back. It looks like most Google services are now recovering.

 

#analytics, #calendar, #classroom, #companies, #docs, #drive, #gmail, #google, #google-meet, #san-francisco-bay-area, #tc, #websites, #x

Mailchimp launches new AI tools as it continues its transformation to marketing platform

Mailchimp may have started out as an easy to use newsletter tool, but that was almost 20 years ago. Today’s company still does email, but at its core, it is now a marketing automation platform for small businesses that also offers a website builder, basic online stores, digital ad support and analytics to make sense of it all. Like before, though, the company’s main goal is to make all these features easy to use for small business users.

Image Credits: Mailchimp

Today, Mailchimp, which has never taken outside funding, is taking the next step in its own transformation with the launch of a set of AI-based tools that give small businesses easy access to the same kind of capabilities that their larger competitors now use. That includes personalized product recommendations for shoppers and forecasting tools for behavioral targeting to see which users are most likely to buy something, for example. But there’s now also a new AI-backed tool to help business owners design their own visual asset (based in part on its acquisition of Sawa), as well as a tool to help them write better email subject lines.

There’s also a new tool that helps businesses choose the next best action. It looks at all of the data the service aggregates and gives users actionable recommendations for how to improve their email campaign performance.

Image Credits: Mailchimp

“The journey to get here started about four years ago,” Mailchimp’s founding CEO Ben Chestnut told me. “We were riding high. Email was doing amazing for us. And things look so good. And I had a choice, I felt I could sell the business and make a lot of money. I had some offers. Or I could just coast, honestly. I could just be a hero in email and keep it simple and just keep raking in the money. Or I could take on another really tough challenge, which would be act two of  Mailchimp. And I honestly didn’t know what that would be. To be honest with you, that was four years ago, it could have been anything really.”

But after talking to the team, including John Foreman, the head of data analytics at the time and now Mailchimp’s CPO, Chestnut put the company on this new path to go after the marketing automation space. In part, he told me, he did so because he noted that the email space was getting increasingly crowded. “You know how that ends. I mean, you can’t stay there forever with this many competitors. So I knew that we had to up our game,” he said.

And that meant going well beyond email and building numerous new products.

Image Credits: Mailchimp

“It was a huge transformation for us,” Chestnut acknowledged. “We had to get good at building for other customer segments at the time, like e-commerce customers and others. And that was new for us, too. It’s all kinds of new disciplines for us. To inflict that kind of change on your employees is very, very rough. I just can’t help but look back with gratitude that my employees were willing to go on this journey with me. And they actually had faith in me and this release — this fall release — is really the culmination of everything we’ve been working on for four years to me.”

One thing that helped was that Mailchimp already had e-commerce customers — and as Chestnut noted, they were pushing the system to its limit. Only a few years ago, the culture at Mailchimp looked at them as somewhat annoying, though, Chestnut admitted, because they were quite demanding. They didn’t even make the company a lot of money either. At the time, non-profits were Mailchimp’s best customers, but they weren’t pushing the technology to its limits.

Despite this transformation, Mailchimp hasn’t made a lot of acquisitions to accelerate this process. Chestnut argues that a lot of what it is doing — say adding direct mail — is something that was more or less and extension of what it was already good at. But it did make some small AI and ML acquisitions to bring the right expertise in-house, as well as two e-commerce acquisitions, including Lemonstand. Most recently, Mailchimp acquired Courier, a British magazine, newsletter and podcast, marking its first move into the print business.

With this new set of products and services, Mailchimp is now aiming to give small businesses access to the same capabilities the larger e-commerce players have long had, but without the complexity.

To build tools based on machine learning, one needs data — and that’s something Mailchimp already had.

“We’ve been doing marketing for decades,” Mailchimp CPO Foreman said. “And we have millions of small businesses on the platform. And so not only do we build all these tools ourselves, which allows us to integrate them from a visual design perspective — they’re not necessarily acquisitions — but we have this common data set from years and years of doing marketing across millions of businesses, billions of customers we’re talking to, and so we thought, how can we use intelligence — artificial intelligence, machine learning, etc. — to also sand down how all of these tools connect.”

Chestnut says he isn’t likely to put the company on a similar transformation anytime soon. “I really believe you can only take on one major transformation per decade,” he said. “And so you better pick the right one and you better invest it. We’re all in on this all-in-one marketing platform that’s e-commerce enabled. That is unique enough. And now what I’m trying to get my company to do is go deep.”

#advertising-tech, #analytics, #articles, #artificial-intelligence, #automation, #ben-chestnut, #business, #cloud-applications, #computing, #email,