3 keys to pricing early-stage SaaS products

I’ve met hundreds of founders over the years, and most, particularly early-stage founders, share one common go-to-market gripe: Pricing.

For enterprise software, traditional pricing methods like per-seat models are often easier to figure out for products that are hyper-specific, especially those used by people in essentially the same way, such as Zoom or Slack. However, it’s a different ball game for startups that offer services or products that are more complex.

Most startups struggle with a per-seat model because their products, unlike Zoom and Slack, are used in a litany of ways. Salesforce, for example, employs regular seat licenses and admin licenses — customers can opt for lower pricing for solutions that have low-usage parts — while other products are priced based on negotiation as part of annual renewals.

You may have a strong champion in a CIO you’re selling to or a very friendly person handling procurement, but it won’t matter if the pricing can’t be easily explained and understood. Complicated or unclear pricing adds more friction.

Early pricing discussions should center around the buyer’s perspective and the value the product creates for them. It’s important for founders to think about the output and the outcome, and a number they can reasonably defend to customers moving forward. Of course, self-evaluation is hard, especially when you’re asking someone else to pay you for something you’ve created.

This process will take time, so here are three tips to smoothen the ride.

Pricing is a journey

Pricing is not a fixed exercise. The enterprise software business involves a lot of intangible aspects, and a software product’s perceived value, quality, and user experience can be highly variable.

The pricing journey is long and, despite what some founders might think, jumping head-first into customer acquisition isn’t the first stop. Instead, step one is making sure you have a fully fledged product.

If you’re a late-seed or Series A company, you’re focused on landing those first 10-20 customers and racking up some wins to showcase in your investor and board deck. But when you grow your organization to the point where the CEO isn’t the only person selling, you’ll want to have your go-to-market position figured out.

Many startups fall into the trap of thinking: “We need to figure out what pricing looks like, so let’s ask 50 hypothetical customers how much they would pay for a solution like ours.” I don’t agree with this approach, because the product hasn’t been finalized yet. You haven’t figured out product-market fit or product messaging and you want to spend a lot of time and energy on pricing? Sure, revenue is important, but you should focus on finding the path to accruing revenue versus finding a strict pricing model.

#artificial-intelligence, #aws, #column, #ec-column, #ec-enterprise-applications, #ec-how-to, #enterprise, #enterprise-software, #product, #saas, #salesforce, #startups, #tc

Real-time database platform SingleStore raises $80M more, now at a $940M valuation

Organizations are swimming in data these days, and so solutions to help manage and use that data in more efficient ways will continue to see a lot of attention and business. In the latest development, SingleStore — which provides a platform to enterprises to help them integrate, monitor and query their data as a single entity, regardless of whether that data is stored in multiple repositories — is announcing another $80 million in funding, money that it will be using to continue investing in its platform, hiring more talent and overall business expansion. Sources close to the company tell us that the company’s valuation has grown to $940 million.

The round, a Series F, is being led by Insight Partners, with new investor Hewlett Packard Enterprise, and previous backers Khosla Ventures, Dell Capital, Rev IV, Glynn Capital, and GV (formerly Google Ventures) also participating. The startup has to date raised $264 million, including most recently an $80 million Series E as recently as last December, just on the heels of rebranding from MemSQL.

The fact that there are three major strategic investors in this Series F — HPE, Dell and Google — may say something about the traction that SingleStore is seeing, but so too do its numbers: 300%+ increase in new customer acquisition for its cloud service and 150%+ year-over-year growth in cloud

Raj Verma, SingleStore’s CEO, said in an interview that its cloud revenues have grown by 150% year over year and now account for some 40% of all revenues (up from 10% a year ago). New customer numbers, meanwhile, have grown by over 300%.

“The flywheel is now turning around,” Verma said. “We didn’t need this money. We’ve barely touched our Series E. But I think there has been a general sentiment among our board and management that we are now ready for the prime time. We think SingleStore is one of the best kept secrets in the database market. Now we want to aggressively be an option for people looking for a platform for intensive data applications or if they want to consolidate databases to 1 from 3, 5 or 7 repositories. We are where the world is going: real-time insights.”

With database management and the need for more efficient and cost-effective tools to manage that becoming an ever-growing priority — one that definitely got a fillip in the last 18 months with Covid-19 pushing people into more remote working environments. That means SingleStore is not without competitors, with others in the same space including Amazon, Microsoft, Snowflake, PostgreSQL, MySQL, Redis and more. Others like Firebolt are tackling the challenges of handing large, disparate data repositories from another angle. (Some of these, I should point out, are also partners: SingleStore works with data stored on AWS, Microsoft Azure, Google Cloud Platform, and Red Hat, and Verma describes those who do compute work as “not database companies; they are using their database capabilities for consumption for cloud compute.”)

But the company has carved a place for itself with enterprises and has thousands now on its books, including GE, IEX Cloud, Go Guardian, Palo Alto Networks, EOG Resources, and SiriusXM + Pandora.

“SingleStore’s first-of-a-kind cloud database is unmatched in speed, scale, and simplicity by anything in the market,” said Lonne Jaffe, managing director at Insight Partners, in a statement. “SingleStore’s differentiated technology allows customers to unify real-time transactions and analytics in a single database.” Vinod Khosla from Khosla Ventures added that “SingleStore is able to reduce data sprawl, run anywhere, and run faster with a single database, replacing legacy databases with the modern cloud.”

#amazon, #aws, #ceo, #cloud-computing, #cloud-infrastructure, #computing, #database, #database-management, #enterprise, #funding, #glynn-capital, #google-cloud-platform, #google-ventures, #hewlett-packard-enterprise, #khosla-ventures, #lonne-jaffe, #memsql, #microsoft, #mysql, #palo-alto-networks, #postgresql, #red-hat, #redis, #series-e, #singlestore, #snowflake, #vinod-khosla

Evervault’s ‘encryption as a service’ is now open access

Dublin-based Evervault, a developer-focused security startup which sells encryption vis API and is backed by a raft of big name investors including the likes of Sequoia, Kleiner Perkins and Index Ventures, is coming out of closed beta today — announcing open access to its encryption engine.

The startup says some 3,000 developers are on its waitlist to kick the tyres of its encryption engine, which it calls E3.

Among “dozens” of companies in its closed preview are drone delivery firm Manna, fintech startup Okra, and healthtech company Vital. Evervault says it’s targeting its tools at developers at companies with a core business need to collect and process four types of data: Identity & contact data; Financial & transaction data; Health & medical data; and Intellectual property.

The first suite of products it offers on E3 are called Relay and Cages; the former providing a new way for developers to encrypt and decrypt data as it passes in and out of apps; the latter offering a secure method — using trusted execution environments running on AWS — to process encrypted data by isolating the code that processes plaintext data from the rest of the developer stack.

Evervault is the first company to get a product deployed on Amazon Web Services’ Nitro Enclaves, per founder Shane Curran.

“Nitro Enclaves are basically environments where you can run code and prove that the code that’s running in the data itself is the code that you’re meant to be running,” he tells TechCrunch. “We were the first production deployment of a product on AWS Nitro Enclaves — so in terms of the people actually taking that approach we’re the only ones.”

It shouldn’t be news to anyone to say that data breaches continue to be a serious problem online. And unfortunately it’s sloppy security practices by app makers — or even a total lack of attention to securing user data — that’s frequently to blame when plaintext data leaks or is improperly accessed.

Evervault’s fix for this unfortunate ‘feature’ of the app ecosystem is to make it super simple for developers to bake in encryption via an API — taking the strain of tasks like managing encryption keys. (“Integrate Evervault in 5 minutes by changing a DNS record and including our SDK,” is the developer-enticing pitch on its website.)

“At the high level what we’re doing… is we’re really focusing on getting companies from [a position of] not approaching security and privacy from any perspective at all — up and running with encryption so that they can actually, at the very least, start to implement the controls,” says Curran.

“One of the biggest problems that companies have these days is they basically collect data and the data sort of gets sprawled across both their implementation and their test sets as well. The benefit of encryption is that  you know exactly when data was accessed and how it was accessed. So it just gives people a platform to see what’s happening with the data and start implementing those controls themselves.”

With C-Suite executives paying increasing mind to the need to properly secure data — thanks to years of horrific data breach scandals (and breach déjà vu), and also because of updated data protection laws like Europe’s General Data Protection Regulation (GDPR) which has beefed up penalties for lax security and data misuse — a growing number of startups are now pitching services that promise to deliver ‘data privacy’, touting tools they claim will protect data while still enabling developers to extract useful intel.

Evervault’s website also deploys the term “data privacy” — which it tells us it defines to mean that “no unauthorized party has access to plaintext user/customer data; users/customers and authorized developers have full control over who has access to data (including when and for what purpose); and, plaintext data breaches are ended”. (So encrypted data could, in theory, still leak — but the point is the information would remain protected as a result of still being robustly encrypted.)

Among a number of techniques being commercialized by startups in this space is homomorphic encryption — a process that allows for analysis of encrypted data without the need to decrypt the data.

Evervault’s first offering doesn’t go that far — although its ‘encryption manifesto‘ notes that it’s keeping a close eye on the technique. And Curran confirms it is likely to incorporate the approach in time. But he says its first focus has been to get E3 up and running with an offering that can help a broad swathe of developers.

“Fully homomorphic [encryption] is great. The biggest challenge if you’re targeting software developers who are building normal services it’s very hard to build general purpose applications on top of it. So we take another approach — which is basically using trusted execution environments. And we worked with the Amazon Web Services team on being their first production deployment of their new product called Nitro Enclaves,” he tells TechCrunch.

“The bigger focus for us is less about the underlying technology itself and it’s more about taking what the best security practices are for companies that are already investing heavily in this and just making them accessible to average developers who don’t even know how encryption works,” Curran continues. “That’s where we get the biggest nuance of Evervault vs some of these others privacy and security companies — we build for developers who don’t normally think about security when they’re building things and try to build a great experience around that… so it’s really just about bridging the gap between ‘the start of art’ and bringing it to average developers.”

“Over time fully homomorphic encryption is probably a no-brainer for us but both in terms of performance and flexibility for your average developer to get up and running it didn’t really make sense for us to build on it in its current form. But it’s something we’re looking into. We’re really looking at what’s coming out of academia — and if we can fit it in there. But in the meantime it’s all this trusted execution environment,” he adds.

Curran suggests Evervault’s main competitor at this point is open source encryption libraries — so basically developers opting to ‘do’ the encryption piece themselves. Hence it’s zeroing in on the service aspect of its offering; taking on encryption management tasks so developers don’t have to, while also reducing their security risk by ensuring they don’t have to touch data in the clear.

“When we’re looking at those sort of developers — who’re already starting to think about doing it themselves — the biggest differentiator with Evervault is, firstly the speed of integration, but more importantly it’s the management of encrypted data itself,” Curran suggests. “With Evervault we manage the keys but we don’t store any data and our customers store encrypted data but they don’t store keys. So it means that even if they want to encrypt something with Evervault they never have all the data themselves in plaintext — whereas with open source encryption they’ll have to have it at some point before they do the encryption. So that’s really the base competitor that we see.”

“Obviously there are some other projects out there — like Tim Berners-Lee’s Solid project and so on. But it’s not clear that there’s anybody else taking the developer-experience focused approach to encryption specifically. Obviously there’s a bunch of API security companies… but encryption through an API is something we haven’t really come across in the past with customers,” he adds.

While Evervault’s current approach sees app makers’ data hosted in dedicated trusted execution environments running on AWS, the information still exists there as plaintext — for now. But as encryption continues to evolves it’s possible to envisage a future where apps aren’t just encrypted by default (Evervault’s stated mission is to “encrypt the web”) but where user data, once ingested and encrypted, never needs to be decrypted — as all processing can be carried out on ciphertext.

Homomorphic encryption has unsurprisingly been called the ‘holy grail’ of security and privacy — and startups like Duality are busy chasing it. But the reality on the ground, online and in app stores, remains a whole lot more rudimentary. So Evervault sees plenty of value in getting on with trying to raise the encryption bar more generally.

Curran also points out that plenty of developers aren’t actually doing much processing of the data they gather — arguing therefore that caging plaintext data inside a trusted execution environment can thus abstract away a large part of the risk related to these sort of data flows anyway. “The reality is most developers who are building software these days aren’t necessarily processing data themselves,” he suggests. “They’re actually just sort of collecting it from their users and then sharing it with third party APIs.

“If you look at a startup building something with Stripe — the credit card flows through their systems but it always ends up being passed on somewhere else. I think that’s generally the direction that most startups are going these days. So you can trust the execution — depending on the security of the silicon in an Amazon data center kind of makes the most sense.”

On the regulatory side, the data protection story is a little more nuanced than the typical security startup spin.

While Europe’s GDPR certainly bakes security requirements into law, the flagship data protection regime also provides citizens with a suite of access rights attached to their personal data — a key element that’s often overlooked in developer-first discussions of ‘data privacy’.

Evervault concedes that data access rights haven’t been front of mind yet, with the team’s initial focus being squarely on encryption. But Curran tells us it plans — “over time” — to roll out products that will “simplify access rights as well”.

“In the future, Evervault will provide the following functionality: Encrypted data tagging (to, for example, time-lock data usage); programmatic role-based access (to, for example, prevent an employee seeing data in plaintext in a UI); and, programmatic compliance (e.g. data localization),” he further notes on that.

 

#api, #aws, #cryptography, #developer, #dublin, #encryption, #europe, #evervault, #general-data-protection-regulation, #homomorphic-encryption, #nitro-enclaves, #okra, #privacy, #security, #sequoia, #shane-curran, #tim-berners-lee

Microsoft says NSA needs to undo its $10B cloud computing contract with Amazon

An aerial view of the NSA.

Enlarge / An aerial view of the NSA. (credit: nsa.gov)

Amazon Web Services (AWS) has been named the winner of a $10 billion cloud computing contract, called “WildandStormy,” for the National Security Agency. But Microsoft, no doubt still salty about Amazon’s successful challenge of Redmond’s $10 billion JEDI contract with the Pentagon, filed a formal bid protest with the Government Accounting Office last month. 

Microsoft says that if the NSA had properly evaluated the bids, Microsoft would have won. The GAO will decide the outcome of the protest by the end of October. The news was first reported by the trade publication Washington Technology.

The award and protest come as US intelligence agencies have been looking at overhauling their computing and storage resources over the last several years. Currently, many of the agencies’ cloud operations use so-called GovCloud products from various vendors, including Amazon’s AWS and Microsoft’s Azure. The initial move to the cloud was spurred years ago by the exponential increase in data that intelligence agencies were gathering and analyzing. That increase was outpacing the agencies’ ability to store it all in-house. AWS was an early winner and secured a $600 million contract with the CIA in 2013.

Read 7 remaining paragraphs | Comments

#amazon, #aws, #cloud-computing, #government-contract, #microsoft, #nsa, #policy

Cloud infrastructure market kept growing in Q2 reaching $42B

It’s often said in baseball that a prospect has a high ceiling, reflecting the tremendous potential of a young player with plenty of room to get better. The same could be said for the cloud infrastructure market, which just keeps growing with little sign of slowing down any time soon. The market hit $42 billion in total revenue with all major vendors reporting, up $2 billion from Q1.

Synergy Research reports that the revenue grew at a speedy 39% clip, the fourth consecutive quarter that it has increased. AWS led the way per usual, but Microsoft continued growing at a rapid pace and Google also kept the momentum going.

AWS continues to defy market logic, actually increasing growth by 5% over the previous quarter at 37%, an amazing feat for a company with the market maturity of AWS. That accounted for $14.81 billion in revenue for Amazon’s cloud division, putting it close to a $60 billion run rate, good for a market leading 33% share. While that share has remained fairly steady for a number of years, the revenue continues to grow as the market pie grows ever larger.

Microsoft grew even faster at 51%, and while Microsoft cloud infrastructure data isn’t always easy to nail down, with 20% of market share according to Synergy Research, that puts it at $8.4 billion as it continues to push upward with revenue up from $7.8 billion last quarter.

Google too continued its slow and steady progress under the leadership of Thomas Kurian, leading the growth numbers with a 54% increase in cloud revenue in Q2 on revenue of $4.2 billion, good for 10% market share, the first time Google Cloud has reached double figures in Synergy’s quarterly tracking data. That’s up from $3.5 billion last quarter.

Synergy Research cloud infrastructure market share chart.

Image Credits: Synergy Research

After the Big 3, Alibaba held steady over Q1 at 6% (but will only report this week) with IBM falling a point from Q1 to 4% as Big Blue continues to struggle in pure infrastructure as it makes the transition to more of a hybrid cloud management player.

John Dinsdale, chief analyst at Synergy, says that the big three are spending big to help fuel this growth. “Amazon, Microsoft and Google in aggregate are typically investing over $25 billion in capex per quarter, much of which is going towards building and equipping their fleet of over 340 hyperscale data centers,” he said in a statement.

Meanwhile Canalys had similar numbers, but saw the overall market slightly higher at $47 billion. Their market share broke down to Amazon with 31%, Microsoft with 22% and Google with 8% of that total number.

Canalys analyst Blake Murray says that part of the reason companies are shifting workloads to the clouds is to help achieve environmental sustainability goals as the cloud vendors are working toward using more renewable energy to run their massive data centers.

“The best practices and technology utilized by these companies will filter to the rest of the industry, while customers will increasingly use cloud services to relieve some of their environmental responsibilities and meet sustainability goals,” Murray said in a statement.

Regardless of whether companies are moving to the cloud to get out of the data center business or because they hope to piggyback on the sustainability efforts of the big 3, companies are continuing a steady march to the cloud. With some estimates of worldwide cloud usage at around 25%, the potential for continued growth remains strong, especially with many markets still untapped outside the U.S.

That bodes well for the big three and for other smaller operators who can find a way to tap into slices of market share that add up to big revenue. “There remains a wealth of opportunity for smaller, more focused cloud providers, but it can be hard to look away from the eye-popping numbers coming out of the big three,” Dinsdale said.

In fact, it’s hard to see the ceiling for these companies any time in the foreseeable future.

#aws, #canalys, #cloud, #cloud-infrastructure-market-share, #earnings, #enterprise, #google, #microsoft, #synergy-research, #tc

Sean Gallagher and an AI expert break down our crazy machine-learning adventure

Sean Gallagher and an AI expert break down our crazy machine-learning adventure

Enlarge

We’ve spent the past few weeks burning copious amounts of AWS compute time trying to invent an algorithm to parse Ars’ front-page story headlines to predict which ones will win an A/B test—and we learned a lot. One of the lessons is that we—and by “we,” I mainly mean “me,” since this odyssey was more or less my idea—should probably have picked a less, shall we say, ambitious project for our initial outing into the machine-learning wilderness. Now, a little older and a little wiser, it’s time to reflect on the project and discuss what went right, what went somewhat less than right, and how we’d do this differently next time.

Our readers had tons of incredibly useful comments, too, especially as we got into the meaty part of the project—comments that we’d love to get into as we discuss the way things shook out. The vagaries of the edit cycle meant that the stories were being posted quite a bit after they were written, so we didn’t have a chance to incorporate a lot of reader feedback as we went, but it’s pretty clear that Ars has some top-shelf AI/ML experts reading our stories (and probably groaning out loud every time we went down a bit of a blind alley). This is a great opportunity for you to jump into the conversation and help us understand how we can improve for next time—or, even better, to help us pick smarter projects if we do an experiment like this again!

Our chat kicks off on Wednesday, July 28, at 1:00 pm Eastern Time (that’s 10:00 am Pacific Time and 17:00 UTC). Our three-person panel will consist of Ars Infosec Editor Emeritus Sean Gallagher and me, along with Amazon Senior Principal Technical Evangelist (and AWS expert) Julien Simon. If you’d like to register so that you can ask questions, use this link here; if you just want to watch, the discussion will be streamed on the Ars Twitter account and archived as an embedded video on this story’s page. Register and join in or check back here after the event to watch!

Read on Ars Technica | Comments

#ai, #ai-ml, #amazon, #artificial-intelligence, #aws, #biz-it, #headlines, #livechat, #machine-learning, #ml, #natural-language-processing, #nlp

Ars AI headline experiment finale—we came, we saw, we used a lot of compute time

Ars AI headline experiment finale—we came, we saw, we used a lot of compute time

Enlarge (credit: Aurich Lawson | Getty Images)

We may have bitten off more than we could chew, folks.

An Amazon engineer told me that when he heard what I was trying to do with Ars headlines, the first thing he thought was that we had chosen a deceptively hard problem. He warned that I needed to be careful about properly setting my expectations. If this was a real business problem… well, the best thing he could do was suggest reframing the problem from “good or bad headline” to something less concrete.

That statement was the most family-friendly and concise way of framing the outcome of my four-week, part-time crash course in machine learning. As of this moment, my PyTorch kernels aren’t so much torches as they are dumpster fires. The accuracy has improved slightly, thanks to professional intervention, but I am nowhere near deploying a working solution. Today, as I am allegedly on vacation visiting my parents for the first time in over a year, I sat on a couch in their living room working on this project and accidentally launched a model training job locally on the Dell laptop I brought—with a 2.4 GHz Intel Core i3 7100U CPU—instead of in the SageMaker copy of the same Jupyter notebook. The Dell locked up so hard I had to pull the battery out to reboot it.

Read 27 remaining paragraphs | Comments

#ai, #al-ml, #artificial-intelligence, #aws, #biz-it, #features, #is-our-machine-learning, #machine-learning, #ml, #natural-language-processing, #nlp, #sagemaker

Sophos acquires Braintrace to supercharge its threat detection capabilities

Thoma Bravo-owned Sophos has announced it’s acquiring Braintrace, a cybersecurity startup that provides organizations visibility into suspicious network traffic patterns. Terms of the deal were not disclosed.

Braintrace, which was founded in 2016 and has raised $10 million in funding, has developed a network detection and response (NDR) solution that helps organizations to easily inspect network traffic to identify and filter out suspicious activity. It does this using remote network packet capture (RNCAP) technology, which provides visibility into network traffic patterns, including encrypted traffic, without the need for man-in-the-middle decryption. It also provides visibility into cloud network traffic, a task that typically needs to be carried out on-site, and supports all of the major cloud providers including AWS and Microsoft Azure.

The deal will see Sophos integrate Braintrace’s NDR technology into its own adaptive cybersecurity ecosystem, which underpins all of its security products and services. The technology will also help Sophos collect data from firewalls, proxies and VPNs, allowing it to look for network traffic that contains instructions for malware like TrickBot, and attackers that misuse Cobalt Strike, as well as pre-empting other malicious traffic that might lead to ransomware attacks

Braintrace’s developers, data scientists and security analysts have joined its global Sophos’ managed threat response (MTR) and rapid response teams as part of the deal.

Commenting on the deal, which Sophos claims will make it one of the largest and fastest-growing managed detection and response (MDR) providers, the company’s CEO Joe Levy said: “We’re excited that Braintrace built this technology specifically to provide better security outcomes to their MDR customers. It’s hard to beat the effectiveness of solutions built by teams of skilled practitioners and developers to solve real-world cybersecurity problems.”

Bret Laughlin, co-founder and CEO of Braintrace, added: “We built Braintrace’s NDR technology from the ground up for detection and now, with Sophos, it will fit into a complete system to provide cross-product detection and response across a multi-vendor ecosystem.”

The deal comes a little over a year after Thoma Bravo completed its $3.9 billion takeover of Sophos, and sees the private equity firm further increasing its reach in the cybersecurity space. It acquired security vendor Proofpoint for $12.3 billion back in April, and recently led a $225 million funding round in zero trust unicorn Illumio.

#aws, #ceo, #computer-security, #computing, #cybercrime, #cybersecurity-startup, #illumio, #microsoft, #proofpoint, #security, #security-software, #sophos, #technology, #thoma-bravo

Our AI headline experiment continues: Did we break the machine?

Our AI headline experiment continues: Did we break the machine?

Enlarge (credit: Aurich Lawson | Getty Images)

We’re in phase three of our machine-learning project now—that is, we’ve gotten past denial and anger, and we’re now sliding into bargaining and depression. I’ve been tasked with using Ars Technica’s trove of data from five years of headline tests, which pair two ideas against each other in an “A/B” test to let readers determine which one to use for an article. The goal is to try to build a machine-learning algorithm that can predict the success of any given headline. And as of my last check-in, it was… not going according to plan.

I had also spent a few dollars on Amazon Web Services compute time to discover this. Experimentation can be a little pricey. (Hint: If you’re on a budget, don’t use the “AutoPilot” mode.)

We’d tried a few approaches to parsing our collection of 11,000 headlines from 5,500 headline tests—half winners, half losers. First, we had taken the whole corpus in comma-separated value form and tried a “Hail Mary” (or, as I see it in retrospect, a “Leeroy Jenkins“) with the Autopilot tool in AWS’ SageMaker Studio. This came back with an accuracy result in validation of 53 percent. This turns out to be not that bad, in retrospect, because when I used a model specifically built for natural-language processing—AWS’ BlazingText—the result was 49 percent accuracy, or even worse than a coin-toss. (If much of this sounds like nonsense, by the way, I recommend revisiting Part 2, where I go over these tools in much more detail.)

Read 29 remaining paragraphs | Comments

#ai, #ai-ml, #amazon-sagemaker, #artificial-intelligence, #aws, #biz-it, #feature, #features, #machine-learning, #ml, #natural-language-processing, #nlp, #tokenization

Here’s the new car Formula 1 hopes will improve racing in 2022

On Thursday in Silverstone, England ahead of this weekend’s British Grand Prix, Formula 1 revealed next year’s car to the public. 2022 will see the biggest shake-up to the sport’s technical regulations since the introduction of the turbocharged hybrid powertrains in 2014. There’s been a fundamental change in the way the car creates its aerodynamic downforce, with the goal being to make it easier for F1 cars to race each other closely. Ars spoke to Rob Smedley, director of data systems at F1, to find out why and how the new car came to be.

What’s the problem?

The cars that will race each other at Silverstone this weekend use the air to generate grip through a combination of the front wing and rear diffuser. And they make an awful lot of downforce, which is part of the reason F1 lap times have reached historic lows. The problem is what happens to the air after it’s passed over an F1 car’s body—it becomes a massive wake of disturbed air. A wing running in turbulent air won’t work nearly as efficiently as a wing running in clean air, and that means it’s very hard for one car to follow another closely enough to try and overtake—something that F1 fans have told the sport they want to see more often.

“As the [2021] car moves in, let’s say a second behind, then it’s losing around 25 percent of its downforce,” Smedley said. “As it moves in to about half a second—a closing distance and getting to the point where they could start to have this wheel to wheel interaction—at that point it loses 40 percent of its downforce. So the loss is immense.”

Read 11 remaining paragraphs | Comments

#aerodynamics, #amazon, #amazon-aws, #amazon-ec2, #aws, #cars, #cfd, #computational-fluid-dynamics, #downforce, #f1, #formula-1, #racing, #rob-smedley

Nym gets $6M for its anonymous overlay mixnet to sell privacy as a service

Switzerland-based privacy startup Nym Technologies has raised $6 million, which is being loosely pegged as a Series A round.

Earlier raises included a $2.5M seed round in 2019. The founders also took in grant money from the European Union’s Horizon 2020 research fund during an earlier R&D phase developing the network tech.

The latest funding will be used to continue commercial development of network infrastructure which combines an old idea for obfuscating the metadata of data packets at the transport network layer (Mixnets) with a crypto inspired reputation and incentive mechanism to drive the required quality of service and support a resilient, decentralized infrastructure.

Nym’s pitch is it’s building “an open-ended anonymous overlay network that works to irreversibly disguise patterns in Internet traffic”.

Unsurprisingly, given its attention to crypto mechanics, investors in the Series A have strong crypto ties — and cryptocurrency-related use-cases are also where Nym expects its first users to come from — with the round led by Polychain Capital, with participation from a number of smaller European investors including Eden Block, Greenfield One, Maven11, Tioga, and 1kx.

Commenting in a statement, Will Wolf of Polychain Capital, said: “We’re incredibly excited to partner with the Nym team to further their mission of bringing robust, sustainable and permissionless privacy infrastructure to all Internet users. We believe the Nym network will provide the strongest privacy guarantees with the highest quality of service of any mixnet and thus may become a very valuable piece of core internet infrastructure.”

The Internet’s ‘original sin’ was that core infrastructure wasn’t designed with privacy in mind. Therefore the level of complicity involved in Mixnets — shuffling and delaying encrypted data packets in order to shield sender-to-recipient metadata from adversaries with a global view of a network — probably seemed like over engineering all the way back when the web’s scaffolding was being pieced together.

But then came Bitcoin and the crypto boom and — also in 2013 — the Snowden revelations which ripped the veil off the NSA’s ‘collect it all’ mantra, as Booz Allen Hamilton sub-contractor Ed risked it all to dump data on his own (and other) governments’ mass surveillance programs. Suddenly network level adversaries were front page news. And so was Internet privacy.

Since Snowden’s big reveal, there’s been a slow burn of momentum for privacy tech — with rising consumer awareness fuelling usage of services like e2e encrypted email and messaging apps. Sometimes in spurts and spikes, related to specific data breaches and scandals. Or indeed privacy-hostile policy changes by mainstream tech giants (hi Facebook!).

Legal clashes between surveillance laws and data protection rights are also causing growing b2b headaches, especially for US-based cloud services. While growth in cryptocurrencies is driving demand for secure infrastructure to support crypto trading.

In short, the opportunity for privacy tech, both b2b and consumer-facing, is growing. And the team behind Nym thinks conditions look ripe for general purpose privacy-focused networking tech to take off too.

Of course there is already a well known anonymous overlay network in existence: Tor, which does onion routing to obfuscate where traffic was sent from and where it ends up.

The node-hopping component of Nym’s network shares a feature with the Tor network. But Tor does not do packet mixing — and Nym’s contention is that a functional mixnet can provide even stronger network-level privacy.

It sets out the case on its website — arguing that “Tor’s anonymity properties can be defeated by an entity that is capable of monitoring the entire network’s ‘entry’ and ‘exit’ nodes” since it does not take the extra step of adding “timing obfuscation” or “decoy traffic” to obfuscate the patterns that could be exploited to deanonymize users.

“Although these kinds of attacks were thought to be unrealistic when Tor was invented, in the era of powerful government agencies and private companies, these kinds of attacks are a real threat,” Nym suggests, further noting another difference in that Tor’s design is “based on a centralized directory authority for routing”, whereas Nym fully decentralizes its infrastructure.

Proving that suggestion will be quite the challenge, of course. And Nym’s CEO is upfront in his admiration for Tor — saying it is the best technology for securing web browsing right now.

“Most VPNs and almost all cryptocurrency projects are not as secure or as private as Tor — Tor is the best we have right now for web browsing,” says Nym founder and CEO Harry Halpin. “We do think Tor made all the right decisions when they built the software — at the time there was no interest from venture capital in privacy, there was only interest from the US government. And the Internet was too slow to do a mixnet. And what’s happened is speed up 20 years, things have transformed.

“The US government is no longer viewed as a defender of privacy. And now — weirdly enough — all of a sudden venture capital is interested in privacy and that’s a really big change.”

With such a high level of complexity involved in what Nym’s doing it will, very evidently, need to demonstrate the robustness of its network protocol and design against attacks and vulnerabilities on an ongoing basis — such as those seeking to spot patterns or identify dummy traffic and be able to relink packets to senders and receivers.

The tech is open source but Nym confirms the plan is to use some of the Series A funding for an independent audit of new code.

It also touts the number of PhDs it’s hired to-date — and plans to hire a bunch more, saying it will be using the new round to more than double its headcount, including hiring cryptographers and developers, as well as marketing specialists in privacy.

The main motivation for the raise, per Halpin, is to spend on more R&D to explore — and (he hopes) — solve some of the more specific use-cases it’s kicking around, beyond the basic one of letting developers use the network to shield user traffic (a la Tor).

Nym’s whitepaper, for example, touts the possibility for the tech being used to enable users to prove they have the right to access a service without having to disclose their actual identity to the service provider.

Another big difference vs Tor is that Tor is a not-for-profit — whereas Nym wants to build a for-profit business around its Mixnet.

It intends to charge users for access to the network — so for the obfuscation-as-a-service of having their data packets mixed into a crowd of shuffled, encrypted and proxy node-hopped others.

But potentially also for some more bespoke services — with Nym’s team eyeing specific use-cases such as whether its network could offer itself as a ‘super VPN’ to the banking sector to shield their transactions; or provide a secure conduit for AI companies to carry out machine learning processing on sensitive data-sets (such as healthcare data) without risking exposing the information itself.

“The main reason we raised this Series A is we need to do more R&D to solve some of these use-cases,” says Halpin. “But what impressed Polychain was they said wow there’s all these people that are actually interested in privacy — that want to run these nodes, that actually want to use the software. So originally when we envisaged this startup we were imagining more b2b use-cases I guess and what I think Polychain was impressed with was there seemed to be demand from b2c; consumer demand that was much higher than expected.”

Halpin says they expect the first use-cases and early users to come from the crypto space — where privacy concerns routinely attach themselves to blockchain transactions.

The plan is to launch the software by the end of the year or early next, he adds.

“We will have at least some sort of chat applications — for example it’s very easy to use our software with Signal… so we do think something like Signal is an ideal use-case for our software — and we would like to launch with both a [crypto] wallet and a chat app,” he says. “Then over the next year or two — because we have this runway — we can work more on kind of higher speed applications. Things like try to find partnerships with browsers, with VPNs.”

At this (still fairly early) stage of the network’s development — an initial testnet was launched in 2019 — Nym’s eponymous network has amassed over 9,000 nodes. These distributed, crowdsourced providers are only earning a NYM reputation token for now, and it remains to be seen how much exchangeable crypto value they might earn in the future as suppliers of key infrastructure if/when usage takes off.

Why didn’t Mixnets as a technology take off before, though? After all the idea dates back to the 1980s. There’s a range of reasons, according to Halpin — issues with scalability being one of them one. And a key design “innovation” he points to vis-a-vis its implementation of Mixnet technology is the ability to keep adding nodes so the network is able to scale to meet demand.

Another key addition is that the Nym protocol injects dummy traffic packets into the shuffle to make it harder for adversaries to decode the path of any particular message — aiming to bolster the packet mixing process against vulnerabilities like correlation attacks.

While the Nym network’s crypto-style reputation and incentive mechanism — which works to ensure the quality of mixing (“via a novel proof of mixing scheme”, as its whitepaper puts it) — is another differentiating component Halpin flags.

“One of our core innovations is we scale by adding servers. And the question is how do we add servers? To be honest we added servers by looking at what everyone had learned about reputation and incentives from cryptocurrency systems,” he tells TechCrunch. “We copied that — those insights — and attached them to mix networks. So the combination of the two things ends up being pretty powerful.

“The technology does essentially three things… We mix packets. You want to think about an unencrypted packet like a card, an encrypted packet you flip over so you don’t know what the card says, you collect a bunch of cards and you shuffle them. That’s all that mixing is — it just randomly permutates the packets… Then you hand them to the next person, they shuffle them. You hand them to the third person, they shuffle them. And then they had the cards to whoever is at the end. And as long as different people gave you cards at the beginning you can’t distinguish those people.”

More generally, Nym also argues it’s an advantage to be developing mixnet technology that’s independent and general purpose — folding all sorts and types of traffic into a shuffled pack — suggesting it can achieve greater privacy for users’ packets in this pooled crowd vs similar tech offered by a single provider to only their own users (such as the ‘privacy relay’ network recently announced by Apple).

In the latter case, an attacker already knows that the relayed traffic is being sent by Apple users who are accessing iCloud services. Whereas — as a general purpose overlay layer — Nym can, in theory, provide contextual coverage to users as part of its privacy mix. So another key point is that the level of privacy available to Nym users scales as usage does.

Historical performance issues with bandwidth and latency are other reasons Halpin cites for Mixnets being largely left on the academic shelf. (There have been some other deployments, such as Loopix — which Nym’s whitepaper says its design builds on by extending it into a “general purpose incentivized mixnet architecture” — but it’s fair to say the technology hasn’t exactly gone mainstream.)

Nonetheless, Nym’s contention is the tech’s time is finally coming; firstly because technical challenges associated with Mixnets can be overcome — because of gains in Internet bandwidth and compute power; as well as through incorporating crypto-style incentives and other design tweaks it’s introducing (e.g. dummy traffic) — but also, and perhaps most importantly, because privacy concerns aren’t simply going to disappear.

Indeed, Halpin suggests governments in certain countries may ultimately decide their exposure to certain mainstream tech providers which are subject to state mass surveillance regimes — whether that’s the US version or China’s flavor or elsewhere —  simply isn’t tenable over the longer run and that trusting sensitive data to corporate VPNs based in countries subject to intelligence agency snooping is a fool’s game.

(And it’s interesting to note, for example, that the European Data Protection Supervisor is currently conducting a review of EU bodies use of mainstream US cloud services from AWS and Microsoft to check whether they are in compliance with last summer’s Schrems II ruling by the CJEU, which struck down the EU-US Privacy Shield deal, after again finding US surveillance law to be essentially incompatible with EU privacy rights… )

Nym is betting that some governments will — eventually — come looking for alternative technology solutions to the spying problem. Although government procurement cycles make that play a longer game.

In the near term, Halpin says they expect interest and usage for the metadata-obscuring tech to come from the crypto world where there’s a need to shield transactions from view of potential hackers.

“The websites that [crypto] people use — these exchanges — have also expressed interest,” he notes, flagging that Nym also took in some funding from Binance Labs, the VC arm of the cryptocurrency exchange, after it was chosen to go through the Lab’s incubator program in 2018.

The issue for crypto users is their networks are (relatively) small, per Halpin — which makes them vulnerable to deanonymization attacks.

“The thing with a small network is it’s easy for random people to observe this. For example people who want to hack your exchange wallet — which happens all the time. So what cryptocurrency exchanges and companies that deal with cryptocurrency are concerned about is typically they do not want the IP address of their wallet revealed for certain kinds of transactions,” he adds. “This is a real problem for cryptocurrency exchanges — and it’s not that their enemy is the NSA; their enemy could be — and almost always is — an unknown, often lone individual but highly skilled hacker. And these kinds of people can do network observations, on smaller networks like cryptocurrency networks, that are essentially are as powerful as what the NSA could do to the entire Internet.”

There are now a range of startups seeking to decentralize various aspects of Internet or common computing infrastructure — from file storage to decentralized DNS. And while some of these tout increased security and privacy as core benefits of decentralization — suggesting they can ‘fix’ the problem of mass surveillance by having an architecture that massively distributes data, Halpin argues that a privacy claim being routinely attached to decentralized infrastructure is misplaced. (He points to a paper he co-authored on this topic, entitled Systematizing Decentralization and Privacy: Lessons from 15 Years of Research and Deployments.)

“Almost all of those projects gain decentralization at the cost of privacy,” he argues. “Because any decentralized system is easier to observe because the crowd has been spread out… than a centralized system — to a large extent. If the adversary is sufficiently powerful enough all the participants in the system. And historically we believe that most people who are interested in decentralization are not expects in privacy and underestimate how easy it is to observe decentalized systems — because most of these systems are actually pretty small.”

He points out there are “only” 10,000 full nodes in Bitcoin, for example, and a similar amount in Ethereum — while other, newer and more nascent decentralized services are likely to have fewer nodes, maybe even just a few hundred or thousand.

And while the Nym network has a similar amount of nodes to Bitcoin, the difference is it’s a mixnet too — so it’s not just decentralized but it’s also using multiple layers of encryption and traffic mixing and the various other obfuscation steps which he says “none of these other people do”.

“We assume the enemy is observing everything in our software,” he adds. “We are not what we call ‘security through obscurity’ — security through obscurity means you assume the enemy just can’t see everything; isn’t looking at your software too carefully; doesn’t know where all your servers are. But — realistically — in an age of mass surveillance, the enemy will know where all your services are and they can observe all the packets coming in, all the packets coming out. And that’s a real problem for decentralized networks.”

Post-Snowden, there’s certainly been growing interest in privacy by design — and a handful of startups and companies have been able to build momentum for services that promise to shield users’ data, such as DuckDuckGo (non-tracking search); Protonmail (e2e encrypted email); and Brave (privacy-safe browsing). Apple has also, of course, very successfully markets its premium hardware under a ‘privacy respecting’ banner.

Halpin says he wants Nym to be part of that movement; building privacy tech that can touch the mainstream.

“Because there’s so much venture capital floating into the market right now I think we have a once in a generation chance — just as everyone was excited about p2p in 2000 — we have a once in a generation chance to build privacy technology and we should build companies which natively support privacy, rather than just trying to bolt it on, in a half hearted manner, onto non-privacy respecting business models.

“Now I think the real question — which is why we didn’t raise more money — is, is there enough consumer and business demand that we can actually discover what the cost of privacy actually is? How much are people willing to pay for it and how much does it cost? And what we do is we do privacy on such a fundamental level is we say what is the cost of a privacy-enhanced byte or packet? So that’s what we’re trying to figure out: How much would people pay just for a privacy-enhanced byte and how much does just a privacy enhanced byte cost? And is this a small enough marginal cost that it can be added to all sorts of systems — just as we added TLS to all sorts of systems and encryption.”

#aws, #binance-labs, #blockchain, #cloud-services, #cryptocurrency, #cryptography, #encryption, #europe, #european-union, #machine-learning, #p2p, #polychain-capital, #privacy, #privacy-technology, #routing, #snowden, #surveillance-law, #tc, #tor, #vpn

Scaling CockroachDB in the red ocean of relational databases

Most database startups avoid building relational databases, since that market is dominated by a few goliaths. Oracle, MySQL and Microsoft SQL Server have embedded themselves into the technical fabric of large- and medium-size companies going back decades. These established companies have a lot of market share and a lot of money to quash the competition.

So rather than trying to compete in the relational database market, over the past decade, many database startups focused on alternative architectures such as document-centric databases (like MongoDB), key-value stores (like Redis) and graph databases (like Neo4J). But Cockroach Labs went against conventional wisdom with CockroachDB: It intentionally competed in the relational database market with its relational database product.

While it did face an uphill battle to penetrate the market, Cockroach Labs saw a surprising benefit: It didn’t have to invent a market. All it needed to do was grab a share of a market that also happened to be growing rapidly.

Cockroach Labs has a bright future, compelling technology, a lot of money in the bank and has an experienced, technically astute executive team.

In previous parts of this EC-1, I looked at the origins of CockroachDB, presented an in-depth technical description of its product as well as an analysis of the company’s developer relations and cloud service, CockroachCloud. In this final installment, we’ll look at the future of the company, the competitive landscape within the relational database market, its ability to retain talent as it looks toward a potential IPO or acquisition, and the risks it faces.

CockroachDB’s success is not guaranteed. It has to overcome significant hurdles to secure a profitable place for itself among a set of well-established database technologies that are owned by companies with very deep pockets.

It’s not impossible, though. We’ll first look at MongoDB as an example of how a company can break through the barriers for database startups competing with incumbents.

When life gives you Mongos, make MongoDB

Dev Ittycheria, MongoDB CEO, rings the Nasdaq Stock Market Opening Bell. Image Credits: Nasdaq, Inc

MongoDB is a good example of the risks that come with trying to invent a new database market. The company started out as a purely document-centric database at a time when that approach was the exception rather than the rule.

Web developers like document-centric databases because they address a number of common use cases in their work. For example, a document-centric database works well for storing comments to a blog post or a customer’s entire order history and profile.

#aws, #baidu, #cloud, #cloud-computing, #cloud-services, #cockroach-labs, #cockroachdb, #cockroachdb-ec-1, #data-management, #database, #database-management, #ec-cloud-and-enterprise-infrastructure, #ec-enterprise-applications, #ec-1, #enterprise, #google, #mongodb, #mysql, #new-york-city, #nosql, #oracle, #relational-database, #saas, #startups

Feeding the machine: We give an AI some headlines and see what it does

Turning the lens on ourselves, as it were.

Enlarge / Turning the lens on ourselves, as it were.

There’s a moment in any foray into new technological territory that you realize you may have embarked on a Sisyphean task. Staring at the multitude of options available to take on the project, you research your options, read the documentation, and start to work—only to find that actually just defining the problem may be more work than finding the actual solution.

Reader, this is where I found myself two weeks into this adventure in machine learning. I familiarized myself with the data, the tools, and the known approaches to problems with this kind of data, and I tried several approaches to solving what on the surface seemed to be a simple machine learning problem: Based on past performance, could we predict whether any given Ars headline will be a winner in an A/B test?

Things have not been going particularly well. In fact, as I finished this piece, my most recent attempt showed that our algorithm was about as accurate as a coin flip.

Read 33 remaining paragraphs | Comments

#ai-ml, #amazon, #analysis, #artificial-intelligence, #aws, #biz-it, #feature, #feature-report, #features, #is-our-machine-learning, #machine-learning, #natural-language-processing, #nlp, #sagemaker

Extra Crunch roundup: Crucial API metrics, US startup funding, advanced SEO tactics

On a recent episode of Extra Crunch Live, Retail Zipline founder Melissa Wong and Emergence Capital investor Lotti Siniscalco joined Managing Editor Jordan Crook to walk attendees through Zipline’s Series A deck.

Interestingly, the conversation revealed that Wong declined an invitation to do a virtual pitch and insisted on an in-person meeting.

“She was one of the few or maybe the only CEO who ever stood up to pitch the entire team,” said Siniscalco.

“She pointed to the screen projected behind her to help us stay on the most relevant piece of information. The way she did it really made us stay with her. Like, we couldn’t break eye contact.”


Full Extra Crunch articles are only available to members.
Use discount code ECFriday to save 20% off a one- or two-year subscription.


Beyond Wong’s pitch technique, this post also examines some of the key “customer love” metrics that helped Zipline win the day, such as CAC, churn rates and net promoter score.

“In retrospect, I really underestimated the competitive advantage of coming from the industry,” said Wong. “But it resulted in the numbers in our deck, because I know what customers want, what they want to buy next, how to keep them happy and I was able to be way more capital-efficient.”

Read our recap with highlights from their conversation, or click though to watch a video with their entire chat.

Thanks very much for reading Extra Crunch this week!

Walter Thompson
Senior Editor, TechCrunch
@yourprotagonist

Investors don’t expect the US startup funding market to slow down

Global venture capital reached $156 billion in Q2 2021, a YOY increase of 157%. A record number of unicorns found their feet during the same period and valuations rose across the board, report Anna Heim and Alex Wilhelm in today’s edition of The Exchange.

Even if round counts didn’t set all-time highs, “the general vibe of Q2 venture capital data was clear: It’s a great time for startups looking to raise capital.”

Anna and Alex are interviewing VCs in different regions to find out why they’re feeling so generous and optimistic. Today, they started with the following U.S.-based investors:

  • Amy Cheetham, principal, Costanoa Ventures
  • Marlon Nichols, founding managing partner, MaC Venture Capital
  • Vanessa Larco, partner, New Enterprise Associates
  • Jeff Grabow, venture capital leader, EY US

Despite the hype, construction tech will be hard to disrupt

Image of two construction workers examining blueprints next to a laptop to represent tech on construction sites.

Image Credits: AzmanJaka (opens in a new window) / Getty Images

The construction industry might seem like a sector wanting innovation, Safe Site Check In CEO and founder David Ward writes in a guest column, but there are unique challenges that make construction firms slow to adapt to new technology.

From the way construction projects are funded to complicated local regulations, there’s no one-size-fits-all solution for the construction industry’s tech problems.

Construction tech might be appealing to investors, Ward writes, but it must be “easy to use, easy to deploy or access while on a job site, and improve productivity almost immediately.”

 

3 analysts weigh in: What are Andy Jassy’s top priorities as Amazon’s new CEO?

Jeff Bezos, executive chairman and Andy Jassy, CEO at Amazon

Image Credits: AP Photo/Isaac Brekken/John Locher

Now that he’s stepping away from AWS and taking over for Jeff Bezos, what are the biggest challenges facing incoming Amazon CEO Andy Jassy?

Enterprise reporter Ron Miller reached out to three analysts to get their take:

  • Robin Ody, Canalys
  • Sucharita Kodali, Forrester
  • Ed Anderson, Gartner

Amazon is listed second in the Fortune 500, but it’s not all sunshine and roses — maintaining growth, unionization, and the potential for antitrust regulation at home and abroad are just a few of his responsibilities.

“I think the biggest to-do is to just continue that momentum that the company has had for the last several years,” Kodali says. “He has to make sure that they don’t lose that. If he does that, I mean, he will win.”

The most important API metric is time to first call

Close up of a stopwatch resting on a laptop's trackpad.

Image Credits: Peter Dazeley (opens in a new window) / Getty Images

Publishing an API isn’t enough for any startup: Once it’s released, the hard work of cultivating a developer base begins.

Postman’s head of developer relations, Joyce Lin, wrote a guest post for Extra Crunch based on the findings of a study aimed at increasing adoption of APIs that utilize a public workspace.

Lin found that the most important metric for a public API is time to first call (TTFC). It makes sense — faster TTFC allows developers to begin using new tools quickly. As a result, “legitimately streamlining TTFC results in a larger market potential of better-educated users for the later stages of your developer journey,” writes Lin.

This post isn’t just for the developers in our audience: TTFC is a metric that product and growth teams should also keep top of mind, they suggest.

“Even if your market is defined as a limited subset of the developer community, any enhancements you make to TTFC equate to a larger available market.”

 

Q3 IPO cycle starts strong with Couchbase pricing and Kaltura relisting

Image Credits: olli0815/iStock

Couchbase and Kaltura offered new filings Monday, with NoSQL provider Couchbase setting an initial price range for its IPO and Kaltura resurrecting its public offering with a fresh price range and new financial information.

“Both bits of news should help us get a handle on how the Q3 2021 IPO cycle is shaping up at the start,” Alex Wilhelm writes.

 

5 advanced-ish SEO tactics to win in 2021

SEO tactics for the underdog

Image Credits: PM Images (opens in a new window)/ Getty Images

Mark Spera, the head of growth marketing at Minted, offers SEO tips to help smaller sites stand out.

He writes in a guest column that Google’s algorithm “errs on the side of caution,” which leads the search engine to favor larger, more established websites.

“The cards aren’t in your favor, so you need to be even more strategic than the big guys,” he writes. “This means executing on some cutting-edge hacks to increase your SEO throughput and capitalize on some of the arbitrage still left in organic search. I call these five tactics ‘advanced-ish,’ because none of them are complicated, but all of them are supremely important for search marketers in 2021.”

#advertising-tech, #andy-jassy, #api, #aws, #couchbase, #developer, #emergence-capital, #entrepreneurship, #jeff-bezos, #kaltura, #lotti-siniscalco, #melissa-wong, #private-equity, #startups, #tc, #venture-capital

Microsoft confirms it’s buying cybersecurity startup RiskIQ

Microsoft has confirmed it’s buying RiskIQ, a San Francisco-based cybersecurity company that provides threat intelligence and cloud-based software as a service for organizations.

Terms of the deal, which will see RiskIQ’s threat intelligence services integrated into Microsoft’s flagship security offerings, were not disclosed, although Bloomberg previously reported that Microsoft will pay more than $500 million in cash for the company. Microsoft declined to confirm the reported figure.

The announcement comes amid a heightened security landscape as organizations shift to remote and hybrid working strategies.

RiskIQ scours the web, mapping out details about websites and networks, domain name records, certificates and other information, like WHOIS registration data, providing customers visibility into what assets, devices and services can be accessed outside of a company’s firewall. That helps companies lock down their assets and limit their attack surface from malicious actors. It’s that data in large part that helped the company discover and understand Magecart, a collection of groups that inject credit card stealing malware into vulnerable websites.

Microsoft says that by embedding RiskIQ’s technologies into its core products, its customers will be able to build a more comprehensive view of the global threats to their businesses as workforces continue to work outside of the traditional office environment.

The deal will also help organizations to keep an eye on supply-chain risks, Microsoft says. This is likely a growing priority for many: an attack on software provider SolarWinds last year saw affected at least 18,000 of its customers, and just this month IT vendor Kaseya fell victim to a ransomware attack that spread to more than 1,000 downstream businesses.

Eric Doerr, vice president of cloud security at Microsoft, said: “RiskIQ helps customers discover and assess the security of their entire enterprise attack surface — in the Microsoft cloud, AWS, other clouds, on-premises, and from their supply chain. With more than a decade of experience scanning and analyzing the internet, RiskIQ can help enterprises identify and remediate vulnerable assets before an attacker can capitalize on them.”

RiskIQ was founded in 2009 and has raised a total of $83 million over four rounds of funding. Elias Manousos, who co-founded RiskIQ and serves as its chief executive, said he was “thrilled” at the acquisition.

“The vision and mission of RiskIQ is to provide unmatched internet visibility and insights to better protect and inform our customers and partners’ security programs,” said Manousos. “Our combined capabilities will enable best-in-class protection, investigations, and response against today’s threats.”

The acquisition is one of many Microsoft has made recently in the cybersecurity space in recent months. The software giant last year bought Israeli security startup CyberX in a bid to boost its Azure IoT business, and just last month it acquired Internet of Things security firm ReFirm Labs.

#aws, #azure-iot, #cloud-based-software, #cloud-computing, #computer-security, #computing, #cyberx, #kaseya, #microsoft, #ransomware, #riskiq, #san-francisco, #security, #software, #solarwinds, #supply-chain, #technology, #vulnerability

3 analysts weigh in: What are Andy Jassy’s top priorities as Amazon’s new CEO?

It’s not easy following a larger-than-life founder and CEO of an iconic company, but that’s what former AWS CEO Andy Jassy faces this week as he takes over for Jeff Bezos, who moves into the executive chairman role. Jassy must deal with myriad challenges as he becomes the head honcho at the No. 2 company on the Fortune 500.

How he handles these challenges will define his tenure at the helm of the online retail giant. We asked several analysts to identify the top problems he will have to address in his new role.

Ensure a smooth transition

Handling that transition smoothly and showing investors and the rest of the world that it’s business as usual at Amazon is going to be a big priority for Jassy, said Robin Ody, an analyst at Canalys. He said it’s not unlike what Satya Nadella faced when he took over as CEO at Microsoft in 2014.

Handling the transition smoothly and showing investors and the rest of the world that it’s business as usual at Amazon is going to be a big priority for Jassy.

“The biggest task is that you’re following Jeff Bezos, so his overarching issue is going to be stability and continuity. … The eyes of the world are on that succession. So managing that I think is the overall issue and would be for anyone in the same position,” Ody said.

Forrester analyst Sucharita Kodali said Jassy’s biggest job is just to keep the revenue train rolling. “I think the biggest to-do is to just continue that momentum that the company has had for the last several years. He has to make sure that they don’t lose that. If he does that, I mean, he will win,” she said.

Maintain company growth

As an online retailer, the company has thrived during COVID, generating $386 billion in revenue in 2020, up more than $100 billion over the prior year. As Jassy takes over and things return to something closer to normal, will he be able to keep the revenue pedal to the metal?

#amazon, #andy-jassy, #aws, #ecommerce, #enterprise, #jeff-bezos, #personnel, #tc

The single vendor requirement ultimately doomed the DoD’s $10B JEDI cloud contract

When the Pentagon killed the JEDI cloud program yesterday, it was the end of a long and bitter road for a project that never seemed to have a chance. The question is why it didn’t work out in the end, and ultimately I think you can blame the DoD’s stubborn adherence to a single vendor requirement, a condition that never made sense to anyone, even the vendor that ostensibly won the deal.

In March 2018, the Pentagon announced a mega $10 billion, decade-long cloud contract to build the next generation of cloud infrastructure for the Department of Defense. It was dubbed JEDI, which aside from the Star Wars reference, was short for Joint Enterprise Defense Infrastructure.

The idea was a 10 year contract with a single vendor that started with an initial two year option. If all was going well, a five year option would kick in and finally a three year option would close things out with earnings of $1 billion a year.

While the total value of the contract had it been completed was quite large, a billion a year for companies the size of Amazon, Oracle or Microsoft is not a ton of money in the scheme of things. It was more about the prestige of winning such a high-profile contract and what it would mean for sales bragging rights. After all, if you passed muster with the DoD, you could probably handle just about anyone’s sensitive data, right?

Regardless, the idea of a single-vendor contract went against conventional wisdom that the cloud gives you the option of working with the best-in-class vendors. Microsoft, the eventual winner of the ill-fated deal acknowledged that the single vendor approach was flawed in an interview in April 2018:

Leigh Madden, who heads up Microsoft’s defense effort, says he believes Microsoft can win such a contract, but it isn’t necessarily the best approach for the DoD. “If the DoD goes with a single award path, we are in it to win, but having said that, it’s counter to what we are seeing across the globe where 80 percent of customers are adopting a multi-cloud solution,” Madden told TechCrunch.

Perhaps it was doomed from the start because of that. Yet even before the requirements were fully known there were complaints that it would favor Amazon, the market share leader in the cloud infrastructure market. Oracle was particularly vocal, taking its complaints directly to the former president before the RFP was even published. It would later file a complaint with the Government Accountability Office and file a couple of lawsuits alleging that the entire process was unfair and designed to favor Amazon. It lost every time — and of course, Amazon wasn’t ultimately the winner.

While there was a lot of drama along the way, in April 2019 the Pentagon named two finalists, and it was probably not too surprising that they were the two cloud infrastructure market leaders: Microsoft and Amazon. Game on.

The former president interjected himself directly in the process in August that year, when he ordered the Defense Secretary to review the matter over concerns that the process favored Amazon, a complaint which to that point had been refuted several times over by the DoD, the Government Accountability Office and the courts. To further complicate matters, a book by former defense secretary Jim Mattis claimed the president told him to “screw Amazon out of the $10 billion contract.” His goal appeared to be to get back at Bezos, who also owns the Washington Post newspaper.

In spite of all these claims that the process favored Amazon, when the winner was finally announced in October 2019, late on a Friday afternoon no less, the winner was not in fact Amazon. Instead, Microsoft won the deal, or at least it seemed that way. It wouldn’t be long before Amazon would dispute the decision in court.

By the time AWS re:Invent hit a couple of months after the announcement, former AWS CEO Andy Jassy was already pushing the idea that the president had unduly influenced the process.

“I think that we ended up with a situation where there was political interference. When you have a sitting president, who has shared openly his disdain for a company, and the leader of that company, it makes it really difficult for government agencies, including the DoD, to make objective decisions without fear of reprisal,” Jassy said at that time.

Then came the litigation. In November the company indicated it would be challenging the decision to choose Microsoft charging that it was was driven by politics and not technical merit. In January 2020, Amazon filed a request with the court that the project should stop until the legal challenges were settled. In February, a federal judge agreed with Amazon and stopped the project. It would never restart.

In April the DoD completed its own internal investigation of the contract procurement process and found no wrong-doing. As I wrote at the time:

While controversy has dogged the $10 billion, decade-long JEDI contract since its earliest days, a report by the DoD’s Inspector General’s Office concluded today that, while there were some funky bits and potential conflicts, overall the contract procurement process was fair and legal and the president did not unduly influence the process in spite of public comments.

Last September the DoD completed a review of the selection process and it once again concluded that Microsoft was the winner, but it didn’t really matter as the litigation was still in motion and the project remained stalled.

The legal wrangling continued into this year, and yesterday The Pentagon finally pulled the plug on the project once and for all, saying it was time to move on as times have changed since 2018 when it announced its vision for JEDI.

The DoD finally came to the conclusion that a single vendor approach wasn’t the best way to go, and not because it could never get the project off the ground, but because it makes more sense from a technology and business perspective to work with multiple vendors and not get locked into any particular one.

“JEDI was developed at a time when the Department’s needs were different and both the CSPs’ (cloud service providers) technology and our cloud conversancy was less mature. In light of new initiatives like JADC2 (the Pentagon’s initiative to build a network of connected sensors) and AI and Data Acceleration (ADA), the evolution of the cloud ecosystem within DoD, and changes in user requirements to leverage multiple cloud environments to execute mission, our landscape has advanced and a new way-ahead is warranted to achieve dominance in both traditional and non-traditional warfighting domains,” said John Sherman, acting DoD Chief Information Officer in a statement.

In other words, the DoD would benefit more from adopting a multi-cloud, multi-vendor approach like pretty much the rest of the world. That said, the department also indicated it would limit the vendor selection to Microsoft and Amazon.

“The Department intends to seek proposals from a limited number of sources, namely the Microsoft Corporation (Microsoft) and Amazon Web Services (AWS), as available market research indicates that these two vendors are the only Cloud Service Providers (CSPs) capable of meeting the Department’s requirements,” the department said in a statement.

That’s not going to sit well with Google, Oracle or IBM, but the department further indicated it would continue to monitor the market to see if other CSPs had the chops to handle their requirements in the future.

In the end, the single vendor requirement contributed greatly to an overly competitive and politically charged atmosphere that resulted in the project never coming to fruition. Now the DoD has to play technology catch-up, having lost three years to the histrionics of the entire JEDI procurement process and that could be the most lamentable part of this long, sordid technology tale.

#amazon, #andy-jassy, #aws, #cloud, #drama, #enterprise, #microsoft, #pentagon-jedi-contract, #tc

Pentagon kills Microsoft’s $10B JEDI cloud contract, says tech is now outdated

Pentagon kills Microsoft’s $10B JEDI cloud contract, says tech is now outdated

Enlarge (credit: US Department of Defense)

Following years of controversy and intrigue, the Pentagon canceled its JEDI cloud computing contract with Microsoft today.

Microsoft was awarded the contract in October 2019, but work stalled as Amazon, the other finalist, mounted a legal challenge. Now, the Department of Defense has scrapped the entire project, saying that it’s out of date.

“The Department has determined that, due to evolving requirements, increased cloud conversancy, and industry advances, the JEDI Cloud contract no longer meets its needs,” a Pentagon spokesperson said in a statement.

Read 7 remaining paragraphs | Comments

#amazon, #aws, #cloud-computing, #department-of-defense, #dod-jedi-contract, #microsoft-azure, #pentagon, #policy

Today is Day 1 for Andy Jassy as former AWS CEO moves to Amazon corner office

Amazon founder and CEO Jeff Bezos has always liked to motivate his employees by saying every day is Day 1. Well, it is actually Day 1 for his successor Andy Jassy, who officially moves into the corner office at Amazon today.

Bezos announced that he would be stepping down as CEO in February to focus on other interests including his charities Day 1 Fund and the Bezos Earth Fund, Blue Origin, the billionaire’s space company and The Washington Post, the newspaper he bought in 2013.

As he steps away, he will remain as executive chairman, but it will be Jassy, who up until now has spent most of his career at Amazon building the tremendously successful AWS cloud infrastructure arm, to keep a good thing going.

Jassy joined Amazon in 1997 and spent some time working as Bezos executive assistant and helped formulate the idea that would become Amazon Web Services, a series of integrated web services. He has been at AWS since its earliest days, helping build it from the initial idea to a $50 billion juggernaut. He was promoted to AWS CEO in 2016.

The Wall Street journal reported the other day that AWS would be ranked 69th on The Fortune 500 if it were a stand-alone company with the cloud unit currently on a $54 billion run rate. While that’s impressive he is taking over the full company, which itself ranks #2 on the same list, and which generated $386 billion in revenue last year as the pandemic pushed shopping online and Amazon was able to increase sales dramatically.

Jassy will face a number of challenges as he takes over including keeping that growth going as COVID slows down and people can begin to shop in person again. He also needs to deal with a federal government antitrust movement in the U.S. and the EU, a push to unionize Amazon warehouses and a general fear of Amazon’s growing market clout.

As part of the executive musical chairs such a shift in leadership tends to force, former Tableau CEO Adam Selipsky, who spent over a decade with Jassy helping to build the unit before moving to run Tableau in 2016, will take over as AWS CEO replacing Jassy.

In pre-market trading, Amazon stock was up 0.42% suggesting perhaps that Wall Street expects a smooth leadership transition at the company. Jassy has been a key member of the executive team for a number of years, and highly successful in his own right as he built AWS from humble beginnings to its current status, but now Amazon is his company to run and he will need to prove that he is up to the task.

#adam-selipsky, #amazon, #andy-jassy, #aws, #cloud, #ecommerce, #jeff-bezos, #personnel

Dispense with the chasm? No way!

Jeff Bussgang, a co-founder and general partner at Flybridge Capital, recently wrote an Extra Crunch guest post that argued it is time for a refresh when it comes to the technology adoption life cycle and the chasm. His argument went as follows:

  1. VCs in recent years have drastically underestimated the size of SAMs (serviceable addressable markets) for their startup investments because they were “trained to think only a portion of the SAM is obtainable within any reasonable window of time because of the chasm.”
  2. The chasm is no longer the barrier it once was because businesses have finally understood that software is eating the world.
  3. As a result, the early majority has joined up with the innovators and early adopters to create an expanded early market. Effectively, they have defected from the mainstream market to cross the chasm in the other direction, leaving only the late majority and the laggards on the other side.
  4. That is why we now are seeing multiple instances of very large high-growth markets that appear to have no limit to their upside. There is no chasm to cross until much later in the life cycle, and it isn’t worth much effort to cross it then.

Now, I agree with Jeff that we are seeing remarkable growth in technology adoption at levels that would have astonished investors from prior decades. In particular, I agree with him when he says:

The pandemic helped accelerate a global appreciation that digital innovation was no longer a luxury but a necessity. As such, companies could no longer wait around for new innovations to cross the chasm. Instead, everyone had to embrace change or be exposed to an existential competitive disadvantage.

But this is crossing the chasm! Pragmatic customers are being forced to adopt because they are under duress. It is not that they buy into the vision of software eating the world. It is because their very own lunches are being eaten. The pandemic created a flotilla of chasm-crossings because it unleashed a very real set of existential threats.

The key here is to understand the difference between two buying decision processes, one governed by visionaries and technology enthusiasts (the early adopters and innovators), the other by pragmatists (the early majority).

The key here is to understand the difference between two buying decision processes, one governed by visionaries and technology enthusiasts (the early adopters and innovators), the other by pragmatists (the early majority). The early group makes their decisions based on their own analyses. They do not look to others for corroborative support. Pragmatists do. Indeed, word-of-mouth endorsements are by far the most impactful input not only about what to buy and when but also from whom.

#amazon, #aws, #cloud-computing, #column, #ec-column, #ec-enterprise-applications, #enterprise, #google, #healthcare, #jeff-bussgang, #microsoft, #product-development, #product-management, #software-as-a-service, #software-developers, #startups, #venture-capital

DOJ files 7 new charges against alleged Capital One hacker

The U.S. Department of Justice (DOJ) has filed seven new charges against Paige Thompson, the former Amazon Web Services (AWS) engineer accused of hacking Capital One and stealing the personal data of more than 100 million Americans.

The new charges, which include six counts of computer fraud and abuse and one count of access device fraud, were revealed in court documents filed earlier this month, obtained by The Record. The previous indictment charged Thompson with one count each of wire fraud and computer crime and abuse, which meant she faced five up to five in prison and a fine of up to $250,000. As a result of the additional charges, Thompson now faces up to 20 years of jail time.

The superseding indictment has also expanded the number of victimized companies from the four listed in the 2019 indictment to eight. In addition to Capital One, a U.S. state agency, a U.S. public research university and an international telecommunications conglomerate, the list now includes a data and threat protection company, an organization that specializes in digital rights management (DRM), a provider of higher education learning technology, and a supplier of call center solutions. The companies have not been named, but security firm CyberInt previously said that Vodafone, Ford, Michigan State University and the Ohio Department of Transportation may all be victims of the breach.

Thompson, who used the handle “erratic” online and was identified after boasting about her activities on GitHub, remains accused of using her knowledge from her previous employment as a software engineer at Amazon to create a program that identified which customers of a cloud computing company (the indictment doesn’t name the company, but it has been identified as Amazon Web Services) had misconfigured firewalls. Once the tool found its target misconfiguration, Thompson allegedly exploited it to extract privileged account credentials.

The prior indictment alleges that once Thompson gained access to victims’ cloud infrastructure using the stolen credentials, she then accessed and downloaded data to a server at her residence in Seattle. It remains unclear whether any of the information was passed to third parties.

In the case of the Capital One breach, which the company confirmed in July 2019, the stolen data comprised 106 million credit card applications, which included names, addresses, phone numbers, and dates of birth, along with 140,000 Social Security numbers, 80,000 bank account numbers, and some credit scores and transaction data. Capital One, which replaced its cybersecurity chief four months after the incident, was fined $80 million in August 2020 for the security breach and its failure to keep its users’ financial data secure.

Prosecutors also allege that Thompson copied and stole data from at least 30 entities in total that used the same cloud provider, and claim that, in some cases, she used this access to set up cryptocurrency mining operations using victims’ cloud computing power – a practice known as cryptojacking.

Thompson pleaded not guilty and was released on pre-trial bond in August 2019. She was initially set to face trial in November 2019, but the trial was delayed to March 2020 due to the huge amount of information the prosecution had to analyze.

The trial was later rescheduled to October 2020 due to the pandemic, then to June 2021, then October 2021, and now to March 14, 2022, with prosecutors still citing the need for more time to analyze the data collected from Thompson’s devices.

 

#aws, #capital-one, #cryptojacking, #data-breach, #department-of-justice, #hacking, #security

Shopify drops its App Store commissions to 0% on developers’ first million in revenue

Following similar moves by Apple, Google, and more recently Amazon, among others, e-commerce platform Shopify announced today it’s also lowering its cut of developer revenue across its app marketplace, the Shopify App Store, as well as the new Shopify Theme Store. The news was announced today alongside a host of other developer-related news and updates for the Shopify platform at the company’s Unite 2021 Conference, including updates to Checkout, APIs, developer tooling and frameworks, among other things.

Shopify says its app developer partners earned $233 million in 2020 alone, more than 2018 and 2019 combined — an increase that can likely be attributed, in part, to the COVID-19 pandemic and the rapid shift to e-commerce that resulted. Today, there are over 6,000 publicly available apps across the Shopify App Store, and on average, a merchant will use around six apps to run their business.

Now, Shopify says it will drop its commissions on app developer revenue to 0%, down from 20%, for developers who make less than $1 million annually on its platform. This benchmark will also reset annually, giving developers — and, particularly those on the cusp of $1 million — more earning potential. And when Shopify’s revenue share kicks in, it will now only be 15% of “marginal” revenue. That means developers will pay 15% only on revenue they make that’s over the $1 million mark.

The same business model will apply to Shopify’s Theme Store, which opens to developer submissions July 15.

As the two stores are separate entities, the $1 million revenue share metric applies to each store individually. The new business model will begin on August 1, 2021 and will be made available to developers who register by providing their account details in their partner dashboard.

Shopify says the more developer-friendly business model will mean a drop in company revenue, but says it doesn’t expect this impact “to be material” because it will encourage greater innovation and development.

The changes to Shopify’s App Store follow a shift in the broader app store market around developer commissions.

Last year, amid increased regulatory scrutiny over how it runs its App Store, Apple announced it would reduce the App Store commissions for smaller businesses under a new program where developers earning up to $1 million per year would only have to pay a 15% commission on in-app purchases. Google and Amazon have since followed suit, each with their own particular spin on the concept. For example, in Google’s case, the fee is 15% on the first million the developer earns. Amazon is still charging a higher percentage at 20%, but is tacking on AWS credits as a perk.

Apple and Google, in particular, hope these changes can help shield them from antitrust investigations over their alleged app store monopolies, while also giving developers a better reason to participate in their own slice of the app economy.

Outside of mobile, Microsoft this year agreed to match the 12% cut on game sales that Epic Games takes on its Windows Store, as a means of increasing the pressure on its rivals. With the larger update to the new Windows 11 Store, it will allow developers to use their own payment platforms, while keeping its commission at 15% on apps.

To date, much of the momentum in the market has been focused on lowering the cut of app and games sales. Shopify’s app platform is different — it’s about apps that are used to enhance an e-commerce business, like those that help with shipping and delivery, marketing, merchandising, store design, customer service and more. These are not consumer-facing apps, but they are still marketed in an app store environment.

While the changes to developers’ businesses is the big news today from Unite 2021, that’s not to diminish from the host of updates Shopify announced related to its larger platform.

Among the updates are: the debut of Online Store 2.0, a more flexible and customizable update to Shopify’s Liquid platform (its templating language), which Netflix was the first to test; investments in custom storefronts for faster response times; a new React framework for building custom storefronts called Hydrogen; a way to host Hydrogen storefronts on Shopify called Oxygen; support for more Metafields for products and product variants and custom content that’s built on top; speedier Spotify Checkout; Checkout Extensions (customizations built by developers); easier and more powerful Shopify Scripts; a Payments Platform for integrating third-party payment gateways into Checkout; updates to its Storefront API; and more.

The company today also shared a few more business metrics, noting, for instance, that last year over 450 million people checked out on Shopify, totaling $120 billion in gross merchandise volume. It said its Shopify partners — which include app developers, theme builders, designers, agencies and experts — earned $12.5 billion in revenue in 2020, up 84% year-over-year, and 4x the revenue of Shopify’s own platform.

#amazon, #api, #app-store, #apple, #apps, #aws, #computing, #developer, #e-commerce, #ecommerce, #epic-games, #google, #itunes, #microsoft, #microsoft-store, #microsoft-windows, #shopify, #software, #spotify, #windows-store

How WesternUnion is fighting back against fintech startups

The saying goes that, “You can’t teach an old dog new tricks.” That may or may not be true, but at least one “old dog” is working hard to disprove that saying.

Western Union has been operating in the cross-border payments space for nearly 150 years (yes, you read that right – 150 years) and today, globally, it serves almost 150 million customers – representing senders and receivers.

In recent years, a number of fintech startups have emerged to challenge Western Union in the massive space – from Wise (formerly TransferWise) to Remitly to WorldRemit. But the payments giant seems up for the challenge and has been investing heavily in its digital operations in an attempt to beat fintechs at their own game

As we all know, the COVID-19 pandemic led to a massive acceleration of the trend of all things moving to digital in nearly all industries. Money transfer was no exception. In 2020, Western Union benefited from that acceleration. Its overall digital money transfer revenues – including WU.com and its digital partnership business – climbed by 38% to more than $850 million, up from over $600 million in 2019. 

Speaking of WU.com, the company’s online transactions site, it saw a nearly 30% gain in annual active customers to 8.6 million. 

This year, the company recently projected that its digital money transfer revenues are on track to exceed $1 billion in 2021 after first-quarter revenue growth of 45% to a new quarterly high of $242 million.

Today, Western Union claims to hold the largest cross-border, digital, peer-to-peer payments network in terms of scale, revenue and channels.

The emphasis on beefing up its digital operations – an initiative that actually began in the second half of 2019, according to the company – and expanding those digital offerings to more countries led to Western Union’s overall business profile shifting over the past 15 months. 

Digital channels in 2020 made up 29% of transactions and 20% of revenue for the company’s consumer-to-consumer (C2C) business, up from 16% and 14%, respectively, in 2019.

Western Union also “open sourced” its platform to third-party financial institutions in a move it says is a “step towards creating an end-to-end payments processing hub.”

TechCrunch talked with Shelly Swanback, Western Union’s president of product and platform, about the company’s digital strategy and what’s next beyond payments for the company (hint: it involves banking products). 

This interview has been edited for clarity and brevity.

TC: Let’s start out by hearing how the COVID-19 pandemic impacted your business, and what kinds of steps you took as a company to adapt?

Swanback: As COVID started playing out, just like any other company, I thought ‘What do we need to do to rally around our customers because our customers who rely on retail locations may not be able to get to their retail location as the COVID lockdowns started happening?’

One of the things we learned from that experience is this notion of everyday innovation. Innovation isn’t always blockchain or some emerging technology. Sometimes the best innovation is just about innovating every day with the products and services that you have. 

For example, we had some places in the world where we actually needed to figure out how we could do home delivery of cash. Delivering cash is different than delivering pizza as you can imagine, as there are a whole lot of regulatory items and security items. We very quickly figured out how we can deliver cash in Sri Lanka and Nepal, Jordan and some other places across the world. 

Another example lies in addressing how some folks were just a little intimidated by digital technology. I thought, ‘What if we set up a video digital location we called it where people could call in and do a video call with us and we could help them with their money transfer?’ It turned out that there actually wasn’t as much customer demand for that as we might have thought. 

But the great news — and this is a good lesson, I think, for many organizations — is what we actually did there in terms of KYC (Know Your Customer), which is a big thing in the financial services industry. So, all the technology we set up for this digital location for customers to upload their documents electronically and not have to be in front of an agent, we’re using today, just in a different way.

TC: I know Western Union has touted the fact that it has such a strong physical presence in so many locations actually benefits the growth of its digital operations as well as an expansion into other offerings beyond payments. Can you elaborate on that?

Swanback: The success and acceleration that we’re having in our digital business and of course the quarterly results are great, and we want to continue to do that. But for me, what’s most exciting is just the solid foundation and the basis gives us to build toward this idea of having a more meaningful account-based relationship with our customers and ability to offer them more than just money transfer. 

We have the fortune of having a trusted brand that’s known globally and trusted for something that’s very near and dear to our customers. What we’re hearing from our customers is they would trust us to provide additional services. So one of the things that we’re beginning to put plans in place for, and beginning to do some market tests on, is building an ecosystem or building a marketplace if you will. It will all be catered around the 270 million migrants across the world and really connecting them to each other, connecting them to their families and connecting them to merchants who want to sell them goods or provide them services that are very culturally relevant to them,  either where they happen to be living and working or providing them services back home to their families. 

Later in the fall, we’re going to be launching our first market test in Europe. We’re going to be offering a bank account, debit card, and multi-currency accounts tied of course into our money transfer services, as well as a few other things as we get closer to the market launch. But this really is our first test around providing a more comprehensive set of services.

TC: You recently announced a tie-up with Google Pay and some others. What is the significance of those partnerships?

Swanback: We want to be able to offer our cross-border capabilities and platform in more of a co-branded or white-label fashion, so that we can reach those customers that might still prefer to just be a customer of a bank. As an example, we recently announced that Google Pay users can log in to their app and can do cross-border transfers.

I think that’s an important part of our strategy– going after the direct relationship with customers and at the same time being able to offer our platform to others who already have a direct relationship with our customer. This is also part of our whole technology modernization right now of course. We’re very, very strong in the C2C segment, but the way we’re going about our technology modernization is one that provides us optionality to continue to expand in other segments  – whether it be consumer to business or business to consumer, or even business to business.

TC: Tell me more about this “modernization.”

Swanback: Like many financial organizations and many existing global organizations, part of our massive technology modernization program is moving to the cloud. So we were well on our way from migrating many of our applications to an AWS Cloud Platform. We’re pretty excited about the progress that we’re making there.

Also, over the last 12 to 18 months, we’ve migrated a good portion of our customer agent transactions, like the core of our data, to Snowflake. We;’ve mined 33 data warehouses, and we’ve got 20 petabytes of data in the cloud. And so, that in itself is just this is just the starting point. We’re modernizing our apps on top of this data foundation and really starting to use artificial intelligence and machine learning. But we’re not using it in the back end processes like many other organizations who were using it for operational interactions with our customers. We’re using it in the front office. For example, we launched a telephone money transfer product where a customer talks to a virtual assistant and it’s 100% digitized. It’s actually one of the best customer experiences we’ve seen.

#artificial-intelligence, #aws, #bank, #banking, #business, #cross-border-payments, #debit-card, #e-commerce, #economy, #europe, #finance, #google, #jordan, #machine-learning, #marketing, #nepal, #online-shopping, #payments, #payments-network, #peer-to-peer, #president, #remitly, #sri-lanka, #supply-chain-management, #tc, #virtual-assistant, #western-union, #worldremit

Edge Delta raises $15M Series A to take on Splunk

Seattle-based Edge Delta, a startup that is building a modern distributed monitoring stack that is competing directly with industry heavyweights like Splunk, New Relic and Datadog, today announced that it has raised a $15 million Series A funding round led by Menlo Ventures and Tim Tully, the former CTO of Splunk. Previous investors MaC Venture Capital and Amity Ventures also participated in this round, which brings the company’s total funding to date to $18 million.

“Our thesis is that there’s no way that enterprises today can continue to analyze all their data in real time,” said Edge Delta co-founder and CEO Ozan Unlu, who has worked in the observability space for about 15 years already (including at Microsoft and Sumo Logic). “The way that it was traditionally done with these primitive, centralized models — there’s just too much data. It worked 10 years ago, but gigabytes turned into terabytes and now terabytes are turning into petabytes. That whole model is breaking down.”

Image Credits: Edge Delta

He acknowledges that traditional big data warehousing works quite well for business intelligence and analytics use cases. But that’s not real-time and also involves moving a lot of data from where it’s generated to a centralized warehouse. The promise of Edge Delta is that it can offer all of the capabilities of this centralized model by allowing enterprises to start to analyze their logs, metrics, traces and other telemetry right at the source. This, in turn, also allows them to get visibility into all of the data that’s generated there, instead of many of today’s systems, which only provide insights into a small slice of this information.

While competing services tend to have agents that run on a customer’s machine, but typically only compress the data, encrypt it and then send it on to its final destination, Edge Delta’s agent starts analyzing the data right at the local level. With that, if you want to, for example, graph error rates from your Kubernetes cluster, you wouldn’t have to gather all of this data and send it off to your data warehouse where it has to be indexed before it can be analyzed and graphed.

With Edge Delta, you could instead have every single node draw its own graph, which Edge Delta can then combine later on. With this, Edge Delta argues, its agent is able to offer significant performance benefits, often by orders of magnitude. This also allows businesses to run their machine learning models at the edge, as well.

Image Credits: Edge Delta

“What I saw before I was leaving Splunk was that people were sort of being choosy about where they put workloads for a variety of reasons, including cost control,” said Menlo Ventures’ Tim Tully, who joined the firm only a couple of months ago. “So this idea that you can move some of the compute down to the edge and lower latency and do machine learning at the edge in a distributed way was incredibly fascinating to me.”

Edge Delta is able to offer a significantly cheaper service, in large part because it doesn’t have to run a lot of compute and manage huge storage pools itself since a lot of that is handled at the edge. And while the customers obviously still incur some overhead to provision this compute power, it’s still significantly less than what they would be paying for a comparable service. The company argues that it typically sees about a 90 percent improvement in total cost of ownership compared to traditional centralized services.

Image Credits: Edge Delta

Edge Delta charges based on volume and it is not shy to compare its prices with Splunk’s and does so right on its pricing calculator. Indeed, in talking to Tully and Unlu, Splunk was clearly on everybody’s mind.

“There’s kind of this concept of unbundling of Splunk,” Unlu said. “You have Snowflake and the data warehouse solutions coming in from one side, and they’re saying, ‘hey, if you don’t care about real time, go use us.’ And then we’re the other half of the equation, which is: actually there’s a lot of real-time operational use cases and this model is actually better for those massive stream processing datasets that you required to analyze in real time.”

But despite this competition, Edge Delta can still integrate with Splunk and similar services. Users can still take their data, ingest it through Edge Delta and then pass it on to the likes of Sumo Logic, Splunk, AWS’s S3 and other solutions.

Image Credits: Edge Delta

“If you follow the trajectory of Splunk, we had this whole idea of building this business around IoT and Splunk at the Edge — and we never really quite got there,” Tully said. “I think what we’re winding up seeing collectively is the edge actually means something a little bit different. […] The advances in distributed computing and sophistication of hardware at the edge allows these types of problems to be solved at a lower cost and lower latency.”

The Edge Delta team plans to use the new funding to expand its team and support all of the new customers that have shown interest in the product. For that, it is building out its go-to-market and marketing teams, as well as its customer success and support teams.

 

#aws, #big-data, #business-intelligence, #cloud, #computing, #cto, #data-security, #data-warehouse, #datadog, #enterprise, #information-technology, #mac-venture-capital, #machine-learning, #menlo-ventures, #microsoft, #new-relic, #real-time, #recent-funding, #seattle, #splunk, #startups, #sumo-logic, #system-administration, #tc, #technology

Salesforce, AWS announce extended partnership with further two-way integration

Salesforce and AWS represent the two most successful cloud companies in their respective categories. Over the last few years the two cloud giants have had an evolving partnership. Today they announced plans for a new set of integration capabilities to make it easier to share data and build applications that cross the two platforms.

Patrick Stokes, EVP and GM for Platform at Salesforce, points out that the companies have worked together in the past to provide features like secure sharing between the two services, but they were hearing from customers that they wanted to take it further and today’s announcement is the first step towards making that happen.

“[The initial phases of the partnership] have really been massively successful. We’re learning a lot from each other and from our mutual customers about the types of things that they want to try to accomplish, both within the Salesforce portfolio of products, as well as all the Amazon products, so that the two solutions complement each other really nicely. And customers are asking us for more, and so we’re excited to enter into this next phase of our partnership,” Stokes explained.

He added, “The goal really is to unify our platforms, so bring [together] all the power of the Amazon services with all of the power of the of the Salesforce platform.” These capabilities could be the next step in accomplishing that.

This involves a couple of new features the companies are working on to help developers on both the platform and application side of the equation. For starters that includes enabling developers to virtualize Amazon data inside Salesforce without having to do all the coding to make that happen manually.

“More specifically, we’re going to virtualize Amazon data within the Salesforce platform, so whether you’re working with an S3 bucket, Amazon RDS or whatever it is we’re going to make it so that that the data is virtualized and just appears just like it’s native data on the Salesforce platform,” he said.

Similarly, developers building applications on Amazon will be able to access Salesforce data and have it appear natively in Amazon. This involves providing connectors between the two systems to make the data flow smoothly without a lot of coding to make that happen.

The companies are also announcing event sharing capabilities, which makes it easier for both Amazon and Salesforce customers to build microservices-based applications that cross both platforms.

“You can build microservices-oriented architecture that spans the services of Salesforce and Amazon platforms, again without having to write any code. To do that, [we’re developing] out of the box connectors so you can click and drag the events that you want.”

The companies are also announcing plans to make it easier from an identity and access management perspective to access the platforms with a guided setup. Finally, the companies are working on applications to build Amazon Chime communications tooling into Service Cloud and other Salesforce services to build things like virtual call centers using AWS machine learning technology.

Amazon VP of Global Marketing Rachel Thorton says that having the two cloud giants work together in this way should make it easier for developers to create solutions that span the two platforms. “I just think it unlocks such possibilities for developers, and the faster and more innovative developers can be, it just unlocks opportunities for businesses, and creates better customer experiences,” Thornton said.

It’s worth noting that Salesforce also has extensive partnerships with other cloud providers including Microsoft Azure and Google Cloud Platform.

As is typically the case with Salesforce announcements, while all of these capabilities are being announced today, they are still in the development stage and won’t go into beta testing until later this year with GA expected sometime next year. The companies are expected to release more details about the partnership at Dreamforce and re:Invent, their respective customer conferences later this year.

#aws, #cloud, #developer, #enterprise, #partnerships, #saas, #salesforce, #tc

Vantage raises $4M to help businesses understand their AWS costs

Vantage, a service that helps businesses analyze and reduce their AWS costs, today announced that it has raised a $4 million seed round led by Andreessen Horowitz. A number of angel investors, including Brianne Kimmel, Julia Lipton, Stephanie Friedman, Calvin French Owen, Ben and Moisey Uretsky, Mitch Wainer and Justin Gage, also participated in this round

Vantage started out with a focus on making the AWS console a bit easier to use — and help businesses figure out what they are spending their cloud infrastructure budgets on in the process. But as Vantage co-founder and CEO Ben Schaechter told me, it was the cost transparency features that really caught on with users.

“We were advertising ourselves as being an alternative AWS console with a focus on developer experience and cost transparency,” he said.”What was interesting is — even in the early days of early access before the formal GA launch in January — I would say more than 95% of the feedback that we were getting from customers was entirely around the cost features that we had in Vantage.”

Image Credits: Vantage

Like any good startup, the Vantage team looked at this and decided to double down on these features and highlight them in its marketing, though it kept the existing AWS Console-related tools as well. The reason the other tools didn’t quite take off, Schaechter believes, is because more and more, AWS users have become accustomed to infrastructure-as-code to do their own automatic provisioning. And with that, they spend a lot less time in the AWS Console anyway.

“But one consistent thing — across the board — was that people were having a really, really hard time twelve times a year, where they would get a shock AWS bill and had to figure out what happened. What Vantage is doing today is providing a lot of value on the transparency front there,” he said.

Over the course of the last few months, the team added a number of new features to its cost transparency tools, including machine learning-driven predictions (both on the overall account level and service level) and the ability to share reports across teams.

Image Credits: Vantage

While Vantage expects to add support for other clouds in the future, likely starting with Azure and then GCP, that’s actually not what the team is focused on right now. Instead, Schaechter noted, the team plans to add support for bringing in data from third-party cloud services instead.

“The number one line item for companies tends to be AWS, GCP, Azure,” he said. “But then, after that, it’s Datadog Cloudflare Sumo Logic, things along those lines. Right now, there’s no way to see, P&L or an ROI from a cloud usage-based perspective. Vantage can be the tool where that’s showing you essentially, all of your cloud costs in one space.”

That is likely the vision the investors bought in as well and even though Vantage is now going up against enterprise tools like Apptio’s Cloudability and VMware’s CloudHealth, Schaechter doesn’t seem to be all that worried about the competition. He argues that these are tools that were born in a time when AWS had only a handful of services and only a few ways of interacting with those. He believes that Vantage, as a modern self-service platform, will have quite a few advantages over these older services.

“You can get up and running in a few clicks. You don’t have to talk to a sales team. We’re helping a large number of startups at this stage all the way up to the enterprise, whereas Cloudability and Cloud Health are, in my mind, kind of antiquated enterprise offerings. No startup is choosing to use those at this point, as far as I know,” he said.

The team, which until now mostly consisted of Schaechter and his co-founder and CTO Brooke McKim, bootstrapped to company up to this point. Now they plan to use the new capital to build out its team (and the company is actively hiring right now), both on the development and go-to-market side.

The company offers a free starter plan for businesses that track up to $2,500 in monthly AWS cost, with paid plans starting at $30 per month for those who need to track larger accounts.

#amazon-web-services, #andreessen-horowitz, #apptio, #aws, #brianne-kimmel, #cloud, #cloud-computing, #cloud-infrastructure, #cloud-services, #cloudability, #cloudflare, #computing, #datadog, #enterprise, #information-technology, #machine-learning, #recent-funding, #startups, #sumo-logic, #tc, #technology, #vmware