Stairwell secures $20M Series A to help organizations outsmart attackers

Back when Stairwell emerged from stealth in 2020, the startup was shrouded in secrecy. Now with $20 million in Series A funding, its founder and CEO Mike Wiacek — who previously served as chief security officer at Chronicle, Google’s moonshot cybersecurity company — is ready to talk.

As well as raising $20M, an investment round co-led by Sequoia Capital and Accel, Stairwell is launching Inception, a threat hunting platform that aims to help organizations determine if they were compromised now or in the past. Unlike other threat detection platforms, Inception takes an “inside out” approach to cybersecurity, which starts by looking inwards at a company’s data.

“This helps you study what’s in your environment first before you start thinking about what’s happening in the outside world,” Wiacek tells TechCrunch. “The beautiful thing about that approach is that’s not information that outside parties, a.k.a. the bad guys, are privy to.”

This data, all of which is treated as suspicious, is continuously evaluated in light of new indicators and new threat intelligence. Stairwell claims this enables organizations to detect anomalies within just days, rather than the industry average of 280 days, as well as to “bootstrap” future detections.

“If you go and buy a threat intelligence feed from Vendor X, do you really think that someone who’s spending hundreds of thousands, or even millions of dollars to conduct an offensive campaign isn’t going to make sure that whatever they’re using isn’t in that field?,” said Wiacek. “They know what McAfee knows and they know other antivirus engines know, but they don’t know what you know and that’s a very powerful advantage that you have there.”

Stairwell’s $20 million in Series A funding, which comes less than 12 months after it secured $4.5 million in seed funding, will be used to further advance the Inception platform and to increase the startup’s headcount; the Palo Alto-based firm currently has a modest headcount of 21.

The Inception platform, which the startup claims finally enables enterprises to “outsmart the bad guys”, is launching in early release for a limited number of customers, with full general availability scheduled for 2022.

“I just wish we had a product to market when SolarWinds happened,” Wiacek added.

#accel, #anomali, #ceo, #computer-security, #computing, #google-cloud, #inception, #information-technology, #mcafee, #palo-alto, #security, #sequoia-capital, #solarwinds, #stairwell, #system-administration

Planet Labs and Google Cloud join forces in data analysis agreement

Satellite operator Planet Labs is beefing up its existing partnership with Google Cloud. Under a new agreement, Planet customers can use Google Cloud to store and process data, and access Google’s other products such as its data analytics warehouse BigQuery.

The two companies’ collaboration stretches back to 2017, when Google sold its satellite imaging business, Terra Bella, to Planet. As part of the sale agreement, Google also signed a multi-year contract with Planet to license Earth-imaging for its use. Planet also uses Google Cloud service for its own internal data processing and hosting.

This latest agreement will let Planet customers use products like BigQuery to analyze large volumes of satellite imaging data, reflecting “a growing demand for planetary-scale satellite data analysis, powered by the cloud,” Planet said in a news release.

“Planet customers want scalable compute and storage,” Kevin Weil, Planet’s president of product and business said. “Google Cloud customers want broader access to satellite data and analytics. This partnership is a win-win for both, as it helps customers transform their operations and compete in a digital-first world, powered by Planet’s unique data set.”

Planet operates a network of around 200 satellites – more than any government – and provides analytics services on the data it gathers. Last month, the company joined a slew of other space companies by announcing it was going public via a $2.8 billion merger with blank-check firm dMY Technology Group IV. The deal is anticipated to inject Planet with $545 million in cash, including a $200 million private-investment-in-public-equity from BlackRock-managed funds, Koch Strategic Platforms, Marc Benioff’s TIME Ventures and Google.

#aerospace, #data-analytics, #google, #google-cloud, #planet-labs, #satellite-constellation, #satellite-imagery, #space

Google announces EPYC-based Tau virtual machines for Cloud

Google this morning announced the launch of Tau, a new family of virtual machines built on AMD’s third-gen EPYC processor. According to the company, the new x86-compatible system offers a 42% price-performance boost over standard VMs. Google notably first started utilizing AMD EPYC processors for Cloud back in 2017, while Amazon Cloud’s offerings date back to 2018.

Google claims the Tau family “leapfrogs” existing cloud VMs. The systems come in a variety of configurations, ranging up to 60vCPUs per VM, and 4GB of memory per vCPU. Networking bandwidth goes up to 32 Gbps, and they can be coupled with a variety of different network attached storage.

“Customers across every industry are dealing with more demanding and data-intensive workloads and looking for strategic ways to speed up performance and reduce costs,” Google Cloud CEO Thomas Kurian said in a press release.  “Our work with key strategic partners like AMD has allowed us to broaden our offerings and deliver customers the best price performance for compute-heavy, business-critical applications– all on the cleanest cloud in the industry.”

Image Credits: Google

Google has already signed up some high-profile customers for an early trial, including Twitter, Snap and DoIT.

“High performance at the right price point is a critical consideration as we work to serve the global public conversation,” Twitter Platform Lead Nick Tornow said in a blog post. “We are excited by initial tests that show potential for double digit performance improvement. We are collaborating with Google Cloud to more deeply evaluate benefits on price and performance for specific compute workloads that we can realize through use of the new Tau VM family.”

Image Credits: Google

The Tau VMs will be arriving for Google Cloud in Q3 of this year. The company has already opened the system up to clients for pre-registration. Pricing is dependent on the configuration. For example, a 32vCPU VM sporting 128GB RAM will run around $1.35 an hour.

#cloud, #enterprise, #google, #google-cloud, #hardware, #virtual-machine

Google’s AirTable rival, Tables, graduates from beta test to become a Google Cloud product

Last fall, Google’s in-house incubator Area 120 introduced a new work-tracking tool called Tables, an AirTable rival that allows for tracking projects more efficiently using automation. Today, Google says Tables will officially “graduate” from Area 120 to become an official Google product by joining Google Cloud, which it expects to complete in the next year.

The Tables project was started by long-time Google employee, now Tables’ GM, Tim Gleason, who spent 10 years at the company and many more before that in the tech industry. He said he was inspired to work on Tables because he always had a difficult time tracking projects, as teams shared notes and tasks across different documents, which quickly got out of date.

Instead of tracking those sorts of notes and tasks associated with a project across various documents that have to be manually updated by team members, Tables uses bots to help take on some of the administrative duties involved in guiding team members through a project — like scheduling recurring email reminders when tasks are overdue, messaging a chat room when new forms are received, moving tasks to other people’s work queues, or updating tasks when schedules are changed.

The team saw Tables as a potential solution for a variety of use cases, including of course project management, as well as IT operations, customer service tracking, CRM, recruiting, product development and more.

Image Credits: Google

The service was launched last September to test product market fit, Google says, and quickly found traction.

According to VP/GM and Head of Platform for Google Cloud Amit Zavery, early customer feedback was positive and the team saw customers adopting the service for multiple projects — another strong signal for its potential growth. He declined to say how many customers were already using the service, however.

The pandemic also likely played a role in Tables’ adoption, Zavery noted.

“If you saw what happened with COVID, I think work-tracking became a pretty big area of interest for many customers who we’re speaking to,” he says, explaining that everyone was trying to quickly digitize.

Popular use cases included inventory management, healthcare supply tracking and use in mortgage-lending workflows. However, the team found Tables was adopted across a variety of industries beyond these, as hoped. On average, customers would use Tables in a department with around 30 to 40 people, they found.

Most customers were abandoning more manual processes to use Tables instead, not coming from a rival service.

“Things were very fragmented in different documents or with different people, so using technologies like this really seems to have resonated very well,” Zavery says. “Now you had one central place for structured information you can access and do things on top of it versus trying to have 15 different sheets and figuring out how they are related because there’s no structure behind each of them.”

Another factor that prompted Tables’ adoption was how quickly people could be productive, thanks in part to its ability to integrate with existing data warehouses and other services. Currently, Tables supports Office 365, Microsoft Access, Google Sheets, Slack, Salesforce, Box and Dropbox, for example.

Tables was one of only a few Area 120 projects to launch with a paid business model, along with ticket seller Fundo, conversational ads platform AdLingo and Google’s recently launched Orion WiFi. During its beta, an individual could use Tables for free, with support for up to 100 tables and 1,000 rows. The paid plan was supposed to cost $10 per user per month, with support for up to 1,000 tables and 10,000 rows. This plan also included support for larger attachments, more actions and advanced history, sharing, forms, automation and views.

However, Google never began charging for its paid tier during the beta, it says.

As Tables graduates into Google Cloud’s lineup, it will be integrated with Google’s no-code app building platform, AppSheet, which has a free tier, allowing the freemium model to continue. Users who want additional features will be able to upgrade to a premium plan. It will also be offered as a standalone product, for those who want that experience.

Google will leverage Workspace to get Tables in front of more users, as well.

“it’s going to be delivered through Workspace integration, because that’s a very large community of users who expect some similar kind of functionality,” Zavery says. “That will be a big differentiator, when you talk about the breadth of things we can do — because of having that community of users on Sheets, the things they do with Drive, and the data they collect — we can automatically add this and augment their experience.”

Image Credits: Google

The project taps into the growing interest in no-code, spreadsheet-powered database platforms — like AirTable, for example, which had closed on $185 million in Series D funding in the days before Tables’ release, valuing its business at $2.585 billion, post-money.

As Tables transitions to Google Cloud, the Tables beta version will remain free until a fully supported Cloud product becomes available in the next year. At that point, users will migrate to the new service.

Over time, Tables plans to add more functionality as it ties in with AppSheet, to make using the service more seamless — so people don’t have to hop around from one product to another to accomplish tasks. It will also work to provide better ease of use, mobile support and connectivity with more backend systems.

Official pricing hasn’t been finalized but shouldn’t be very different from the beta version.

#airtable, #apps, #appsheet, #area-120, #cloud-applications, #computing, #crm, #google, #google-cloud, #google-sheets, #inventory-management, #tables, #tc, #technology

Google updates Firebase with new personalization features, security tools and more

At its I/O developer conference, Google today announced a slew of updates to its Firebase developer platform, which, as the company also announced, now powers over 3 million apps.

There’s a number of major updates here, most of which center around improving existing tools like Firebase Remote Config and Firebase’s monitoring capabilities, but there are also a number of completely new features here as well, including the ability to create Android App Bundles and a new security tool called App Check.

“Helping developers be successful is what makes Firebase successful,” Firebase product manager Kristen Richards told me ahead of today’s announcements. “So we put helpfulness and helping developers at the center of everything that we do.” She noted that during the pandemic, Google saw a lot of people who started to focus on app development — both as learners and as professional developers. But the team also saw a lot of enterprises move to its platform as those companies looked to quickly bring new apps online.

Maybe the marquee Firebase announcement at I/O is the updated Remote Config. That’s always been a very powerful feature that allows developers to make changes to live production apps on the go without having to release a new version of their app. Developers can use this for anything from A/B testing to providing tailored in-app experience to specific user groups.

With this update, Google is introducing updates to the Remote Config console, to make it easier for developers to see how they are using this tool, as well as an updated publish flow and redesigned test results pages for A/B tests.

Image Credits: Google

What’s most important, though, is that Google is taking Remote Config a step further now by launching a new Personalization feature that helps developers automatically optimize the user experience for individual users. “It’s a new feature of [Remote Config] that uses Google’s machine learning to create unique individual app experiences,” Richards explained. “It’s super simple to set up and it automatically creates these personalized experiences that’s tailored to each individual user. Maybe you have something that you would like, which would be something different for me. In that way, we’re able to get a tailored experience, which is really what customers expect nowadays. I think we’re all expecting things to be more personalized than they have in the past.”

Image Credits: Google

Google is also improving a number of Firebase’s analytics and monitoring capabilities, including its Crashlytics service for figuring out app crashes. For game developers, that means improved support for games written with the help of the Unity platform, for example, but for all developers, the fact that Firebase’s Performance Monitoring service now processes data in real time is a major update to having performance data (especially on launch day) arrive with a delay of almost half a day.

Firebase is also now finally adding support for Android App Bundles, Google’s relatively new format for packaging up all of an app’s code and resources, with Google Play optimizing the actual APK with the right resources for the kind of device the app gets installed on. This typically leads to smaller downloads and faster installs.

On the security side, the Firebase team is launching App Check, now available in beta. App Check helps developers guard their apps against outside threats and is meant to automatically block any traffic to online resources like Cloud Storage, Realtime Database and Cloud Functions for Firebase (with others coming soon) that doesn’t provide valid credentials.

Image Credits: Google

The other update worth mentioning here is to Firebase Extensions, which launched a while ago, but which is getting support for a few more extensions today. These are new extensions from Algolia, Mailchimp and MessageBird, that helps bring new features like Algolia’s search capabilities or MessageBird’s communications features directly to the platform. Google itself is also launching a new extension that helps developers detect comments that could be considered “rude, disrespectful, or unreasonable in a way that will make people leave a conversation.”

#algolia, #android, #cloud-computing, #computing, #developer, #firebase, #google, #google-allo, #google-cloud, #google-i-o-2021, #google-play, #google-search, #machine-learning, #mailchimp, #operating-systems, #product-manager, #tc

Google Cloud Run gets committed use discounts and new security features

Cloud Run, Google Cloud’s serverless platform for containerized applications, is getting committed use discounts. Users who commit to spending a given amount on using Cloud Run for a year will get a 17% discount on the money they commit. The company offers a similar pre-commitment discount scheme for VM-based Compute Engine instances, as well as automatic ‘sustained use‘ discounts for machines that run for more than 25% of a month.

In addition, Google Cloud is also introducing a number of new security features for Cloud Run, including the ability to mount secrets from the Google Cloud Secret Manager and binary authorization to help define and enforce policies about how containers are deployed on the service. Cloud Run users can now also now use and manage their own encryption keys (by default, Cloud Run uses Google-managed keys) and a new Recommendation Hub inside of Cloud Run will now offer users recommendations for how to better protect their Cloud Run services.

Aparna Sinha, who recently became the director of product management for Google Cloud’s serverless platform, noted that these updates are part of Google Cloud’s push to build what she calls the “next generation of serverless.’

“We’re really excited to introduce our new vision for serverless, which I think is going to help redefine this space,” she told me. “In the past, serverless has meant a certain narrower type of compute, which is focused on functions or a very specific kind of applications, web services, etc. — and what we are talking about with redefining serverless is focusing on the power of serverless, which is the developer experience and the ease of use, but broadening it into a much more versatile platform, where many different types of applications can be run, and building in the Google way of doing DevOps and security and a lot of integrations so that you have access to everything that’s the best of cloud.”

She noted that Cloud Run saw “tremendous adoption” during the pandemic, something she attributes to the fact that businesses were looking to speed up time-to-value from their applications. IKEA, for example, which famously had a hard time moving from in-store to online sales, bet on Google Cloud’s serverless platform to bring down the refresh time of its online store and inventory management system from three hours to less than three minutes after switching to this model.

“That’s kind of the power of serverless, I think, especially looking forward, the ability to build real-time applications that have data about the context, about the inventory, about the customer and can therefore be much more reactive and responsive,” Sinha said. “This is an expectation that customers will have going forward and serverless is an excellent way to deliver that as well as be responsive to demand patterns, especially when they’re changing so much in today’s uncertain environment.”

Since the container model gives businesses a lot of flexibility in what they want to run in these containers — and how they want to develop these applications since Cloud Run is language-agnostic — Google is now seeing a lot of other enterprises move to this platform as well, both for deploying completely new applications but also to modernize some of their existing services.

For the companies that have predictable usage patterns, the committed use discounts should be an attractive option and it’s likely the more sophisticated organizations that are asking for the kinds of new security features that Google Cloud is introducing today.

“The next generation of serverless combines the best of serverless with containers to run a broad spectrum of apps, with no language, networking or regional restrictions,” Sinha writes in today’s announcement. “The next generation of serverless will help developers build the modern applications of tomorrow—applications that adapt easily to change, scale as needed, respond to the needs of their customers faster and more efficiently, all while giving developers the best developer experience.”

#aparna-sinha, #cloud, #cloud-computing, #cloud-infrastructure, #computing, #developer, #encryption, #google, #google-cloud, #google-compute-engine, #ikea, #online-sales, #product-management, #serverless-computing, #web-services

Google Cloud teams up with SpaceX’s Starlink for enterprise connectivity at network’s edge

SpaceX’s bourgeoning Starlink satellite-based broadband internet service just got a big boost from a significant new partner: Google Cloud. Thanks to a new partnership between the two, SpaceX will now be locating Starlink ground stations right within Google’s existing data centers, providing the Starlink network with direct access to ground-based network infrastructure to help facilitate network connections for customers who are on the edges of the footprint of existing network access.

Starlink’s entire aim is to provide reliable, broadband-quality connections to areas that have typically be hard or impossible to reach with legacy ground-based network infrastructure, including cellular networks. The tie-up with Google means that not only will business and public sector customers taking advantage of that new network reach have access to internet connections, but also to cloud-based infrastructure and applications, including AI and machine learning capabilities, analytics and more.

This should not only bolster Starlink’s reliability in terms of its consumer clients, but also provide key capabilities for serving enterprise customers — another key target demographic for the growing Starlink business, though much of the public focus thus far for Starlink’s roll-out has been on residential access across its expanding beta.

Google and Starlink expect to begin to become available to enterprise customers soon — sometime pin the “second half of 2021” according to a press release issued by the companies.

SpaceX has been very aggressive in building out the Starlink network in the past few months, launching 480 in just around there months. All that in-space infrastructure build out could well have been pre-amble to this collaboration and enterprise-focused service launch, in addition to helping SpaceX expand Starlink consumer service quality and availability.

#artificial-intelligence, #broadband, #google, #google-cloud, #internet-access, #machine-learning, #space, #spacecraft, #spaceflight, #spacex, #starlink, #tc, #telecommunications

Google’s Anthos multi-cloud platform gets improved logging, Windows container support and more

Google today announced a sizable update to its Anthos multi-cloud platform that lets you build, deploy and manage containerized applications anywhere, including on Amazon’s AWS and (in preview) on Microsoft Azure.

Version 1.7 includes new features like improved metrics and logging for Anthos on AWS, a new Connect gateway to interact with any cluster right from Google Cloud and a preview of Google’s managed control plane for Anthos Service Mesh. Other new features include Windows container support for environments that use VMware’s vSphere platform and new tools for developers to make it easier for them to deploy their applications to any Anthos cluster.

Today’s update comes almost exactly two years after Google CEO Sundar Pichai originally announced Anthos at its Cloud Next event in 2019 (before that, Google called this project the ‘Google Cloud Services Platform,’ which launched three years ago). Hybrid- and multi-cloud, it’s fair to say, takes a key role in the Google Cloud roadmap — and maybe more so for Google than for any of its competitors. And recently, Google brought on industry veteran Jeff Reed to become the VP of Product Management in charge of Anthos.

Reed told me that he believes that there are a lot of factors right now that are putting Anthos in a good position. “The wind is at our back. We bet on Kubernetes, bet on containers — those were good decisions,” he said. Increasingly, customers are also now scaling out their use of Kubernetes and have to figure out how to best scale out their clusters and deploy them in different environments — and to do so, they need a consistent platform across these environments. He also noted that when it comes to bringing on new Anthos customers, it’s really those factors that determine whether a company will look into Anthos or not.

He acknowledged that there are other players in this market, but he argues that Google Cloud’s take on this is also quite different. “I think we’re pretty unique in the sense that we’re from the cloud, cloud-native is our core approach,” he said. “A lot of what we talk about in [Anthos] 1.7 is about how we leverage the power of the cloud and use what we call ‘an anchor in the cloud’ to make your life much easier. We’re more like a cloud vendor there, but because we support on-prem, we see some of those other folks.” Those other folks being IBM/Red Hat’s OpenShift and VMware’s Tanzu, for example. 

The addition of support for Windows containers in vSphere environments also points to the fact that a lot of Anthos customers are classical enterprises that are trying to modernize their infrastructure, yet still rely on a lot of legacy applications that they are now trying to bring to the cloud.

Looking ahead, one thing we’ll likely see is more integrations with a wider range of Google Cloud products into Anthos. And indeed, as Reed noted, inside of Google Cloud, more teams are now building their products on top of Anthos themselves. In turn, that then makes it easier to bring those services to an Anthos-managed environment anywhere. One of the first of these internal services that run on top of Anthos is Apigee. “Your Apigee deployment essentially has Anthos underneath the covers. So Apigee gets all the benefits of a container environment, scalability and all those pieces — and we’ve made it really simple for that whole environment to run kind of as a stack,” he said.

I guess we can expect to hear more about this in the near future — or at Google Cloud Next 2021.

 

#anthos, #apigee, #aws, #ceo, #chrome-os, #cisco, #cloud, #cloud-computing, #cloud-infrastructure, #computing, #enterprise, #google, #google-cloud, #google-cloud-platform, #ibm, #kubernetes, #microsoft, #microsoft-windows, #red-hat, #sundar-pichai, #vmware

Google Cloud joins the FinOps Foundation

Google Cloud today announced that it is joining the FinOps Foundation as a Premier Member.

The FinOps Foundation is a relatively new open-source foundation, hosted by the Linux Foundation, that launched last year. It aims to bring together companies in the ‘cloud financial management’ space to establish best practices and standards. As the term implies, ‘cloud financial management,’ is about the tools and practices that help businesses manage and budget their cloud spend. There’s a reason, after all, that there are a number of successful startups that do nothing else but help businesses optimize their cloud spend (and ideally lower it).

Maybe it’s no surprise that the FinOps Foundation was born out of Cloudability’s quarterly Customer Advisory Board meetings. Until now, CloudHealth by VMware was the Foundation’s only Premiere Member among its vendor members. Other members include Cloudability, Densify, Kubecost and SoftwareOne. With Google Cloud, the Foundation has now signed up its first major cloud provider.

“FinOps best practices are essential for companies to monitor, analyze, and optimize cloud spend across tens to hundreds of projects that are critical to their business success,” said Yanbing Li, Vice President of Engineering and Product at Google Cloud. “More visibility, efficiency, and tools will enable our customers to improve their cloud deployments and drive greater business value. We are excited to join FinOps Foundation, and together with like-minded organizations, we will shepherd behavioral change throughout the industry.”

Google Cloud has already committed to sending members to some of the Foundation’s various Special Interest Groups (SIGs) and Working Groups to “help drive open source standards for cloud financial management.”

“The practitioners in the FinOps Foundation greatly benefit when market leaders like Google Cloud invest resources and align their product offerings to FinOps principles and standards,” said J.R. Storment, Executive Director of the FinOps Foundation. “We are thrilled to see Google Cloud increase its commitment to the FinOps Foundation, joining VMware as the 2nd of 3 dedicated Premier Member Technical Advisory Council seats.”

#cloud, #cloud-computing, #cloud-infrastructure, #cloudability, #computing, #densify, #enterprise, #google, #google-cloud, #linux, #linux-foundation, #vmware

Google Cloud launches a new support option for mission critical workloads

Google Cloud today announced the launch of a new support option for its Premium Support customers that run mission-critical services on its platform. The new service, imaginatively dubbed Mission Critical Services (MCS), brings Google’s own experience with Site Reliability Engineering to its customers. This is not Google completely taking over the management of these services, though. Instead, the company describes it as a “consultative offering in which we partner with you on a journey toward readiness.”

Initially, Google will work with its customers to improve — or develop — the architecture of their apps and help them instrument the right monitoring systems and controls, as well as help them set and raise their service-level objectives (a key feature in the Site Reliability Engineering philosophy).

Later, Google will also provide ongoing check-ins with its engineers and walk customers through tune-ups architecture reviews. “Our highest tier of engineers will have deep familiarity with your workloads, allowing us to monitor, prevent, and mitigate impacts quickly, delivering the fastest response in the industry. For example, if you have any issues–24-hours-a-day, seven-days-a-week–we’ll spin up a live war room with our experts within five minutes,” Google Cloud’s VP for Customer Experience, John Jester, explains in today’s announcement.

This new offering is another example of how Google Cloud is trying to differentiate itself from the rest of the large cloud providers. Its emphasis today is on providing the high-touch service experiences that were long missing from its platform, with a clear emphasis on the needs of large enterprise customers. That’s what Thomas Kurian promised to do when he became the organization’s CEO and he’s clearly following through.

 

#artificial-intelligence, #ceo, #cloud, #cloud-computing, #cloud-infrastructure, #computing, #enterprise, #google, #google-cloud, #google-workspace, #technology, #thomas-kurian, #world-wide-web

Databricks brings its lakehouse to Google Cloud

Databricks and Google Cloud today announced a new partnership that will bring to Databricks customers a deep integration with Google’s BigQuery platform and Google Kubernetes Engine. This will allow Databricks’ users to bring their data lakes and the service’s analytics capabilities to Google Cloud.

Databricks already features a deep integration with Microsoft Azure — one that goes well beyond this new partnership with Google Cloud — and the company is also an AWS partner. By adding Google Cloud to this list, the company can now claim to be the “only unified data platform available across all three clouds (Google, AWS and Azure).”

It’s worth stressing, though, that Databricks’ Azure integration is a bit of a different deal from this new partnership with Google Cloud. “Azure Databricks is a first-party Microsoft Azure service that is sold and supported directly by Microsoft. The first-party service is unique to our Microsoft partnership. Customers on Google Cloud will purchase directly from Databricks through the Google Cloud Marketplace,” a company spokesperson told me. That makes it a bit more of a run-of-the-mill partnership compared to the Microsoft deal, but that doesn’t mean the two companies aren’t just as excited about it.

“We’re delighted to deliver Databricks’ lakehouse for AI and ML-driven analytics on Google Cloud,” said Google Cloud CEO Thomas Kurian (or, more likely, one of the company’s many PR specialists who likely wrote and re-wrote this for him a few times before it got approved). “By combining Databricks’ capabilities in data engineering and analytics with Google Cloud’s global, secure network—and our expertise in analytics and delivering containerized applications—we can help companies transform their businesses through the power of data.”

Similarly, Databricks CEO Ali Ghodsi noted that he is “thrilled to partner with Google Cloud and deliver on our shared vision of a simplified, open, and unified data platform that supports all analytics and AI use-cases that will empower our customers to innovate even faster.”

And indeed, this is clearly a thrilling delight for everybody around, including customers like Conde Nast, whose Director of Data Engineering Nana Essuman is “excited to see leaders like Google Cloud and Databricks come together to streamline and simplify getting value from data.”

If you’re also thrilled about this, you’ll be able to hear more about it from both Ghodsi and Kurian at an event on April 6 that is apparently hosted by TechCrunch (though this is the first I’ve heard of it, too).

#ali-ghodsi, #artificial-intelligence, #aws, #bigquery, #cloud-computing, #cloud-infrastructure, #computing, #conde-nast, #databricks, #google, #google-cloud, #microsoft, #microsoft-azure, #partner, #tc, #thomas-kurian

Twitter expands Google Cloud partnership to ‘learn more from data, move faster’

Twitter is upping its data analytics game in the form of an expanded, multiyear partnership with Google Cloud.

The social media giant first began working with Google in 2018 to move Hadoop clusters to the Google Cloud platform as a part of its Partly Cloudy strategy.

With the expanded agreement, Twitter will move its offline analytics, data processing and machine learning workloads to Google’s Data Cloud

I talked with Sudhir Hasbe, Google Cloud’s director of product management and data analytics, to better understand just what this means. He said the move will give Twitter the ability to analyze data faster as part of its goal to provide a better user experience.

You see, behind every tweet, like and retweet, there is a series of data points that helps Twitter understand things like just how people are using the service, and what type of content they might want to see.

Twitter’s data platform ingests trillions of events, processes hundreds of petabytes of data and runs tens of thousands of jobs on over a dozen clusters daily. 

By expanding its partnership with Google, Twitter is essentially adopting the company’s Data Cloud, including BigQuery, Dataflow, BigTable and machine learning (ML) tools to make more sense of, and improve, how Twitter features are used.

Twitter declined a request for an interview but CTO Parag Agrawal said in a written statement that the company’s initial partnership was successful and led to enhanced productivity on the part of its engineering teams.  

“Building on this relationship and Google’s technologies will allow us to learn more from our data, move faster and serve more relevant content to the people who use our service every day,” he said.

Google Cloud’s Hasbe believes that organizations like Twitter need a highly scalable analytics platform so they can derive value from all their data collecting. By expanding its partnership with Google, Twitter is able to add significantly more use cases out of its cloud platform.

“Our platform is serverless and we can help organizations, like Twitter, automatically scale up and down,” Hasbe told TechCrunch.

“Twitter can bring massive amounts of data, analyze and get insights without the burden of having to worry about infrastructure or capacity management or how many machines or servers they might need,” he added. “None of that is their problem.” 

The shift will also make it easier for Twitter’s data scientists and other similar personnel to build machine learning models and do predictive analytics, according to Hasbe.

Other organizations that have recently turned to Google Cloud to help navigate the pandemic include Bed, Bath and Beyond, Wayfair, Etsy and The Home Depot.

On February 2, TC’s Frederic Lardinois reported that while Google Cloud is seeing accelerated revenue growth, its losses are also increasing. This week, Google disclosed operating income/loss for its Google Cloud business unit in its quarterly earnings. Google Cloud lost $5.6 billion in Google’s fiscal year 2020, which ended December 31. That’s on $13 billion of revenue.

#apache-hadoop, #cloud, #cloud-computing, #cloud-infrastructure, #data-analysis, #data-processing, #google-cloud, #google-cloud-platform, #machine-learning, #twitter

Google Cloud launches Apigee X, the next generation of its API management platform

Google today announced the launch of Apigee X, the next major release of the Apgiee API management platform it acquired back in 2016.

“If you look at what’s happening — especially after the pandemic started in March last year — the volume of digital activities has gone up in every kind of industry, all kinds of use cases are coming up. And one of the things we see is the need for a really high-performance, reliable, global digital transformation platform,” Amit Zavery, Google Cloud’s head of platform, told me.

He noted that the number of API calls has gone up 47 percent from last year and that the platform now handles about 2.2 trillion API calls per year.

At the core of the updates are deeper integrations with Google Cloud’s AI, security and networking tools. In practice, this means Apigee users can now deploy their APIs across 24 Google Cloud regions, for example, and use Google’s caching services in more than 100 edge locations.

Image Credits: Google

In addition, Apigee X now integrates with Google’s Cloud Armor firewall and its Cloud Identity Access Management platform. This also means that Apigee users won’t have to use third-party tools for their firewall and identity management needs.

“We do a lot of AI/ML-based anomaly detection and operations management,” Zavery explained. “We can predict any kind of malicious intent or any other things which might happen to those API calls or your traffic by embedding a lot of those insights into our API platform. I think [that] is a big improvement, as well as new features, especially in operations management, security management, vulnerability management and making those a core capability so that as a business, you don’t have to worry about all these things. It comes with the core capabilities and that is really where the front doors of digital front-ends can shine and customers can focus on that.”

The platform now also makes better use of Google’s AI capabilities to help users identify anomalies or predict traffic for peak seasons. The idea here is to help customers automate a lot of the standards automation tasks and, of course, improve security at the same time.

As Zavery stressed, API management is now about more than just managing traffic between applications. But more than just helping customers manage their digital transformation projects, the Apigee team is now thinking about what it calls ‘digital excellence.’ “That’s how we’re thinking of the journey for customers moving from not just ‘hey, I can have a front end,’ but what about all the excellent things you want to do and how we can do that,” Zavery said.

“During these uncertain times, organizations worldwide are doubling-down on their API strategies to operate anywhere, automate processes, and deliver new digital experiences quickly and securely,” said James Fairweather, Chief Innovation Officer at Pitney Bowes. “By powering APIs with new capabilities like reCAPTCHA Enterprise, Cloud Armor (WAF), and Cloud CDN, Apigee X makes it easy for enterprises like us to scale digital initiatives, and deliver innovative experiences to our customers, employees and partners.”

#api, #apigee, #artificial-intelligence, #caching, #cloud, #cloud-computing, #cloud-infrastructure, #computing, #developer, #enterprise, #firewall, #google, #google-cloud, #google-cloud-platform

Google Cloud lost $5.6B in 2020

Google continues to bet heavily on Google Cloud and while it is seeing accelerated revenue growth, its losses are also increasing. For the first time today, Google disclosed operating income/loss for its Google Cloud business unit in its quarterly earnings today. Google Cloud lost $5.6 billion in Google’s fiscal year 2020, which ended December 31. That’s on $13 billion of revenue.

While this may look a bit dire at first glance (cloud computing should be pretty profitable, after all), there’s different ways of looking at this. On the one hand, losses are mounting, up from $4.3 billion in 2018 and $4.6 billion in 2019, but revenue is also seeing strong growth, up from $5.8 billion in 2018 and $8.9 billion in 2019. What we’re seeing here, more than anything else, is Google investing heavily in its cloud business.

Google’s Cloud unit, led by its CEO Thomas Kurian, includes all of its cloud infrastructure and platform services, as well as Google Workspace (which you probably still refer to as G Suite). And that’s exactly where Google is making a lot of investments right now. Data centers, after all, don’t come cheap and Google Cloud launched four new regions in 2020 and started work on others. That’s on top of its investment in its core services and a number of acquisitions.

Image Credits: Google

“Our strong fourth quarter performance, with revenues of $56.9 billion, was driven by Search and YouTube, as consumer and business activity recovered from earlier in the year,” Ruth Porat, CFO of Google and Alphabet, said. “Google Cloud revenues were $13.1 billion for 2020, with significant ongoing momentum, and we remain focused on delivering value across the growth opportunities we see.”

For now, though, Google’s core business, which saw a strong rebound in its advertising business in the last quarter, is subsidizing its cloud expansion.

Meanwhile, over in Seattle, AWS today reported revenue of $12.74 billion in the last quarter alone and operating income of $3.56 billion. For 2020, AWS’s operating income was $13.5 billion.

#alphabet, #amazon-web-services, #artificial-intelligence, #aws, #ceo, #cfo, #cloud-computing, #cloud-infrastructure, #companies, #computing, #diane-greene, #earnings, #google, #google-cloud, #google-cloud-platform, #ruth-porat, #seattle, #thomas-kurian, #world-wide-web

Ford bets on Google Cloud for its digital transformation

Google and Ford today announced a new partnership around bringing Android Automotive to Ford’s Ford- and Lincoln-branded cars, starting in 2023. But at the same time, the two companies also announced that Ford has chosen Google Cloud as its preferred cloud provider.

“With Google Cloud, Ford will digitally transform from the front office to the car to the manufacturing plant floor,” Google Cloud CEO Thomas Kurian said in a press conference today. “And there are a number of different applications, including modernizing product development, improving manufacturing and supply chain management, using computer vision AI for employee training, inspection of equipment on the assembly line and other applications.”

Kurian also noted that Google and Ford are working to find new ways to monetize Ford’s data through features like maintenance requests and trade-in alerts.

“At Ford, we’ve got world-class in-house data insights and analytics teams,”  David McClelland, Ford’s VP for strategy and partnerships, said. “We’ve recruited significant software expertise and we’re making great progress in this area. And we’re moving rapidly towards commercializing our new self-driving business. And with this news that Thomas [Kurian] and I are announcing today, we’re turbocharging all of that.”

McClelland stressed that Google “brought the entire company to the table for us across cloud, Android, Maps and much more.” It’s maybe also no surprise, given Google’s expertise in this area, that for is looking to leverage Google Cloud’s AI tools as well. This work will go beyond the actual driving experience, too, and include work on modernizing Ford’s product development, manufacturing and supply chain, as well as predictive maintenance in Ford’s plants.

Like other car manufacturers, Ford is also looking to find ways to use the data it collects to create a connection to its drivers that goes beyond the buying experience and (maybe) the occasional maintenance visit to a dealership. For this to work, it needs to be able to understand its customers and offer personalized experiences.

Today’s announcement marks a bit of a turnaround for Ford, which had previously banded together with a group of other car manufacturers with the explicit goal of keeping Google’s role in the automotive industry to a minimum. Now, only a few years later, the two are coming together in one of the deeper partnerships in the industry.

It’s also worth mentioning, that not too long ago, Ford had a deep partnership with Microsoft, which provided the basis of Ford’s Sync technology.

“From the first moving assembly line to the latest driver-assist technology, Ford has set the pace of innovation for the automotive industry for nearly 120 years,” said Sundar Pichai, CEO of Google and Alphabet. “We’re proud to partner to apply the best of Google’s AI, data analytics, compute and cloud
platforms to help transform Ford’s business and build automotive technologies that keep people safe and connected on the road.”

#android, #artificial-intelligence, #automotive, #ceo, #cloud, #companies, #computing, #ford, #ford-motor-company, #google, #google-cloud, #henry-ford, #lincoln, #manufacturing, #supply-chain-management, #technology, #thomas-kurian

Three dimensional search engine Physna wants to be the Google of the physical world

In June of 1999, Sequoia Capital and Kleiner Perkins invested $25 million into an early stage company developing a new search engine called Google, paving the way for a revolution in how knowledge online was organized and shared.

Now, Sequoia Capital is placing another bet on a different kind of search engine, one for physical objects in three dimensions, just as the introduction of three dimensional sensing technologies on consumer phones are poised to create a revolution in spatial computing.

At least, that’s the bet that Sequoia Capital’s Shaun Maguire is making on the Cincinnati, Ohio-based startup Physna.

Maguire and Sequoia are leading a $20 million bet into the company alongside Drive Capital, the Columbus, Ohio-based venture firm founded by two former Sequoia partners, Mark Kvamme and Chris Olsen.

“There’s been this open problem in mathematics, which is how you do three dimensional search. How do you define a metric that gives you other similar three dimensional objects. This has a long history in mathematics,” Maguire said. “When I first met [Physna founder] Paul Powers, he had already come up with a wildly novel distance metric to compare different three dimensional objects. If you have one distance metric, you can find other objects that are a distance away. His thinking underlying that is so unbelievably creative. If I were to put it in the language of modern mathematics… it just involves a lot of really advanced ideas that actually also works.”

Powers’ idea — and Physna’s technology — was a long time coming.

A lawyer by training and an entrepreneur at heart, Powers came to the problem of three dimensional search through his old day job as an intellectual property lawyer.

Powers chose IP law because he thought it was the most interesting way to operate at the intersection of technology and law — and would provide good grounding for whatever company the serial entrepreneur would eventually launch next. While practicing, Powers hit upon a big problem, while some intellectual property theft around software and services was easy to catch, it was harder to identify when actual products or parts were being stolen as trade secrets. “We were always able to find 2D intellectual property theft,” Powers said, but catching IP theft in three dimensions was elusive.

From its launch in 2015 through 2019, Powers worked with co-founder and chief technology officer Glenn Warner Jr. on developing the product, which was initially intended to protect product designs from theft. Tragically just as the company was getting ready to unveil its transformation into the three dimensional search engine it had become, Warner died.

Powers soldiered on, rebuilding the company and its executive team with the help of Dennis DeMeyere, who joined the company in 2020 after a stint in Google’s office of the chief technology officer and technical director for Google Cloud.

“When I moved, I jumped on a plane with two checked bags and moved into a hotel, until I could rent a fully furnished home,” DeMeyere told Protocol last year.

Other heavy hitters were also drawn to the Cincinnati-based company thanks in no small part to Olsen and Kvamme’s Silicon Valley connections. They include Github’s chief technology officer, Jason Warner, who has a seat on the company’s board of directors alongside Drive Capital’s co-founder Kvamme, who serves as the chairman.

In Physna, Kvamme, Maguire, and Warner see a combination of Github and Google — especially after the launch last year of the company’s consumer facing site, Thangs.

That site allows users to search for three dimensional objects by a description or by uploading a model or image. As Mike Murphy at Protocol noted, it’s a bit like Thingiverse, Yeggi or other sites used by 3D-printing hobbyists. What the site can also do is show users the collaborative history of each model and the model’s component parts — if it involves different objects.

Hence the GitHub and Google combination. And users can set up profiles to store their own models or collaborate and comment on public models.

What caught Maguire’s eye about the company was the way users were gravitating to the free site. “There were tens of thousands of people using it every day,” he said. It’s a replica of the way many successful companies try a freemium or professional consumer hybrid approach to selling products. “They have a free version and people are using it all the time and loving it. That is a foundation that they can build from,” said Maguire.

And Maguire thinks that the spatial computing wave is coming sooner than anyone may realize. “The new iPhone has LIDAR on it… This is the first consumer device that comes shipped with a 3D scanner with LIDAR and I think three dimensional is about to explode.”

Eventually, Physna could be a technology hub where users can scan three dimensional objects into their phones and have a representational model for reproduction either as a virtual object or as something that can be converted into a file for 3D printing.

Right now, hundreds of businesses have approached the company with different requests for how to apply its technology, according to Powers.

One new feature will allow you to take a picture of something and not only show you what that is or where it goes. Even if that is into a part of the assembly. We shatter a vase and with the vase shards we can show you how the pieces fit back together,” Powers said.

Typical contracts for the company’s software range from $25,000 to $50,000 for enterprise customers, but the software that powers Physna’s product is more than just a single application, according to Powers.

“We’re not just a product. We’re a fundamental technology,” said Powers. “There is a gap between the physical and the digital.”

For Sequoia and Drive Capital, Physna’s software is the technology to bridge that gap.

 

#california, #chairman, #chief-technology-officer, #chris-olsen, #cincinnati, #co-founder, #columbus, #computing, #drive-capital, #entrepreneur, #executive, #github, #google, #google-cloud, #iphone, #kleiner-perkins, #lawyer, #mark-kvamme, #ohio, #printing, #search-engine, #sequoia, #sequoia-capital, #sequoia-partners, #serial-entrepreneur, #shaun-maguire, #tc

Supabase raises $6M for its open-source Firebase alternative

Supabase, a YC-incubated startup that offers developers an open-source alternative to Google’s Firebase and similar platforms, today announced that it has raised a $6 million funding round led by Coatue, with participation from YC, Mozilla and a group of about 20 angel investors.

Currently, Supabase includes support for PostgreSQL databases and authentication tools, with a storage and serverless solution coming soon. It currently provides all the usual tools for working with databases — and listening to database changes — as well as a web-based UI for managing them. The team is quick to note that while the comparison with Google’s Firebase is inevitable, it is not meant to be a 1-to-1 replacement for it. And unlike Firebase, which uses a NoSQL database, Supabase is using PostgreSQL.

Indeed, the team relies heavily on existing open-source projects and contributes to them where it can. One of Supabase’s full-time employees maintains the PostgREST tool for building APIs on top of the database, for example.

“We’re not trying to build another system,” Supabase co-founder and CEO Paul Copplestone told me. “We just believe that already there are well-trusted, scalable enterprise open-source products out there and they just don’t have this usability component. So actually right now, Supabase is an amalgamation of six tools, soon to be seven. Some of them we built ourselves. If we go to market and can’t find anything that we think is going to be scalable — or really solve the problems — then we’ll build it and we’ll open-source it. But otherwise, we’ll use existing tools.”

Image Credits: Supabase

The traditional route to market for open-source tools is to create a tool and then launch a hosted version — maybe with some additional features — to monetize the work. Supabase took a slightly different route and launched a hosted version right away.

If somebody would want to host the service themselves, the code is available, but running your own PaaS is obviously a major challenge, but that’s also why the team went with this approach. What you get with Firebase, he noted, is that it’s a few clicks to set everything up. Supabase wanted to be able to offer the same kind of experience. “That’s one thing that self-hosting just cannot offer,” he said. “You can’t really get the same wow factor that you can if we offered a hosted platform where you literally [have] one click and then a couple of minutes later, you’ve got everything set up.”

In addition, he also noted that he wanted to make sure the company could support the growing stable of tools it was building and commercializing its tools based on its database services was the easiest way to do so.

Like other Y Combinator startups, Supabase closed its funding round after the accelerator’s demo day in August. The team had considered doing a SAFE round, but it found the right group of institutional investors that offered founder-friendly terms to go ahead with this institutional round instead.

“It’s going to cost us a lot to compete with the generous free tier that Firebase offers,” Copplestone said. “And it’s databases, right? So it’s not like you can just keep them stateless and shut them down if you’re not really using them. [This funding round] gives us a long, generous runway and more importantly, for the developers who come in and build on top of us, [they can] take as long as they want and then start monetizing later on themselves.

The company plans to use the new funding to continue to invest in its various tools and hire to support its growth.

Supabase’s value proposition of building in a weekend and scaling so quickly hit home immediately,” said Caryn Marooney, general partner at Coatue and Facebook’s former VP of Global Communications. “We are proud to work with this team, and we are excited by their laser focus on developers and their commitment to speed and reliability.”

#caryn-marooney, #cloud-computing, #coatue, #computing, #database, #developer, #firebase, #google-cloud, #nosql, #platform-as-a-service, #postgresql, #recent-funding, #serverless-computing, #startups, #supabase, #tc

Google acquires Actifio to step into the area of data management and business continuity

In the same week that Amazon is holding its big AWS confab, Google is also announcing a move to raise its own enterprise game with Google Cloud. Today the company announced that it is acquiring Actifio, a data management company that helps companies with data continuity to be better prepared in the event of a security breach or other need for disaster recovery. The deal squares Google up as a competitor against the likes of Rubrik, another big player in data continuity.

The terms of the deal were not disclosed in the announcement; we’re looking and will update as we learn more. Notably, when the company was valued at over $1 billion in a funding round back in 2014, it had said it was preparing for an IPO (which never happened). PitchBook data estimated its value at $1.3 billion in 2018, but earlier this year it appeared to be raising money at about a 60% discount to its recent valuation, according to data provided to us by Prime Unicorn Index.

The company was also involved in a patent infringement suit against Rubrik, which it also filed earlier this year.

It had raised around $461 million, with investors including Andreessen Horowitz, TCV, Tiger, 83 North, and more.

With Actifio, Google is moving into what is one of the key investment areas for enterprises in recent years. The growth of increasingly sophisticated security breaches, coupled with stronger data protection regulation, has given a new priority to the task of holding and using business data more responsibly, and business continuity is a cornerstone of that.

Google describes the startup as as a “leader in backup and disaster recovery” providing virtual copies of data that can be managed and updated for storage, testing, and more. The fact that it covers data in a number of environments — including SAP HANA, Oracle, Microsoft SQL Server, PostgreSQL, and MySQL, virtual machines (VMs) in VMware, Hyper-V, physical servers, and of course Google Compute Engine — means that it also gives Google a strong play to work with companies in hybrid and multi-vendor environments rather than just all-Google shops.

“We know that customers have many options when it comes to cloud solutions, including backup and DR, and the acquisition of Actifio will help us to better serve enterprises as they deploy and manage business-critical workloads, including in hybrid scenarios,” writes Brad Calder, VP, engineering, in the blog post. :In addition, we are committed to supporting our backup and DR technology and channel partner ecosystem, providing customers with a variety of options so they can choose the solution that best fits their needs.”

The company will join Google Cloud.

“We’re excited to join Google Cloud and build on the success we’ve had as partners over the past four years,” said Ash Ashutosh, CEO at Actifio, in a statement. “Backup and recovery is essential to enterprise cloud adoption and, together with Google Cloud, we are well-positioned to serve the needs of data-driven customers across industries.”

#actifio, #enterprise, #google-cloud

Amazon to invest $2.8 billion to build its second data center region in India

Amazon will invest about $2.8 billion in Telangana to set up a new AWS Cloud region in the southern state of India, a top Indian politician announced on Friday.

The investment will allow Amazon to launch an AWS Cloud region in Hyderabad city by mid-2022, said K. T. Rama Rao, Minister for Information Technology, Electronics & Communications, Municipal Administration and Urban Development and Industries & Commerce Departments, Government of Telangana.

The new AWS Asia Region will be Amazon’s second infrastructure region in India, Amazon said in a press release. It did not disclose the size of the investment.

“The new AWS Asia Pacific (Hyderabad) Region will enable even more developers, startups, and enterprises as well as government, education, and non-profit organizations to run their applications and serve end users from data centers located in India,” the e-commerce giant said.

“Businesses in India are embracing cloud computing to reduce costs, increase agility, and enable rapid innovation to meet the needs of billions of customers in India and abroad,” said Peter DeSantis, Senior Vice President of Global Infrastructure and Customer Support, Amazon Web Services, in a statement. “Together with our AWS Asia Pacific (Mumbai) Region, we’re providing customers with more flexibility and choice, while allowing them to architect their infrastructure for even greater fault tolerance, resiliency, and availability across geographic locations.”

The investment illustrates the opportunities Amazon, which has poured over $6.5 billion in its India operations to date and leads the cloud market in the nation, sees in the world’s second largest internet market.

“This is a big win for the state government of Telangana for attracting this level of investment,” said Jayanth Kolla, chief analyst at consultancy firm Convergence Catalyst. He told TechCrunch that the move will also help Amazon better comply with India’s data localization policy. “We could see states launch their own similar laws in the future.”

AWS has courted several high-profile businesses as customers in recent years. Some of these include automobile giant Ashok Leyland, life insurance firm Aditya Birla Capital, edtech giant Byju’s, Axis Bank, Bajaj Capital, ClearTax, Dream11, Druva, Edelweiss, Edunext, Extramarks, Freshworks, HDFC Life, Mahindra Electric, Ola, Oyo, Policybazaar, Quantela, RBL Bank, redBus, Sharda University, Swiggy, Tata Sky, and Zerodha.

More to follow…

#amazon, #asia, #aws, #aws-region, #azure, #column, #google, #google-cloud, #india, #microsoft

Chronicle’s ex-CSO is starting a new company he doesn’t want you to know much about

Mike Wiacek is back with a new company, a year after departing his role as chief security officer at Chronicle, Google’s moonshot cybersecurity company.

Wiacek spent 13 years at Google, founding the company’s Threat Analysis Group, a unit dedicated to countering government and state-backed hacking efforts, and also co-founding Chronicle, which was rolled into Google Cloud after a series of high-profile executive departures, including his own.

Armed with over a decade of experience, Wiacek is now steering the ship at Stairwell, his new cybersecurity startup. Stairwell is now out of stealth after almost a year and securing $4.5 million in seed funding, led by venture firm Accel, with participation from Sequoia Capital, Gradient Ventures, and Allen & Company.

In a press release, the company said it wants “to provide security teams with accessible, user-centric tools that help them understand the pivotal relationships between their external and internal data sources. With this intelligence, organizations will be in a much stronger position to proactively combat the most sophisticated and dangerous cyber attacks.” And, in a call with TechCrunch, Wiacek said his vision for the new company is to “empower any team to defend against every attacker.”

But the founder took the unusual step of declining to say how the company plans to get there. Wiacek confirmed that Stairwell is building at least one product, but declined to offer details of what it is, what it does, or when it will be out.

Admittedly, that made the ensuing conversation rather difficult.

Wiacek said Jan Kang, former chief legal officer at Chronicle, has joined the startup, and the seed round will go towards staffing his team. Stairwell has ten employees at launch, largely focused on engineering, and plans to also include designers, physicists, and applied mathematicians.

As for what the company is working on, “I think it’s one of those we just have to wait and see,” said Wiacek.

I guess we’ll have to.

#accel, #allen-company, #business, #california, #companies, #cybersecurity-startup, #google, #google-cloud, #recent-funding, #security, #startup-company, #startups

Google Cloud launches its Business Application Platform based on Apigee and AppSheet

Unlike some of its competitors, Google Cloud has recently started emphasizing how its large lineup of different services can be combined to solve common business problems. Instead of trying to sell individual services, Google is focusing on solutions and the latest effort here is what it calls its Business Application Platform, which combines the API management capabilities of Apigee with the no-code application development platform of AppSheet, which Google acquired earlier this year.

As part of this process, Google is also launching a number of new features for both services today. The company is launching the beta of a new API Gateway, built on top of the open-source Envoy project, for example. This is a fully-managed service that is meant o makes it easier for developers to secure and manage their API across Google’s cloud computing services and serverless offerings like Cloud Functions and Cloud Run. The new gateway, which has been in alpha for a while now, offers all the standard features you’d expect, including authentication, key validation and rate limiting.

As for its low-code service AppSheet, the Google Cloud team is now making it easier to bring in data from third-party applications thanks to the general availability to Apigee as a data source for the service. AppSheet already supported standard sources like MySQL, Salesforce and G Suite, but this new feature adds a lot of flexibility to the service.

With more data comes more complexity, so AppSheet is also launching new tools for automating processes inside the service today, thanks to the early access launch of AppSheet Automation. Like the rest of AppSheet, the promise here is that developers won’t have to write any code. Instead, AppSheet Automation provides a visual interface, that according to Google, “provides contextual suggestions based on natural language inputs.” 

“We are confident the new category of business application platforms will help empower both technical and line of business developers with the core ability to create and extend applications, build and automate workflows, and connect and modernize applications,” Google notes in today’s announcement. And indeed, this looks like a smart way to combine the no-code environment of AppSheet with the power of Apigee .

#alpha, #api, #api-management, #apigee, #appsheet, #cloud, #cloud-applications, #cloud-computing, #computing, #developer, #enterprise, #envoy, #google, #google-cloud, #google-cloud-platform, #mysql, #salesforce, #serverless-computing, #tc

Google Cloud lets businesses create their own text-to-speech voices

Google launched a few updates to its Contact Center AI product today, but the most interesting one is probably the beta of its new Custom Voice service, which will let brands create their own text-to-speech voices to best represent their own brands.

Maybe your company has a well-known spokesperson for example, but it would be pretty arduous to have them record every sentence in an automated response system or bring them back to the studio whenever you launch a new product or procedure. With Custom Voice, businesses can bring in their voice talent to the studio and have them record a script provided by Google. The company will then take those recordings and train its speech models based on them.

As of now, this seems to be a somewhat manual task on Google’s side. Training and evaluating the model will take “several weeks,” the company says and Google itself will conduct its own tests of the trained model before sending it back to the business that commissioned the model. After that, the business must follow Google’s own testing process to evaluate the results and sign off on it.

For now, these custom voices are still in beta and only American English is supported so far.

It’s also worth noting that Google’s review process is meant to ensure that the result is aligned with its internal AI Principles, which it released back in 2018.

Like with similar projects, I would expect that this lengthy process of creating custom voices for these contact center solutions will become mainstream quickly. While it will just be a gimmick for some brands (remember those custom voices for stand-alone GPS systems back in the day?), it will allow the more forward-thinking brands to distinguish their own contact center experiences from those of the competition. Nobody likes calling customer support, but a more thoughtful experience that doesn’t make you think you’re talking to a random phone tree may just help alleviate some of the stress at least.

#artificial-intelligence, #branding, #cloud, #contact-center, #developer, #enterprise, #google, #google-cloud, #tc, #text-to-speech

Decrypted: How a teenager hacked Twitter, Garmin’s ransomware aftermath

A 17-year-old Florida teenager is accused of perpetrating one of the year’s biggest and most high-profile hacks: Twitter.

A federal 30-count indictment filed in Tampa said Graham Ivan Clark used a phone spearphishing attack to pivot through multiple layers of Twitter’s security and bypassed its two-factor authentication to gain access to an internal “admin” tool that let the hacker take over any account. With two accomplices named in a separate federal indictment, Clark — who went by the online handle “Kirk” — allegedly used the tool to hijack the accounts of dozens of celebrities and public figures, including Bill Gates, Elon Musk and former president Barack Obama, to post a cryptocurrency scam netting over $100,000 in bitcoin in just a few hours.

It was, by all accounts, a sophisticated attack that required technical skills and an ability to trick and deceive to pull off the scam. Some security professionals were impressed, comparing the attack to one that had the finesse and professionalism of a well-resourced nation-state attacker.

But a profile in The New York Times describes Clark was an “adept scammer with an explosive temper.”

In the teenager’s defense, the attack could have been much worse. Instead of pushing a scam that promised to “double your money,” Clark and his compatriots could have wreaked havoc. In 2013, hackers hijacked the Associated Press’ Twitter account and tweeted a fake bomb attack on the White House, sending the markets plummeting — only to quickly recover after the all-clear was given.

But with control of some of the world’s most popular Twitter accounts, Clark was for a few hours in July one of the most powerful people in the world. If found guilty, the teenager could spend his better years behind bars.

Here’s more from the past week.


THE BIG PICTURE

Garmin hobbles back after ransomware attack, but questions remain

#accel, #amazon-web-services, #cloud-computing, #computer-security, #crime, #cybercrime, #data-breach, #decrypted, #elon-musk, #extra-crunch, #garmin, #google-cloud, #growth-marketing, #law-enforcement, #market-analysis, #phishing, #privacy, #ransomware, #security, #security-breaches, #series-a, #social, #startups, #tc, #twitter, #u-s-treasury, #venture-capital

Google signs up Verizon for its AI-powered contact center services

Google today announced that it has signed up Verizon as the newest customer of its Google Cloud Contact Center AI service, which aims to bring natural language recognition to the often inscrutable phone menus that many companies still use today (disclaimer: TechCrunch is part of the Verizon Media Group). For Google, that’s a major win, but it’s also a chance for the Google Cloud team to highlight some of the work it has done in this area. It’s also worth noting that the Contact Center AI product is a good example of Google Cloud’s strategy of packaging up many of its disparate technologies into products that solve specific problems.

“A big part of our approach is that machine learning has enormous power but it’s hard for people,” Google Cloud CEO Thomas Kurian told me in an interview ahead of today’s announcement. “Instead of telling people, ‘well, ‘here’s our natural language processing tools, here is speech recognition, here is text-to-speech and speech-to-text — and why don’t you just write a big neural network of your own to process all that?’ Very few companies can do that well. We thought that we can take the collection of these things and bring that as a solution to people to solve a business problem. And it’s much easier for them when we do that and […] that it’s a big part of our strategy to take our expertise in machine intelligence and artificial intelligence and build domain-specific solutions for a number of customers.”

The company first announced Contact Center AI at its Cloud Next conference two years ago and it became generally available last November. The promise here is that it will allow businesses to build smarter contact center solutions that rely on speech recognition to provide customers with personalized support while it also allows human agents to focus on more complex issues. A lot of this is driven by Google Cloud’s Dialogflow tool for building conversational experiences across multiple channels.

“Our view is that AI technology has reached a stage of maturity where it can be meaningfully applied to solving business problems that customers face,” he said. “One of the most important things that companies need is to differentiate the customer experience through helpful and convenient service — and it has never been more important, especially during the period we’re all in.”

Not too long ago, bots — and especially text-based bots — went through the trough of disillusionment, but Kurian argues that we’ve reached a very different stage now and that these tools can now provide real business value. What’s different now is that a tool like Contact Center AI has more advanced natural language processing capabilities and is able to handle multiple questions at the same time and maintain the context of the conversation.

“The first generation of something called chatbots — they kind of did something but they didn’t really do much because they thought that all questions can be answered with one sentence and that human beings don’t have a conversation,” he noted and also added that Google’s tools are able to automatically create dialogs using a company’s existing database of voice calls and chats that have happened in the past.

When necessary, the Contact Center AI can automatically hand the call off to a human agent when it isn’t able to solve a problem but another interesting feature is its ability to essentially shadow the human agent and automatically provide real-time assistance.

“We have a capability called Agent Assist, where the technology is assisting the agent and that’s the central premise that we built — not to replace the agent but assist the agent.”

Because of the COVID-19 pandemic, more companies are now accelerating their digital transformation projects. Kurian said that this is also true for companies that want to modernize their contact centers, given that for many businesses, this has now become their main way to interact with their customers.

As for Verizon, Kurian noted that this was a very large project that has to handle very high call volumes and a large variety of incoming questions.

“We have worked with Verizon for many, many years in different contexts as Alphabet and so we’ve known the customer for a long time,” said Kurian. “They have started using our cloud. They also experimented with other technologies and so we sort of went in three phases. Phase One is to get a discussion with the customer around the use of our technology for chat, then the focus is on saying you shouldn’t just do chat, you should do chat and voice on a common platform to avoid the kind of thing where you get one response online and a different response when you call. And then we’ve had our engineers working with them — virtually obviously, not physically.”

He noted that Google has seen quite a bit of success with Contact Center AI in the telco space, but also among government agencies, for example, especially in Europe and Asia. In some verticals like retail, he noted, Google Cloud’s customers are mostly focused on chat, while the company is seeing more voice usage among banks, for example. In the telco business, Google sees both across its customers, so it probably made sense for Verizon to bet on both voice and chat with its implementation.

“Verizon’s commitment to innovation extends to all aspects of the customer experience,” said Verizon global CIO and SVP Shankar Arumugavelu in today’s announcement. “These customer service enhancements, powered by the Verizon collaboration with Google Cloud, offer a faster and more personalized digital experience for our customers while empowering our customer support agents to provide a higher level of service.”

#articles, #artificial-intelligence, #asia, #ceo, #cloud-computing, #dialogflow, #europe, #google, #google-cloud, #machine-learning, #natural-language-processing, #neural-network, #speech-recognition, #tc, #techcrunch, #technology, #text-to-speech, #thomas-kurian, #verizon-media-group

Google reportedly cancelled a cloud project meant for countries including China

Updated with Google’s full statement

After reportedly spending a year and a half working on a cloud service meant for China and other countries, Google cancelled the project, called “Isolated Region,” in May due partly to geopolitical and pandemic-related concerns. Bloomberg reports that Isolated Region, shut down in May, would have enabled it to offer cloud services in countries that want to keep and control data within their borders.

According to two Google employees who spoke to Bloomberg, the project was part of a larger initiative called “Sharded Google” to create data and processing infrastructure that is completely separate from the rest of the company’s network. Isolated Region began in early 2018 in response to Chinese regulations that mean foreign tech companies that want to enter the country need to form a joint venture with a local company that would hold control over user data. Isolated Region was meant to help meet requirements like this in China and other countries, while also addressing U.S. national security concerns.

Bloomberg’s sources said the project was paused in China in January 2019, and focus was redirected to Europe, the Middle East and Africa instead, before Isolated Region was ultimately cancelled in May, though Google has since considered offering a smaller version of Google Cloud Platform in China.

After the story was first published, a Google representative told Bloomberg that Isolated Region wasn’t shut down because of geopolitical issues or the pandemic, and that the company “does not offer and has not offered cloud platform services inside China.”

Instead, she said Isolated Region was cancelled because “other approaches we were actively pursuing offered better outcomes. We have a comprehensive approach to addressing these requirements that covers the governance of data, operational practices and survivability of software. Isolated Region was just one of the paths we explored to address these requirements.”

Alphabet, Google’s parent company, broke out Google Cloud as its own line item for the first time in its fourth-quarter and full-year earnings report, released in February. It revealed that its run rate grew 53.6% during the last year to just over $10 billion in 2019, making it a more formidable rival to competitors Amazon and Microsoft.

Update: In a media statement, Google said:

“We’ve seen emerging requirements around adoption of cloud technology from customers and regulatory bodies in many different parts of the world. We have a comprehensive approach to addressing these requirements that covers the governance of data, operational practices, and survivability of software. Isolated Region was just one of the paths we explored to address these requirements. What we learned from customer conversations and input from government stakeholders in Europe and elsewhere is that other approaches we were also actively pursuing offered better outcomes. Isolated Region was not shut down over geopolitical concerns or the pandemic. Google does not offer and has not offered cloud platform services inside China, and Google Cloud is not weighing options to offer the Google Cloud Platform in China.”

#alphabet, #china, #cloud-computing, #europe, #google, #google-cloud, #isolated-region, #security, #sharded-google, #tc

Nvidia’s Ampere GPUs come to Google Cloud

Nvidia today announced that its new Ampere-based data center GPUs, the A100 Tensor Core GPUs, are now available in alpha on Google Cloud. As the name implies, these GPUs were designed for AI workloads, as well as data analytics and high-performance computing solutions.

The A100 promises a significant performance improvement over previous generations. Nvidia says the A100 can boost training and inference performance by over 20x compared to its predecessors (though you’ll mostly see 6x or 7x improvements in most benchmarks) and tops out at about 19.5 TFLOPs in single-precision performance and 156 TFLOPs for Tensor Float 32 workloads.

Image Credits: Nvidia

“Google Cloud customers often look to us to provide the latest hardware and software services to help them drive innovation on AI and scientific computing workloads,” said Manish Sainani, Director of Product Management at Google Cloud, in today’s announcement. “With our new A2 VM family, we are proud to be the first major cloud provider to market Nvidia A100 GPUs, just as we were with Nvidia’s T4 GPUs. We are excited to see what our customers will do with these new capabilities.”

Google Cloud users can get access to instances with up to 16 of these A100 GPUs, for a total of 640GB of GPU memory and 1.3TB of system memory.

#ampere, #artificial-intelligence, #cloud, #computing, #developer, #enterprise, #google-cloud, #nvidia, #tc

Google Cloud launches Filestore High Scale, a new storage tier for high-performance computing workloads

Google Cloud today announced the launch of Filestore High Scale, a new storage option — and tier of Google’s existing Filestore service — for workloads that can benefit from access to a distributed high-performance storage option.

With Filestore High Scale, which is based on technology Google acquired when it bought Elastifile in 2019, users can deploy shared file systems with hundreds of thousands of IOPS, 10s of GB/s of throughput and at a scale of 100s of TBs.

“Virtual screening allows us to computationally screen billions of small molecules against a target protein in order to discover potential treatments and therapies much faster than traditional experimental testing methods,” says Christoph Gorgulla, a postdoctoral research fellow at Harvard Medical School’s Wagner Lab., which already put the new service through its paces. “As researchers, we hardly have the time to invest in learning how to set up and manage a needlessly complicated file system cluster, or to constantly monitor the health of our storage system. We needed a file system that could handle the load generated concurrently by thousands of clients, which have hundreds of thousands of vCPUs.”

The standard Google Cloud Filestore service already supports some of these use cases, but the company notes that it specifically built Filestore High Scale for high-performance computing (HPC) workloads. In today’s announcement, the company specifically focuses on biotech use cases around COVID-19. Filestore High Scale is meant to support tens of thousands of concurrent clients, which isn’t necessarily a standard use case, but developers who need this kind of power can now get it in Google Cloud.

In addition to High Scale, Google also today announced that all Filestore tiers now offer beta support for NFS IP-based access controls, an important new feature for those companies that have advanced security requirements on top of their need for a high-performance, fully managed file storage service.

#cloud, #cloud-computing, #cloud-infrastructure, #computing, #enterprise, #file-storage, #google, #google-cloud

Google makes it easier to migrate VMware environments to its cloud

Google Cloud today announced the next step in its partnership with VMware: the Google Cloud VMware Engine. This fully managed service provides businesses with a full VMware Cloud Foundation stack on Google Cloud to help businesses easily migrate their existing VMware-based environments to Google’s infrastructure. Cloud Foundation is VMware’s stack for hybrid and private cloud deployments

Given Google Cloud’s focus on enterprise customers, it’s no surprise that the company continues to bet on partnerships with the likes of VMware to attract more of these companies’ workloads. Less than a year ago, Google announced that VMware Cloud Foundation would come to Google Cloud and that it would start supporting VMware workloads. Then, last November, Google Cloud acquired CloudSimple, a company that specialized in running VMware environments and that Google had already partnered with for its original VMware deployments. The company describes today’s announcement as the third step in this journey.

VMware Engine provides users with all of the standard Cloud Foundation components: vSphere, vCenter, vSAN, NSX-T and HCX. With this, Google Cloud General Manager June Yang notes in today’s announcement, businesses can quickly stand up their own software-defined data center in the Google Cloud.

“Google Cloud VMware Engine is designed to minimize your operational burden, so you can focus on your business,” she notes. “We take care of the lifecycle of the VMware software stack and manage all related infrastructure and upgrades. Customers can continue to leverage IT management tools and third-party services consistent with their on-premises environment.”

Google is also working with third-party providers like NetApp, Veeam, Zerto, Cohesity and Dell Technologies to ensure that their solutions work on Google’s platform, too.

“As customers look to simplify their cloud migration journey, we’re committed to build cloud services to help customers benefit from the increased agility and efficiency of running VMware workloads on Google Cloud,” said Bob Black, Dell Technologies Global Lead Alliance Principal at Deloitte Consulting. “By combining Google Cloud’s technology and Deloitte’s business transformation experience, we can enable our joint customers to accelerate their cloud migration, unify operations, and benefit from innovative Google Cloud services as they look to modernize applications.””

#cloud, #cloudsimple, #computing, #enterprise, #google, #google-cloud, #vmware