Microsoft brings more of its Azure services to any Kubernetes cluster

At its Build developer conference today, Microsoft announced a new set of Azure services (in preview) that businesses can now run on virtually any CNCF-conformant Kubernetes cluster with the help of its Azure Arc multi-cloud service.

Azure Arc, similar to tools like Google’s Anthos or AWS’s upcoming EKS Anywhere, provides businesses with a single tool to manage their container clusters across clouds and on-premises data centers. Since its launch back in late 2019, Arc enabled some of the core Azure services to run directly in these clusters as well, though the early focus was on a small set of data services, with the team also later adding some machine learning tools to Arc as well. With today’s update, the company is greatly expanding this set of containerized Azure services that work with Arc.

These new services include Azure App Service for building and managing web apps and APIs, Azure Functions for event-driven programming, Azure Logic Apps for building automated workflows, Azure Event Grid for event routing, and Azure API Management for… you guessed it… managing internal and external APIs.

“The app services are now Azure Arc-enabled, which means customers can deploy Web Apps, Functions, API gateways, Logic Apps and Event Grid services on pre-provisioned Kubernetes clusters,” Microsoft explained in its annual “Book of News” for this year’s Build. “This takes advantage of features including deployment slots for A/B testing, storage queue triggers and out-of-box connectors from the app services, regardless of run location. With these portable turnkey services, customers can save time building apps, then manage them consistently across hybrid and multicloud environments using Azure Arc.”

read

#api, #aws, #azure, #azure-arc, #cloud-computing, #cloud-infrastructure, #computing, #google-cloud-platform, #kubernetes, #machine-learning, #microsoft, #microsoft-build-2021, #microsoft-azure, #tc, #web-apps

0

Microsoft Visual Studio 2022 will (finally) enter the 64-bit world

Earlier today, Microsoft offered us a peek at Visual Studio 2022, which will offer its first public preview builds later this summer. If you’re into the Visual Studio ecosystem, this looks like a killer upgrade.

Visual Studio enters the 64-bit world… finally

With Visual Studio 2022, you’ll finally be able to take advantage of all of your system RAM. Earlier versions of Visual Studio are 32-bit applications, thereby hobbling VS to a maximum of 2GiB RAM.

The new VS2022 is fully 64-bit—without which the first GIF in the gallery above wouldn’t be able to open a whopping 1,600 projects and roughly 300,000 files at once.

Read 7 remaining paragraphs | Comments

#app-development, #azure, #microsoft, #microsoft-azure, #net, #tech, #visual-studio, #visual-studio-2022

0

Microsoft outage knocks sites and services offline

Microsoft is experiencing a major outage, so that’s why you can’t get any work done.

Besides its homepage, Microsoft services are down, log-in pages aren’t loading, and even the company’s status pages were kaput. Worse, Microsoft’s cloud service Azure appeared to also be offline, causing outages to any sites and services that rely on it.

It’s looking like a networking issue, according to the status page — when it loaded. Microsoft also tweeted that it was related to DNS, the internet system that translates web addresses to computer-readable internet numbers. It’s an important function of how the internet works, so not ideal when it suddenly breaks.

We’ve reached out for comment, and we’ll follow up when we know more.

#cloud, #cloud-computing, #cloud-infrastructure, #computing, #microsoft, #microsoft-azure, #security, #technology

0

Microsoft launches Azure Percept, its new hardware and software platform to bring AI to the edge

Microsoft today announced Azure Percept, its new hardware and software platform for bringing more of its Azure AI services to the edge. Percept combines Microsoft’s Azure cloud tools for managing devices and creating AI models with hardware from Microsoft’s device partners. The general idea here is to make it far easier for all kinds of businesses to build and implement AI for things like object detection, anomaly detections, shelf analytics and keyword spotting at the edge by providing them with an end-to-end solution that takes them from building AI models to deploying them on compatible hardware.

To kickstart this, Microsoft also today launches a hardware development kit with an intelligent camera for vision use cases (dubbed Azure Percept Vision). The kit features hardware-enabled AI modules for running models at the edge, but it can also be connected to the cloud. Users will also be able to trial their proofs-of-concept in the real world because the development kit conforms to the widely used 80/20 T-slot framing architecture.

In addition to Percept Vision, Microsoft is also launching Azure Percept Audio for audio-centric use cases.

Azure Percept devices, including Trust Platform Module, Azure Percept Vision and Azure Percept Audio

Azure Percept devices, including Trust Platform Module, Azure Percept Vision and Azure Percept Audio

“We’ve started with the two most common AI workloads, vision and voice, sight and sound, and we’ve given out that blueprint so that manufacturers can take the basics of what we’ve started,” said Roanne Sones, the corporate vice president of Microsoft’s edge and platform group, said. “But they can envision it in any kind of responsible form factor to cover a pattern of the world.”

Percept customers will have access to Azure’s cognitive service and machine learning models and Percept devices will automatically connect to Azure’s IoT hub.

Microsoft says it is working with silicon and equipment manufacturers to build an ecosystem of “intelligent edge devices that are certified to run on the Azure Percept platform.” Over the course of the next few months, Microsoft plans to certify third-party devices for inclusion in this program, which will ideally allow its customers to take their proofs-of-concept and easily deploy them to any certified devices.

“Anybody who builds a prototype using one of our development kits, if they buy a certified device, they don’t have to do any additional work,” said Christa St. Pierre, a product manager in Microsoft’s Azure edge and platform group.

St. Pierre also noted that all of the components of the platform will have to conform to Microsoft’s responsible AI principles — and go through extensive security testing.

#articles, #artificial-intelligence, #azure, #cloud, #cloud-computing, #cloud-infrastructure, #enterprise, #machine-learning, #microsoft, #microsoft-ignite-2021, #microsoft-azure, #perception, #philosophy, #platform, #product-manager, #software-platform, #tc

0

Microsoft’s Azure Arc multi-cloud platform now supports machine learning workloads

With Azure Arc, Microsoft offers a service that allows its customers to run Azure in any Kubernetes environment, no matter where that container cluster is hosted. From Day One, Arc supported a wide range of use cases, but one feature that was sorely missing when it first launched was support for machine learning (ML). But one of the advantages of a tool like Arc is that it allows enterprises to run their workloads close to their data and today, that often means using that data to train ML models.

At its Ignite conference, Microsoft today announced that it bringing exactly this capability to Azure Arc with the addition of Azure Machine Learning to the set of Arc-enabled data services.

“By extending machine learning capabilities to hybrid and multicloud environments, customers can run training models where the data lives while leveraging existing infrastructure investments. This reduces data movement and network latency, while meeting security and compliance requirements,” Azure GM Arpan Shah writes in today’s announcement.

This new capability is now available to Arc customers.

In addition to bringing this new machine learning capability to Arc, Microsoft also today announced that Azure Arc enabled Kubernetes, which allows users to deploy standard Kubernetes configurations to their clusters anywhere, is now generally available.

Also new in this world of hybrid Azure services is support for Azure Kubernetes Service on Azure Stack HCI. That’s a mouthful, but Azure Stack HCI is Microsoft’s platform for running Azure on a set of standardized, hyperconverged hardware inside a customer’s datacenter. The idea pre-dates Azure Arc, but it remains a plausible alternative for enterprises who want to run Azure in their own data center and has continued support from vendors like Dell, Lenovo, HPE, Fujitsu and DataOn.

On the open-source side of Arc, Microsoft also today stressed that Arc is built to work with any Kubernetes distribution that is conformant to the standard of the Cloud Native Computing Foundation (CNCF) and that it has worked with RedHat, Canonical, Rancher and now Nutanix to test and validate their Kubernetes implementations on Azure Arc.

#cloud-computing, #cloud-infrastructure, #cloud-native-computing-foundation, #computing, #dell, #fujitsu, #hpe, #kubernetes, #lenovo, #machine-learning, #microsoft, #microsoft-ignite-2021, #microsoft-azure, #ml, #nutanix, #red-hat, #redhat, #tc

0

Microsoft Azure expands its NoSQL portfolio with Managed Instances for Apache Cassandra

At its Ignite conference today, Microsoft announced the launch of Azure Managed Instance for Apache Cassandra, its latest NoSQL database offering and a competitor to Cassandra-centric companies like Datastax. Microsoft describes the new service as a ‘semi-managed offering that will help companies bring more of their Cassandra-based workloads into its cloud.

“Customers can easily take on-prem Cassandra workloads and add limitless cloud scale while maintaining full compatibility with the latest version of Apache Cassandra,” Microsoft explains in its press materials. “Their deployments gain improved performance and availability, while benefiting from Azure’s security and compliance capabilities.”

Like its counterpart, Azure SQL Manages Instance, the idea here is to give users access to a scalable, cloud-based database service. To use Cassandra in Azure before, businesses had to either move to Cosmos DB, its highly scalable database service which supports the Cassandra, MongoDB, SQL and Gremlin APIs, or manage their own fleet of virtual machines or on-premises infrastructure.

Cassandra was originally developed at Facebook and then open-sourced in 2008. A year later, it joined the Apache Foundation and today it’s used widely across the industry, with companies like Apple and Netflix betting on it for some of their core services, for example. AWS launched a managed Cassandra-compatible service at its re:Invent conference in 2019 (it’s called Amazon Keyspaces today), Microsoft only launched the Cassandra API for Cosmos DB last November. With today’s announcement, though, the company can now offer a full range of Cassandra-based servicer for enterprises that want to move these workloads to its cloud.

#amazon, #apache-cassandra, #api, #apple, #aws, #cloud, #computing, #data, #data-management, #datastax, #developer, #enterprise, #facebook, #microsoft, #microsoft-ignite-2021, #microsoft-azure, #mongodb, #netflix, #nosql, #sql, #tc

0

Microsoft’s Dapr open-source project to help developers build cloud-native apps hits 1.0

Dapr, the Microsoft-incubated open-source project that aims to make it easier for developers to build event-driven, distributed cloud-native applications, hit its 1.0 milestone today, signifying the project’s readiness for production use cases. Microsoft launched the Distributed Application Runtime (that’s what “Dapr” stand for) back in October 2019. Since then, the project released 14 updates and the community launched integrations with virtually all major cloud providers, including Azure, AWS, Alibaba and Google Cloud.

The goal for Dapr, Microsoft Azure CTO Mark Russinovich told me, was to democratize cloud-native development for enterprise developers.

“When we go look at what enterprise developers are being asked to do — they’ve traditionally been doing client, server, web plus database-type applications,” he noted. “But now, we’re asking them to containerize and to create microservices that scale out and have no-downtime updates — and they’ve got to integrate with all these cloud services. And many enterprises are, on top of that, asking them to make apps that are portable across on-premises environments as well as cloud environments or even be able to move between clouds. So just tons of complexity has been thrown at them that’s not specific to or not relevant to the business problems they’re trying to solve.”

And a lot of the development involves re-inventing the wheel to make their applications reliably talk to various other services. The idea behind Dapr is to give developers a single runtime that, out of the box, provides the tools that developers need to build event-driven microservices. Among other things, Dapr provides various building blocks for things like service-to-service communications, state management, pub/sub and secrets management.

Image Credits: Dapr

“The goal with Dapr was: let’s take care of all of the mundane work of writing one of these cloud-native distributed, highly available, scalable, secure cloud services, away from the developers so they can focus on their code. And actually, we took lessons from serverless, from Functions-as-a-Service where with, for example Azure Functions, it’s event-driven, they focus on their business logic and then things like the bindings that come with Azure Functions take care of connecting with other services,” Russinovich said.

He also noted that another goal here was to do away with language-specific models and to create a programming model that can be leveraged from any language. Enterprises, after all, tend to use multiple languages in their existing code, and a lot of them are now looking at how to best modernize their existing applications — without throwing out all of their current code.

As Russinovich noted, the project now has more than 700 contributors outside of Microsoft (though the core commuters are largely from Microsoft) and a number of businesses started using it in production before the 1.0 release. One of the larger cloud providers that is already using it is Alibaba. “Alibaba Cloud has really fallen in love with Dapr and is leveraging it heavily,” he said. Other organizations that have contributed to Dapr include HashiCorp and early users like ZEISS, Ignition Group and New Relic.

And while it may seem a bit odd for a cloud provider to be happy that its competitors are using its innovations already, Russinovich noted that this was exactly the plan and that the team hopes to bring Dapr into a foundation soon.

“We’ve been on a path to open governance for several months and the goal is to get this into a foundation. […] The goal is opening this up. It’s not a Microsoft thing. It’s an industry thing,” he said — but he wasn’t quite ready to say to which foundation the team is talking.

 

#alibaba, #alibaba-cloud, #aws, #cloud, #cloud-computing, #cloud-infrastructure, #cloud-services, #cloud-storage, #computing, #developer, #enterprise, #google, #hashicorp, #mark-russinovich, #microservices, #microsoft, #microsoft-azure, #new-relic, #serverless-computing, #tc

0

Databricks brings its lakehouse to Google Cloud

Databricks and Google Cloud today announced a new partnership that will bring to Databricks customers a deep integration with Google’s BigQuery platform and Google Kubernetes Engine. This will allow Databricks’ users to bring their data lakes and the service’s analytics capabilities to Google Cloud.

Databricks already features a deep integration with Microsoft Azure — one that goes well beyond this new partnership with Google Cloud — and the company is also an AWS partner. By adding Google Cloud to this list, the company can now claim to be the “only unified data platform available across all three clouds (Google, AWS and Azure).”

It’s worth stressing, though, that Databricks’ Azure integration is a bit of a different deal from this new partnership with Google Cloud. “Azure Databricks is a first-party Microsoft Azure service that is sold and supported directly by Microsoft. The first-party service is unique to our Microsoft partnership. Customers on Google Cloud will purchase directly from Databricks through the Google Cloud Marketplace,” a company spokesperson told me. That makes it a bit more of a run-of-the-mill partnership compared to the Microsoft deal, but that doesn’t mean the two companies aren’t just as excited about it.

“We’re delighted to deliver Databricks’ lakehouse for AI and ML-driven analytics on Google Cloud,” said Google Cloud CEO Thomas Kurian (or, more likely, one of the company’s many PR specialists who likely wrote and re-wrote this for him a few times before it got approved). “By combining Databricks’ capabilities in data engineering and analytics with Google Cloud’s global, secure network—and our expertise in analytics and delivering containerized applications—we can help companies transform their businesses through the power of data.”

Similarly, Databricks CEO Ali Ghodsi noted that he is “thrilled to partner with Google Cloud and deliver on our shared vision of a simplified, open, and unified data platform that supports all analytics and AI use-cases that will empower our customers to innovate even faster.”

And indeed, this is clearly a thrilling delight for everybody around, including customers like Conde Nast, whose Director of Data Engineering Nana Essuman is “excited to see leaders like Google Cloud and Databricks come together to streamline and simplify getting value from data.”

If you’re also thrilled about this, you’ll be able to hear more about it from both Ghodsi and Kurian at an event on April 6 that is apparently hosted by TechCrunch (though this is the first I’ve heard of it, too).

#ali-ghodsi, #artificial-intelligence, #aws, #bigquery, #cloud-computing, #cloud-infrastructure, #computing, #conde-nast, #databricks, #google, #google-cloud, #microsoft, #microsoft-azure, #partner, #tc, #thomas-kurian

0

Microsoft earnings: Xbox hardware sales shot up 86% with Series X/S

The Xbox Series X, which launched in November.

Enlarge / The Xbox Series X, which launched in November. (credit: Sam Machkovech)

Microsoft delivered its earnings report for Q2 2021 yesterday, and the company has continued its sprint of very strong quarters, again driven primarily by Azure and the cloud. But that same old story isn’t the only one here: the report also tells us a thing or two about the new Xbox’s performance, as well as Windows and Office.

Overall, Microsoft beat analyst expectations. The company’s top-level revenue grew 17 percent year over year, reaching $43.08 billion. Analysts had expected $40.18 billion. $14.6 billion of that was from the business segment Microsoft calls “Intelligent Cloud,” which most notably includes Azure but also some other professional services like GitHub.

Cloud wasn’t the only positive story, though. Personal Computing including Windows, Xbox, and Surface grew 15 percent compared to the previous year to just over $15 billion. That included an 86 percent increase in Xbox hardware sales, as well as a 40 percent increase in Xbox content and surfaces—the former of those includes the launch of the Xbox Series X/S consoles in November, and the latter includes Game Pass, which Microsoft has been pushing hard as a core value proposition for the Xbox game platform.

Read 5 remaining paragraphs | Comments

#earnings, #microsoft, #microsoft-azure, #office, #satya-nadella, #tech, #windows, #xbox

0

With a $50B run rate in reach, can anyone stop AWS?

AWS, Amazon’s flourishing cloud arm, has been growing at a rapid clip for more than a decade. An early public cloud infrastructure vendor, it has taken advantage of first-to-market status to become the most successful player in the space. In fact, one could argue that many of today’s startups wouldn’t have gotten off the ground without the formation of cloud companies like AWS giving them easy access to infrastructure without having to build it themselves.

In Amazon’s most-recent earnings report, AWS generated revenues of $11.6 billion, good for a run rate of more than $46 billion. That makes the next AWS milestone a run rate of $50 billion, something that could be in reach in less than two quarters if it continues its pace of revenue growth.

The good news for competing companies is that in spite of the market size and relative maturity, there is still plenty of room to grow.

While the cloud division’s growth is slowing in percentage terms as it comes firmly up against the law of large numbers in which AWS has to grow every quarter compared to an ever-larger revenue base. The result of this dynamic is that while AWS’ year-over-year growth rate is slowing over time — from 35% in Q3 2019 to 29% in Q3 2020 — the pace at which it is adding $10 billion chunks of annual revenue run rate is accelerating.

At the AWS re:Invent customer conference this year, AWS CEO Andy Jassy talked about the pace of change over the years, saying that it took the following number of months to grow its run rate by $10 billion increments:

123 months ($0-$10 billion) 23 months ($10 billion-$20 billion) 13 months ($20 billion-$30 billion) 12 months ($30 billion to $40 billion)

Image Credits: TechCrunch (data from AWS)

Extrapolating from the above trend, it should take AWS fewer than 12 months to scale from a run rate of $40 billion to $50 billion. Stating the obvious, Jassy said “the rate of growth in AWS continues to accelerate.” He also took the time to point out that AWS is now the fifth-largest enterprise IT company in the world, ahead of enterprise stalwarts like SAP and Oracle.

What’s amazing is that AWS achieved its scale so fast, not even existing until 2006. That growth rate makes us ask a question: Can anyone hope to stop AWS’ momentum?

The short answer is that it doesn’t appear likely.

Cloud market landscape

A good place to start is surveying the cloud infrastructure competitive landscape to see if there are any cloud companies that could catch the market leader. According to Synergy Research, AWS remains firmly in front, and it doesn’t look like any competitor could catch AWS anytime soon unless some market dynamic caused a drastic change.

Synergy Research Cloud marketshare leaders. Amazon is first, Microsoft is second and Google is third.

Image Credits: Synergy Research

With around a third of the market, AWS is the clear front-runner. Its closest and fiercest rival Microsoft has around 20%. To put that into perspective a bit, last quarter AWS had $11.6 billion in revenue compared to Microsoft’s $5.2 billion Azure result. While Microsoft’s equivalent cloud number is growing faster at 47%, like AWS, that number has begun to drop steadily while it gains market share and higher revenue and it falls victim to that same law of large numbers.

#amazon, #aws, #cloud, #cloud-infrastructure-market, #enterprise, #google-cloud-platform, #microsoft-azure, #tc

0

Microsoft announces its first Azure data center region in Denmark

Microsoft continues to expand its global Azure data center presence at a rapid clip. After announcing new regions in Austria and Taiwan in October, the company today revealed its plans to launch a new region in Denmark.

As with many of Microsoft’s recent announcements, the company is also attaching a commitment to provide digital skills to 200,000 people in the country (in this case, by 2024).

“With this investment, we’re taking the next step in our longstanding commitment to provide Danish society and businesses with the digital tools, skills and infrastructure needed to drive sustainable growth, innovation, and job creation. We’re investing in Denmark’s digital leap into the future – all in a way that supports the country’s ambitious climate goals and economic recovery,” said Nana Bule, General Manager, Microsoft Denmark.

Azure regions

Image Credits: Microsoft

The new data center, which will be powered by 100% renewable energy and feature multiple availability zones, will feature support for what has now become the standard set of Microsoft cloud products: Azure, Microsoft 365, and Dynamics 365 and Power Platform.

As usual, the idea here is to provide low-latency access to Microsoft’s tools and services. It has long been Microsoft’s strategy to blanket the globe with local data centers. Europe is a prime example of this, with regions (both operational and announced) in about a dozen countries already. In the U.S., Azure currently offers 13 regions (including three exclusively for government agencies), with a new region on the West Coast coming soon.

“This is a proud day for Microsoft in Denmark,” said Brad Smith, President, Microsoft. “Building a hyper-scale datacenter in Denmark means we’ll store Danish data in Denmark, make computing more accessible at even faster speeds, secure data with our world-class security, protect data with Danish privacy laws, and do more to provide to the people of Denmark our best digital skills training. This investment reflects our deep appreciation of Denmark’s green and digital leadership globally and our commitment to its future.”

#austria, #cloud, #cloud-computing, #computing, #denmark, #developer, #europe, #microsoft, #microsoft-azure, #renewable-energy, #subscription-services, #taiwan, #united-states, #west-coast

0

Microsoft announces its first Azure data center region in Taiwan

After announcing its latest data center region in Austria earlier this month and an expansion of its footprint in Brazil, Microsoft today unveiled its plans to open a new region in Taiwan. This new region will augment its existing presence in East Asia, where the company already runs data centers in China (operated by 21Vianet), Hong Kong, Japan and Korea. This new region will bring Microsoft’s total presence around the world to 66 cloud regions.

Similar to its recent expansion in Brazil, Microsoft also pledged to provide digital skilling for over 200,000 people in Taiwan by 2024 and it is growing its Taiwan Azure Hardware Systems and Infrastructure engineering group, too. That’s in addition to investments in its IoT and AI research efforts in Taiwan and the startup accelerator it runs there.

“Our new investment in Taiwan reflects our faith in its strong heritage of hardware and software integration,” said Jean-Phillippe Courtois, Executive Vice President and President, Microsoft Global Sales, Marketing and Operations. “With Taiwan’s expertise in hardware manufacturing and the new datacenter region, we look forward to greater transformation, advancing what is possible with 5G, AI and IoT capabilities spanning the intelligent cloud and intelligent edge.”

Image Credits: Microsoft

The new region will offer access to the core Microsoft Azure services. Support for Microsoft 365, Dynamics 365 and Power Platform. That’s pretty much Microsoft’s playbook for launching all of its new regions these days. Like virtually all of Microsoft’s new data center region, this one will also offer multiple availability zones.

#artificial-intelligence, #austria, #brazil, #china, #cloud, #cloud-computing, #cloud-infrastructure, #cloud-storage, #computing, #internet-of-things, #iot, #japan, #microsoft, #microsoft-365, #microsoft-azure, #taiwan

0

Microsoft Azure announces its first region in Austria

Microsoft today announced its plans to launch a new data center region in Austria, its first in the country. With nearby Azure regions in Switzerland, Germany, France and a planned new region in northern Italy, this part of Europe now has its fair share of Azure coverage. Microsoft also noted that it plans to launch a new ‘Center of Digital Excellence’ to Austria to “to modernize Austria’s IT infrastructure, public governmental services and industry innovation.”

In total, Azure now features 65 cloud regions — though that number includes some that aren’t online yet. As its competitors like to point out, not all of them feature multiple availability zones yet, but the company plans to change that. Until then, the fact that there’s usually another nearby region can often make up for that.

Image Credits: Microsoft

Talking about availability zones, in addition to announcing this new data center region, Microsoft also today announced plans to expand its cloud in Brazil, with new availability zones to enable high-availability workloads launching in the existing Brazil South region in 2021. Currently, this region only supports Azure workloads but will add support for Microsoft 365, Dynamics 365 and Power Platform over the course of the next few months.

This announcement is part of a large commitment to building out its presence in Brazil. Microsoft is also partnering with the Ministry of Economy “to help job matching for up to 25 million workers and is offering free digital skilling with the capacity to train up to 5.5 million people” and to use its AI to protect the rainforest. That last part may sound a bit naive, but the specific plan here is to use AI to predict likely deforestation zones based on data from satellite images.

#artificial-intelligence, #austria, #brazil, #cloud, #cloud-computing, #cloud-infrastructure, #computing, #europe, #france, #germany, #italy, #microsoft, #microsoft-365, #microsoft-azure, #ministry-of-economy, #subscription-services, #switzerland

0

Microsoft challenges Twilio with the launch of Azure Communication Services

Microsoft today announced the launch of Azure Communication Services, a new set of features in its cloud that enable developers to add voice and video calling, chat, text messages to their apps, as well as old-school telephony.

The company describes the new set of services as the ” first fully managed communication platform offering from a major cloud provider” and that seems right, given that Google and AWS offer some of these features, including the AWS notification service, for example, but not as part of a cohesive communication service. Indeed, it seems Azure Communication Service is more of a competitor to the core features of Twilio or up-and-coming MessageBird.

Over the course of the last few years, Microsoft has built up a lot of experience in this area, in large parts things to the success of its Teams service. Unsurprisingly, that’s something Microsoft is also playing up in its announcement.

“Azure Communication Services is built natively on top a global, reliable cloud — Azure. Businesses can confidently build and deploy on the same low latency global communication network used by Microsoft Teams to support 5B+ meeting minutes daily,” writes Scott Van Vliet, Corporate Vice President for Intelligent Communication at the company.

Microsoft also stresses that it offers a set of additional smart services that developers can tap into to build out their communication services, including its translation tools, for example. The company also notes that its services are encrypted to meet HIPPA and GDPR standards.

Like similar services, developer access the various capabilities through a set of new APIs and SDKs.

As for the core services, the capabilities here are pretty much what you’d expect. There’s voice and video calling (and the ability to shift between them). There’s support for chat and starting in October, users will also be able to send text messages. Microsoft says developers will be able to send these to users anywhere, with Microsoft positioning it as a global service.

Provisioning phone numbers, too, is part of the services and developers will be able to provision those for in-bound and out-bound calls, port existing numbers, request new ones and — most importantly for contact-center users — integrated them with existing on-premises equipment and carrier networks.

“Our goal is to meet businesses where they are and provide solutions to help them be resilient and move their business forward in today’s market,” writes Van Vliet. “We see rich communication experiences – enabled by voice, video, chat, and SMS – continuing to be an integral part in how businesses connect with their customers across devices and platforms.”

#amazon-web-services, #aws, #cloud-computing, #cloud-infrastructure, #computing, #google, #microsoft, #microsoft-azure, #tc, #telephony, #twilio

0

Microsoft Azure launches new availability zones in Canada and Australia

Microsoft Azure offers developers access to more data center regions than its competitors, but it was late to the game of offering different availability zones in those regions for high-availability use cases. After a few high-profile issues a couple of years ago, it accelerated its roadmap for building availability zones. Currently, 12 of Microsoft’s regions feature availability zones and as the company announced at its Ignite conference, both the Canada Central and Australia region will feature availability zones now.

In addition, the company today promised that it would launch availability zones in each country it operates data centers in within the next 24 months.

The idea of an availability zone is to offer users access to data centers that in the same geographic region but are physically separate and each feature their own power, networking and connectivity infrastructure. That way, in case one of those data centers goes offline for whatever reason, there is still another one in the same area that can take over.

In its early days, Microsoft Azure took a slightly different approach and focus on regions without availability zones, arguing that geographic expansion was more important than offering zones. Google took a somewhat similar approach, but it now offers three availability zones for virtually all of its regions (and four in Iowa). The general idea here was that developers could always choose multiple regions for high-availability applications, but that still introduces additional latencies, for example.

#australia, #cloud, #cloud-computing, #cloud-infrastructure, #cloud-storage, #computing, #data-center, #data-management, #google, #iowa, #microsoft, #microsoft-ignite-2020, #microsoft-azure

0

Microsoft brings data services to its Arc multi-cloud management service

Microsoft today launched a major update to its Arc multi-cloud service that allows Azure customers to run and manage workloads across clouds — including those of Microsoft’s competitors — and their on on-premises data centers. First announced at Microsoft Ignite in 2019, Arc was always meant to not just help users manage their servers but to also allow them to run data services like Azure SQL and Azure Database for PostgreSQL close to where their data sits.

Today, the company is making good on this promise with the preview launch of Azure Arc enabled data services with support for, as expected, Azure SQL and Azure Database for PostgreSQL.

In addition, Microsoft is making the core feature of Arc, Arc enabled servers, generally available. These are the tools at the core of the service that allow enterprises can use the standard Azure Portal to manage and monitor their Windows and Linux servers across their multi-cloud and edge environments.

Image Credits: Microsoft

“We’ve always known that enterprises are looking to unlock the agility of the cloud — they love the app model, they love the business model — while balancing a need to maintain certain applications and workloads on premises,” Rohan Kumar, Microsoft’s corporate VP for Azure Data said. “A lot of customers actually have a multi-cloud strategy. In some cases, they need to keep the data specifically for regulatory compliance. And in many cases, they want to maximize their existing investments. They’ve spent a lot of CapEx.”

As Kumar stressed, Microsoft wants to meet customers where they are, without forcing them to adopt a container architecture, for example, or replace their specialized engineered appliances to use Arc.

“Hybrid is really [about] providing that flexible choice to our customers, meeting them where they are, and not prescribing a solution,” he said.

He admitted that this approach makes engineering the solution more difficult, but the team decided that the baseline should be a container endpoint and nothing more. And for the most part, Microsoft packaged up the tools its own engineers were already using to run Azure services on the company’s own infrastructure to manage these services in a multi-cloud environment.

“In hindsight, it was a little challenging at the beginning, because, you can imagine, when we initially built them, we didn’t imagine that we’ll be packaging them like this. But it’s a very modern design point,” Kumar said. But the result is that supporting customers is now relatively easy because it’s so similar to what the team does in Azure, too.

Kumar noted that one of the selling points for the Azure Data Services is also that the version of Azure SQL is essentially evergreen, allowing them to stop worrying about SQL Server licensing and end-of-life support questions.

#arc, #azure-arc, #cloud, #cloud-computing, #cloud-infrastructure, #computing, #database, #enterprise, #microsoft, #microsoft-ignite-2020, #microsoft-azure, #serverless-computing, #sql, #tc

0

Google Cloud’s new BigQuery Omni will let developers query data in GCP, AWS and Azure

At its virtual Cloud Next ’20 event, Google today announced a number of updates to its cloud portfolio, but the public alpha launch of BigQuery Omni is probably the highlight of this year’s event. Powered by Google Cloud’s Anthos hybrid-cloud platform, BigQuery Omni allows developers to use the BigQuery engine to analyze data that sits in multiple clouds, including those of Google Cloud competitors like AWS and Microsoft Azure — though for now, the service only supports AWS, with Azure support coming later.

Using a unified interface, developers can analyze this data locally without having to move data sets between platforms.

“Our customers store petabytes of information in BigQuery, with the knowledge that it is safe and that it’s protected,” said Debanjan Saha, the GM and VP of Engineering for Data Analytics at Google Cloud, in a press conference ahead of today’s announcement. “A lot of our customers do many different types of analytics in BigQuery. For example, they use the built-in machine learning capabilities to run real-time analytics and predictive analytics. […] A lot of our customers who are very excited about using BigQuery in GCP are also asking, ‘how can they extend the use of BigQuery to other clouds?’ ”

Image Credits: Google

Google has long said that it believes that multi-cloud is the future — something that most of its competitors would probably agree with, though they all would obviously like you to use their tools, even if the data sits in other clouds or is generated off-platform. It’s the tools and services that help businesses to make use of all of this data, after all, where the different vendors can differentiate themselves from each other. Maybe it’s no surprise then, given Google Cloud’s expertise in data analytics, that BigQuery is now joining the multi-cloud fray.

“With BigQuery Omni customers get what they wanted,” Saha said. “They wanted to analyze their data no matter where the data sits and they get it today with BigQuery Omni.”

Image Credits: Google

He noted that Google Cloud believes that this will help enterprises break down their data silos and gain new insights into their data, all while allowing developers and analysts to use a standard SQL interface.

Today’s announcement is also a good example of how Google’s bet on Anthos is paying off by making it easier for the company to not just allow its customers to manage their multi-cloud deployments but also to extend the reach of its own products across clouds. This also explains why BigQuery Omni isn’t available for Azure yet, given that Anthos for Azure is still in preview, while AWS support became generally available in April.

#alpha, #amazon-web-services, #analytics, #bigquery, #cloud, #cloud-analytics, #cloud-computing, #cloud-infrastructure, #computing, #developer, #enterprise, #google, #machine-learning, #microsoft, #microsoft-azure, #omni, #sql, #web-services

0

Microsoft and SAS announce deep technology partnership

Microsoft and SAS, the privately held enterprise data management and analytics company (and not the airline), today announced a far-reaching partnership that will see Microsoft’s Azure become SAS’s preferred cloud and deep integrations of SAS’s various products into Microsoft’s cloud portfolio, ranging from Azure to Dynamics 365 and PowerBI. The two companies also plan to launch new joint solutions for their customers.

While you may not necessarily be familiar with 44-year-old SAS, the North Carolina-based company counts more than 90 of the top 100 Fortune 1000 companies among its customers, Marquee customers include the likes of Allianz, Discover, Honda, HSBC, Lockheed Martin, Lufthansa and Nestle. While it provides tools and services for companies across a wide range of verticals, they all focus on helping these companies better manage their data and turn it into actionable analytics. Like similar data-centric companies, these days, that includes a lot of work on machine learning, too.

SAS COO and CTO Oliver Schabenberger

“It is a technology partnership,” SAS COO and CTO Oliver Schabenberger told me ahead of today’s announcement. “Our customers are increasingly moving to the cloud. I have something that I call the ‘principles of analytics.’ The first principle is: analytics follows the data — and increasingly, data is moving to the cloud. We have our own cloud operation at SAS. We have done enterprise hosting for over 20 years and have a lot of experience in that. So one of the strategic questions that I asked myself is how do we combine what we love so much about our own cloud and managed services and working directly with a customer with the scale, the agility and the reach of a public cloud?”

The answer to that for SAS was a partnership with Microsoft. Both companies, Schabenberger said, are looking at how to democratize access to technologies like machine learning and analytics, he noted, but are also trying to build data visualization tools and other services that make it easier for anybody within a company to work with the increasingly large data sets that most enterprises now gather.

“The technologies of SAS and Microsoft to me go hand in hand,” said Schabenberger. “They really complement each other. What Microsoft’s doing with Dynamics, with Power Platform, I can envision a new class of business applications — all low-code, no-code — where data and analytics drive logic and drive decisioning. And so for us, what’s really interesting, fascinating and innovative about this relationship is that this is not about bringing a service to Azure, or an integration into Synapse. It is really looking at the entire Microsoft Cloud estate, if you will, from Azure to integrating with AD, with AKS, with [Azure] Database for PostgreSQL. These are obvious things, but then looking at Microsoft 365, Dynamics 365 and Power Platform, how can we be part of this ecosystem? I think that’s a very powerful integration.”

It’s important to note that this is not an exclusive agreement and Schabenberg stressed that SAS will continue to offer support for customers who choose a different public cloud provider.

Scott Guthrie, Microsoft executive VP of its Cloud and AI group, echoed this. “We couldn’t be more excited on the Microsoft side for this partnership. If you look at pretty much any business out there, they’re using SAS for analytics and they’re using Microsoft software as well. And the thing that Oliver called out and what we really look for in strategic partnerships like this is, where can we help our mutual customers do more and achieve more? And I think both from a technology alignment perspective and then also from a mission statement and culture perspective, that’s where we’re so aligned.”

Both Guthrie and Schabenberger stressed how deep the integrations here are. As an example, Guthrie noted that users will be able to take SAS models and embed them into SQL Server statements — and there will be similar integrations for Microsoft products into SAS’s tools, too. Guthrie also noted that the two companies will go to market together in a deep way, too, leveraging the existing salesforces of both companies. “So it’s a little different from what we might do with a startup, which tends to not have a big salesforce. But as part of this partnership, you’ll definitely see our go-to-market deep alignment and Microsoft sellers will be heavily incented to promote and push the SAS integration and likewise, SAS is going to be highly incented to drive this integration from their perspective as well.”

One interesting aspect here is that both companies offer competing products, be that around data management and analytics, as well as data visualization. Guthrie and Schabenberger were quite open about this, though. “I’m perfectly comfortable with that,” said Schabenberger. “I’ve recognized for a long time that our customers have choices and they exercise those choices. And if we bring the right technology to bear and offer it to them, then I’m proud of the technology we built. We’re not the best at everything and I am really looking forward actually to focusing on our core competency, where we’re strongest — and I’m happy to have customers make other choices. […] We have an existing customer base that wants to make use of their existing investment in SAS technology, but also wants to modernize, wants to be part of a cloud ecosystem, wants to operate with agility and speed — and we can combine all that.”

“We’ve been around long enough and we’re big enough and we have enough customers to also realize, what really matters is making your customers successful,” noted Guthrie. “And
the complementary capabilities that we’re bringing together by partnering is so powerful that, yes, there might be some overlap in a few places, but for the most part, this is such a powerful accelerant for our customers and we’re going to both benefit from that.”

#allianz, #business-intelligence, #business-software, #cloud-computing, #cloud-infrastructure, #computing, #data-management, #discover, #honda, #hsbc, #lockheed-martin, #lufthansa, #machine-learning, #microsoft, #microsoft-azure, #nestle, #north-carolina, #power-bi, #sas, #scott-guthrie, #tc, #vertica

0

Docker expands relationship with Microsoft to ease developer experience across platforms

When Docker sold off its enterprise division to Mirantis last fall, that didn’t mark the end of the company. In fact, Docker still exists and has refocused as a cloud-native developer tools vendor. Today it announced an expanded partnership with Microsoft around simplifying running Docker containers in Azure.

As its new mission suggests, it involves tighter integration between Docker and a couple of Azure developer tools including Visual Studio Code and Azure Container Instances (ACI). According to Docker, it can take developers hours or even days to set up their containerized environment across the two sets of tools.

The idea of the integration is to make it easier, faster and more efficient to include Docker containers when developing applications with the Microsoft tool set. Docker CEO Scott Johnston says it’s a matter of giving developers a better experience.

“Extending our strategic relationship with Microsoft will further reduce the complexity of building, sharing and running cloud-native, microservices-based applications for developers. Docker and VS Code are two of the most beloved developer tools and we are proud to bring them together to deliver a better experience for developers building container-based apps for Azure Container Instances,” Johnston said in a statement.

Among the features they are announcing is the ability to log into Azure directly from the Docker command line interface, a big simplification that reduces going back and forth between the two sets of tools. What’s more, developers can set up a Microsoft ACI environment complete with a set of configuration defaults. Developers will also be able to switch easily between their local desktop instance and the cloud to run applications.

These and other integrations are designed to make it easier for Azure and Docker common users to work in in the Microsoft cloud service without having to jump through a lot of extra hoops to do it.

It’s worth noting that these integrations are starting in Beta, but the company promises they should be released some time in the second half of this year.

#cloud, #containerization, #developer, #developer-tools, #docker, #enterprise, #microsoft, #microsoft-azure, #tc

0

Microsoft launches new tools for building fairer machine learning models

At its Build developer conference, Microsoft today put a strong emphasis on machine learning but in addition to plenty of new tools and features, the company also highlighted its work on building more responsible and fairer AI systems — both in the Azure cloud and Microsoft’s open-source toolkits.

These include new tools for differential privacy and a system for ensuring that models work well across different groups of people, as well as new tools that enable businesses to make the best use of their data while still meeting strict regulatory requirements.

As developers are increasingly tasked to learn how to build AI models, they regularly have to ask themselves whether the systems are “easy to explain” and that they “comply with non-discrimination and privacy regulations,” Microsoft notes in today’s announcement. But to do that, they need tools that help them better interpret their models’ results. One of those is interpretML, which Microsoft launched a while ago, but also the Fairlearn toolkit, which can be used to assess the fairness of ML models, and which is currently available as an open-source tool and which will be built into Azure Machine Learning next month.

As for differential privacy, which makes it possible to get insights from private data while still protecting private information, Microsoft today announced WhiteNoise, a new open-source toolkit that’s available both on GitHub and through Azure Machine Learning. WhiteNoise is the result of a partnership between Microsoft and Harvard’s Institute for Quantitative Social Science.

#artificial-intelligence, #cloud, #cloud-computing, #cloud-infrastructure, #developer, #github, #learning, #machine-learning, #microsoft, #microsoft-build-2020, #microsoft-azure, #ml, #tc, #whitenoise

0

Azure Cognitive Services learns more languages

It wouldn’t be a Microsoft Build without a bunch of new capabilities for Azure Cognitive Services, Microsoft’s cloud-based AI tools for developers.

The first new feature is what Microsoft calls the “personalized apprentice mode,” which allows the existing Personalizer API to learn about user preferences in real time and in parallel with existing apps, all without being exposed to the user until it reaches pre-set performance goals.

With this update, the Cognitive Services Speech Service, the unified Azure API for speech-to-text, text-to-speech and translation, is coming to 27 new locales and the company promises a 20 percent reduction in word error rates for its speech transcription services. For the Neural Text to Speech service, Microsoft promises that it has reduced the pronunciation error rate by 50 percent and it’s now bringing this service to 11 new locales with 15 new voices, too. It is also adding a pronunciation assessment to the service.

Also new is an addition to QnA Maker, a no-code service that can automatically read FAQs, support websites, product manuals and other documents and turn them into Q&A pairs. In the process, it creates something akin to a knowledge base, which users can now also collaboratively edit with the addition of role-based access control to the service.

In addition, Azure Cognitive Search, a related service that focuses on — you guessed it — search, is also getting a couple of new capabilities. Using the same natural language understanding engine that powers Bing and Microsoft Office, Azure Cognitive Search is now getting a new custom search ranking feature (in preview), that allows users to build their own search rankings based on their specific needs. As Microsoft notes, a home improvement retailer could use this to build its own search ranking system to augment the existing Cognitive Search results.

#api, #artificial-intelligence, #bing, #cloud-computing, #computing, #microsoft, #microsoft-build-2020, #microsoft-azure, #speech-recognition, #speech-synthesis, #tc

0

Microsoft launches Azure Synapse Link to help enterprises get faster insights from their data

At its Build developer conference, Microsoft today announced Azure Synapse Link, a new enterprise service that allows businesses to analyze their data faster and more efficiently, using an approach that’s generally called ‘hybrid transaction/analytical processing’ (HTAP). That’s a mouthful, it essentially enables enterprises to use the same database system for analytical and transactional workloads on a single system. Traditionally, enterprises had to make some tradeoffs between either building a single system for both that was often highly over-provisioned or to maintain separate systems for transactional and analytics workloads.

Last year, at its Ignite conference, Microsoft announced Azure Synapse Analytics, an analytics service that combines analytics and data warehousing to create what the company calls “the next evolution of Azure SQL Data Warehouse.” Synapse Analytics brings together data from Microsoft’s services and those from its partners and makes it easier to analyze.

“One of the key things, as we work with our customers on their digital transformation journey, there is an aspect of being data-driven, of being insights-driven as a culture, and a key part of that really is that once you decide there is some amount of information or insights that you need, how quickly are you able to get to that? For us, time to insight and a secondary element, which is the cost it takes, the effort it takes to build these pipelines and maintain them with an end-to-end analytics solution, was a key metric we have been observing for multiple years from our largest enterprise customers,” said Rohan Kumar, Microsoft’s corporate VP for Azure Data.

Synapse Link takes the work Microsoft did on Synaps Analytics a step further by removing the barriers between Azure’s operational databases and Synapse Analytics, so enterprises can immediately get value from the data in those databases without going through a data warehouse first.

“What we are announcing with Synapse Link is the next major step in the same vision that we had around reducing the time to insight,” explained Kumar. “And in this particular case, a long-standing barrier that exists today between operational databases and analytics systems is these complex ETL (extract, transform, load) pipelines that need to be set up just so you can do basic operational reporting or where, in a very transactionally consistent way, you need to move data from your operational system to the analytics system, because you don’t want impact the performance of the operational system in any way because that’s typically dealing with, depending on the system, millions of transactions per second.”

ETL pipelines, Kumar argued, are typically expensive and hard to build and maintain, yet enterprises are now building new apps — and maybe even line of business mobile apps — where any action that consumers take and that is registered in the operational database is immediately available for predictive analytics, for example.

From the user perspective, enabling this only takes a single click to link the two, while it removes the need for managing additional data pipelines or database resources. That, Kumar said, was always the main goal for Synapse Link. “With a single click, you should be able to enable real-time analytics on you operational data in ways that don’t have any impact on your operational systems, so you’re not using the compute part of your operational system to do the query, you actually have to transform the data into a columnar format, which is more adaptable for analytics, and that’s really what we achieved with Synapse Link.”

Because traditional HTAP systems on-premises typically share their compute resources with the operational database, those systems never quite took off, Kumar argued. In the cloud, with Synapse Link, though, that impact doesn’t exist because you’re dealing with two separate systems. Now, once a transaction gets committed to the operational database, the Synapse Link system transforms the data into a columnar format that is more optimized for the analytics system — and it does so in real time.

For now, Synapse Link is only available in conjunction with Microsoft’s Cosmos DB database. As Kumar told me, that’s because that’s where the company saw the highest demand for this kind of service, but you can expect the company to add support for available in Azure SQL, Azure Database for PostgreSQL and Azure Database for MySQL in the future.

#azure-sql-data-warehouse, #business-intelligence, #cloud, #cloud-computing, #cloud-infrastructure, #computing, #data-management, #data-warehouse, #developer, #enterprise, #microsoft, #microsoft-build-2020, #microsoft-azure, #mysql, #postgresql, #sql, #tc

0

Azure Arc, Microsoft’s service for managing cloud resources anywhere, is now in public preview

At its Build developer conference, Microsoft today announced that Azure Arc, its service for managing cloud resources anywhere, including competing clouds like AWS and GCP and platforms like Red Hat’s Open Shift, is now in public preview.

Microsoft first announced this Kubernetes-based solution at its Ignite event in Orland last September. One feature that makes it stand out is that it takes some of what Microsoft has learned from its Azure Stack project for bringing Azure Services to its customers’ data centers (and unsurprisingly, Azure Arc also supports deployments on Azure Stack). Thanks to this, Azure Arc doesn’t just allow you to manage containerized workloads anywhere but also includes the ability to bring services like Azure SQL Database and Azure Database for PostgreSQL to these platforms. It’s also worth noting that while this is a Microsoft service, it supports both Windows and Linux servers.

As part of today’s public preview launch, Microsoft also announced that Arc now supports SUSE Linux Enterprise Server and the SUSE CaaS Platform. “Azure Arc for servers gives customers a central management control plane with security and governance capabilities for SUSE Linux Enterprise Server systems hosted outside of the Azure cloud, such as edge deployments,” says SUSE President of Engineering and Innovation Thomas Di Giacomo.

It’s no secret that most large cloud vendors now have some kind of multi-cloud management service that’s similar to Azure Arc. Google is betting heavily on Anthos, for example, while AWS offers its fully-managed Outpost service. They all have slightly different characteristics and philosophies, but the fact that every major cloud player is now offering some version of this is a clear sign that enterprises don’t want to be locked into using a single cloud — even as these services make them place a bet on a specific vendor for their management services, though.

In a related set of announcements, Microsoft also launched a large set of new features for Azure Stack. This includes the private preview of Azure Stack Hub fleet management for monitoring deployments across Azure and Azure Stack Hub, as well as GPU partitioning using AMD GPU’s, which is also now in private preview. This last part matters not just for using those GPUs for visualization but also for enabling graphics-intensive workloads on virtualized desktop environments through Azure Stack Hub for enterprises that use AMD GPUs in their servers. With GPU partitioning, admins can give multiple users access to their share of the overall GPUs power.

#amd, #artificial-intelligence, #azure, #cloud-computing, #cloud-infrastructure, #cloud-storage, #computing, #fleet-management, #google, #linux, #microsoft, #microsoft-build-2020, #microsoft-windows, #microsoft-azure, #postgresql, #red-hat, #sql, #tc

0

New earnings report shows Microsoft’s shift to cloud and subscriptions is working

Promotional image of desktop computer.

Enlarge / Xbox Series X, due in late 2020. It’s tall. And it has a modified controller compared to the Xbox One pad. (credit: Xbox)

The gauntlet of tech earnings reports has mostly come to a close, and there’s a wide range of performance. Almost every part of the tech industry has been rattled by COVID-19, but Microsoft managed to report accelerated growth and strong performance for all of its businesses. It’s a sign that the software company’s efforts to reinvent itself may be working—and that cloud and subscription services will define the company (and with it, customers’ experiences with its products) for years to come.

Microsoft’s Q3 2020 earnings report showed significant growth for all three of the company’s business segments, which hasn’t even always happened in a “normal” quarter. Productivity, which includes services like Office and LinkedIn, grew 16 percent year over year to $11.7 billion in revenue—that’s a small step down compared to $11.8 in the immediate preceding quarter. Cloud, which includes Azure and GitHub, grew 27 percent year over year to $12.3 billion. And personal computing—an umbrella that covers Windows, Xbox, and Surface—grew a more modest 3 percent year over year to $11 billion.

All told, Microsoft’s revenue for the quarter was $35 billion, down $2 billion from the previous quarter but up 15 percent from last year’s Q3. Even Xbox, which saw an 11 percent drop last quarter, grew by three points. Microsoft this week announced that Xbox Game Pass, a Netflix-like subscription for accessing about 100 games on the Xbox One and Windows 10 platforms, reached 10 million subscribers—more evidence that subscription services and the like are now integral to the company’s strategy across all its businesses.

Read 3 remaining paragraphs | Comments

#azure, #cloud-services, #earnings, #microsoft, #microsoft-azure, #stock-market, #tech

0

Google Cloud’s fully-managed Anthos is now generally available for AWS

A year ago, back in the days of in-person conferences, Google officially announced the launch of its Anthos multi-cloud application modernization platform at its Cloud Next conference. The promise of Anthos was always that it would allow enterprises to write their applications once, package them into containers and then manage their multi-cloud deployments across GCP, AWS, Azure and their on-prem data centers.

Until now, support for AWS and Azure was only available in preview, but today, the company is making support for AWS and on-premises generally available. Microsoft Azure support remains in preview, though.

“As an AWS customer now, or a GCP customer, or a multi-cloud customer, […] you can now run Anthos on those environments in a consistent way, so you don’t have to learn any proprietary APIs and be locked in,” Eyal Manor, the VP of engineering in charge of Anthos, told me. “And for the first time, we enable the portability between different infrastructure environments as opposed to what has happened in the past where you were locked into a set of API’s.”

Manor stressed that Anthos was designed to be multi-cloud from day one. As for why AWS support is launching ahead of Azure, Manor said that there was simply more demand for it. “We surveyed the customers and they said, hey, we want, in addition to GCP, we want AWS,” he said. But support for Azure will come later this year and the company already has a number of preview customers for it. In addition, Anthos will also come to bare metal servers in the future.

Looking even further ahead, Manor also noted that better support for machine learning workloads in on the way. Many businesses, after all, want to be able to update and run their models right where their data resides, no matter what cloud that may be. There, too, the promise of Anthos is that developers can write the application once and then run it anywhere.

“I think a lot of the initial response and excitement was from the developer audiences,” Jennifer Lin, Google Cloud’s VP of product management, told me. “Eric Brewer had led a white paper that we did to say that a lot of the Anthos architecture sort of decouples the developer and the operator stakeholder concerns. There hadn’t been a multi-cloud shared software architecture where we could do that and still drive emerging and existing applications with a common shared software stack.”

She also noted that a lot of Google Cloud’s ecosystem partners endorsed the overall Anthos architecture early on because they, too, wanted to be able to write once and run anywhere — and so do their customers.

Plaid is one of the launch partners for these new capabilities. “Our customers rely on us to be always available and as a result we have very high reliability requirements,” said Naohiko Takemura, Plaid’s head of engineering. “We pursued a multi-cloud strategy to ensure redundancy for our critical KARTE service. Google Cloud’s Anthos works seamlessly across GCP and our other cloud providers preventing any business disruption. Thanks to Anthos, we prevent vendor lock-in, avoid managing cloud-specific infrastructure, and our developers are not constrained by cloud providers.”

With this release, Google Cloud is also bringing deeper support for virtual machines to Anthos, as well as improved policy and configuration management.

Over the next few months, the Anthos Service Mesh will also add support for applications that run in traditional virtual machines. As Lin told me, “a lot of this is is about driving better agility and talking the complexity out of it so that we have abstractions that work across any environment, whether it’s legacy or new or on-prem or AWS or GCP.”

#amazon-web-services, #api, #cloud, #cloud-computing, #cloud-infrastructure, #computing, #developer, #enterprise, #google, #google-cloud, #google-cloud-platform, #machine-learning, #microsoft, #microsoft-azure, #netapp, #product-management, #tc

0