Google’s Anthos multi-cloud platform gets improved logging, Windows container support and more

Google today announced a sizable update to its Anthos multi-cloud platform that lets you build, deploy and manage containerized applications anywhere, including on Amazon’s AWS and (in preview) on Microsoft Azure.

Version 1.7 includes new features like improved metrics and logging for Anthos on AWS, a new Connect gateway to interact with any cluster right from Google Cloud and a preview of Google’s managed control plane for Anthos Service Mesh. Other new features include Windows container support for environments that use VMware’s vSphere platform and new tools for developers to make it easier for them to deploy their applications to any Anthos cluster.

Today’s update comes almost exactly two years after Google CEO Sundar Pichai originally announced Anthos at its Cloud Next event in 2019 (before that, Google called this project the ‘Google Cloud Services Platform,’ which launched three years ago). Hybrid- and multi-cloud, it’s fair to say, takes a key role in the Google Cloud roadmap — and maybe more so for Google than for any of its competitors. And recently, Google brought on industry veteran Jeff Reed to become the VP of Product Management in charge of Anthos.

Reed told me that he believes that there are a lot of factors right now that are putting Anthos in a good position. “The wind is at our back. We bet on Kubernetes, bet on containers — those were good decisions,” he said. Increasingly, customers are also now scaling out their use of Kubernetes and have to figure out how to best scale out their clusters and deploy them in different environments — and to do so, they need a consistent platform across these environments. He also noted that when it comes to bringing on new Anthos customers, it’s really those factors that determine whether a company will look into Anthos or not.

He acknowledged that there are other players in this market, but he argues that Google Cloud’s take on this is also quite different. “I think we’re pretty unique in the sense that we’re from the cloud, cloud-native is our core approach,” he said. “A lot of what we talk about in [Anthos] 1.7 is about how we leverage the power of the cloud and use what we call ‘an anchor in the cloud’ to make your life much easier. We’re more like a cloud vendor there, but because we support on-prem, we see some of those other folks.” Those other folks being IBM/Red Hat’s OpenShift and VMware’s Tanzu, for example. 

The addition of support for Windows containers in vSphere environments also points to the fact that a lot of Anthos customers are classical enterprises that are trying to modernize their infrastructure, yet still rely on a lot of legacy applications that they are now trying to bring to the cloud.

Looking ahead, one thing we’ll likely see is more integrations with a wider range of Google Cloud products into Anthos. And indeed, as Reed noted, inside of Google Cloud, more teams are now building their products on top of Anthos themselves. In turn, that then makes it easier to bring those services to an Anthos-managed environment anywhere. One of the first of these internal services that run on top of Anthos is Apigee. “Your Apigee deployment essentially has Anthos underneath the covers. So Apigee gets all the benefits of a container environment, scalability and all those pieces — and we’ve made it really simple for that whole environment to run kind of as a stack,” he said.

I guess we can expect to hear more about this in the near future — or at Google Cloud Next 2021.

 

#anthos, #apigee, #aws, #ceo, #chrome-os, #cisco, #cloud, #cloud-computing, #cloud-infrastructure, #computing, #enterprise, #google, #google-cloud, #google-cloud-platform, #ibm, #kubernetes, #microsoft, #microsoft-windows, #red-hat, #sundar-pichai, #vmware

0

Window Snyder’s new startup Thistle Technologies raises $2.5M seed to secure IoT devices

The Internet of Things has a security problem. The past decade has seen wave after wave of new internet-connected devices, from sensors through to webcams and smart home tech, often manufactured in bulk but with little — if any — consideration to security. Worse, many device manufacturers make no effort to fix security flaws, while others simply leave out the software update mechanisms needed to deliver patches altogether.

That sets up an entire swath of insecure and unpatchable devices to fail, and destined to be thrown out when they break down or are invariably hacked.

Security veteran Window Snyder thinks there is a better way. Her new startup, Thistle Technologies, is backed with $2.5 million in seed funding from True Ventures with the goal of helping IoT manufacturers reliably and securely deliver software updates to their devices.

Snyder founded Thistle last year, and named it after the flowering plant with sharp prickles designed to deter animals from eating them. “It’s a defense mechanism,” Snyder told TechCrunch, a name that’s fitting for a defensive technology company. The startup aims to help device manufacturers without the personnel or resources to integrate update mechanisms into their device’s software in order to receive security updates and better defend against security threats.

“We’re building the means so that they don’t have to do it themselves. They want to spend the time building customer-facing features anyway,” said Snyder. Prior to founding Thistle, Snyder worked in senior cybersecurity positions at Apple, Intel, and Microsoft, and also served as chief security officer at Mozilla, Square, and Fastly.

Thistle lands on the security scene at a time when IoT needs it most. Botnet operators are known to scan the internet for devices with weak default passwords and hijack their internet connections to pummel victims with floods of internet traffic, knocking entire websites and networks offline. In 2016, a record-breaking distributed denial-of-service attack launched by the Mirai botnet on internet infrastructure giant Dyn knocked some of the biggest websites — Shopify, SoundCloud, Spotify, Twitter — offline for hours. Mirai had ensnared thousands of IoT devices into its network at the time of the attack.

Other malicious hackers target IoT devices as a way to get a foot into a victim’s network, allowing them to launch attacks or plant malware from the inside.

Since device manufacturers have done little to solve their security problems among themselves, lawmakers are looking at legislating to curb some of the more egregious security mistakes made by default manufacturers, like using default — and often unchangeable — passwords and selling devices with no way to deliver security updates.

California paved the way after passing an IoT security law in 2018, with the U.K. following shortly after in 2019. The U.S. has no federal law governing basic IoT security standards.

Snyder said the push to introduce IoT cybersecurity laws could be “an easy way for folks to get into compliance” without having to hire fleets of security engineers. Having an update mechanism in place also helps to keeps the IoT devices around for longer — potentially for years longer — simply by being able to push fixes and new features.

“To build the infrastructure that’s going to allow you to continue to make those devices resilient and deliver new functionality through software, that’s an incredible opportunity for these device manufacturers. And so I’m building a security infrastructure company to support that security needs,” she said.

With the seed round in the bank, Snyder said the company is focused on hiring device and back-end engineers, product managers, and building new partnerships with device manufacturers.

Phil Black, co-founder of True Ventures — Thistle’s seed round investor — described the company as “an astute and natural next step in security technologies.” He added: “Window has so many of the qualities we look for in founders. She has deep domain expertise, is highly respected within the security community, and she’s driven by a deep passion to evolve her industry.”

#apple, #bank, #botnet, #california, #co-founder, #computer-security, #computing, #cybercrime, #cyberwarfare, #dyn, #fastly, #intel, #internet-of-things, #internet-traffic, #malware, #microsoft, #mirai, #science-and-technology, #security, #shopify, #soundcloud, #spotify, #startups, #technology, #true-ventures, #united-kingdom, #united-states

0

Apple and Google pressed in antitrust hearing on whether app stores share data with product development teams

In today’s antitrust hearing in the U.S. Senate, Apple and Google representatives were questioned on whether they have a “strict firewall” or other internal policies in place that prevent them from leveraging the data from third-party businesses operating on their app stores to inform the development of their own competitive products. Apple, in particular, was called out for the practice of copying other apps by Senator Richard Blumenthal (D-CT), who said the practice had become so common that it earned a nickname with Apple’s developer community: “sherlocking.”

Sherlock, which has its own Wikipedia entry under software, comes from Apple’s search tool in the early 2000s called Sherlock. A third-party developer, Karelia Software, created an alternative tool called Watson. Following the success of Karelia’s product, Apple added Watson’s same functionality into its own search tool, and Watson was effectively put out of business. The nickname “Sherlock” later became shorthand for any time Apple copies an idea from a third-party developer that threatens to or even destroys their business.

Over the years, developers claimed Apple has “sherlocked” a number of apps, including Konfabulator (desktop widgets), iPodderX (podcast manager), Sandvox (app for building websites) and Growl (a notification system for Mac OS X) and, in more recent years, F.lux (blue light reduction tool for screens) Duet and Luna (apps that makes iPad a secondary display), as well as various screen-time-management tools. Now Tile claims Apple has also unfairly entered its market with AirTag.

During his questioning, Blumenthal asked Apple and Google’s representatives at the hearing — Kyle Andeer, Apple’s
chief compliance officer and Wilson White, Google’s senior director of Public Policy & Government Relations, respectively — if they employed any sort of “firewall” in between their app stores and their business strategy.

Andeer somewhat dodged the question, saying, “Senator, if I understand the question correctly, we have separate teams that manage the App Store and that are engaged in product development strategy here at Apple.”

Blumenthal then clarified what he meant by “firewall.” He explained that it doesn’t mean whether or not there are separate teams in place, but whether there’s an internal prohibition on sharing data between the App Store and the people who run Apple’s other businesses.

Andeer then answered, “Senator, we have controls in place.”

He went on to note that over the past 12 years, Apple has only introduced “a handful of applications and services,” and in every instance, there are “dozens of alternatives” on the App Store. And, sometimes, the alternatives are more popular than Apple’s own product, he noted.

“We don’t copy. We don’t kill. What we do is offer up a new choice and a new innovation,” Andeer stated.

His argument may hold true when there are strong rivalries, like Spotify versus Apple Music, or Netflix versus Apple TV+, or Kindle versus Apple Books. But it’s harder to stretch it to areas where Apple makes smaller enhancements — like when Apple introduced Sidecar, a feature that allowed users to make their iPad a secondary display. Sidecar ended the need for a third-party app, after apps like Duet and Luna first proved the market.

Another example was when Apple built screen-time controls into its iOS software, but didn’t provide the makers of third-party screen-time apps with an API so consumers could use their preferred apps to configure Apple’s Screen Time settings via the third-party’s specialized interface or take advantage of other unique features.

Blumenthal said he interpreted Andeer’s response as to whether Apple has a “data firewall” as a “no.”

Posed the same question, Google’s representative, White, said his understanding was that Google had “data access controls in place that govern how data from our third-party services are used.”

Blumenthal pressed him to clarify if this was a “firewall,” meaning, he clarified again, “do you have a prohibition against access?”

“We have a prohibition against using our third-party services to compete directly with our first-party services,” White said, adding that Google has “internal policies that govern that.”

The senator said he would follow up on this matter with written questions, as his time expired.

#airtag, #api, #app-store, #apple, #apple-books, #apple-inc, #apple-tv, #apps, #computing, #firewall, #google, #ios, #ipad, #itunes, #kindle, #luna, #mac-os-x, #netflix, #richard-blumenthal, #senator, #sherlock, #sidecar, #smartphones, #spotify, #u-s-senate, #watson

0

Expenses startup Pleo preps $100M Series C funding, launches new bill payments service

Late-stage Fintech startup Pleo, which offers expense management tools and ‘smart’ company Mastercards, says it plans to raise a Series C round of funding this summer. It’s also launching a B2B bill payments service this week.

Co-founder and CEO Jeppe Rindom told me via a call: “We have money until 2022, but we’ve seen incredible momentum in the past couple of quarters, and we are getting a lot of inbound interest so we will be fundraising a Series C round in the Summer and will be raising around $100 million.”

Pleo has raised $78.8 million to date. Its last funding round was $56M in May 2019. Its main investors include Speedinvest, Creandum, Kinnevik, Stripes and Founders.

The startup competes on some levels with Dext, Soldo, Spendesk and Expensify.

Pleo is today launching Bills, a platform to consolidate, track and pay business-to-business bill payments and a supplier’s terms of service. It will offer free-of-charge domestic transfers.

Bills are automatically processed using Pleo’s OCR technology and cross-referenced for duplication and validated for authenticity before being approved for payment

In addition, it will offer approval control for admins and free domestic transfers.

Rindom added: “Since 66% of admins told us that they spent half their time on processing bills as well as authenticating the validity of them, it became our mission to simplify this complicated process and provide an end-to-end overview of it.”

Pleo was founded in Copenhagen in 2015 by Rindom and Niccolo Perra, who were early team members Tradeshift.

#computing, #copenhagen, #europe, #expensify, #kinnevik, #pleo, #robotics, #robots, #tc, #tradeshift

0

Algorithm Virtually Unfolds a Historical Letter without Unsealing It

In the centuries before envelopes, “letterlocking” secured a message’s information

— Read more on ScientificAmerican.com

#computing, #tech

0

Will Quantum Computing Ever Live Up to Its Hype?

One expert warns that the field is overpromising, while another says his firm is on the verge of building “useful” machines

— Read more on ScientificAmerican.com

#computing, #tech

0

Apple’s new iMac finally gets an actually good webcam

Apple introduced new iMacs at its event on Tuesday, outfitted with its M1 processor and redesigned inside and out from the ground up. The hardware is impressive, but one of the biggest improvements for everyone’s Zoom-heavy life might be the webcam. Apple said it’s the “best camera ever in a Mac,” which honestly wouldn’t take much, but its specs suggest it actually is a big upgrade.

The camera finally achieves 1080p video capabilities, and Apple has also equipped it with a larger sensor that should provide greatly-improved low light performance. The M1 chip has better image signal processing capabilities, and uses computational video powers to correct and improve the image on the fly, which has brought benefits to the image quality even on existing MacBook Air and MacBook Pro hardware with the same old, bad webcam equipment.

That should mean this iMac actually has really good image quality — or at least not image quality you need to be embarrassed about. The on-board machine learning processor in the M1, which Apple calls the Neural Engine, will be working in real-time to optimize lighting and do noise reduction, too.

On top of the camera, Apple touts new beam forming mics in a three-mic array that will optimize audio, focusing on your voice and eliminating background noise. All told, this should finally be a Mac that provides a videoconferencing experience that doesn’t feel like it’s stuck in the early 2000s.

#apple, #apple-inc, #apple-spring-hardware-event-2021, #computers, #computing, #imac, #macbook, #macbook-air, #machine-learning, #steve-jobs, #tc, #teleconferencing, #webcam

0

Announcing our TC Sessions: SaaS virtual event happening October 27

Software-as-a-Service (SaaS) is now the default business model for most B2B and B2C software startups. And while it’s been around for a while now, its momentum keeps accelerating and the ecosystem continues to expand as technologists and marketers are getting more sophisticated about how to build and sell SaaS products. For all of them, we’re pleased to announced TechCrunch Sessions: SaaS 2021, a one-day virtual event that will examine the state of SaaS to help startup founders, developers and investors understand the state of play and what’s next.

The single-day event will take place 100% virtually on October 27 and will feature actionable advice, Q&A with some of SaaS’s biggest names, and plenty of networking opportunities. $75 Early Bird Passes are now on sale. Book your passes today to save $100 before prices go up.

We’re not quite ready to disclose our agenda yet, but you can expect a mix of superstars from across the industry, ranging from some of the largest tech companies to up-and-coming startups that are pushing the limits of SaaS.

The plan is to look at a broad spectrum of what’s happening in with B2B startups and give you actionable insights into how to build and/or improve your own product. If you’re just getting started, we want you to come away with new ideas for how to start your company and if you’re already on your way, then our sessions on scaling both your technology and marketing organization will help you to get to that $100 million annual run rate faster.

In addition to other founders, you’ll also hear from enterprise leaders who decide what to buy — and the mistakes they see startups make when they try to sell to them.

But SaaS isn’t only about managing growth — though ideally, that’s a problem founders will face sooner or later. Some of the other specific topics we will look at are how to keep your services safe in an ever-growing threat environment, how to use open source to your advantage and how to smartly raise funding for your company.

We will also highlight how B2B and B2C companies can handle the glut of data they now produce and use it to build machine learning models in the process. We’ll talk about how SaaS startups can both do so themselves and help others in the process. There’s nary a startup that doesn’t want to use some form of AI these days, after all.

And because this is 2021, chances are we’ll also talk about building remote companies and the lessons SaaS startups can learn from the last year of working through the pandemic.

Don’t miss out. Book your $75 Early Bird pass today and save $100.

#business-models, #computing, #enterprise, #entrepreneurship, #machine-learning, #private-equity, #saas, #software, #software-as-a-service, #startup-company, #tc

0

Pulumi launches version 3.0 of its infrastructure-as-code platform

Pulumi was one of the first of what is now a growing number of infrastructure-as-code startups and today, at its developer conference, the company is launching version 3.0 of its cloud engineering platform. With 70 new features and about 1,000 improvements since version 2.0, this is Pulumi’s biggest release yet.

The new release includes features that range from support for Google Cloud as an infrastructure provider (now in preview) to a new Automation API that turns Pulumi into a library that can then be called from other applications. It basically allows developers to write tools that, for example, can then provision and configure their own infrastructure for each customer of a SaaS application, for example.

Image Credits: Pulumi

The company is also launching Pulumi Packages and Components for creating opinionated infrastructure building blocks that developers can then call up from their preferred languages.

Also new is support for Pulumi’s CI/CD Assistant across all the company’s paid plans. This feature makes it easier to deploy cloud infrastructure and applications through more than a dozen popular CI/CD platforms, including the likes of AWS Code Service, Azure DevOps, CircleCI, GitLab CI, Google Cloud Build, Jenkins, Travis CI and Spinnaker. Until now, you needed to be on a Team Pro or Enterprise plan to use this, but it’s now available to all paying users.

In addition, the company is expanding some of its enterprise features with, for example, SAML SSO, SCIm synchronization and new role types.

“When we started out on Pulumi, we knew we wanted to enable developers and infrastructure teams to
collaborate more closely to build more innovative software,” said Joe Duffy, Pulumi co-founder and
CEO. “What we didn’t know yet is that we’d end up calling this ‘Cloud Engineering,’ that our customers
would call it that too, and that they would go on this journey with us. We are now centering our entire
platform around this core idea which is now accelerating as the modern cloud continues to disrupt
entire business models. Pulumi 3.0 is an exciting milestone in realizing this vision of the future —
democratizing access to the cloud and helping teams build better software together — with much more
to come.”

#api, #aws, #cloud-computing, #cloud-infrastructure, #co-founder, #computing, #continuous-integration, #devops, #gitlab, #identity-management, #jenkins, #joe-duffy, #pulumi, #software-engineering, #tc, #technology, #version-control

0

Facebook is expanding Spotify partnership with new ‘Boombox’ project

Facebook is deepening its relationship with music company Spotify and will allow users to listen to music hosted on Spotify while browsing through its apps as part of a new initiative called “Project Boombox,” Facebook CEO Mark Zuckerberg said Monday.

Facebook is building an in-line audio player that will allow users to listen to songs or playlists being shared on the platforms without being externally linked to Spotify’s app or website. Zuckerberg highlighted the feature as another product designed to improve the experience of creators on its platforms, specifically the ability of musicians to share their work, “basically making audio a first-class type of media,” he said.

The news was revealed in a wide-ranging interview with reporter Casey Newton on the company’s future pursuits in the audio world as Facebook aims to keep pace with upstart efforts like Clubhouse and increased activity in the podcasting world. Zuckerberg notably didn’t give a timeline on the feature rollout of “Project Boombox” and it was not further detailed in a company blog post summarizing the audio announcements.

“We think that audio is going to be a first-class medium and that there are all these different products to be built across this whole spectrum,” said Zuckerberg. “Of course, it includes some areas that, that have been, you know, popular recently like like podcasting and and kind of live audio rooms like this, but I also think that there’s some interesting things that are that are under explored in the area overall.”

Spotify has already supported a fairly product relationship with the Facebook and Instagram platforms. In recent years the music and podcasts platform has been integrated more deeply into Instagram Stories where users can share content from the service, a feature that’s also been available in Facebook Stories.

#ceo, #computing, #facebook, #instagram, #mark-zuckerberg, #operating-systems, #reporter, #social-media, #social-software, #software, #spotify, #tc, #the-social-network

0

Xbox Cloud Gaming beta starts rolling out on iOS and PC this week

The era of cloud gaming hasn’t arrived with the intensity that may have seemed imminent a couple years ago when major tech platforms announced their plays. In 2021, the market is still pretty much non-existent despite established presences from nearly all of tech’s biggest players.

Microsoft has been slow to roll out its Xbox Cloud Gaming beta to its users widely across platforms, but that’s likely because they know that, unlike other upstart platforms, there’s not a huge advantage to them rushing out the gate first. This week, the company will begin rolling out the service on iOS and PC to Game Pass Ultimate users, sending out invited to a limited number of users and scaling it up over time.

“The limited beta is our time to test and learn; we’ll send out more invites on a continuous basis to players in all 22 supported countries, evaluate feedback, continue to improve the experience, and add support for more devices,” wrote Xbox’s Catherine Gluckstein in a blog post. “Our plan is to iterate quickly and open up to all Xbox Game Pass Ultimate members in the coming months so more people have the opportunity to play Xbox in all-new ways.”

The service has been available in beta for Android users since last year but it’s been a slow expansion to other platforms outside that world.

A big part of that slowdown has been the result of Apple playing hardball with cloud gaming platform providers, whose business models represent a major threat to App Store gaming revenues. Apple announced a carve-out provision for cloud-gaming platforms that would maintain dependency on the App Store and in-app purchase frameworks but none of the providers seemed very happy with Apple’s solution. As a result, Xbox Cloud Gaming will operate entirely through the web on iOS inside mobile Safari.

#android, #app-store, #apple, #cloud-computing, #cloud-gaming, #computing, #gaming, #microsoft, #xbox, #xbox-cloud-gaming, #xbox-game-pass-ultimate

0

Atlassian acquires ThinkTilt

Atlassian today announced that it has acquired Brisbane, Australia-based ThinkTilt, the company behind the popular Jira-centric no-code/low-code form builder ProForma. The two companies did not disclose the price of the acquisition.

The acquisition is meant to help strengthen Jira Service Management, Atlassian’s version of Jira that focuses on IT service management (ITSM). Launched in November 2020, Jira Service Management is meant to remove the barriers between development and IT operations and provide them with a unified platform, but it also enables other teams (think HR, legal or finance) to set up their own service operations.

Edwin Wong, Atlassian’s head of product, IT, tells me that the company already has over 30,000 customers who use Jira Service Management (though to be fair, Jira Service Management is in part a rebrand of Jira Service Desk with additional ITSM functionality, so a lot of these users were previous Jira Service Desk customers).

“One thing that I keep hearing from our customers when we speak to them, is that what makes us different is that Jira Service Management really helps them deliver value quickly, without the cost and complexity of some of the other ITSM solutions that they’ve used in the past. It’s just easier to set up, get going and maintain,” Wong said.

Image Credits: Atlassian

And while at launch, the company’s focus was very much on bringing developers and IT together, Wong stressed that today’s announcement is very much about how IT can help other business teams develop services as well — and cope with the reality of remote and hybrid work.

“Employees now expect digital experiences from the employers and their colleagues as much as they expect them in every aspect of their consumer lives, as these two things blend together,” Wong noted. “Fact is, you can’t really walk up to the HR team anymore when you’re working remote and say, ‘hey, I’ve got someone coming in.’ You can’t go tap on their shoulder and say, ‘hey, upgrade that campaign for me.’ That’s not really going to work anymore.” But ThinkTilt, Wong argues, helps businesses “create amazing customer and employee experiences, and allows anyone to do that really quickly and easily.”

Unsurprisingly, ProForma comes with all the tools you would need to create forms (and there are a lot of them) and it is already deeply integrated with Jira and Jira Service Management. It also features over 300 templates for often used business flows like candidate approval tracking in an HR system, for example. “What we’re really providing for our customers is not just the features and saying, ‘hey, figure it out yourselves,’ but really that practice and the [ThinkTilt] team really brings with them an amazing amount of knowledge with that,” Wong said about ProForma’s set of templates.

One neat tool more companies should offer: ProForma features a fully functional demo mode that lets you try out the product before even signing up for its free trial.

Jira Service Management, of course, can already build all these workflows, too. That is, after all, what the product is all about. But with ProForma, an HR team could capture all the information they need to capture for a given workflow, with Jira Service Management becoming the backend for those operations. Or they can easily create forms based on existing workflows, too, and enhance the user experience that way.

Over the course of the last five years, Atlassian regularly acquired a company or two per year. Currently, though, it feels like this pace is picking up a bit. Indeed, the acquisition of ThinkTilt marks the company’s fourth acquisition in the last twelve months. In February, it acquired visualization and analytics company Chartio, while in 2020, it acquired helpdesk tool Halp and the asset management company Mindville. If anything, I would expect this pace to increase in the next year as Atlassian aims to capitalize on current trends.

#atlassian, #australia, #brisbane, #chartio, #computing, #finance, #halp, #jira, #management, #mindville, #software, #tc

0

Data scientists: Bring the narrative to the forefront

By 2025, 463 exabytes of data will be created each day, according to some estimates. (For perspective, one exabyte of storage could hold 50,000 years of DVD-quality video.) It’s now easier than ever to translate physical and digital actions into data, and businesses of all types have raced to amass as much data as possible in order to gain a competitive edge.

However, in our collective infatuation with data (and obtaining more of it), what’s often overlooked is the role that storytelling plays in extracting real value from data.

The reality is that data by itself is insufficient to really influence human behavior. Whether the goal is to improve a business’ bottom line or convince people to stay home amid a pandemic, it’s the narrative that compels action, rather than the numbers alone. As more data is collected and analyzed, communication and storytelling will become even more integral in the data science discipline because of their role in separating the signal from the noise.

Data alone doesn’t spur innovation — rather, it’s data-driven storytelling that helps uncover hidden trends, powers personalization, and streamlines processes.

Yet this can be an area where data scientists struggle. In Anaconda’s 2020 State of Data Science survey of more than 2,300 data scientists, nearly a quarter of respondents said that their data science or machine learning (ML) teams lacked communication skills. This may be one reason why roughly 40% of respondents said they were able to effectively demonstrate business impact “only sometimes” or “almost never.”

The best data practitioners must be as skilled in storytelling as they are in coding and deploying models — and yes, this extends beyond creating visualizations to accompany reports. Here are some recommendations for how data scientists can situate their results within larger contextual narratives.

Make the abstract more tangible

Ever-growing datasets help machine learning models better understand the scope of a problem space, but more data does not necessarily help with human comprehension. Even for the most left-brain of thinkers, it’s not in our nature to understand large abstract numbers or things like marginal improvements in accuracy. This is why it’s important to include points of reference in your storytelling that make data tangible.

For example, throughout the pandemic, we’ve been bombarded with countless statistics around case counts, death rates, positivity rates, and more. While all of this data is important, tools like interactive maps and conversations around reproduction numbers are more effective than massive data dumps in terms of providing context, conveying risk, and, consequently, helping change behaviors as needed. In working with numbers, data practitioners have a responsibility to provide the necessary structure so that the data can be understood by the intended audience.

#column, #computing, #data, #data-management, #data-visualization, #developer, #ec-column, #ec-consumer-applications, #ec-enterprise-applications, #enterprise, #machine-learning, #peter-wang, #startups, #storytelling

0

Facebook brings software subscriptions to the Oculus Quest

Subscription pricing is landing on Facebook’s Oculus Store, giving VR developers another way to monetize content on Facebook’s Oculus Quest headset.

Developers will be allowed to add premium subscriptions to paid or free apps, with Facebook assumedly dragging in their standard percentage fee at the same time. Oculus and the developers on its platform have been riding the success of the company’s recent Quest 2 headset, which Facebook hasn’t detailed sales numbers on but has noted that the months-old $299 headset has already outsold every other Oculus headset sold to date.

Subscription pricing is an unsurprising development but signals that some developers believe they have a loyal enough group of subscribers to bring in sizable bits of recurring revenue. Facebook shipped the first Oculus Rift just over five years ago, and it’s been a zig-zagging path to finding early consumer success during that time. A big challenge for them has been building a dynamic developer ecosystem that offer something engaging to users while ensuring that VR devs can operate sustainably.

At launch, there are already a few developers debuting subscriptions for a number of different app types, spanning exercise, meditation, social, productivity and DJing. In addition to subscriptions, the new monetization path also allows developers to let users try out paid apps on a free trial basis.

The central question is how many Quest users there are that utilize their devices enough to justify a number of monthly subscriptions, but for developers looking to monetize their hardcore users, this is another utility that they likely felt was missing from the Oculus Store.

#computing, #display-technology, #facebook, #facebook-horizon, #mixed-reality, #oculus, #oculus-rift-s, #technology, #virtual-reality, #wearable-devices

0

Grocery startup Mercato spilled years of data, but didn’t tell its customers

A security lapse at online grocery delivery startup Mercato exposed tens of thousands of customer orders, TechCrunch has learned.

A person with knowledge of the incident told TechCrunch that the incident happened in January after one of the company’s cloud storage buckets, hosted on Amazon’s cloud, was left open and unprotected.

The company fixed the data spill, but has not yet alerted its customers.

Mercato was founded in 2015 and helps over a thousand smaller grocers and specialty food stores get online for pickup or delivery, without having to sign up for delivery services like Instacart or Amazon Fresh. Mercato operates in Boston, Chicago, Los Angeles, and New York, where the company is headquartered.

TechCrunch obtained a copy of the exposed data and verified a portion of the records by matching names and addresses against known existing accounts and public records. The data set contained more than 70,000 orders dating between September 2015 and November 2019, and included customer names and email addresses, home addresses, and order details. Each record also had the user’s IP address of the device they used to place the order.

The data set also included the personal data and order details of company executives.

It’s not clear how the security lapse happened since storage buckets on Amazon’s cloud are private by default, or when the company learned of the exposure.

Companies are required to disclose data breaches or security lapses to state attorneys-general, but no notices have been published where they are required by law, such as California. The data set had more than 1,800 residents in California, more than three times the number needed to trigger mandatory disclosure under the state’s data breach notification laws.

It’s also not known if Mercato disclosed the incident to investors ahead of its $26 million Series A raise earlier this month. Velvet Sea Ventures, which led the round, did not respond to emails requesting comment.

In a statement, Mercato chief executive Bobby Brannigan confirmed the incident but declined to answer our questions, citing an ongoing investigation.

“We are conducting a complete audit using a third party and will be contacting the individuals who have been affected. We are confident that no credit card data was accessed because we do not store those details on our servers. We will continually inform all authoritative bodies and stakeholders, including investors, regarding the findings of our audit and any steps needed to remedy this situation,” said Brannigan.


Know something, say something. Send tips securely over Signal and WhatsApp to +1 646-755-8849. You can also send files or documents using our SecureDrop. Learn more

#amazon, #boston, #california, #chicago, #cloud-computing, #cloud-infrastructure, #cloud-storage, #computer-security, #computing, #data-breach, #data-security, #ecommerce, #food, #instacart, #los-angeles, #mercato, #new-york, #security, #technology, #united-states, #velvet-sea-ventures

0

Google’s FeedBurner moves to a new infrastructure but loses its email subscription service

Google today announced that it is moving FeedBurner to a new infrastructure but also deprecating its email subscription service.

If you’re an internet user of a certain age, chances are you used Google’s FeedBurner to manage the RSS feeds of your personal blogs and early podcasts at some point. During the Web 2.0 era, it was the de facto standard for feed management and analytics, after all. Founded in 2004, with Dick Costolo as one of its co-founders (before he became Twitter’s CEO in 2010), it was acquired by Google in 2007.

Ever since, FeedBurner lingered in an odd kind of limbo. While Google had no qualms shutting down popular services like Google Reader in favor of its ill-fated social experiments like Google+, FeedBurner just kept burning feeds day in and day out, even as Google slowly deprecated some parts of the service, most notably its advertising integrations.

I don’t know that anybody spent a lot of time thinking about the service and RSS has slowly (and sadly) fallen into obscurity, yet the service was probably easy enough to maintain that Google kept it going. And despite everything, shutting it down would probably break enough tools for publishers to create quite an uproar. The TechCrunch RSS feed, to which you are surely subscribed in your desktop RSS reader, is http://feeds.feedburner.com/TechCrunch/, after all.

So here we are, 14 years later, and Google today announced that it is “making several upcoming changes to support the product’s next chapter.” It’s moving the service to a new, more stable infrastructure.

But in July, it is also shutting down some non-core features that don’t directly involve feed management, most importantly the FeedBurner email subscription service that allowed you to get emailed alerts when a feed updates. Feed owners will be able to download their email subscriber lists (and will be able to do so after July, too). With that, Blogger’s FollowByEmail widget will also be deprecated (and hey, did you start this day thinking you’d read about FeedBurner AND Blogger on TechCrunch without having to travel back to 2007?).

Google stresses that other core FeedBurner features will remain in place, but given the popularity of email newsletters, that’s a bit of an odd move.

#ceo, #computing, #dick-costolo, #feedburner, #google, #google-reader, #news-aggregators, #nostalgia, #rss, #tc, #technology, #twitter, #web-2-0, #world-wide-web

0

Triller owner gets a new CEO with acquisition of Amplify.AI; also acquires live streaming service FITE TV

Would be TikTok competitor Triller, operated by parent company TrillerNet, is gaining a new CEO, the company announced today. The short-form video app said it’s acquiring an A.I.-based customer engagement platform, Amplify.AI, whose co-founder Mahi de Silva will now become TrillerNet’s CEO. Existing CEO Mike Lu will transition to President of TrillerNet and will focus on investor relations. The company separately announced the acquisition of FITE TV, a live event and pay-per-view combat sports streaming platform.

New CEO Mahi de Silva had been closely involved with Triller before today. The company’s press release today says he’s been serving as non-executive chairman since 2016, but his LinkedIn notes the year was 2019 (which would be following Triller’s 2019 funding by Proxima Media, when the press release at the time noted he was assuming the role of “chairman.”)  These are both wrong, the company discovered when we reached out for clarity. The correct year is 2018.

Ahead of the acquisition, de Silva had been serving as CEO and co-founder to Amplify.AI since 2017, and before that was CEO of Opera Mediaworks, the marketing and advertising arm of Opera Software, and co-founder and CEO of Botworx.

Amplify.AI, which works with brands in CPG, financial services, automotive, telecom, politics, and digital media, among others, will continue to operate as a subsidiary of TrillerNet following the deal. Other team members include former RSA and Verisign executive Ram Moskowitz who helped design and develop the digital certificates for SSL and code signing; and Amplify.ai co-founder and CTO Manoj Malhotra, a pioneer in B2C SMS messaging, the company notes.

TrillerNet also today announced it’s acquiring another strategic property to help shift its business further into the direction of live events: FITE TV. This deal gives Triller more of a foothold in the live events and pay-per-view streaming market, it says. As a result, FITE, which touts 10 million users, will become the exclusive digital distributor of all Triller Fight Club boxing events going forward.

“Acquiring FITE is part of the larger Triller strategy to bring together content, creators and commerce for the first time and the only place where they truly interact,” said Triller’s Ryan Kavanaugh, the former head of movie studio Relativity Media (and controversial figure) whose Proxima Media became Triller’s majority investor in 2019. “We have invested hundreds of millions of dollars and believe we have created a better more efficient e-commerce content platform,” he added.

The acquisition follows several others TrillerNet has made to expand into live events, now that becoming a TikTok replacement in the U.S. is no longer a viable option, as the Trump ban was put on hold by the Biden administration. Triller also in March acquired live music streaming platform Verzuz, founded by Swizz Beats and Timbaland. And it operates Triller Flight Club in partnership with Snoop Dogg, as well as a streaming platform Triller TV.

While specific deal terms were not revealed, Triller told TechCrunch it’s spent $250 million in the aggregate on its acquisitions, including Halogen, Mashtraxx, Verzuz, FITE and Amplify today.

#amplify, #apps, #biden-administration, #ceo, #chairman, #co-founder, #computing, #digital-media, #executive, #financial-services, #fundings-exits, #internet-culture, #mike-lu, #opera-mediaworks, #opera-software, #president, #sms, #snoop, #software, #ssl, #tiktok, #triller, #trump, #united-states, #verisign, #video-hosting

0

Gay dating site Manhunt hacked, thousands of accounts stolen

Manhunt, a gay dating app that claims to have 6 million male members, has confirmed it was hit by a data breach in February after a hacker gained access to the company’s accounts database.

In a notice filed with the Washington attorney general’s office, Manhunt said the hacker “gained access to a database that stored account credentials for Manhunt users,” and “downloaded the usernames, email addresses and passwords for a subset of our users in early February 2021.

The notice did not say how the passwords were scrambled, if at all, to prevent them from being read by humans. Passwords scrambled using weak algorithms can sometimes be decoded into plain text, allowing malicious hackers to break into their accounts.

Following the breach, Manhunt force-reset account passwords began alerting users in mid-March. Manhunt did not say what percentage of its users had their data stolen or how the data breach happened, but said that more than 7,700 Washington state residents were affected.

The company’s attorneys did not reply to an email requesting comment.

But questions remain about how Manhunt handled the breach. In March, the company tweeted that, “At this time, all Manhunt users are required to update their password to ensure it meets the updated password requirements.” The tweet did not say that user accounts had been stolen.

Manhunt was launched in 2001 by Online-Buddies Inc., which also offered gay dating app Jack’d before it was sold to Perry Street in 2019 for an undisclosed sum. Just months before the sale, Jack’d had a security lapse that exposed users’ private photos and location data.

Dating sites store some of the most sensitive information on their users, and are frequently a target of malicious hackers. In 2015, Ashley Madison, a dating site that encouraged users to have an affair, was hacked, exposing names, and postal and email addresses. Several people died by suicide after the stolen data was posted online. A year later, dating site AdultFriendFinder was hacked, exposing more than 400 million user accounts.

In 2018, same-sex dating app Grindr made headlines for sharing users’ HIV status with data analytics firms.

In other cases, poor security — in some cases none at all — led to data spills involving some of the most sensitive data. In 2019, Rela, a popular dating app for gay and queer women in China, left a server unsecured with no password, allowing anyone to access sensitive data — including sexual orientation and geolocation — on more than 5 million app users. Months later, Jewish dating app JCrush exposed around 200,000 user records.

Read more: 


Know something, say something. Send tips securely over Signal and WhatsApp to +1 646-755-8849. You can also send files or documents using our SecureDrop. Learn more

#jack, #apps, #articles, #ashley-madison, #china, #computer-security, #computing, #cryptography, #data-breaches, #password, #securedrop, #security, #security-breaches

0

PlexTrac raises $10M Series A round for its collaboration-centric security platform

PlexTrac, a Boise, ID-based security service that aims to provide a unified workflow automation platform for red and blue teams, today announced that it has raised a $10 million Series A funding round led by Noro-Moseley Partners and Madrona Venture Group. StageDot0 ventures also participated in this round, which the company plans to use to build out its team and grow its platform.

With this new round, the company, which was founded in 2018, has now raised a total of $11 million, with StageDot0 leading its 2019 seed round.

PlexTrac CEO and President Dan DeCloss

PlexTrac CEO and President Dan DeCloss

“I have been on both sides of the fence, the specialist who comes in and does the assessment, produces that 300-page report and then comes back a year later to find that some of the critical issues had not been addressed at all.  And not because the organization didn’t want to but because it was lost in that report,” PlexTrac CEO and President Dan DeCloss said. “These are some of the most critical findings for an entity from a risk perspective. By making it collaborative, both red and blue teams are united on the same goal we all share, to protect the network and assets.”

With an extensive career in security that included time as a penetration tester for Veracode and the Mayo Clinic, as well as senior information security advisor for Anthem, among other roles, DeCloss has quite a bit of first-hand experience that led him to found PlexTrac. Specifically, he believes that it’s important to break down the wall between offense-focused red teams and defense-centric blue teams.

Image Credits: PlexTrac

 

 

“Historically there has been more of the cloak and dagger relationship but those walls are breaking down– and rightfully so, there isn’t that much of that mentality today– people recognize they are on the same mission whether they are internal security team or an external team,” he said. “With the PlexTrac platform the red and blue teams have a better view into the other teams’ tactics and techniques – and it makes the whole process into an educational exercise for everyone.”

At its core, PlexTrac makes it easier for security teams to produce their reports — and hence free them up to actually focus on ‘real’ security work. To do so, the service integrates with most of the popular scanners like Qualys, and Veracode, but also tools like ServiceNow and Jira in order to help teams coordinate their workflows. All the data flows into real-time reports that then help teams monitor their security posture. The service also features a dedicated tool, WriteupsDB, for managing reusable write-ups to help teams deliver consistent reports for a variety of audiences.

“Current tools for planning, executing, and reporting on security testing workflows are either nonexistent (manual reporting, spreadsheets, documents, etc…) or exist as largely incomplete features of legacy platforms,” Madrona’s S. Somasegar and Chris Picardo write in today’s announcement. “The pain point for security teams is real and PlexTrac is able to streamline their workflows, save time, and greatly improve output quality. These teams are on the leading edge of attempting to find and exploit vulnerabilities (red teams) and defend and/or eliminate threats (blue teams).”

 

#cloud-applications, #computer-security, #computing, #enterprise, #information-technology, #madrona-venture-group, #mayo-clinic, #noro-moseley-partners, #qualys, #recent-funding, #red-team, #security, #servicenow, #startups

0

FBI launches operation to remotely remove Microsoft Exchange server backdoors

A Texas court has authorized an FBI operation to “copy and remove” backdoors from hundreds of Microsoft Exchange email servers in the United States, months after hackers used four previously undiscovered vulnerabilities to attack thousands of networks.

The Justice Department announced the operation on Tuesday, which it described as “successful.” It’s believed this is the first known case of the FBI effectively cleaning up private networks following a cyberattack.

In March, Microsoft discovered a new China state-sponsored hacking group — Hafnium — targeting Exchange servers run from company networks. The four vulnerabilities when chained together allowed the hackers to break into a vulnerable Exchange server and steal its contents. Microsoft fixed the vulnerabilities but the patches did not close the backdoors from the servers that had already been breached. Within days, other hacking groups began hitting vulnerable servers with the same flaws to deploy ransomware.

The number of infected servers dropped as patches were applied. But hundreds of Exchange servers remained vulnerable because the backdoors are difficult to find and eliminate, the Justice Department said in a statement.

“This operation removed one early hacking group’s remaining web shells which could have been used to maintain and escalate persistent, unauthorized access to U.S. networks,” the statement said. “The FBI conducted the removal by issuing a command through the web shell to the server, which was designed to cause the server to delete only the web shell (identified by its unique file path).”

The FBI said it’s attempting to contact owners of servers from which it removed the backdoors by email.

Assistant attorney general John C. Demers said the operation “demonstrates the Department’s commitment to disrupt hacking activity using all of our legal tools, not just prosecutions.”

The Justice Department also said the operation only removed the backdoors, but did not patch the vulnerabilities exploited by the hackers to begin with or remove any malware left behind.

Neither the FBI nor the Justice Department commented by press time.

#backdoor, #china, #computing, #cryptography, #cybercrime, #cyberwarfare, #department-of-justice, #federal-bureau-of-investigation, #hacking, #justice-department, #malware, #microsoft, #ransomware, #security, #security-breaches, #spyware, #technology, #texas, #united-states

0

Risk startup LogicGate confirms data breach

Risk and compliance startup LogicGate has confirmed a data breach. But unless you’re a customer, you probably didn’t hear about it.

An email sent by LogicGate to customers earlier this month said on February 23 an unauthorized third-party obtained credentials to its Amazon Web Services-hosted cloud storage servers storing customer backup files for its flagship platform Risk Cloud, which helps companies to identify and manage their risk and compliance with data protection and security standards. LogicGate says its Risk Cloud can also help find security vulnerabilities before they are exploited by malicious hackers.

The credentials “appear to have been used by an unauthorized third party to decrypt particular files stored in AWS S3 buckets in the LogicGate Risk Cloud backup environment,” the email read.

“Only data uploaded to your Risk Cloud environment on or prior to February 23, 2021, would have been included in that backup file. Further, to the extent you have stored attachments in the Risk Cloud, we did not identify decrypt events associated with such attachments,” it added.

LogicGate did not say how the AWS credentials were compromised. An email update sent by LogicGate last Friday said the company anticipates finding the root cause of the incident by this week.

But LogicGate has not made any public statement about the breach. It’s also not clear if the company contacted all of its customers or only those whose data was accessed. LogicGate counts Capco, SoFi, and Blue Cross Blue Shield of Kansas City as customers.

We sent a list of questions, including how many customers were affected and if the company has alerted U.S. state authorities as required by state data breach notification laws. When reached, LogicGate chief executive Matt Kunkel confirmed the breach but declined to comment citing an ongoing investigation. “We believe it’s best to communicate developments directly to our customers,” he said.

Kunkel would not say, when asked, if the attacker also exfiltrated the decrypted customer data from its servers.

Data breach notification laws vary by state, but companies that fail to report security incidents can face heavy fines. Under Europe’s GDPR rules, companies can face fines of up to 4% of their annual turnover for violations.

In December, LogicGate secured $8.75 million in fresh funding, totaling more than $40 million since it launched in 2015.


Are you a LogicGate customer? Send tips securely over Signal and WhatsApp to +1 646-755-8849. You can also send files or documents using our SecureDrop. Learn more

#amazon, #amazon-web-services, #blue-cross-blue-shield, #capco, #cloud, #cloud-computing, #cloud-storage, #computer-security, #computing, #data-breach, #data-security, #europe, #health-insurance, #securedrop, #security, #security-breaches, #sofi, #united-states

0

Meroxa raises $15M Series A for its real-time data platform

Meroxa, a startup that makes it easier for businesses to build the data pipelines to power both their analytics and operational workflows, today announced that it has raised a $15 million Series A funding round led by Drive Capital. Existing investors Root, Amplify and Hustle Fund also participated in this round, which together with the company’s previously undisclosed $4.2 million seed round now brings total funding in the company to $19.2 million.

The promise of Meroxa is that can use a single platform for their various data needs and won’t need a team of experts to build their infrastructure and then manage it. At its core, Meroxa provides a single Software-as-a-Service solution that connects relational databases to data warehouses and then helps businesses operationalize that data.

Image Credits: Meroxa

“The interesting thing is that we are focusing squarely on relational and NoSQL databases into data warehouse,” Meroxa co-founder and CEO DeVaris Brown told me. “Honestly, people come to us as a real-time FiveTran or real-time data warehouse sink. Because, you know, the industry has moved to this [extract, load, transform] format. But the beautiful part about us is, because we do change data capture, we get that granular data as it happens.” And businesses want this very granular data to be reflected inside of their data warehouses, Brown noted, but he also stressed that Meroxa can expose this stream of data as an API endpoint or point it to a Webhook.

The company is able to do this because its core architecture is somewhat different from other data pipeline and integration services that, at first glance, seem to offer a similar solution. Because of this, users can use the service to connect different tools to their data warehouse but also build real-time tools on top of these data streams.

Image Credits: Meroxa

“We aren’t a point-to-point solution,” Meroxa co-founder and CTO Ali Hamidi explained. “When you set up the connection, you aren’t taking data from Postgres and only putting it into Snowflake. What’s really happening is that it’s going into our intermediate stream. Once it’s in that stream, you can then start hanging off connectors and say, ‘Okay, well, I also want to peek into the stream, I want to transfer my data, I want to filter out some things, I want to put it into S3.”

Because of this, users can use the service to connect different tools to their data warehouse but also build real-time tools to utilize the real-time data stream. With this flexibility, Hamidi noted, a lot of the company’s customers start with a pretty standard use case and then quickly expand into other areas as well.

Brown and Hamidi met during their time at Heroku, where Brown was a director of product management and Hamidi a lead software engineer. But while Heroku made it very easy for developers to publish their web apps, there wasn’t anything comparable in the highly fragmented database space. The team acknowledges that there are a lot of tools that aim to solve these data problems, but few of them focus on the user experience.

Image Credits: Meroxa

“When we talk to customers now, it’s still very much an unsolved problem,” Hamidi said. “It seems kind of insane to me that this is such a common thing and there is no ‘oh, of course you use this tool because it addresses all my problems.’ And so the angle that we’re taking is that we see user experience not as a nice-to-have, it’s really an enabler, it is something that enables a software engineer or someone who isn’t a data engineer with 10 years of experience in wrangling Kafka and Postgres and all these things. […] That’s a transformative kind of change.”

It’s worth noting that Meroxa uses a lot of open-source tools but the company has also committed to open-sourcing everything in its data plane as well. “This has multiple wins for us, but one of the biggest incentives is in terms of the customer, we’re really committed to having our agenda aligned. Because if we don’t do well, we don’t serve the customer. If we do a crappy job, they can just keep all of those components and run it themselves,” Hamidi explained.

Today, Meroxa, which the team founded in early 2020, has over 24 employees (and is 100% remote). “I really think we’re building one of the most talented and most inclusive teams possible,” Brown told me. “Inclusion and diversity are very, very high on our radar. Our team is 50% black and brown. Over 40% are women. Our management team is 90% underrepresented. So not only are we building a great product, we’re building a great company, we’re building a great business.”  

#api, #business-intelligence, #cloud, #computing, #data-management, #data-warehouse, #database, #developer, #drive-capital, #enterprise, #heroku, #hustle-fund, #information-technology, #nosql, #product-management, #recent-funding, #software-engineer, #startups, #web-apps

0

Apple said to be developing Apple TV/HomePod combo and iPad-like smart speaker display

Apple is reportedly working on a couple of new options for a renewed entry into the smart home, including a mash-up of the Apple TV with a HomePod speaker, and an integrated camera for video chat, according to Bloomberg. It’s also said to be working on a smart speaker that basically combines a HomePod with an iPad, providing something similar to Amazon’s Echo Show or Google’s Nest Hub in functionality.

The Apple TV/HomePod hybrid would still connect to a television for outputting video, and would offer similar access to all the video and gaming services that the current Apple TV does, while the speaker component would provide sound output, music playback, and Siri integration. It would also include a built-in camera for using video conferencing apps on the TV itself, the report says.

That second device would be much more like existing smart assistant display devices on the market today, with an iPad-like screen providing integrated visuals. The project could involve attaching the iPad via a “robotic arm” according to Bloomberg, that would allow it to move to accommodate a user moving around, with the ability to keep them in frame during video chat sessions.

Bloomberg doesn’t provide any specific timelines for release of any of these potential products, and it sounds like they’re still very much in the development phase, which means Apple could easily abandon these plans depending on its evaluation of their potential. Apple just recently discontinued its original HomePod, the $300 smart speaker it debuted in 2018.

Rumors abound about a refreshed Apple TV arriving sometime this year, which should boast a faster processor and also an updated remote control. It could bring other hardware improvements, like support for a faster 120Hz refresh rate available on more modern TVs.

#apple, #apple-inc, #apple-tv, #assistant, #computing, #hardware, #homepod, #ios, #ipad, #portable-media-players, #siri, #smart-speaker, #speaker, #tablet-computers, #tc, #touchscreens, #video-conferencing

0

APKPure app contained malicious adware, say researchers

Security researchers say APKPure, a widely popular app for installing older or discontinued Android apps from outside of Google’s app store, contained malicious adware that flooded the victim’s device with unwanted ads.

Kaspersky Lab said that it alerted APKPure on Thursday that its most recent app version, 3.17.18, contained malicious code that siphoned off data from a victim’s device without their knowledge, and pushed ads to the device’s lock screen and in the background to generate fraudulent revenue for the adware operators.

But the researchers said that the malicious code had the capacity to download other malware, potentially putting affected victims at further risk.

The researchers said the APKPure developers likely introduced the malicious code, known as a software development kit or SDK, from an unverified source. APKPure removed the malicious code and pushed out a new version, 3.17.19, and the developers no longer list the malicious version on its site.

APKPure was set up in 2014 to allow Android users access to a vast bank of Android apps and games, including old versions, as well as app versions from other regions that are no longer on Android’s official app store Google Play. It later launched an Android app, which also has to be installed outside Google Play, serving as its own app store to allow users to download older apps directly to their Android devices.

APKPure is ranked as one of the most popular sites on the internet.

But security experts have long warned against installing apps outside of the official app stores as quality and security vary wildly as much of the Android malware requires victims to install malicious apps from outside the app store. Google scans all Android apps that make it into Google Play, but some have slipped through the cracks before.

TechCrunch contacted APKPure for comment but did not hear back.

#android, #apkpure, #app-store, #apps, #bank, #computing, #google, #google-play, #mobile-app, #mobile-linux, #mobile-malware, #online-advertising, #privacy, #security, #smartphones, #software, #technology

0

NLPCloud.io helps devs add language processing smarts to their apps

While visual ‘no code‘ tools are helping businesses get more out of computing without the need for armies of in-house techies to configure software on behalf of other staff, access to the most powerful tech tools — at the ‘deep tech’ AI coal face — still requires some expert help (and/or costly in-house expertise).

This is where bootstrapping French startup, NLPCloud.io, is plying a trade in MLOps/AIOps — or ‘compute platform as a service’ (being as it runs the queries on its own servers) — with a focus on natural language processing (NLP), as its name suggests.

Developments in artificial intelligence have, in recent years, led to impressive advances in the field of NLP — a technology that can help businesses scale their capacity to intelligently grapple with all sorts of communications by automating tasks like Named Entity Recognition, sentiment-analysis, text classification, summarization, question answering, and Part-Of-Speech tagging, freeing up (human) staff to focus on more complex/nuanced work. (Although it’s worth emphasizing that the bulk of NLP research has focused on the English language — meaning that’s where this tech is most mature; so associated AI advances are not universally distributed.)

Production ready (pre-trained) NLP models for English are readily available ‘out of the box’. There are also dedicated open source frameworks offering help with training models. But businesses wanting to tap into NLP still need to have the DevOps resource and chops to implement NLP models.

NLPCloud.io is catering to businesses that don’t feel up to the implementation challenge themselves — offering “production-ready NLP API” with the promise of “no DevOps required”.

Its API is based on Hugging Face and spaCy open-source models. Customers can either choose to use ready-to-use pre-trained models (it selects the “best” open source models; it does not build its own); or they can upload custom models developed internally by their own data scientists — which it says is a point of differentiation vs SaaS services such as Google Natural Language (which uses Google’s ML models) or Amazon Comprehend and Monkey Learn.

NLPCloud.io says it wants to democratize NLP by helping developers and data scientists deliver these projects “in no time and at a fair price”. (It has a tiered pricing model based on requests per minute, which starts at $39pm and ranges up to $1,199pm, at the enterprise end, for one custom model running on a GPU. It does also offer a free tier so users can test models at low request velocity without incurring a charge.)

“The idea came from the fact that, as a software engineer, I saw many AI projects fail because of the deployment to production phase,” says sole founder and CTO Julien Salinas. “Companies often focus on building accurate and fast AI models but today more and more excellent open-source models are available and are doing an excellent job… so the toughest challenge now is being able to efficiently use these models in production. It takes AI skills, DevOps skills, programming skill… which is why it’s a challenge for so many companies, and which is why I decided to launch NLPCloud.io.”

The platform launched in January 2021 and now has around 500 users, including 30 who are paying for the service. While the startup, which is based in Grenoble, in the French Alps, is a team of three for now, plus a couple of independent contractors. (Salinas says he plans to hire five people by the end of the year.)

“Most of our users are tech startups but we also start having a couple of bigger companies,” he tells TechCrunch. “The biggest demand I’m seeing is both from software engineers and data scientists. Sometimes it’s from teams who have data science skills but don’t have DevOps skills (or don’t want to spend time on this). Sometimes it’s from tech teams who want to leverage NLP out-of-the-box without hiring a whole data science team.”

“We have very diverse customers, from solo startup founders to bigger companies like BBVA, Mintel, Senuto… in all sorts of sectors (banking, public relations, market research),” he adds.

Use cases of its customers include lead generation from unstructured text (such as web pages), via named entities extraction; and sorting support tickets based on urgency by conducting sentiment analysis.

Content marketers are also using its platform for headline generation (via summarization). While text classification capabilities are being used for economic intelligence and financial data extraction, per Salinas.

He says his own experience as a CTO and software engineer working on NLP projects at a number of tech companies led him to spot an opportunity in the challenge of AI implementation.

“I realized that it was quite easy to build acceptable NLP models thanks to great open-source frameworks like spaCy and Hugging Face Transformers but then I found it quite hard to use these models in production,” he explains. “It takes programming skills in order to develop an API, strong DevOps skills in order to build a robust and fast infrastructure to serve NLP models (AI models in general consume a lot of resources), and also data science skills of course.

“I tried to look for ready-to-use cloud solutions in order to save weeks of work but I couldn’t find anything satisfactory. My intuition was that such a platform would help tech teams save a lot of time, sometimes months of work for the teams who don’t have strong DevOps profiles.”

“NLP has been around for decades but until recently it took whole teams of data scientists to build acceptable NLP models. For a couple of years, we’ve made amazing progress in terms of accuracy and speed of the NLP models. More and more experts who have been working in the NLP field for decades agree that NLP is becoming a ‘commodity’,” he goes on. “Frameworks like spaCy make it extremely simple for developers to leverage NLP models without having advanced data science knowledge. And Hugging Face’s open-source repository for NLP models is also a great step in this direction.

“But having these models run in production is still hard, and maybe even harder than before as these brand new models are very demanding in terms of resources.”

The models NLPCloud.io offers are picked for performance — where “best” means it has “the best compromise between accuracy and speed”. Salinas also says they are paying mind to context, given NLP can be used for diverse user cases — hence proposing number of models so as to be able to adapt to a given use.

“Initially we started with models dedicated to entities extraction only but most of our first customers also asked for other use cases too, so we started adding other models,” he notes, adding that they will continue to add more models from the two chosen frameworks — “in order to cover more use cases, and more languages”.

SpaCy and Hugging Face, meanwhile, were chosen to be the source for the models offered via its API based on their track record as companies, the NLP libraries they offer and their focus on production-ready framework — with the combination allowing NLPCloud.io to offer a selection of models that are fast and accurate, working within the bounds of respective trade-offs, according to Salinas.

“SpaCy is developed by a solid company in Germany called Explosion.ai. This library has become one of the most used NLP libraries among companies who want to leverage NLP in production ‘for real’ (as opposed to academic research only). The reason is that it is very fast, has great accuracy in most scenarios, and is an opinionated” framework which makes it very simple to use by non-data scientists (the tradeoff is that it gives less customization possibilities),” he says.

Hugging Face is an even more solid company that recently raised $40M for a good reason: They created a disruptive NLP library called ‘transformers’ that improves a lot the accuracy of NLP models (the tradeoff is that it is very resource intensive though). It gives the opportunity to cover more use cases like sentiment analysis, classification, summarization… In addition to that, they created an open-source repository where it is easy to select the best model you need for your use case.”

While AI is advancing at a clip within certain tracks — such as NLP for English — there are still caveats and potential pitfalls attached to automating language processing and analysis, with the risk of getting stuff wrong or worse. AI models trained on human-generated data have, for example, been shown reflecting embedded biases and prejudices of the people who produced the underlying data.

Salinas agrees NLP can sometimes face “concerning bias issues”, such as racism and misogyny. But he expresses confidence in the models they’ve selected.

“Most of the time it seems [bias in NLP] is due to the underlying data used to trained the models. It shows we should be more careful about the origin of this data,” he says. “In my opinion the best solution in order to mitigate this is that the community of NLP users should actively report something inappropriate when using a specific model so that this model can be paused and fixed.”

“Even if we doubt that such a bias exists in the models we’re proposing, we do encourage our users to report such problems to us so we can take measures,” he adds.

 

#amazon, #api, #artificial-intelligence, #artificial-neural-networks, #bbva, #computing, #developer, #devops, #europe, #germany, #google, #hugging-face, #ml, #natural-language-processing, #nlpcloud-io, #public-relations, #software-development, #speech-recognition, #startups, #transformer

0

Facebook ran ads for a fake ‘Clubhouse for PC’ app planted with malware

Cybercriminals have taken out a number of Facebook ads masquerading as a Clubhouse app for PC users in order to target unsuspecting victims with malware, TechCrunch has learned.

TechCrunch was alerted Wednesday to Facebook ads tied to several Facebook pages impersonating Clubhouse, the drop-in audio chat app only available on iPhones. Clicking on the ad would open a fake Clubhouse website, including a mocked-up screenshot of what the non-existent PC app looks like, with a download link to the malicious app.

When opened, the malicious app tries to communicate with a command and control server to obtain instructions on what to do next. One sandbox analysis of the malware showed the malicious app tried to infect the isolated machine with ransomware.

But overnight, the fake Clubhouse websites — which were hosted in Russia — went offline. In doing so, the malware also stopped working. Guardicore’s Amit Serper, who tested the malware in a sandbox on Thursday, said the malware received an error from the server and did nothing more.

The fake website was set up to look like Clubhouse’s real website, but featuring a malicious PC app. (Image: TechCrunch)

It’s not uncommon for cybercriminals to tailor their malware campaigns to piggyback off the successes of wildly popular apps. Clubhouse reportedly topped more than 8 million global downloads to date despite an invite-only launch. That high demand prompted a scramble to reverse-engineer the app to build bootleg versions of it to evade Clubhouse’s gated walls, but also government censors where the app is blocked.

Each of the Facebook pages impersonating Clubhouse only had a handful of likes, but were still active at the time of publication. When reached, Facebook wouldn’t say how many account owners had clicked on the ads pointing to the fake Clubhouse websites.

At least nine ads were placed this week between Tuesday and Thursday. Several of the ads said Clubhouse “is now available for PC,” while another featured a photo of co-founders Paul Davidson and Rohan Seth. Clubhouse did not return a request for comment.

The ads have been removed from Facebook’s Ad Library, but we have published a copy. It’s also not clear how the ads made it through Facebook’s processes in the first place.

 

#android, #apps, #clubhouse, #computing, #facebook, #malware, #russia, #sandbox, #security

0

Immersion cooling to offset data centers’ massive power demands gains a big booster in Microsoft

LiquidStack does it. So does Submer. They’re both dropping servers carrying sensitive data into goop in an effort to save the planet. Now they’re joined by one of the biggest tech companies in the world in their efforts to improve the energy efficiency of data centers, because Microsoft is getting into the liquid-immersion cooling market.

Microsoft is using a liquid it developed in-house that’s engineered to boil at 122 degrees Fahrenheit (lower than the boiling point of water) to act as a heat sink, reducing the temperature inside the servers so they can operate at full power without any risks from overheating.

The vapor from the boiling fluid is converted back into a liquid through contact with a cooled condenser in the lid of the tank that stores the servers.

“We are the first cloud provider that is running two-phase immersion cooling in a production environment,” said Husam Alissa, a principal hardware engineer on Microsoft’s team for datacenter advanced development in Redmond, Washington, in a statement on the company’s internal blog. 

While that claim may be true, liquid cooling is a well-known approach to dealing with moving heat around to keep systems working. Cars use liquid cooling to keep their motors humming as they head out on the highway.

As technology companies confront the physical limits of Moore’s Law, the demand for faster, higher performance processors mean designing new architectures that can handle more power, the company wrote in a blog post. Power flowing through central processing units has increased from 150 watts to more than 300 watts per chip and the GPUs responsible for much of Bitcoin mining, artificial intelligence applications and high end graphics each consume more than 700 watts per chip.

It’s worth noting that Microsoft isn’t the first tech company to apply liquid cooling to data centers and the distinction that the company uses of being the first “cloud provider” is doing a lot of work. That’s because bitcoin mining operations have been using the tech for years. Indeed, LiquidStack was spun out from a bitcoin miner to commercialize its liquid immersion cooling tech and bring it to the masses.

“Air cooling is not enough”

More power flowing through the processors means hotter chips, which means the need for better cooling or the chips will malfunction.

“Air cooling is not enough,” said Christian Belady, vice president of Microsoft’s datacenter advanced development group in Redmond, in an interview for the company’s internal blog. “That’s what’s driving us to immersion cooling, where we can directly boil off the surfaces of the chip.”

For Belady, the use of liquid cooling technology brings the density and compression of Moore’s Law up to the datacenter level

The results, from an energy consumption perspective, are impressive. The company found that using two-phase immersion cooling reduced power consumption for a server by anywhere from 5 percent to 15 percent (every little bit helps).

Microsoft investigated liquid immersion as a cooling solution for high performance computing applications such as AI. Among other things, the investigation revealed that two-phase immersion cooling reduced power consumption for any given server by 5% to 15%. 

Meanwhile, companies like Submer claim they reduce energy consumption by 50%, water use by 99%, and take up 85% less space.

For cloud computing companies, the ability to keep these servers up and running even during spikes in demand, when they’d consume even more power, adds flexibility and ensures uptime even when servers are overtaxed, according to Microsoft.

“[We] know that with Teams when you get to 1 o’clock or 2 o’clock, there is a huge spike because people are joining meetings at the same time,” Marcus Fontoura, a vice president on Microsoft’s Azure team, said on the company’s internal blog. “Immersion cooling gives us more flexibility to deal with these burst-y workloads.”

At this point, data centers are a critical component of the internet infrastructure that much of the world relies on for… well… pretty much every tech-enabled service. That reliance however has come at a significant environmental cost.

“Data centers power human advancement. Their role as a core infrastructure has become more apparent than ever and emerging technologies such as AI and IoT will continue to drive computing needs. However, the environmental footprint of the industry is growing at an alarming rate,” Alexander Danielsson, an investment manager at Norrsken VC noted last year when discussing that firm’s investment in Submer.

Solutions under the sea

If submerging servers in experimental liquids offers one potential solution to the problem — then sinking them in the ocean is another way that companies are trying to cool data centers without expending too much power.

Microsoft has already been operating an undersea data center for the past two years. The company actually trotted out the tech as part of a push from the tech company to aid in the search for a COVID-19 vaccine last year.

These pre-packed, shipping container-sized data centers can be spun up on demand and run deep under the ocean’s surface for sustainable, high-efficiency and powerful compute operations, the company said.

The liquid cooling project shares most similarity with Microsoft’s Project Natick, which is exploring the potential of underwater datacenters that are quick to deploy and can operate for years on the seabed sealed inside submarine-like tubes without any onsite maintenance by people. 

In those data centers nitrogen air replaces an engineered fluid and the servers are cooled with fans and a heat exchanger that pumps seawater through a sealed tube.

Startups are also staking claims to cool data centers out on the ocean (the seaweed is always greener in somebody else’s lake).

Nautilus Data Technologies, for instance, has raised over $100 million (according to Crunchbase) to develop data centers dotting the surface of Davey Jones’ Locker. The company is currently developing a data center project co-located with a sustainable energy project in a tributary near Stockton, Calif.

With the double-immersion cooling tech Microsoft is hoping to bring the benefits of ocean-cooling tech onto the shore. “We brought the sea to the servers rather than put the datacenter under the sea,” Microsoft’s Alissa said in a company statement.

Ioannis Manousakis, a principal software engineer with Azure (left), and Husam Alissa, a principal hardware engineer on Microsoft’s team for datacenter advanced development (right), walk past a container at a Microsoft datacenter where computer servers in a two-phase immersion cooling tank are processing workloads. Photo by Gene Twedt for Microsoft.

#artificial-intelligence, #bitcoin, #cloud, #computing, #data-center, #energy-consumption, #energy-efficiency, #enterprise, #liquid-cooling, #microsoft, #saas, #tc

0

Education non-profit Edraak ignored a student data leak for two months

Edraak, an online education non-profit, exposed the private information of thousands of students after uploading student data to an unprotected cloud storage server, apparently by mistake.

The non-profit, founded by Jordan’s Queen Rania and headquartered in the kingdom’s capital, was set up in 2013 to promote education across the Arab region. The organization works with several partners, including the British Council and edX, a consortium set up by Harvard, Stanford, and MIT.

In February, researchers at U.K. cybersecurity firm TurgenSec found one of Edraak’s cloud storage servers containing at least tens of thousands of students’ data, including spreadsheets with students’ names, email addresses, gender, birth year, country of nationality, and some class grades.

TurgenSec, which runs Breaches.UK, a site for disclosing security incidents, alerted Edraak to the security lapse. A week later, their email was acknowledged by the organization but the data continued to spill. Emails seen by TechCrunch show the researchers tried to alert others who worked at the organization via LinkedIn requests, and its partners, including the British Council.

Two months passed and the server remained open. At its request, TechCrunch contacted Edraak, which closed the servers a few hours later.

In an email this week, Edraak chief executive Sherif Halawa told TechCrunch that the storage server was “meant to be publicly accessible, and to host public course content assets, such as course images, videos, and educational files,” but that “student data is never intentionally placed in this bucket.”

“Due to an unfortunate configuration bug, however, some academic data and student information exports were accidentally placed in the bucket,” Halawa confirmed.

“Unfortunately our initial scan did not locate the misplaced data that made it there accidentally. We attributed the elements in the Breaches.UK email to regular student uploads. We have now located these misplaced reports today and addressed the issue,” Halawa said.

The server is now closed off to public access.

It’s not clear why Edraak ignored the researchers’ initial email, which disclosed the location of the unprotected server, or why the organization’s response was not to ask for more details. When reached, British Council spokesperson Catherine Bowden said the organization received an email from TurgenSec but mistook it for a phishing email.

Edraak’s CEO Halawa said that the organization had already begun notifying affected students about the incident, and put out a blog post on Thursday.

Last year, TurgenSec found an unencrypted customer database belonging to U.K. internet provider Virgin Media that was left online by mistake, containing records linking some customers to adult and explicit websites.

More from TechCrunch:


Send tips securely over Signal and WhatsApp to +1 646-755-8849. You can also send files or documents using our SecureDrop. Learn more

#articles, #british-council, #ceo, #computing, #cyberspace, #education, #edx, #email, #harvard, #jordan, #linkedin, #mit, #online-education, #phishing, #security, #server, #spamming, #spokesperson, #stanford, #united-kingdom, #virgin-media, #web-server

0

Twitter said to have held acquisition talks with Clubhouse on potential $4B deal

Twitter held talks with Clubhouse around a potential acquisition of the live drop-in audio networking platform, with a deal value somewhere around $4 billion, according to a report from Bloomberg. TechCrunch has also confirmed the discussions took place from a source familiar with the conversations.

While the talks occurred over the past several months, they’re no longer taking place, though the reason they ended isn’t known according to the report. It’s also worth noting that just a few days ago, Bloomberg reported that Clubhouse was seeking to raise a new round of funding at a valuation of around $4 billion, but the report detailing the potential acquisition talks indicate that the discussions with Twitter collapsed first, leading to a change in strategy to pursue securing additional capital in exchange for equity investment.

Twitter has its own product very similar to Clubhouse — Spaces, a drop-in audio chatroom feature that it has been rolling out gradually to its user base over the past few months. Clubhouse, meanwhile, just launched the first of its monetization efforts, Clubhouse Payments, which lets users send direct payments to other creators on the platform, provided that person has enabled receipt of said payments.

Interestingly, the monetization effort from Clubhouse actually doesn’t provide them with any money; instead, it’s monetization for recipient users who get 100% of the funds directed their way, minus a small cut for processing that goes directly to Stripe, the payment provider Clubhouse is using to enable the virtual tips.

While we aren’t privy to the specifics of these talks between Twitter and Clubhouse, it does seem like an awfully high price tag for the social network to pay for the audio app, especially given its own progress with Spaces. Clubhouse’s early traction has been undeniable, but there are a lot of questions still remaining about its longevity, and it’s also being cloned left and right by other platforms, begging the age-old startup question of whether it’s a feature or a product on its own.

Whatever went down, the timing of this revelation seems likely to prime the pump for Clubhouse’s conversation with potential investors at its target valuation for the round it’s looking to raise. Regardless, it’s exciting to have this kind of activity, buzz and attention paid to a consumer software play after many years of what one could argue has been a relatively lacklustre period for the category.

#apps, #clubhouse, #computing, #freeware, #internet-culture, #ma, #mobile-applications, #monetization, #operating-systems, #social-media, #social-network, #software, #startups, #tc, #twitter, #valuation

0

Esri brings its flagship ArcGIS platform to Kubernetes

Esri, the geographic information system (GIS), mapping and spatial analytics company, is hosting its (virtual) developer summit today. Unsurprisingly, it is making a couple of major announcements at the event that range from a new design system and improved JavaScript APIs to support for running ArcGIS Enterprise in containers on Kubernetes.

The Kubernetes project was a major undertaking for the company, Esri Product Managers Trevor Seaton and Philip Heede told me. Traditionally, like so many similar products, ArcGIS was architected to be installed on physical boxes, virtual machines or cloud-hosted VMs. And while it doesn’t really matter to end-users where the software runs, containerizing the application means that it is far easier for businesses to scale their systems up or down as needed.

Esri ArcGIS Enterprise on Kubernetes deployment

Esri ArcGIS Enterprise on Kubernetes deployment

“We have a lot of customers — especially some of the larger customers — that run very complex questions,” Seaton explained. “And sometimes it’s unpredictable. They might be responding to seasonal events or business events or economic events, and they need to understand not only what’s going on in the world, but also respond to their many users from outside the organization coming in and asking questions of the systems that they put in place using ArcGIS. And that unpredictable demand is one of the key benefits of Kubernetes.”

Deploying Esri ArcGIS Enterprise on Kubernetes

Deploying Esri ArcGIS Enterprise on Kubernetes

The team could have chosen to go the easy route and put a wrapper around its existing tools to containerize them and call it a day, but as Seaton noted, Esri used this opportunity to re-architect its tools and break it down into microservices.

“It’s taken us a while because we took three or four big applications that together make up [ArcGIS] Enterprise,” he said. “And we broke those apart into a much larger set of microservices. That allows us to containerize specific services and add a lot of high availability and resilience to the system without adding a lot of complexity for the administrators — in fact, we’re reducing the complexity as we do that and all of that gets installed in one single deployment script.”

While Kubernetes simplifies a lot of the management experience, a lot of companies that use ArcGIS aren’t yet familiar with it. And as Seaton and Heede noted, the company isn’t forcing anyone onto this platform. It will continue to support Windows and Linux just like before. Heede also stressed that it’s still unusual — especially in this industry — to see a complex, fully integrated system like ArcGIS being delivered in the form of microservices and multiple containers that its customers then run on their own infrastructure.

Image Credits: Esri

In addition to the Kubernetes announcement, Esri also today announced new JavaScript APIs that make it easier for developers to create applications that bring together Esri’s server-side technology and the scalability of doing much of the analysis on the client-side. Back in the day, Esri would support tools like Microsoft’s Silverlight and Adobe/Apache Flex for building rich web-based applications. “Now, we’re really focusing on a single web development technology and the toolset around that,” Esri product manager Julie Powell told me.

A bit later this month, Esri also plans to launch its new design system to make it easier and faster for developers to create clean and consistent user interfaces. This design system will launch April 22, but the company already provided a bit of a teaser today. As Powell noted, the challenge for Esri is that its design system has to help the company’s partners to put their own style and branding on top of the maps and data they get from the ArcGIS ecosystem.

 

#computing, #developer, #enterprise, #esri, #gis, #javascript, #kubernetes, #linux, #microsoft-windows, #software, #tc, #vms

0