Confluent CEO Jay Kreps is coming to TC Sessions: SaaS for a fireside chat

As companies process ever-increasing amounts of data, moving it in real time is a huge challenge for organizations. Confluent is a streaming data platform built on top of the open source Apache Kafka project that’s been designed to process massive numbers of events. To discuss this, and more, Confluent CEO and co-founder Jay Kreps will be joining us at TC Sessions: SaaS on Oct 27th for a fireside chat.

Data is a big part of the story we are telling at the SaaS event, as it has such a critical role in every business. Kreps has said in the past the data streams are at the core of every business, from sales to orders to customer experiences. As he wrote in a company blog post announcing the company’s $250 million Series E in April 2020, Confluent is working to process all of this data in real time — and that was a big reason why investors were willing to pour so much money into the company.

“The reason is simple: though new data technologies come and go, event streaming is emerging as a major new category that is on a path to be as important and foundational in the architecture of a modern digital company as databases have been,” Kreps wrote at the time.

The company’s streaming data platform takes a multi-faceted approach to streaming and builds on the open source Kafka project. While anyone can download and use Kafka, as with many open source projects, companies may lack the resources or expertise to deal with the raw open source code. Many a startup have been built on open source to help simplify whatever the project does, and Confluent and Kafka are no different.

Kreps told us in 2017 that companies using Kafka as a core technology include Netflix, Uber, Cisco and Goldman Sachs. But those companies have the resources to manage complex software like this. Mere mortal companies can pay Confluent to access a managed cloud version or they can manage it themselves and install it in the cloud infrastructure provider of choice.

The project was actually born at LinkedIn in 2011 when their engineers were tasked with building a tool to process the enormous number of events flowing through the platform. The company eventually open sourced the technology it had created and Apache Kafka was born.

Confluent launched in 2014 and raised over $450 million along the way. In its last private round in April 2020, the company scored a $4.5 billion valuation on a $250 million investment. As of today, it has a market cap of over $17 billion.

In addition to our discussion with Kreps, the conference will also include Google’s Javier Soltero, Amplitude’s Olivia Rose, as well as investors Kobie Fuller and Casey Aylward, among others. We hope you’ll join us. It’s going to be a thought-provoking lineup.

Buy your pass now to save up to $100 when you book by October 1. We can’t wait to see you in October!

#apache-kafka, #casey-aylward, #cisco, #cloud, #cloud-computing, #computing, #confluent, #developer, #enterprise, #event-streaming, #free-software, #goldman-sachs, #google, #javier-soltero, #jay-kreps, #kobie-fuller, #linkedin, #microsoft, #netflix, #open-source, #saas, #software, #software-as-a-service, #tc, #tc-sessions-saas-2021, #uber

The stars are aligning for federal IT open source software adoption

In recent years, the private sector has been spurning proprietary software in favor of open source software and development approaches. For good reason: The open source avenue saves money and development time by using freely available components instead of writing new code, enables new applications to be deployed quickly and eliminates vendor lock-in.

The federal government has been slower to embrace open source, however. Efforts to change are complicated by the fact that many agencies employ large legacy IT infrastructure and systems to serve millions of people and are responsible for a plethora of sensitive data. Washington spends tens of billions every year on IT, but with each agency essentially acting as its own enterprise, decision-making is far more decentralized than it would be at, say, a large bank.

While the government has made a number of moves in a more open direction in recent years, the story of open source in federal IT has often seemed more about potential than reality.

But there are several indications that this is changing and that the government is reaching its own open source adoption tipping point. The costs of producing modern applications to serve increasingly digital-savvy citizens keep rising, and agencies are budget constrained to find ways to improve service while saving taxpayer dollars.

Sheer economics dictate an increased role for open source, as do a variety of other benefits. Because its source code is publicly available, open source software encourages continuous review by others outside the initial development team to promote increased software reliability and security, and code can be easily shared for reuse by other agencies.

Here are five signs I see that the U.S. government is increasingly rallying around open source.

More dedicated resources for open source innovation

Two initiatives have gone a long way toward helping agencies advance their open source journeys.

18F, a team within the General Services Administration that acts as consultancy to help other agencies build digital services, is an ardent open source backer. Its work has included developing a new application for accessing Federal Election Commission data, as well as software that has allowed the GSA to improve its contractor hiring process.

18F — short for GSA headquarters’ address of 1800 F St. — reflects the same grassroots ethos that helped spur open source’s emergence and momentum in the private sector. “The code we create belongs to the public as a part of the public domain,” the group says on its website.

Five years ago this August, the Obama administration introduced a new Federal Source Code Policy that called on every agency to adopt an open source approach, create a source code inventory, and publish at least 20% of written code as open source. The administration also launched Code.gov, giving agencies a place to locate open source solutions that other departments are already using.

The results have been mixed, however. Most agencies are now consistent with the federal policy’s goal, though many still have work to do in implementation, according to Code.gov’s tracker. And a report by a Code.gov staffer found that some agencies were embracing open source more than others.

Still, Code.gov says the growth of open source in the federal government has gone farther than initially estimated.

A push from the new administration

The American Rescue Plan, a $1.9 trillion pandemic relief bill that President Biden signed in early March 2021, contained $9 billion for the GSA’s Technology Modernization Fund, which finances new federal technology projects. In January, the White House said upgrading federal IT infrastructure and addressing recent breaches such as the SolarWinds hack was “an urgent national security issue that cannot wait.”

It’s fair to assume open source software will form the foundation of many of these efforts, because White House technology director David Recordon is a long-time open source advocate and once led Facebook’s open source projects.

A changing skills environment

Federal IT employees who spent much of their careers working on legacy systems are starting to retire, and their successors are younger people who came of age in an open source world and are comfortable with it.

About 81% of private sector hiring managers surveyed by the Linux Foundation said hiring open source talent is a priority and that they’re more likely than ever to seek out professionals with certifications. You can be sure the public sector is increasingly mirroring this trend as it recognizes a need for talent to support open source’s growing foothold.

Stronger capabilities from vendors

By partnering with the right commercial open source vendor, agencies can drive down infrastructure costs and more efficiently manage their applications. For example, vendors have made great strides in addressing security requirements laid out by policies such as the Federal Security Security Modernization Act (FISMA), Federal Information Processing Standards (FIPS) and the Federal Risk and Authorization Management Program (FedRamp), making it easy to deal with compliance.

In addition, some vendors offer powerful infrastructure automation tools and generous support packages, so federal agencies don’t have to go it alone as they accelerate their open source strategies. Linux distributions like Ubuntu provide a consistent developer experience from laptop/workstation to the cloud, and at the edge, for public clouds, containers, and physical and virtual infrastructure.

This makes application development a well-supported activity that includes 24/7 phone and web support, which provides access to world-class enterprise support teams through web portals, knowledge bases or via phone.

The pandemic effect

Whether it’s accommodating more employees working from home or meeting higher citizen demand for online services, COVID-19 has forced large swaths of the federal government to up their digital game. Open source allows legacy applications to be moved to the cloud, new applications to be developed more quickly, and IT infrastructures to adapt to rapidly changing demands.

As these signs show, the federal government continues to move rapidly from talk to action in adopting open source.

Who wins? Everyone!

#column, #developer, #federal-election-commission, #free-software, #government, #linux, #linux-foundation, #open-source-software, #open-source-technology, #opinion, #policy, #solarwinds, #ubuntu

Chilean fintech Xepelin secures $230M in debt and equity from Kaszek, high-profile angels

Chilean startup Xepelin, which has created a financial services platform for SMEs in Latin America, has secured $30 million in equity and $200 million in credit facilities.

LatAm venture fund Kaszek Ventures led the equity portion of the financing, which also included participation from partners of DST Global and a slew of other firms and founders/angel investors. LatAm- and U.S.-based asset managers and hedge funds — including Chilean pension funds — provided the credit facilities. In total over its lifetime, Xepelin has raised over $36 million in equity and $250 million in asset-backed facilities.

Also participating in the round were Picus Capital; Kayak Ventures; Cathay Innovation; MSA Capital; Amarena; FJ Labs; Gilgamesh and Kavak founder and CEO Carlos Garcia; Jackie Reses, executive chairman of Square Financial Services; Justo founder and CEO Ricardo Weder; Tiger Global Management Partner John Curtius; GGV’s Hans Tung; and Gerry Giacoman, founder and CEO of Clara, among others.

Nicolás de Camino and Sebastian Kreis founded Xepelin in mid-2019 with the mission of changing the fact that “only 5% of companies in all LatAm countries have access to recurring financial services.”

“We want all SMEs in LatAm to have access to financial services and capital in a fair and efficient way,” the pair said.

Xepelin is built on a SaaS model designed to give SMEs a way to organize their financial information in real time. Embedded in its software is a way for companies to apply for short-term working capital loans “with just three clicks, and receive the capital in a matter of hours,” the company claimed.

It has developed an AI-driven underwriting engine, which the execs said gives it the ability to make real-time loan approval decisions.

“Any company in LatAm can onboard in just a few minutes and immediately access a free software that helps them organize their information in real time, including cash flow, revenue, sales, tax, bureau info — sort of a free CFO SaaS,” de Camino said. “The circle is virtuous: SMEs use Xepelin to improve their financial habits, obtain more efficient financing, pay their obligations, and collaborate effectively with clients and suppliers, generating relevant impacts in their industries.”

The fintech currently has over 4,000 clients in Chile and Mexico, which currently has a growth rate “four times faster” than when Xepelin started in Chile. Over the past 22 months, it has loaned more than $400 million to SMBs in the two countries. It currently has a portfolio of active loans for $120 million and an asset-backed facility for more than $250 million.

Overall, the company has been seeing a growth rate of 30% per month, the founders said. It has 110 employees, up from 20 a year ago.

Xepelin has more than 60 partnerships (a number that it said is growing each week) with midmarket corporate companies, allowing for their suppliers to onboard to its platform for free and gain access to accounts payable, revenue-based financing. The company also sells its portfolio of non-recourse loans to financial partners, which it says mitigates credit risk exposure and enhances its platform and data play.

“When we talk about creating the largest digital bank for SMEs in LatAm, we are not saying that our goal is to create a bank; perhaps we will never ask for the license to have one, and to be honest, everything we do, we do it differently from the banks, something like a non-bank, a concept used today to exemplify focus,” the founders said.

Both de Camino and Kreis said they share a passion for making financial services more accessible to SMEs all across Latin America and have backgrounds rooted deep in different areas of finance.

“Our goal is to scale a platform that can solve the true pains of all SMEs in LatAm, all in one place that also connects them with their entire ecosystem, and above all, democratized in such a way that everyone can access it,” Kreis said, “regardless of whether you are a company that sells billions of dollars or just a thousand dollars, getting the same service and conditions.”

For now, the company is nearly exclusively focused on the B2B space, but in the future, it believes several of its services “will be very useful for all SMEs and companies in LatAm.” 

“Xepelin has developed technology and data science engines to deliver financing to SMBs in Latin America in a seamless way,” Nicolas Szekasy, co-founder and managing partner at Kaszek Ventures, said in a statement.The team has deep experience in the sector and has proven a perfect fit of their user-friendly product with the needs of the market.”

Chile was home to another large funding earlier this week. NotCo, a food technology company making plant-based milk and meat replacements, closed on a $235 million Series D round that gives it a $1.5 billion valuation.

#chile, #digital-bank, #dst-global, #finance, #financial-services, #fintech, #free-software, #funding, #fundings-exits, #hans-tung, #justo, #kaszek-ventures, #latin-america, #mexico, #msa-capital, #picus-capital, #recent-funding, #ricardo-weder, #saas, #square-financial-services, #startups, #tiger-global-management, #venture-capital

An internal code repo used by New York State’s IT office was exposed online

A code repository used by the New York state government’s IT department was left exposed on the internet, allowing anyone to access the projects inside, some of which contained secret keys and passwords associated with state government systems.

The exposed GitLab server was discovered on Saturday by Dubai-based SpiderSilk, a cybersecurity company credited with discovering data spills at Samsung, Clearview AI and MoviePass.

Organizations use GitLab to collaboratively develop and store their source code — as well as the secret keys, tokens and passwords needed for the projects to work — on servers that they control. But the exposed server was accessible from the internet and configured so that anyone from outside the organization could create a user account and log in unimpeded, SpiderSilk’s chief security officer Mossab Hussin told TechCrunch.

When TechCrunch visited the GitLab server, the login page showed it was accepting new user accounts. It’s not known exactly how long the GitLab server was accessible in this way, but historic records from Shodan, a search engine for exposed devices and databases, shows the GitLab was first detected on the internet on March 18.

SpiderSilk shared several screenshots showing that the GitLab server contained secret keys and passwords associated with servers and databases belonging to New York State’s Office of Information Technology Services. Fearing the exposed server could be maliciously accessed or tampered with, the startup asked for help in disclosing the security lapse to the state.

TechCrunch alerted the New York governor’s office to the exposure a short time after the server was found. Several emails to the governor’s office with details of the exposed GitLab server were opened but were not responded to. The server went offline on Monday afternoon.

Scot Reif, a spokesperson for New York State’s Office of Information Technology Services, said the server was “a test box set up by a vendor, there is no data whatsoever, and it has already been decommissioned by ITS.” (Reif declared his response “on background” and attributable to a state official, which would require both parties agree to the terms in advance, but we are printing the reply as we were not given the opportunity to reject the terms.)

When asked, Reif would not say who the vendor was or if the passwords on the server were changed. Several projects on the server were marked “prod,” or common shorthand for “production,” a term for servers that are actively use. Reif also would not say if the incident was reported to the state’s Attorney General’s office. When reached, a spokesperson for the Attorney General did not comment by press time.

TechCrunch understands the vendor is Indotronix-Avani, a New York-based company with offices in India, and owned by venture capital firm Nigama Ventures. Several screenshots show some of the GitLab projects were modified by a project manager at Indotronix-Avani. The vendor’s website touts New York State on its website, along with other government customers, including the U.S. State Department and the U.S. Department of Defense.

Indotronix-Avani spokesperson Mark Edmonds did not respond to requests for comment.

Read more:

#attorney-general, #clearview-ai, #continuous-integration, #dubai, #echelon, #free-software, #git, #gitlab, #government, #india, #information-technology, #moviepass, #password, #samsung, #search-engine, #security, #software, #spidersilk, #spokesperson, #venture-capital, #version-control

ProtonMail gets a slick new look, as privacy tech eyes the mainstream

End-to-end encrypted email service ProtonMail has refreshed its design, updating with a cleaner look and a more customizable user interface — including the ability to pick from a bunch of themes (dark and contrasting versions are both in the mix).

Last month the Swiss company officially announced passing 50M users globally, as it turned seven years old. Over those years privacy tech has come a long way in terms of usability — which in turn has helped drive adoption.

ProtonMail’s full integration of PGP, for example, makes the gold standard of e2e encryption invisibly accessible to a mainstream Internet user, providing them with a technical guarantee that it cannot poke around in their stuff.

Its new look (see screenshot gallery below) is really just a cherry on the cake of that underlying end-to-end encryption — but as usage of its product continues to step up it’s necessarily paying more attention to design and user interface details…

Proton has also been busy building out a suite of productivity tools which it can cross-promote to webmail users, using the same privacy promise as its sales pitch (it talks about offering an “encrypted ecosystem”).

And while ProtonMail is a freemium product, which can be a red flag for digital privacy, Proton’s business has the credibility of always having had privacy engineering at its core. Its business model is to monetize via paying users — who it says are subsidizing the free tier of its tools.

One notable change to the refreshed ProtonMail web app is an app switcher that lets users quickly switch between (or indeed discover) its other apps: Proton Calendar and Proton Driver (an e2e encrypted cloud storage offering, currently still in beta).

The company also offers a VPN service, although it’s worth emphasizing that while Proton’s pledge is that it doesn’t track users’ web browsing, the service architecture of VPNs is different so there’s no technical ‘zero access’ guarantee here, as there is with Proton’s other products.

A difference of color in the icons Proton displays in the app switcher — where Mail, Calendar and Drive are colored purple like its wider brand livery and only the VPN is tinted green — is perhaps intended to represent that distinction.

Other tweaks to the updated ProtonMail interface include redesigned keyboard shortcuts which the company says makes it easier to check messages and quick filters to sort mails by read or unread status.

The company’s Import-Export app — to help users transfer messages to they can make the switch from another webmail provider — exited beta back in November.

Zooming out, adoption of privacy tech is growing for a number of reasons. As well as the increased accessibility and usability that’s being driven by developers of privacy tech tools like Proton, rising awareness of the risks around digital data breaches and privacy-hostile ad models is a parallel and powerful driver — to the point where iPhone maker Apple now routinely draws attention to rivals’ privacy-hostile digital activity in its marketing for iOS, seeking to put clear blue water between how it treats users’ data vs the data-mining competition.

Proton, the company behind ProtonMail, is positioned to benefit from the same privacy messaging. So it’s no surprise to see it making use of the iOS App Privacy disclosures introduced by Apple last year to highlight its own competitive distinction.

Here, for example, it’s pointing users’ attention to background data exchanges which underlie Google-owned Gmail and contrasting all those direct lines feeding into Google’s ad targeting business with absolutely no surveillance at all of ProtonMail users’ messages…

Comparison of the privacy disclosures of ProtonMail’s iOS app vs Gmail’s (Image credits: Proton)

Commenting on ProtonMail’s new look in a statement, Andy Yen, founder and CEO, added: “Your email is your life. It’s a record of your purchases, your conversations, your friends and loved ones. If left unprotected it can provide a detailed insight into your private life. We believe users should have a choice on how and with whom their data is shared. With the redesigned ProtonMail, we are offering an even easier way for users to take control of their data.”

#andy-yen, #apple, #apps, #digital-privacy, #e2e, #e2e-encryption, #email-encryption, #encryption, #europe, #free-software, #gmail, #google, #iphone, #privacy, #productivity-tools, #proton, #protonmail, #vpn, #web-app, #webmail

GitLab acquires UnReview as it looks to bring more ML tools to its platform

DevOps platform GitLab today announced that it has acquired UnReview, a machine learning-based tool that helps software teams recommend the best reviewers for when developers want to check in their latest code. GitLab, which is looking to bring more of these machine learning capabilities to its platform, will integrate UnReview’s capabilities into its own code review workflow. The two companies did not disclose the price of the acquisition.

“Last year we decided that the future of DevOps includes ML/AI, both within the DevOps lifecycle as well as the growth of adoption of ML/AI with our customers,” David DeSanto, GitLab’s senior director, Product Management – Dev & Sec, told me. He noted that when GitLab recently surveyed its customers, 75% of the teams said they are already using AI/ML. The company started by adding a bot to the platform that can automatically label issues, which then led to the team meeting with UnReview and, finally, acquiring it.

Image Credits: GitLab

“Our primary focus for the second half of this year in bringing on UnReview is to help automate the selection of code reviewers. It’s a very interesting problem to solve, even we at GitLab occasionally end up picking the wrong reviewers based off of what people know,” DeSanto noted.

GitLab launched its original code review components last year. As Wayne Haber, GitLab’s director of Engineering, noted, that was still a very manual process. Even with the new system, teams still retain full control over which reviewers will be assigned to a merge request, but the tool will automatically — and transparently — rank potential reviewers based on who the system believes is best suited to this task.

“I am grateful for the opportunity to share my passion for data science and machine learning with GitLab and its community,” said Alexander Chueshev, UnReview’s founder (and now a senior full stack engineer at GitLab). “I look forward to enhancing the user experience by playing a role in integrating UnReview into the GitLab platform and extending machine learning and artificial intelligence into additional DevOps stages in the future.”

DeSanto noted that GitLab now has quite a bit of experience in acquiring companies and integrating them into its stack. “We’re always looking to acquire strong teams and strong concepts that can help accelerate our roadmap or strategy or help the platform in general,” he said. “And you can see it over the last couple of years of acquisitions. When we were looking at extending what we did in security, we acquired two leaders in the security space to help build that portfolio out. And that’s fully integrated today. […] In the case of this, UnReview is doing something that we thought we may need to do in the future. They had already built it, they were able to show the value of it, and it became a good partnership between the two companies, which then led to this acquisition.”

One interesting wrinkle here is that GitLab offers both a hosted SaaS service and allows users to run their own on-premises systems as well. Running an ML service like UnReview on-premises isn’t necessarily something that most businesses are equipped to do, so at first, UnReview will be integrated with the SaaS service. The team is still looking at how to best bring it to its self-hosted user base, including a hybrid model.

#artificial-intelligence, #cloud, #continuous-integration, #developer, #devops, #engineer, #free-software, #git, #gitlab, #go, #ma, #machine-learning, #ml, #software-engineering, #tc, #unreview, #version-control

Iterative raises $20M for its MLOps platform

Iterative, an open-source startup that is building an enterprise AI platform to help companies operationalize their models, today announced that it has raised a $20 million Series A round led by 468 Capital and Mesosphere co-founder Florian Leibert. Previous investors True Ventures and Afore Capital also participated in this round, which brings the company’s total funding to $25 million.

The core idea behind Iterative is to provide data scientists and data engineers with a platform that closely resembles a modern GitOps-driven development stack.

After spending time in academia, Iterative co-founder and CEO Dmitry Petrov joined Microsoft as a data scientist on the Bing team in 2013. He noted that the industry has changed quite a bit since then. While early on, the questions were about how to build machine learning models, today the problem is how to build predictable processes around machine learning, especially in large organizations with sizable teams. “How can we make the team productive not the person? This is a new challenge for the entire industry,” he said.

Big companies (like Microsoft) were able to build their own proprietary tooling and processes to build their AI operations, Petrov noted, but that’s not an option for smaller companies.

Currently, Iterative’s stack consists of a couple of different components that sit on top of tools like GitLab and GitHub. These include DVC for running experiments and data and model versioning, CML, the company’s CI/CD platform for machine learning, and the company’s newest product, Studio, its SaaS platform for enabling collaboration between teams. Instead of reinventing the wheel, Iterative essentially provides data scientists who already use GitHub or GitLab to collaborate on their source code with a tool like DVC Studio that extends this to help them collaborate on data and metrics, too.

Image Credits: Iterative

“DVC Studio enables machine learning developers to run hundreds of experiments with full transparency, giving other developers in the organization the ability to collaborate fully in the process,” said Dmitry Petrov, CEO and founder of Iterative. “The funding today will help us bring more innovative products and services into our ecosystem.”

Petrov stressed that he wants to build an ecosystem of tools, not a monolithic platform. When the company closed this current funding round about three months ago, Iterative had about 30 employees, many of which were previously active in the open-source community around its projects. Today, that number is already closer to 60.

“Data, ML and AI are becoming an essential part of the industry and IT infrastructure,” said Leibert, general partner at 468 Capital. “Companies with great open source adoption and bottom-up market strategy, like Iterative, are going to define the standards for AI tools and processes around building ML models.”

#afore-capital, #artificial-intelligence, #cloud, #cybernetics, #data-scientist, #developer, #enterprise, #free-software, #funding, #fundings-exits, #git, #github, #gitlab, #learning, #machine-learning, #microsoft, #ml, #recent-funding, #saas, #software-engineering, #startups, #true-ventures, #version-control

Microsoft Azure launches enterprise support for PyTorch

Microsoft today announced PyTorch Enterprise, a new Azure service that provides developers with additional support when using PyTorch on Azure. It’s basically Microsoft’s commercial support offering for PyTorch

PyTorch is a Python-centric open-source machine learning framework with a focus on computer vision and natural language processing. It was originally developed by Facebook and is, at least to some degree, comparable to Google’s popular TensorFlow framework.

Frank X. Shaw, Microsoft’s corporate VP for communications, described the new PyTorch Enterprise service as providing developers with “a more reliable production experience for organizations using PyTorch in their data sciences work.”

With PyTorch Enterprise, members of Microsoft’s Premier and Unified support program will get benefits like prioritized requests, hands-on support and solutions for hotfixes, bugs and security patches, Shaw explained. Every year, Microsoft will also select one PyTorch support for long-term support.

Azure already made it relatively easy to use PyTorch and Microsoft has long invested in the library by, for example, taking over the development of PyTorch for Windows last year. As Microsoft noted in today’s announcement, the latest release of PyTorch will be integrated with Azure Machine Learning and the company promises to feed back the PyTorch code it developers back to the public PyTorch distribution.

Enterprise support will be available for PyTorch version 1.8.1 and up on Windows 10 and a number of popular Linux distributions.

“This new enterprise-level offering by Microsoft closes an important gap. PyTorch gives our researchers unprecedented flexibility in designing their models and running their experiments,” said Jeremy Jancsary, Senior Principal Research Scientist at Nuance. “Serving these models in production, however, can be a challenge. The direct involvement of Microsoft lets us deploy new versions of PyTorch to Azure with confidence.”

With this new offering, Microsoft is taking a page out of the open-source monetization playbook for startups by offering additional services on top of an open-source project. Since PyTorch wasn’t developed by a startup, only to have a major cloud provider then offer its own commercial version on top of the open-source code, this feels like a rather uncontroversial move.

read

#artificial-intelligence, #deep-learning, #developer, #facebook, #free-software, #machine-learning, #microsoft, #microsoft-windows, #natural-language-processing, #premier, #programming-languages, #python, #pytorch, #software, #tensorflow

From bootstrapped to a $2.1B valuation, ReCharge raises $227M for subscription management platform

ReCharge, a provider of subscription management software for e-commerce, announced today that it has raised $227 million in a Series B growth round at a $2.1 billion valuation. 

Summit Partners, ICONIQ Growth and Bain Capital Ventures provided the capital.

Notably, Santa Monica, California-based ReCharge was bootstrapped for several years before raising $50 million in a previously undisclosed Series A from Summit Partners in January of 2020. And, it’s currently cash flow positive, according to company execs. With this round, ReCharge has raised a total of $277 million in funding.

Over the years, the company’s SaaS platform has evolved from a subscription billing/payments platform to include a broader set of offerings aimed at helping e-commerce businesses boost revenues and cut operating costs.

Specifically, ReCharge’s cloud-based software is designed to give e-commerce merchants a way to offer and manage subscriptions for physical products. It also aims to help these brands, primarily direct to consumer companies, grow by providing them with ways to “easily” add subscription offerings to their business with the goal of turning one-time purchasers “into loyal, repeat customers.”

The company has some impressive growth metrics, no doubt in part driven by the COVID-19 pandemic’s push to all things digital. ReCharge’s ARR grew 146% in 2020, while revenue grew over 136% over the same period, according to co-founder and CEO Oisin O’Connor, although he declined to reveal hard numbers. The startup has 15,000 customers and 20 million subscribers across 180 countries on its platform. Customers include Harry’s, Oatly, Fiji Water, Billie and Native. But even prior to the pandemic, it had doubled its processing volume each year for the past five years and has processed over $5.3 billion in transactions since its 2014 inception.

ReCharge also has 328 employees, up from 140 in January of 2020.

“We saw many brick and mortar stores, such as Oatly, offer their products through subscriptions as a result of the pandemic in 2020,” O’Connor told TechCrunch. “Certain categories such as food & beverage and pet foods were some of the fastest growing segments in total subscriber count, with 100% and 147% increases, respectively, as non-discretionary spending shifted online.”

He was surprised to see that growth also extend beyond the most obvious categories. For example, ReCharge saw beauty care products subscribers grow by 120% last year.

“Overall, we saw a 91% subscriber growth in 2020 across the board in all categories of subscriptions,” O’Connor told TechCrunch. “We believe there is a combination of factors at play: the pandemic, the rise of physical subscriptions and the rise of direct-to-consumer buying.”

ReCharge plans to use its fresh capital to accelerate hiring in both R&D (engineering and product) and go-to-market functions such as sales, marketing and customer success. It plans to continue its expansion into other e-commerce platforms such as BigCommerce, Salesforce Commerce Cloud and Magento, and outside of North America into other geographic markets, starting with Europe. ReCharge also plans to “broaden” its acquisition scope so that it can “accelerate” its time-to-market in certain domains, according to O’Connor, and of course build upon its products and services.

Yoonkee Sull, partner at ICONIQ Growth, said his firm has been watching the rapid rise of subscription commerce for several years “as more merchants have looked for ways to deepen relationships with loyal customers and consumers increasingly have sought out more convenient and flexible ways to buy from their favorite brands.”

Ultimately, ICONIQ is betting on its belief that ReCharge “will continue to take significant share in a fast-growing market,” he told TechCrunch.

Sull believes the ReCharge team identified the subscription e-commerce opportunity early on and addresses the numerous nuanced needs of the market with “a fully-featured product that uniquely enables both the smallest merchants and largest brands to easily adopt and scale with their platform.”

Andrew Collins, managing director at Summit Partners, was impressed that the company saw so much growth without external capital for years, due to its “efficiency and discipline.”

The ReCharge team identified a true product-market fit and built a product that customers love — which has fueled strong organic growth as the business has scaled,” Collins added.

#bain-capital-ventures, #bigcommerce, #california, #cloud, #cloud-based-software, #e-commerce, #ecommerce, #europe, #food, #free-software, #funding, #fundings-exits, #iconiq-growth, #magento, #north-america, #oatly, #payments, #recent-funding, #recharge, #saas, #salesforce, #santa-monica, #software, #startup, #startups, #summit-partners, #tc, #venture-capital

Whereby, which allows more collaboration over video calls, raises $12M from Point Nine and 20 Angels

Zoom, Microsoft and Google all rocketed to the top of the charts in the virtual meetings stakes during the pandemic but a plucky startup from Norway had others ideas. Video meeting startup Whereby has now raised $12 million from German VC Point Nine, SaaStr fund and a group of more than 20 angel investors.

Angels investors include Josh Buckley(CEO, Producthunt), Elizabeth Yin (Hustlefund) and Jason M. Lemkin (founder of Saastr).

Øyvind Reed, CEO at Whereby said in a statement: “The past year has led many of us to question the future of work, with video meetings set to remain a big part of our lives. More than ever, the tools we use to connect have to enable effective and enjoyable meetings, providing focus, collaboration and wellbeing. .”

Whereby’s platform has three pricing plans (including free) and allows users to embed tools like Google Docs, Trello and Miro directly in their meetings, unlike other video platforms.

Whereby was demonstrated to me by co-founder Ingrid Ødegaard on a coffee table during 2016’s Oslo Innovation Week. I immediately set-up my username, which has existed even as the startup changed it name from Appear.in. Ingrid told me during an interview that they “tried to be much more human-centric and really focus on some of the human problems that come with collaborating remotely. One of the big mistakes that a lot of people making is just replicating the behavior that they had in the office… whereas we think that you actually need to work in a fundamentally different way. We want to help people do that and by making it really easy to jump in and have a meeting when you need to. But our goal is not to push people to have more meetings, quite the opposite.”

The startup’s secret weapon is enterprise integrations. If you had a video meeting with a UK GP over video in the last year it was probably over Whereby (indeed, mine was!). Whereby won a contract with the NHS for its remote video patient consultations during the pandemic. Competitors for this include Jitsi and AccurX. The company claims it saw a 450% increase in users across 150 countries last year.

“Last year we saw the mass adoption of video meetings,” said Christoph Janz, Partner at Point Nine. “Now it’s about taking the user experience to the next level and Whereby will be leading that charge. It’s amazing to see a Scandinavian startup playing in the same league as the tech giants.”

#ceo, #christoph-janz, #co-founder, #elizabeth-yin, #europe, #founder, #free-software, #google, #ingrid, #jitsi, #josh-buckley, #microsoft, #miro, #nhs, #norway, #point-nine, #producthunt, #reed, #saastr, #shakil-khan, #spotify, #tc, #technology, #trello, #web-conferencing, #zoom

Scarf helps open-source developers track how their projects are being used

Almost by default, open-source developers get very little insight into who uses their projects. In part, that’s the beauty of open source, but for developers who want to monetize their projects, it’s also a bit of a curse because they get very little data back from these projects. While you usually know who bought your proprietary software — and those tools often send back some telemetry, too — that’s not something that holds true for open-source code. Scarf is trying to change that.

In its earliest incarnation, Scarf founder Avi Press tried to go the telemetry route for getting this kind of data. He had written a few successful developer tools and as they got more popular, he realized that he was spending an increasingly large amount of time supporting his users.

Scarf founder Avi Press

Scarf co-founder and CEO Avi Press (Image Credits: Scarf)

“This project was now really sapping my time and energy, but also clearly providing value to big companies,” he said. “And that’s really what got me thinking that there’s probably an opportunity to maybe provide support or build features just for these companies, or do something to try to make some money from that, or really just better support those commercial users.” But he also quickly realized that he had virtually no data about how the project was being used beyond what people told him directly and download stats from GitHub and other places. So as he tried to monetize the project, he had very little data to inform his decisions and he had no way of knowing which companies to target directly that were already quietly using his code.

“If you were working at any old company — pushing code out to an app or a website — if you pushed out code without any observability, that would be reckless. You would you get fired over something like that. Or maybe not, but it’s a really poor decision to make. And this is the norm for every domain of software — except open source.”

Image Credits: Scarf

That led to the first version of Scarf: a package manager that would provide usage analytics and make it easy to sell different versions of a project. But that wasn’t quite something the community was ready to accept — and a lot of people questioned the open-source nature of the project.

“What really came out of those conversations, even chatting with people who were really, really against this kind of approach — everyone agrees that the package registries already have all of this data. So NPM and Docker and all these companies that have this data — there are many, many requests of developers for this data,” Press said, and noted that there is obviously a lot of value in this data.

So the new Scarf now takes a more sophisticated approach. While it still offers an NPM library that does phone home and pixel tracking for documentation, its focus is now on registries. What the company is essentially launching this week is a kind of middle layer between the code and the registry that allows developers to, for example, point users of their containers to the Scarf registry first and then Scarf sits in front of the Docker Hub or the GitHub Container Registry.

“You tell us, where are your containers located? And then your users pull the image through Scarf and Scarf just redirects the traffic to wherever it needs to go. But then all the traffic that flows through Scarf, we can expose that to the maintainers. What company did that pull come from? Was it on a laptop or on CI? What cloud provider was it on? What container runtime was it using? What version of the software did they pull down? And all of these things that are actually pretty trivial to answer from this traffic — and the registries could have been doing this whole time but unfortunately have not done so.”

To fund its efforts, Scarf recently raised a $2 million seed funding round led by Wave Capital, with participation from 468 Capital and a number of angel investors.

#computing, #developer, #docker, #energy, #free-software, #github, #go, #npm, #open-source-software, #programming-languages, #recent-funding, #scarf, #software, #startups, #wave-capital

The rise of the activist developer

The last few months have put technology and its role in society, especially in the United States, in the spotlight.

We need a serious conversation on the equitable and ethical use of tech, what can be done to combat the spread of misinformation and more. As we work to solve these problems, however, I hope this dialogue doesn’t overshadow one silver lining of the past year: The rise of the developer activists who are using tech for good.

They stepped up like never before to tackle numerous global issues, demonstrating they not only love solving incredibly hard problems, but can do it well and at scale.

We need a serious conversation on the equitable and ethical use of tech, what can be done to combat the spread of misinformation and more.

The responsibility lies with all of us to empower this community to unleash their entrepreneurial growth mindset and ensure more people have the opportunity to create a sustainable future for all. I’m calling on my colleagues, our industry, our governments and more to join me in supporting a new wave of developer-led activism and renew efforts to collectively close the skills gap that exists today.

From the COVID-19 pandemic, to climate change, to racial injustice, developers are playing a crucial role in creating new technologies to help people navigate today’s volatile world. Many of these developers are working on social problems on their own time, using open-source software that they can share globally. This work is helping to save lives and going forward, will help millions more.

The international research community acted early to share data and genetic sequences with one another in open-source projects that helped advance our early understanding of coronavirus and how to mobilize efforts to stop it. The ability for researchers to track genetic codes around the world in near real-time is crucial to our response.

St. Jude Children’s Research Hospital was able to digitize its contract signature process in just 10 days during this critical time. A team of four developers hailing from Taiwan, Brazil, Mongolia and India helped farmers navigate climate change by using weather data to make more informed crop management decisions.

From the civil rights and anti-war movements of the 1950s and 1960s through the recent rallies supporting the Black Lives Matter movement, people have used passion and protests to shape the conversations that lead to a better future. Now, this rich history of people-powered action has an important new set of tools: The data, software and tech know-how that’s needed to mount a coordinated global and local response to our greatest challenges.

Today’s software developers are akin to civil engineers in the 1940s and 1950s who designed bridges and roads, creating an infrastructure that paved the path for enormous widespread progress.

The open-source code community already collaborates and shares, producing innovations that belong to everyone, focusing on progress over perfection. If a hurricane is about to create havoc in your community, don’t just fill sandbags, hit your keyboard and use open-source technologies to not only help your community, but to scale solutions to help others. DroneAID, for example, is an open-source tool that uses visual recognition to detect and count SOS icons on the ground from drones flying overhead, and then automatically plots emergency needs on a map for first responders.

A recent GitHub study shows that open-source project creation is up 25% since April of last year. Developers are signing on to contribute to open-source communities and virtual hackathons during their downtime, using their skills to create a more sustainable world.

In 2018, I helped found Call for Code with IBM, David Clark Cause and United Nations Human Rights to empower the global developer community, and a big part of our mission was to create the infrastructure needed to shepherd big ideas into real-world deployments. For our part, IBM provides the 24-million-person developer community access to the same technology being used by our enterprise clients, including our open hybrid cloud platform, AI, blockchain and quantum computing.

One winner, Prometeo, with a team including a firefighter, nurse and developers, created a system that uses artificial intelligence and the Internet of Things to safeguard firefighters as they battle blazes and has been tested in multiple regions in Spain. We’ve seen developers help teachers share virtual information for homeschooling; measure the carbon footprint impact of consumer purchases; update small businesses with COVID-19 policies; help farmers navigate climate change; and improve the way businesses manage lines amid the pandemic.

This past year, Devpost partnered with the World Health Organization (WHO) and challenged developers to create COVID-19 mitigation solutions in categories including health, vulnerable populations and education. The Ford Foundation and Mozilla led a fellowship program to connect technologists, activists, journalists and scientists, and strengthen organizations working at the convergence of technology and social justice. The U.S. Digital Response (USDR) connected pro-bono technologists to work with government and organizations responding to crisis.

The most complex global and societal issues can be broken down into smaller solvable tech challenges. But to solve our most complex problems, we need the brains of every country, every class, every gender. The skills-gap crisis is a global phenomenon, making it critical that we equip the next generation of problem solvers with the training and resources they need to turn great ideas into impactful solutions.

This year, we can expect to see a newly energized community of developers working across the boundaries of companies, states and countries to take on some of the world’s biggest problems.

But they can’t do it alone. These developer activists need our support, encouragement and help pinpointing the most crucial problems to address, and they need the tools to bring solutions to every corner of the world.

The true power of technology lies with those who want to change the world for good. To ensure anyone who wants to create change has the tools, resources and skillsets to do so, we must renew our focus on closing the skills gap and addressing deep inequalities in our society.

Our future depends on getting this right.

#column, #covid-19, #developer, #free-software, #hackathon, #open-source-software, #opinion

The Rust programming language finds a new home in a non-profit foundation

Rust, the programming language — not the survival game, now has a new home: the Rust Foundation. AWS, Huawei, Google, Microsoft and Mozilla banded together to launch this new foundation today and put a two-year commitment to a million-dollar budget behind it. This budget will allow the project to “develop services, programs, and events that will support the Rust project maintainers in building the best possible Rust.”

Rust started out as a side project inside of Mozilla to develop an alternative to C/C++ . Designed by Mozilla Research’s Graydon Hore, with contributions from the likes of JavaScript creator Brendan Eich, Rust became the core language for some of the fundamental features of the Firefox browser and its Gecko engine, as well as Mozilla’s Servo engine. Today, Rust is the most-loved language among developers. But with Mozilla’s layoffs in recent months, a lot of the Rust team lost its job and the future of the language became unclear without a main sponsor, though the project itself has thousands of contributors and a lot of corporate users, so the language itself wasn’t going anywhere.

A large open-source project oftens needs some kind of guidance and the new foundation will provide this — and it takes a legal entity to manage various aspects of the community, including the trademark, for example. The new Rust board will feature 5 board directors from the 5 founding members, as well as 5 directors from project leadership.

“Mozilla incubated Rust to build a better Firefox and contribute to a better Internet,” writes Bobby Holley, Mozilla and Rust Foundation Board member, in a statement. “In its new home with the Rust Foundation, Rust will have the room to grow into its own success, while continuing to amplify some of the core values that Mozilla shares with the Rust community.”

All of the corporate sponsors have a vested interest in Rust and are using it to build (and re-build) core aspects of some of their stacks. Google recently said that it will fund a Rust-based project that aims to make the Apache webserver safer, for example, while Microsoft recently formed a Rust team, too, and is using the language to rewrite some core Windows APIs. AWS recently launched Bottlerocket, a new Linux distribution for containers that, for example, features a build system that was largely written in Rust.

 

#aws, #brendan-eich, #firefox, #free-software, #gecko, #google, #huawei, #javascript, #microsoft, #mozilla, #mozilla-foundation, #programming-languages, #rust, #servo, #software, #tc

GitLab reshuffles its paid subscription plans, drops its Bronze/Starter tier

GitLab, the increasingly popular DevOps platform, today announced a major update to its subscription model. The company is doing away with its $4/month Bronze/Starter package. Current users will be able to renew one more time at the existing price or move to a higher tier (and receive a significant discount for the first three years after they do so).

The company’s free tier, it is worth noting, is not going away and GitLab argues that it includes “89% of the features in Bronze/Starter.”

As GitLab founder and CEO Sid Sijbrandij told me, this was a difficult decision for the team. He acknowledged that this is a big change for those on the Bronze plan. “I hope that they see that we we did our homework and that we have great legacy pricing,” Sijbrandij said, and added that the company will listen to feedback from its users.

To ease the pain, Bronze users will be able to renew their existing subscription before January 26, 2022 for an additional year at the existing price. They can also opt to move to the Premium tier at a discounted price for the next three years, starting at $6/user/month in Year 1, but that price then goes up to $9/user/month and $15/user/month in Year 2 and 3 respectively. For new users, the Bronze package is no longer available, starting now.

Image Credits: GitLab

In the end, this was a purely financial decision for GitLab. As Sijbrandij told me, the company was losing money on every Bronze-tier customer. “The Bronze tier, we were selling at a loss,” he said. “We were just losing money every time we sold it — just on hosting and support. To be a sustainable business, this was a move we had to make. It’s a big transition for our customers but we want to make sure we’re a sustainable company and we can keep investing.”

Sijbrandij told me the team looked at increasing the price of the Bronze tier to make it profitable. “We looked at all options, but in the end, you’re going to have an offering that is very similar to Premium. It would be too much overlap between the two,” he explained.

With this change, GitLab now offers three tiers: Free, Premium and Ultimate (it’s also doing away with the “Silver/Premium” and “Gold/Ultimate” naming).

The free tier, which in terms of total users is the most popular plan on GitLab, will remain in place. While it is surely a loss-leader for GitLab, it only comes with limited CI/CD credits and doesn’t include any support options, so the overall loss here must have been worth it for the company. Sijbrandij also noted that, as an open core company, having a free and open offering is simply a must.

#continuous-integration, #developer, #free-software, #git, #gitlab, #sid-sijbrandij, #tc, #version-control

Twitter’s vision of decentralization could also be the far-right’s internet endgame

This week, Twitter CEO Jack Dorsey finally responded publicly to the company’s decision to ban President Trump from its platform, writing that Twitter had “faced an extraordinary and untenable circumstance” and that he did not “feel pride” about the decision. In the same thread, he took time to call out a nascent Twitter-sponsored initiative called “bluesky,” which is aiming to build up an “open decentralized standard for social media” that Twitter is just one part of.

Researchers involved with bluesky reveal to TechCrunch an initiative still in its earliest stages that could fundamentally shift the power dynamics of the social web.

Bluesky is aiming to build a “durable” web standard that will ultimately ensure that platforms like Twitter have less centralized responsibility in deciding which users and communities have a voice on the internet. While this could protect speech from marginalized groups, it may also upend modern moderation techniques and efforts to prevent online radicalization.

Jack Dorsey, co-founder and chief executive officer of Twitter Inc., arrives after a break during a House Energy and Commerce Committee hearing in Washington, D.C., U.S., on Wednesday, Sept. 5, 2018. Republicans pressed Dorsey for what they said may be the “shadow-banning” of conservatives during the hearing. Photographer: Andrew Harrer/Bloomberg via Getty Images

What is bluesky?

Just as Bitcoin lacks a central bank to control it, a decentralized social network protocol operates without central governance, meaning Twitter would only control its own app built on bluesky, not other applications on the protocol. The open and independent system would allow applications to see, search and interact with content across the entire standard. Twitter hopes that the project can go far beyond what the existing Twitter API offers, enabling developers to create applications with different interfaces or methods of algorithmic curation, potentially paying entities across the protocol like Twitter for plug-and-play access to different moderation tools or identity networks.

A widely adopted, decentralized protocol is an opportunity for social networks to “pass the buck” on moderation responsibilities to a broader network, one person involved with the early stages of bluesky suggests, allowing individual applications on the protocol to decide which accounts and networks its users are blocked from accessing.

Social platforms like Parler or Gab could theoretically rebuild their networks on bluesky, benefitting from its stability and the network effects of an open protocol. Researchers involved are also clear that such a system would also provide a meaningful measure against government censorship and protect the speech of marginalized groups across the globe.

Bluesky’s current scope is firmly in the research phase, people involved tell TechCrunch, with about 40-50 active members from different factions of the decentralized tech community surveying the software landscape and putting together proposals for what the protocol should ultimately look like. Twitter has told early members that it hopes to hire a project manager in the coming weeks to build out an independent team that will start crafting the protocol itself.

Bluesky’s initial members were invited by Twitter CTO Parag Agrawal early last year. It was later determined that the group should open the conversation up to folks representing some of the more recognizable decentralized network projects, including Mastodon and ActivityPub who joined the working group hosted on the secure chat platform Element.

Jay Graber, founder of decentralized social platform Happening, was paid by Twitter to write up a technical review of the decentralized social ecosystem, an effort to “help Twitter evaluate the existing options in the space,” she tells TechCrunch.

“If [Twitter] wanted to design this thing, they could have just assigned a group of guys to do it, but there’s only one thing that this little tiny group of people could do better than Twitter, and that’s not be Twitter,” said Golda Velez, another member of the group who works as a senior software engineer at Postmates and co-founded civ.works, a privacy-centric social network for civic engagement.

The group has had some back and forth with Twitter executives on the scope of the project, eventually forming a Twitter-approved list of goals for the initiative. They define the challenges that the bluesky protocol should seek to address while also laying out what responsibilities are best left to the application creators building on the standard.

A Twitter spokesperson declined to comment.

Parrot.VC Twitter account

Image: TechCrunch

Who is involved

The pain points enumerated in the document, viewed by TechCrunch, encapsulate some of Twitter’s biggest shortcomings. They include “how to keep controversy and outrage from hijacking virality mechanisms,” as well as a desire to develop “customizable mechanisms” for moderation, though the document notes that the applications, not the overall protocol, are “ultimately liable for compliance, censorship, takedowns etc..”

“I think the solution to the problem of algorithms isn’t getting rid of algorithms — because sorting posts chronologically is an algorithm — the solution is to make it an open pluggable system by which you can go in and try different algorithms and see which one suits you or use the one that your friends like,” says Evan Henshaw-Plath, another member of the working group. He was one of Twitter’s earliest employees and has been building out his own decentralized social platform called Planetary.

His platform is based on the secure scuttlebutt protocol, which allows user to browse networks offline in an encrypted fashion. Early on, Planetary had been in talks with Twitter for a corporate investment as well as a personal investment from CEO Jack Dorsey, Henshaw-Plath says, but the competitive nature of the platform prompted some concern among Twitter’s lawyers and Planetary ended up receiving an investment from Twitter co-founder Biz Stone’s venture fund Future Positive. Stone did not respond to interview requests.

After agreeing on goals, Twitter had initially hoped for the broader team to arrive at some shared consensus but starkly different viewpoints within the group prompted Twitter to accept individual proposals from members. Some pushed Twitter to outright adopt or evolve an existing standard while others pushed for bluesky to pursue interoperability of standards early on and see what users naturally flock to.

One of the developers in the group hoping to bring bluesky onto their standard was Mastodon creator Eugen Rochko who tells TechCrunch he sees the need for a major shift in how social media platforms operate globally.

“Banning Trump was the right decision though it came a little bit too late. But at the same time, the nuance of the situation is that maybe it shouldn’t be a single American company that decides these things,” Rochko tells us.

Like several of the other members in the group, Rochko has been skeptical at times about Twitter’s motivation with the bluesky protocol. Shortly after Dorsey’s initial announcement in 2019, Mastodon’s official Twitter account tweeted out a biting critique, writing, “This is not an announcement of reinventing the wheel. This is announcing the building of a protocol that Twitter gets to control, like Google controls Android.”

Today, Mastodon is arguably one of the most mature decentralized social platforms. Rochko claims that the network of decentralized nodes has more than 2.3 million users spread across thousands of servers. In early 2017, the platform had its viral moment on Twitter, prompting an influx of “hundreds of thousands” of new users alongside some inquisitive potential investors whom Rochko has rebuffed in favor of a donation-based model.

Image Credits: TechCrunch

Inherent risks

Not all of the attention Rochko has garnered has been welcome. In 2019, Gab, a social network favored by right-wing extremists, brought its entire platform onto the Mastodon network after integrating the platform’s open source code, bringing Mastodon its single biggest web of users and its most undesirable liability all at once.

Rochko quickly disavowed the network and aimed to sever its ties to other nodes on the Mastodon platform and convince application creators to do the same. But a central fear of decentralization advocates was quickly realized, as the platform type’s first “success story” was a home for right-wing extremists.

This fear has been echoed in decentralized communities this week as app store owners and networks have taken another right-wing social network, Parler, off the web after violent content surfaced on the site in the lead-up and aftermath of riots at the U.S. Capitol, leaving some developers fearful that the social network may set up home on their decentralized standard.

“Fascists are 100% going to use peer-to-peer technologies, they already are and they’re going to start using it more… If they get pushed off of mainstream infrastructure or people are surveilling them really closely, they’re going to have added motivation,” said Emmi Bevensee, a researcher studying extremist presences on decentralized networks. “Maybe the far-right gets stronger footholds on peer-to-peer before the people who think the far-right is bad do because they were effectively pushed off.”

A central concern is that commoditizing decentralized platforms through efforts like bluesky will provide a more accessible route for extremists kicked off current platforms to maintain an audience and provide casual internet users a less janky path towards radicalization.

“Peer-to-peer technology is generally not that seamless right now. Some of it is; you can buy Bitcoin in Cash App now, which, if anything, is proof that this technology is going to become much more mainstream and adoption is going to become much more seamless,” Bevensee told TechCrunch. “In the current era of this mass exodus from Parler, they’re obviously going to lose a huge amount of audience that isn’t dedicated enough to get on IPFS. Scuttlebutt is a really cool technology but it’s not as seamless as Twitter.”

Extremists adopting technologies that promote privacy and strong encryption is far from a new phenomenon, encrypted chat apps like Signal and Telegram have been at the center of such controversies in recent years. Bevensee notes the tendency of right-wing extremist networks to adopt decentralized network tech has been “extremely demoralizing” to those early developer communities — though she notes that the same technologies can and do benefit “marginalized people all around the world.”

Though people connected to bluesky’s early moves see a long road ahead for the protocol’s development and adoption, they also see an evolving landscape with Parler and President Trump’s recent deplatforming that they hope will drive other stakeholders to eventually commit to integrating with the standard.

“Right at this moment I think that there’s going to be a lot of incentive to adopt, and I don’t just mean by end users, I mean by platforms, because Twitter is not the only one having these really thorny moderation problems,” Velez says. “I think people understand that this is a critical moment.”

#android, #biz-stone, #ceo, #co-founder, #computing, #encryption, #free-software, #gab, #google, #house-energy-and-commerce-committee, #jack-dorsey, #peer-to-peer, #photographer, #president, #social, #social-media, #social-media-platforms, #social-network, #social-networks, #tc, #technology, #text-messaging, #trump, #twitter, #united-states, #washington-d-c, #web-applications

You can now securely submit tips to TechCrunch using SecureDrop

For the past few years, some of the biggest stories on TechCrunch have come from you.

We’ve revealed internal employee battles at some of the world’s biggest tech companies, reported on unexpected layoffs during the pandemic, uncovered safety violations, how Facebook paid teens to snoop on their private data, revealed a major hack that a company tried to cover up, and exposed workplace discrimination, secretive startups and government wrongdoing.

We have been able to report on these important issues in large part because sources have reached out with information that companies and governments don’t want to come to light.

Now we’re making it easier and more secure for you to contact TechCrunch reporters and editors.

Today, we are launching our own SecureDrop, a tip submission system that allows you to securely and anonymously reach out with information, files and documents for us to investigate.

This is what TechCrunch’s SecureDrop looks like. (Image: TechCrunch)

SecureDrop is a system designed to allow us to communicate with you while protecting your identity from virtually all kinds of tracking. It works in part thanks to the Tor anonymity network, which bounces and encrypts your internet traffic through several servers on its way to its destination in order to make tracking almost impossible. Accessing Tor — and our SecureDrop — requires use of the free-to-use Tor Browser. That means we won’t know who you are, unless you tell us.

We know what sources risk in revealing information that the powerful want to keep secret.

Rest assured, our SecureDrop is physically controlled by TechCrunch, and anything you submit to SecureDrop is encrypted and can only be viewed by TechCrunch editors. And the software itself, maintained by the non-profit Freedom of the Press Foundation, goes through regular security audits and will be kept up-to-date with the most latest fixes.

You can access instructions on how to use our SecureDrop by going to: techcrunch.com/securedrop

We’re looking forward to hearing from you.

#computing, #dark-web, #free-software, #freedom-of-the-press-foundation, #internet-traffic, #securedrop, #security, #whistleblowing

Google grants $3 million to the CNCF to help it run the Kubernetes infrastructure

Back in 2018, Google announced that it would provide $9 million in Google Cloud Platform credits — divided over three years — to the Cloud Native Computing Foundation (CNCF) to help it run the development and distribution infrastructure for the Kubernetes project. Previously, Google owned and managed those resources for the community. Today, the two organizations announced that Google is adding on to this grant with another $3 million annual donation to the CNCF to “help ensure the long-term health, quality and stability of Kubernetes and its ecosystem.”

As Google notes, the funds will go to the testing and infrastructure of the Kubernetes project, which currently sees over 2,300 monthly pull requests that trigger about 400,000 integration test runs, all of which use about 300,000 core hours on GCP.

“I’m really happy that we’re able to continue to make this investment,” Aparna Sinha, a director of product management at Google and the chairperson of the CNCF governing board, told me. “We know that it is extremely important for the long-term health, quality and stability of Kubernetes and its ecosystem and we’re delighted to be partnering with the Cloud Native Computing Foundation on an ongoing basis. At the end of the day, the real goal of this is to make sure that developers can develop freely and that Kubernetes, which is of course so important to everyone, continues to be an excellent, solid, stable standard for doing that.”

Sinha also noted that Google contributes a lot of code to the project, with 128,000 code contributions in the last twelve months alone. But on top of these technical contributions, the team is also making in-kind contributions through community engagement and mentoring, for example, in addition to the kind of financial contributions the company is announcing today.

“The Kubernetes project has been growing so fast — the releases are just one after the other,” said Priyanka Sharma, the General Manager of the CNCF. “And there are big changes, all of this has to run somewhere. […] This specific contribution of the $3 million, that’s where that comes in. So the Kubernetes project can be stress-free, [knowing] they have enough credits to actually run for a full year. And that security is critical because you don’t want Kubernetes to be wondering where will this run next month. This gives the developers and the contributors to the project the confidence to focus on feature sets, to build better, to make Kubernetes ever-evolving.”

It’s worth noting that while both Google and the CNCF are putting their best foot forward here, there have been some questions around Google’s management around the Istio service mesh project, which was incubated by Google and IBM a few years ago. At some point in 2017, there was a proposal to bring it under the CNCF umbrella, but that never happened. This year, Istio became one of the founding projects of Open Usage Commons, though that group is mostly concerned with trademarks, not with project governance. And while all of this may seem like a lot of inside baseball — and it is — but it had some members of the open-source community question Google’s commitment to organizations like the CNCF.

“Google contributes to a lot of open-source projects. […] There’s a lot of them, many are with open-source foundations under the Linux Foundation, many of them are otherwise,” Sinha said when I asked her about this. “There’s nothing new, or anything to report about anything else. In particular, this discussion — and our focus very much with the CNCF here is on Kubernetes, which I think — out of everything that we do — is by far the biggest contribution or biggest amount of time and biggest amount of commitment relative to anything else.”

#aparna-sinha, #cloud, #cloud-computing, #cloud-infrastructure, #cloud-native-computing-foundation, #cloud-native-computing, #cncf, #computing, #developer, #free-software, #google, #google-cloud-platform, #kubernetes, #priyanka-sharma, #product-management, #tc, #web-services

Google opens its Fuchsia operating system to outside developers

For the longest time, Google’s new Fuchsia operating system remained a bit of a mystery — with little information in terms of the company’s plans for it, even as the team behind it brought the code to GitHub under a standard open-source license. These days, we know that it’s Google’s first attempt at developing a completely new kernel and general purpose operating system that promises to be more than just an experiment (or a retention project to keep senior engineers from jumping ship). For the most part, though, Google has remained pretty mum about the subject.

It seems like Google is ready to start talking about Fuchsia a bit more now. The company today announced that it is expanding the Fuchsia open-source community and opening it up to contributions from the public. Typically, companies start opening up their open-source projects to outside contributors once they feel they have achieved a stable foundation that others can build on.

“Starting today, we are expanding Fuchsia‘s open source model to make it easier for the public to engage with the project,” the team writes. “We have created new public mailing lists for project discussions, added a governance model to clarify how strategic decisions are made, and opened up the issue tracker for public contributors to see what’s being worked on. As an open source effort, we welcome high-quality, well-tested contributions from all. There is now a process to become a member to submit patches, or a committer with full write access.”

Google is also publishing a technical roadmap for Fuchsia, with a driver framework, file system performance and expanding the input pipeline for accessibility at the top of the list.

The company also specifically notes that Fuchsia is not ready for general product development or even as a development target. Anybody with the right technical chops can clone the repository and build the code, though. Google already provides quite a bit of documentation around how to do that today, as well as an emulator.

Google also notes that it aims to build an include open source community around the project. “Fuchsia is an open source project that is inclusive by design, from the architecture of the platform itself, to the open source community that we’re building. The project is still evolving rapidly, but the underlying principles and values of the system have remained relatively constant throughout the project.”

#computing, #developer, #free-software, #fuchsia, #github, #google, #operating-system, #operating-systems, #software, #version-control

Parrot Software has $1.2 million to grow its restaurant point-of-sale and management service in Mexico

The two founders of Parrot Software, Roberto Cebrián and David Villarreal, first met in high school in Monterrey, Mexico. In the eleven years since , both have pursued successful careers in the tech industry and became family (they’re brothers-in-law).

Now, they’re starting a new business together leveraging Cebrián’s experience running a point-of-sale company and Villarreal’s time working first at Uber and then at the high-growth, scooter and bike rental startup, Grin.

Cebrían’s experience founding the point-of-sale company S3 Software laid the foundation for Parrot Software, and its point of sale service to manage restaurant operations. 

Roberto has been in the industry for the past six or seven years,” said Villarreal. “And he was telling me that no one has been serving [restaurants] properly… Roberto pitched me the idea and I got super involved and decided to start the company.”

Parrot Software co-founders Roberto Cebrían and David Villarreal. Image Credit: Parrot Software

Like Toast in the U.S., Parrot  manages payments including online and payments and real-time ordering, along with integrations into services that can manage the back-end operations of a restaurant too, according to Villarreal. Those services include things like delivery software, accounting and loyalty systems.  

The company is already live in over 500 restaurants in Mexico and is used by chains including Cinnabon, Dairy Queen, Grupo Costeño, and Grupo Pangea.

Based in Monterrey, Mexico, the company has managed to attract a slew of high profile North American investors including Joe Montana’s Liquid2 Ventures, Foundation Capital, Superhuman angel fund, Toby Spinoza, the vice president of DoorDash, and Ed Baker, a product lead at Uber.

Since its launch, the company has managed to land contracts in 10 cities, with the largest presence in Northeastern Mexico, around Monterrey, said Villarreal.

The market for restaurant management software is large and growing. It’s a big category that’s expected to reach $6.94 billion in sales worldwide by 2025, according to a reporter from Grand View Research.

Investors in the U.S. market certainly believe in the potential opportunity for a business like Toast. That company has raised nearly $1 billion in funding from firms like Bessemer Venture Partners, the private equity firm TPG, and Tiger Global Management.

#bessemer-venture-partners, #doordash, #foundation-capital, #free-software, #grin, #mexico, #point-of-sale, #reporter, #software, #tc, #tiger-global-management, #trade, #uber, #united-states

Mirantis brings extensions to its Lens Kubernetes IDE, launches a new Kubernetes distro

Earlier this year, Mirantis, the company that now owns Docker’s enterprise business, acquired Lens, a desktop application that provides developers with something akin to an IDE for managing their Kubernetes clusters. At the time, Mirantis CEO Adrian Ionel told me that the company wants to offer enterprises the tools to quickly build modern applications. Today, it’s taking another step in that direction with the launch of an extensions API for Lens that will take the tool far beyond its original capabilities

In addition to this update to Lens, Mirantis also today announced a new open-source project: k0s. The company describes it as “a modern, 100% upstream vanilla Kubernetes distro that is designed and packaged without compromise.”

It’s a single optimized binary without any OS dependencies (besides the kernel). Based on upstream Kubernetes, k0s supports Intel and Arm architectures and can run on any Linux host or Windows Server 2019 worker nodes. Given these requirements, the team argues that k0s should work for virtually any use case, ranging from local development clusters to private datacenters, telco clusters and hybrid cloud solutions.

“We wanted to create a modern, robust and versatile base layer for various use cases where Kubernetes is in play. Something that leverages vanilla upstream Kubernetes and is versatile enough to cover use cases ranging from typical cloud based deployments to various edge/IoT type of cases.,” said Jussi Nummelin, Senior Principal Engineer at Mirantis and founder of k0s. “Leveraging our previous experiences, we really did not want to start maintaining the setup and packaging for various OS distros. Hence the packaging model of a single binary to allow us to focus more on the core problem rather than different flavors of packaging such as debs, rpms and what-nots.”

Mirantis, of course, has a bit of experience in the distro game. In its earliest iteration, back in 2013, the company offered one of the first major OpenStack distributions, after all.

As for Lens, the new API, which will go live next week to coincide with KubeCon, will enable developers to extend the service with support for other Kubernetes-integrated components and services.

“Extensions API will unlock collaboration with technology vendors and transform Lens into a fully featured cloud native development IDE that we can extend and enhance without limits,” said Miska Kaipiainen, the co-founder of the Lens open-source project and senior director of engineering at Mirantis. “If you are a vendor, Lens will provide the best channel to reach tens of thousands of active Kubernetes developers and gain distribution to your technology in a way that did not exist before. At the same time, the users of Lens enjoy quality features, technologies and integrations easier than ever.”

The company has already lined up a number of popular CNCF projects and vendors in the cloud-native ecosystem to build integrations. These include Kubernetes security vendors Aqua and Carbonetes, API gateway maker Ambassador Labs and AIOps company Carbon Relay. Venafi, nCipher, Tigera, Kong and StackRox are also currently working on their extensions.

“Introducing an extensions API to Lens is a game-changer for Kubernetes operators and developers, because it will foster an ecosystem of cloud-native tools that can be used in context with the full power of Kubernetes controls, at the user’s fingertips,” said Viswajith Venugopal, StackRox software engineer and developer of KubeLinter. “We look forward to integrating KubeLinter with Lens for a more seamless user experience.”

#adrian-ionel, #api, #carbon-relay, #ceo, #cloud, #cloud-infrastructure, #co-founder, #developer, #enterprise, #founder, #free-software, #intel, #kubernetes, #lens, #linux, #mirantis, #miska-kaipiainen, #openstack, #tc, #windows-server

DOJ says it seized over $1 billion in bitcoin from the Silk Road drugs marketplace

Two days ago, about $1 billion worth of bitcoin that had sat dormant since the seizure of the Silk Road marketplace in 2013, one of the biggest underground drug websites on the dark web, suddenly changed hands.

Who took it? Mystery over. It was the U.S. government.

In a statement Thursday, the Justice Department confirmed it had seized the 70,000 bitcoins generated in revenue from drug sales on the Silk Web marketplace. At the time of the seizure, the bitcoin was worth more than $1 billion.

“Silk Road was the most notorious online criminal marketplace of its day. The successful prosecution of Silk Road’s founder in 2015 left open a billion-dollar question. Where did the money go? Today’s forfeiture complaint answers this open question at least in part,” said U.S. Attorney David Anderson in remarks.

“$1 billion of these criminal proceeds are now in the United States’ possession,” he said.

Silk Road was for a time the “most sophisticated and extensive criminal marketplace on the Internet,” per the Justice Department statement. In 2013, its founder and administrator Ross Ulbricht was arrested and the site seized. Ulbricht was convicted in 2015 and sentenced to two life terms and an additional 40 years, for his role in the operation. Prosecutors said the site had close to 13,000 listings for drugs and other illegal services, and generated millions of bitcoin.

The Justice Department said Thursday that the seized bitcoin would be subject to forfeiture proceedings.

#computing, #cryptocurrency, #dark-web, #department-of-justice, #free-software, #internet, #ross-ulbricht, #security, #silk-road, #u-s-government, #united-states

Dataloop raises $11M Series A round for its AI data management platform

Dataloop, a Tel Aviv-based startup that specializes in helping businesses manage the entire data lifecycle for their AI projects, including helping them annotate their datasets, today announced that it has now raised a total of $16 million. This includes a $5 seed round that was previously unreported, as well as an $11 million Series A round that recently closed.

The Series A round was led by Amiti Ventures with participation from F2 Venture Capital, crowdfunding platform OurCrowd, NextLeap Ventures and SeedIL Ventures.

“Many organizations continue to struggle with moving their AI and ML projects into production as a result of data labeling limitations and a lack of real time validation that can only be achieved with human input into the system,” said Dataloop CEO Eran Shlomo. “With this investment, we are committed, along with our partners, to overcoming these roadblocks and providing next generation data management tools that will transform the AI industry and meet the rising demand for innovation in global markets.”

Image Credits: Dataloop

For the most part, Dataloop specializes in helping businesses manage and annotate their visual data. It’s agnostic to the vertical its customers are in, but we’re talking about anything from robotics and drones to retail and autonomous driving.

The platform itself centers around the ‘humans in the loop’ model that complements the automated systems with the ability for humans to train and correct the model as needed. It combines the hosted annotation platform with a Python SDK and REST API for developers, as well as a serverless Functions-as-a-Service environment that runs on top of a Kubernetes cluster for automating dataflows.

Image Credits: Dataloop

The company was founded in 2017. It’ll use the new funding to grow its presence in the U.S. and European markets, something that’s pretty standard for Israeli startups, and build out its engineering team as well.

#artificial-intelligence, #ceo, #enterprise, #free-software, #ml, #ourcrowd, #python, #serverless-computing, #tc, #tel-aviv, #united-states

Grid AI raises $18.6M Series A to help AI researchers and engineers bring their models to production

Grid AI, a startup founded by the inventor of the popular open-source PyTorch Lightning project, William Falcon, that aims to help machine learning engineers more efficiently, today announced that it has raised an $18.6 million Series A funding round, which closed earlier this summer. The round was led by Index Ventures, with participation from Bain Capital Ventures and firstminute. 

Falcon co-founded the company with Luis Capelo, who was previously the head of machine learning at Glossier. Unsurprisingly, the idea here is to take PyTorch Lightning, which launched about a year ago, and turn that into the core of Grid’s service. The main idea behind Lightning is to decouple the data science from the engineering.

The time argues that a few years ago, when data scientists tried to get started with deep learning, they didn’t always have the right expertise and it was hard for them to get everything right.

“Now the industry has an unhealthy aversion to deep learning because of this,” Falcon noted. “Lightning and Grid embed all those tricks into the workflow so you no longer need to be a PhD in AI nor [have] the resources of the major AI companies to get these things to work. This makes the opportunity cost of putting a simple model against a sophisticated neural network a few hours’ worth of effort instead of the months it used to take. When you use Lightning and Grid it’s hard to make mistakes. It’s like if you take a bad photo with your phone but we are the phone and make that photo look super professional AND teach you how to get there on your own.”

As Falcon noted, Grid is meant to help data scientists and other ML professionals “scale to match the workloads required for enterprise use cases.” Lightning itself can get them partially there, but Grid is meant to provide all of the services its users need to scale up their models to solve real-world problems.

What exactly that looks like isn’t quite clear yet, though. “Imagine you can find any GitHub repository out there. You get a local copy on your laptop and without making any code changes you spin up 400 GPUs on AWS — all from your laptop using either a web app or command-line-interface. That’s the Lightning “magic” applied to training and building models at scale,” Falcon said. “It is what we are already known for and has proven to be such a successful paradigm shift that all the other frameworks like Keras or TensorFlow, and companies have taken notice and have started to modify what they do to try to match what we do.”

The service is now in private beta.

With this new funding, Grid, which currently has 25 employees, plans to expand its team and strengthen its corporate offering via both Grid AI and through the open-source project. Falcon tells me that he aims to build a diverse team, not in the least because he himself is an immigrant, born in Venezuela, and a U.S. military veteran.

“I have first-hand knowledge of the extent that unethical AI can have,” he said. “As a result, we have approached hiring our current 25 employees across many backgrounds and experiences. We might be the first AI company that is not all the same Silicon Valley prototype tech-bro.”

“Lightning’s open-source traction piqued my interest when I first learned about it a year ago,” Index Ventures’ Sarah Cannon told me. “So intrigued in fact I remember rushing into a closet in Helsinki while at a conference to have the privacy needed to hear exactly what Will and Luis had built. I promptly called my colleague Bryan Offutt who met Will and Luis in SF and was impressed by the ‘elegance’ of their code. We swiftly decided to participate in their seed round, days later. We feel very privileged to be part of Grid’s journey. After investing in seed, we spent a significant amount with the team, and the more time we spent with them the more conviction we developed. Less than a year later and pre-launch, we knew we wanted to lead their Series A.”

#artificial-intelligence, #bain-capital-ventures, #cloud, #deep-learning, #developer, #enterprise, #free-software, #github, #grid-ai, #helsinki, #index-ventures, #machine-learning, #ml, #neural-network, #pytorch, #recent-funding, #sarah-cannon, #startups, #tc, #torch, #united-states, #venezuela, #web-app, #william-falcon

Kong launches Kong Konnect, its cloud-native connectivity platform

At its (virtual) Kong Summit 2020, API platform Kong today announced the launch of Kong Konnect, its managed end-to-end cloud-native connectivity platform. The idea here is to give businesses a single service that allows them to manage the connectivity between their APIs and microservices and help developers and operators manage their workflows across Kong’s API Gateway, Kubernetes Ingress and King Service Mesh runtimes.

“It’s a universal control plane delivery cloud that’s consumption-based, where you can manage and orchestrate API gateway runtime, service mesh runtime, and Kubernetes Ingress controller runtime — and even Insomnia for design — all from one platform,” Kong CEO and co-founder Augusto ‘Aghi’ Marietti told me.

The new service is now in private beta and will become generally available in early 2021.

Image Credits: Kong

At the core of the platform is Kong’s new so-called ServiceHub, which provides that single pane of glass for managing a company’s services across the organization (and make them accessible across teams, too).

As Marietti noted, organizations can choose which runtime they want to use and purchase only those capabilities of the service that they currently need. The platform also includes built-in monitoring tools and supports any cloud, Kubernetes provider or on-premises environment, as long as they are Kubernetes-based.

The idea here, too, is to make all these tools accessible to developers and not just architects and operators. “I think that’s a key advantage, too,” Marietti said. “We are lowering the barrier by making a connectivity technology easier to be used by the 50 million developers — not just by the architects that were doing big grand plans at a large company.”

To do this, Konnect will be available as a self-service platform, reducing the friction of adopting the service.

Image Credits: Kong

This is also part of the company’s grander plan to go beyond its core API management services. Those services aren’t going away, but they are now part of the larger Kong platform. With its open-source Kong API Gateway, the company built the pathway to get to this point, but that’s a stable product now and it’s now clearly expanding beyond that with this cloud connectivity play that takes the company’s existing runtimes and combines them to provide a more comprehensive service.

“We have upgraded the vision of really becoming an end-to-end cloud connectivity company,” Marietti said. “Whether that’s API management or Kubernetes Ingress, […] or Kuma Service Mesh. It’s about connectivity problems. And so the company uplifted that solution to the enterprise.”

 

#api, #augusto-marietti, #cloud, #cloud-computing, #cloud-infrastructure, #cloud-native-computing-foundation, #computing, #controller, #developer, #enterprise, #free-software, #kong, #kubernetes, #microservices, #openshift, #web-services

Pixie Labs raises $9.15M Series A round for its Kubernetes observability platform

Pixie, a startup that provides developers with tools to get observability into their Kubernetes-native applications, today announced that it has raised a $9.15 million Series A round led by Benchmark, with participation from GV. In addition, the company also today said that its service is now available as a public beta.

The company was co-founded by Zain Asgar (CEO), a former Google engineer working on Google AI and adjunct professor at Stanford, and Ishan Mukherjee (CPO), who led Apple’s Siri Knowledge Graph product team and also previously worked on Amazon’s Robotics efforts. Asgar had originally joined Benchmark to work on developer tools for machine learning. Over time, the idea changed to using machine learning to power tools to help developers manage large-scale deployments instead.

“We saw data systems, this move to the edge, and we felt like this old cloud 1.0 model of manually collecting data and shipping it to databases in the cloud seems pretty inefficient,” Mukherjee explained. “And the other part was: I was on call. I got gray hair and all that stuff. We felt like we could build this new generation of developer tools and get to Michael Jordan’s vision of intelligent augmentation, which is giving creatives tools where they can be a lot more productive.”

Image Credits: Pixie

The team argues that most competing monitoring and observability systems focus on operators and IT teams — and often involve a long manual setup process. But Pixie wants to automate most of this manual process and build a tool that developers want to use.

Pixie runs inside a developer’s Kubernetes platform and developers get instant and automatic visibility into their production environments. With Pixie, which the team is making available as a freemium SaaS product, there is no instrumentation to install. Instead, the team uses relatively new Linux kernel techniques like eBPF to collect data right at the source.

“One of the really cool things about this is that we can deploy Pixie in about a minute and you’ll instantly get data,” said Asgar. “Our goal here is that this really helps you when there are cases where you don’t want your business logic to be full of monitoring code, especially if you forget something — when you have an outage.”

Image Credits: Pixie

At the core of the developer experience is what the company calls “Pixie scripts.” Using a Python-like language (PxL), developers can codify their debugging workflows. The company’s system already features a number of scripts written by the team itself and the community at large. But as Asgar noted, not every user will write scripts. “The way scripts work, it’s supposed to capture human knowledge in that problem. We don’t expect the average user — or even the way above average developer — ever to touch a script or write one. They’re just going to use it in a specific scenario,” he explained.

Looking ahead, the team plans to make these scripts and the scripting language more robust and usable to allow developers to go from passively monitoring their systems to building scripts that can actively take actions on their clusters based on the monitoring data the system collects.

“Zain and Ishan’s provocative idea was to move software monitoring to the source,” said Eric Vishria, General Partner at Benchmark. “Pixie enables engineering teams to fundamentally rethink their monitoring strategy as it presents a vision of the future where we detect anomalous behavior and make operational decisions inside the infrastructure layer itself. This allows companies of all sizes to monitor their digital experiences in a more responsive, cost-effective and scalable manner.”

#artificial-intelligence, #benchmark, #ceo, #cloud-computing, #cloud-infrastructure, #computing, #engineer, #eric-vishria, #free-software, #general-partner, #google, #kubernetes, #linux, #machine-learning, #michael-jordan, #pixie, #stanford, #tc

JupiterOne raises $19M Series A to automate cyber asset management

Asset management might not be the most exciting talking topic, but it’s often an overlooked area of cyber-defenses. By knowing exactly what assets your company has makes it easier to know where the security weak spots are.

That’s the problem JupiterOne is trying to fix.

“We built JupiterOne because we saw a gap in how organizations manage the security and compliance of their cyber assets day to day,” said Erkang Zheng, the company’s founder and chief executive.

The Morrisville, N.C.-based startup, which spun out from healthcare cloud firm LifeOmic in 2018, helps companies see all of their digital and cloud assets by integrating with dozens of services and tools, including Amazon Web Services, Cloudflare, and GitLab, and centralizing the results into a single monitoring tool.

JupiterOne says it makes it easier for companies to spot security issues and maintain compliance, with an aim of helping companies prevent security lapses and data breaches by catching issues early on.

The company already has Reddit, Databricks and Auth0 as customers, and just secured $19 million in its Series A, led by Bain Capital Ventures and with participation from Rain Capital and its parent company LifeOmic.

As part of the deal, Bain partner Enrique Salem will join JupiterOne’s board. “We see a large multibillion dollar market opportunity for this technology across mid-market and enterprise customers,” he said. Asset management is slated to be a $8.5 billion market by 2024.

Zheng told TechCrunch the company plans to use the funds to accelerate its engineering efforts and its go-to-market strategy, with new product features to come.

#bain-capital-ventures, #computer-security, #computing, #enrique-salem, #free-software, #internet-security, #north-carolina, #security, #series-a, #software, #version-control, #web-services

SUSE contributes EiriniX to the Cloud Foundry Foundation

SUSE today announced that it has contributed EiriniX, a framework for building extensions for Eirini, a technology that brings support for Kubernetes-based container orchestration to the Cloud Foundry platform-as-a-service project.

About a year ago, SUSE also contributed the KubeCF project to the foundation, which itself allows the Cloud Foundry Application Runtime — the core of Cloud Foundry — to run on top of Kubernetes.

Image Credits: SUSE

“At SUSE we are developing upstream first as much as possible,” said Thomas Di Giacomo, president of Engineering and Innovation at SUSE. “So, after experiencing the value of contributing