Under pressure from employees and supporters of reproductive rights, the company announced privacy changes for the post-Roe era.
More than a billion people worldwide have signed up for Google accounts, clicking through screens promising that “your personal info is private and safe.” This week, Google’s sign-up process came under fire when European Union consumer rights groups issued new privacy complaints suggesting that the opposite is true—that Google intentionally designs default settings to deceive new users into granting permissions to harvest and share a broad swath of personal info.
“The language Google uses at every step of the registration process is unclear, incomplete, and misleading,” the European consumer organization BEUC told Reuters. BEUC is helping to coordinate a potential civil lawsuit in Germany and several new complaints to data-protection authorities from consumer rights groups in France, Greece, the Czech Republic, Norway, and Slovenia.
The key issue in these complaints is how hard Google makes it for account users to choose privacy-friendly options. It’s much easier, the consumer groups argue, to set up an account to share personal info than to protect it. As Tech Crunch reported, Google designed a one-click “express personalization” option allowing data tracking, while “manual personalization” requires 10 clicks to turn off tracking.
There is so much digital information about us out there that we can’t possibly control it all.
Financial companies collect a lot of payment data from customers. Prosecutors could subpoena those records for evidence of abortion, legal experts say.
Some have offered to cover travel and other expenses for employees, and are working to ease worker fears about safety and confidentiality.
A tracking tool installed on many hospitals’ websites has been collecting patients’ sensitive health information—including details about their medical conditions, prescriptions, and doctor’s appointments—and sending it to Facebook.
The Markup tested the websites of Newsweek’s top 100 hospitals in America. On 33 of them we found the tracker, called the Meta Pixel, sending Facebook a packet of data whenever a person clicked a button to schedule a doctor’s appointment. The data is connected to an IP address—an identifier that’s like a computer’s mailing address and can generally be linked to a specific individual or household—creating an intimate receipt of the appointment request for Facebook.
Canadian investigators determined that users of the Tim Hortons coffee chain’s mobile app “had their movements tracked and recorded every few minutes of every day,” even when the app wasn’t open, in violation of the country’s privacy laws.
“The Tim Hortons app asked for permission to access the mobile device’s geolocation functions but misled many users to believe information would only be accessed when the app was in use. In reality, the app tracked users as long as the device was on, continually collecting their location data,” according to an announcement Wednesday by Canada’s Office of the Privacy Commissioner. The federal office collaborated with provincial authorities in Quebec, British Columbia, and Alberta in the investigation of Tim Hortons.
“The app also used location data to infer where users lived, where they worked, and whether they were traveling,” the Office of the Privacy Commissioner said. “It generated an ‘event’ every time users entered or left a Tim Hortons competitor, a major sports venue, or their home or workplace.”
In the absence of federal privacy legislation, the state’s law is considered among the nation’s strongest.
The online system is broken. How do we fix it?
PimEyes is a paid service that finds photos of a person from across the internet, including some the person may not want exposed. “We’re just a tool provider,” its owner said.
Nations are accelerating efforts to control data produced within their perimeters, disrupting the flow of what has become a kind of digital currency.
In a post-Roe America, we’ll bear the costs of letting data collection undermine our liberty.
America still doesn’t have a federal data privacy law. But look what we have found — hope!
The confirmation of a third Democrat creates an opportunity for Lina Khan, the Federal Trade Commission’s chair, to advance efforts to rein in corporate power.
An abrupt and profound change to an established law is utterly destructive to respect for the law.
Federal privacy bills, security legislation and antitrust laws to address the power of the tech giants have all failed to advance in Congress, despite hand wringing and shows of bipartisan support.
Technology advances forced the Census Bureau to use sweeping measures to ensure privacy for respondents. The ensuing debate goes to the heart of what a census is.
Last year, Apple enacted App Tracking Transparency, a mandatory policy that forbids app makers from tracking user activity across other apps without first receiving those users’ explicit permission. Privacy advocates praised the initiative, and Facebook warned it would spell certain doom for companies that rely on targeted advertising. However, research published last week suggests that ATT, as it’s usually abbreviated, doesn’t always curb the surreptitious collection of personal data or the fingerprinting of users.
At the heart of ATT is the requirement that users must click an “allow” button that appears when an app is installed. It asks: “Allow [app] to track your activity across other companies’ apps and websites?” Without that consent, the app can’t access the so-called IDFA (Identifier for Advertisers), a unique identifier iOS or iPadOS assigns so they can track users across other installed apps. At the same time, Apple also started requiring app makers to provide “privacy nutrition labels” that declared the types of user and device data they collect and how that data is used.
Loopholes, bypasses, and outright violations
Last week’s research paper said that while ATT in many ways works as intended, loopholes in the framework also provided the opportunity for companies, particularly large ones like Google and Facebook, to work around the protections and stockpile even more data. The paper also warned that despite Apple’s promise for more transparency, ATT might give many users a false sense of security.
The law professor Amy Gajda writes about the tug of war between the right to know and the right to be let alone.
Apple CEO Tim Cook took to the stage at the annual International Association of Privacy Professionals (IAPP) conference on Tuesday to talk about privacy, security, ad tracking, and sideloading.
Calling privacy “one of the most essential battles of our time,” Cook lambasted companies that moneteize large user data collection operations, comparing them to real-world stalkers.
By contrast, he claimed that Apple maintains “a commitment to protecting people from a data industrial complex built on a foundation of surveillance.” To vigorous applause from the audience of privacy professionals, he voiced his support for US privacy regulations akin to those passed in Europe in recent years.
The wireless carrier said that it was working with the F.B.I. and the Secret Service to investigate a recent wave of fraudulent messages, but said the source did not appear to be Russian hackers.
A new bill in Congress says anonymity can be preserved.
Fifty-four families volunteered to share data on everything from sleeping habits to trash volume to help developers make a city from scratch in Busan.
Digital copies of licenses and state-issued IDs, which can also be stored on Apple Watches, will only work at security checkpoints at a Phoenix airport for now, officials said.
Two journalists dive into George Floyd’s life and family; Viola Davis reflects on her career; a historian explores the brutal underpinnings of the British Empire; and more.
The database used by the New York Police Department violates state law and the Constitution, the Legal Aid Society contends in a lawsuit.
A new state agency has a $10 million budget to regulate Google, Facebook and others. But first it needs to be created.
Some websites just can’t take “no” for an answer. Instead of respecting visitors’ choice to block third-party cookies—the identifiers that track browsing activity as a user moves from site to site—they find sneaky ways to bypass those settings. Now, makers of the Brave browser are taking action.
Earlier this week, Brave Nightly—the testing and development version of the browser—rolled out a feature that’s designed to prevent what’s known as bounce tracking. The new feature, known as unlinkable bouncing, will roll out for general release in Brave version 1.37 slated for March 29.
Bounce tracking is one of the key ways websites circumvent third-party cookie blocking. When a browser prevents a website such as site.example from loading a third-party tracking cookie from a domain such as tracker.example, site.example pulls a fast one. When site.example detects that the tracker.example cookie can’t be set, it instead redirects the browser to the tracker.example site, sets a cookie from that domain, and then redirects back to the original page or a new destination.
Kurbo by WW, a weight loss app geared toward children, illegally collected data from users as young as 8 without their parents’ consent, the Federal Trade Commission said in a complaint.
An intrusive sweep that has spanned several provinces risks alienating Afghans and fueling the insurgency the new government is trying to stop.
A New York Times investigation reveals how Israel reaped diplomatic gains around the world from NSO’s Pegasus spyware — a tool America itself purchased but is now trying to ban.
Customs officials aim to save time and increase security by ramping up the use of facial recognition. But what about privacy? A biometrics specialist weighs in.
An official investigation refuted claims that the police had illegally hacked dozens of civilians using spyware from NSO Group, an Israeli company that has long attracted global scrutiny.
The agency, dealing with controversy over its decision to use facial recognition software, said it would allow taxpayers to authenticate their accounts with a live, virtual interview.
The sealing of the case against Hadley Palmer, 53, who must register as a sex offender for secretly filming minors, has revived questions about judicial favoritism for the wealthy. Lawyers say they were trying to protect her victims.
California lawmakers plan to introduce a new bill to protect children’s data online this Thursday, mirroring the UK’s recently introduced children’s code, as part of growing momentum globally for stricter regulation on Big Tech.
The California age-appropriate design-code bill will require many of the world’s biggest tech platforms headquartered in the state—such as social media group Meta and Google’s YouTube—to limit the amount of data they collect from young users and the location tracking of children in the state.
If passed into law, it will also place restrictions on profiling younger users for targeted advertising, mandate the introduction of “age-appropriate” content policies, and ban serving up behavioral nudges that might trick them into weakening their privacy protections.
It says it will give other companies plenty of time to adapt to changes to its Android software. Similar changes made by Apple affected big internet companies.
Hadley Palmer, 53, pleaded guilty to making images of children in intimate situations at her home in Belle Haven, a wealthy Connecticut enclave. The court record was sealed over an A.P. reporter’s objection.
A vast location-tracking network is being built around us so we don’t lose our keys: One couple’s adventures in the consumer tech surveillance state.
After Apple made it harder to track people on the internet, even tech giants felt the effects.
Jack Sweeney, a freshman at the University of Central Florida, said that Mr. Musk raised privacy and security concerns about his popular Twitter account, @ElonJet.