The bill was sent to the governor despite dividing Republicans. Some of them said the measure was too restrictive; others objected to limited exceptions for rape and incest.
But a 2021 law that bans abortions after about six weeks of pregnancy is still valid in the state.
Progressive groups and medical organizations have adopted inclusive language, which has led to terms like “pregnant people” and “chestfeeding.”
In the absence of federal privacy legislation, the state’s law is considered among the nation’s strongest.
But the court said Gov. Greg Abbott did not have the authority to order such investigations, acknowledging they could cause “irreparable harm.”
Mr. Depp sued his ex-wife, Ms. Heard, for an op-ed she wrote in The Washington Post referring to herself as a “public figure representing domestic abuse.”
Legislators also approved a measure restricting discussion of L.G.B.T.Q. issues in public schools.
Jurors in the civil case found that the Denver Police used excessive force against the 12 plaintiffs.
In a court hearing, a supervisor testified that the state agency responsible for child welfare was told to investigate parents of transgender children without exception.
One family was affected by the decision, but Gov. Greg Abbott’s order to investigate certain medically accepted treatments as child abuse is still in place.
The investigations by the state’s Department of Family and Protective Services started last week with an employee of the agency, according to the suit, after Gov. Greg Abbott called for such inquiries.
The voting rights decision is further evidence of an impatient conservative majority.
For months, the jail’s doctor has promoted the drug, which health experts say should not be used to treat or prevent Covid-19.
A Texas statute that bans abortion after six weeks of pregnancy was seemingly undercut by two court rulings, but the reality on the ground has not changed.
The Supreme Court will decide whether Boston, which allows many kinds of groups to raise flags outside its City Hall, can reject one bearing the Latin cross.
The state court order, which The Times said it would immediately oppose, raised concerns from First Amendment advocates.
“That’s not going to happen,” the president said about proposed compensation for a Trump administration policy that divided thousands of parents and children.
The things that are driving us apart are going to keep driving us apart.
The A.C.L.U. said that the Jefferson Parish Sheriff’s Office had opened an investigation into the violent encounter. The woman was treated at a hospital, and no charges were filed against her.
The organization acknowledged that changing references from women to people was a mistake — albeit a well-intentioned one.
A bad tweet reveals a rift over gender and reproduction.
Forty years after her appointment as the court’s first female justice, it’s worth reflecting on the path she took.
The justices will soon consider whether to hear a case arguing that the First Amendment requires disclosure of a secret court’s major rulings.
From vaccine mandates to religious liberty, your allies often matter more than your ideology.
According to a transcript released Friday, a member of the Minnesota State Patrol said there was “a purge of emails and text messages” after troopers responded to protests in Minneapolis last year.
Hello friends, and welcome back to Week in Review.
Last week, we dove into the truly bizarre machinations of the NFT market. This week, we’re talking about something that’s a little bit more impactful on the current state of the web — Apple’s NeuralHash kerfuffle.
the big thing
In the past month, Apple did something it generally has done an exceptional job avoiding — the company made what seemed to be an entirely unforced error.
In early August — seemingly out of nowhere** — the company announced that by the end of the year they would be rolling out a technology called NeuralHash that actively scanned the libraries of all iCloud Photos users, seeking out image hashes that matched known images of child sexual abuse material (CSAM). For obvious reasons, the on-device scanning could not be opted out of.
This announcement was not coordinated with other major consumer tech giants, Apple pushed forward on the announcement alone.
Researchers and advocacy groups had almost unilaterally negative feedback for the effort, raising concerns that this could create new abuse channels for actors like governments to detect on-device information that they regarded as objectionable. As my colleague Zach noted in a recent story, “The Electronic Frontier Foundation said this week it had amassed more than 25,000 signatures from consumers. On top of that, close to 100 policy and rights groups, including the American Civil Liberties Union, also called on Apple to abandon plans to roll out the technology.”
(The announcement also reportedly generated some controversy inside of Apple.)
The issue — of course — wasn’t that Apple was looking at find ways that prevented the proliferation of CSAM while making as few device security concessions as possible. The issue was that Apple was unilaterally making a massive choice that would affect billions of customers (while likely pushing competitors towards similar solutions), and was doing so without external public input about possible ramifications or necessary safeguards.
A long story short, over the past month researchers discovered Apple’s NeuralHash wasn’t as air tight as hoped and the company announced Friday that it was delaying the rollout “to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Having spent several years in the tech media, I will say that the only reason to release news on a Friday morning ahead of a long weekend is to ensure that the announcement is read and seen by as few people as possible, and it’s clear why they’d want that. It’s a major embarrassment for Apple, and as with any delayed rollout like this, it’s a sign that their internal teams weren’t adequately prepared and lacked the ideological diversity to gauge the scope of the issue that they were tackling. This isn’t really a dig at Apple’s team building this so much as it’s a dig on Apple trying to solve a problem like this inside the Apple Park vacuum while adhering to its annual iOS release schedule.
Apple is increasingly looking to make privacy a key selling point for the iOS ecosystem, and as a result of this productization, has pushed development of privacy-centric features towards the same secrecy its surface-level design changes command. In June, Apple announced iCloud+ and raised some eyebrows when they shared that certain new privacy-centric features would only be available to iPhone users who paid for additional subscription services.
You obviously can’t tap public opinion for every product update, but perhaps wide-ranging and trail-blazing security and privacy features should be treated a bit differently than the average product update. Apple’s lack of engagement with research and advocacy groups on NeuralHash was pretty egregious and certainly raises some questions about whether the company fully respects how the choices they make for iOS affect the broader internet.
Delaying the feature’s rollout is a good thing, but let’s all hope they take that time to reflect more broadly as well.
** Though the announcement was a surprise to many, Apple’s development of this feature wasn’t coming completely out of nowhere. Those at the top of Apple likely felt that the winds of global tech regulation might be shifting towards outright bans of some methods of encryption in some of its biggest markets.
Back in October of 2020, then United States AG Bill Barr joined representatives from the UK, New Zealand, Australia, Canada, India and Japan in signing a letter raising major concerns about how implementations of encryption tech posed “significant challenges to public safety, including to highly vulnerable members of our societies like sexually exploited children.” The letter effectively called on tech industry companies to get creative in how they tackled this problem.
Here are the TechCrunch news stories that especially caught my eye this week:
LinkedIn kills Stories
You may be shocked to hear that LinkedIn even had a Stories-like product on their platform, but if you did already know that they were testing Stories, you likely won’t be so surprised to hear that the test didn’t pan out too well. The company announced this week that they’ll be suspending the feature at the end of the month. RIP.
FAA grounds Virgin Galactic over questions about Branson flight
While all appeared to go swimmingly for Richard Branson’s trip to space last month, the FAA has some questions regarding why the flight seemed to unexpectedly veer so far off the cleared route. The FAA is preventing the company from further launches until they find out what the deal is.
Apple buys a classical music streaming service
While Spotify makes news every month or two for spending a massive amount acquiring a popular podcast, Apple seems to have eyes on a different market for Apple Music, announcing this week that they’re bringing the classical music streaming service Primephonic onto the Apple Music team.
TikTok parent company buys a VR startup
It isn’t a huge secret that ByteDance and Facebook have been trying to copy each other’s success at times, but many probably weren’t expecting TikTok’s parent company to wander into the virtual reality game. The Chinese company bought the startup Pico which makes consumer VR headsets for China and enterprise VR products for North American customers.
Twitter tests an anti-abuse ‘Safety Mode’
The same features that make Twitter an incredibly cool product for some users can also make the experience awful for others, a realization that Twitter has seemingly been very slow to make. Their latest solution is more individual user controls, which Twitter is testing out with a new “safety mode” which pairs algorithmic intelligence with new user inputs.
Some of my favorite reads from our Extra Crunch subscription service this week:
Our favorite startups from YC’s Demo Day, Part 1
“Y Combinator kicked off its fourth-ever virtual Demo Day today, revealing the first half of its nearly 400-company batch. The presentation, YC’s biggest yet, offers a snapshot into where innovation is heading, from not-so-simple seaweed to a Clearco for creators….”
“…Yesterday, the TechCrunch team covered the first half of this batch, as well as the startups with one-minute pitches that stood out to us. We even podcasted about it! Today, we’re doing it all over again. Here’s our full list of all startups that presented on the record today, and below, you’ll find our votes for the best Y Combinator pitches of Day Two. The ones that, as people who sift through a few hundred pitches a day, made us go ‘oh wait, what’s this?’
All the reasons why you should launch a credit card
“… if your company somehow hasn’t yet found its way to launch a debit or credit card, we have good news: It’s easier than ever to do so and there’s actual money to be made. Just know that if you do, you’ve got plenty of competition and that actual customer usage will probably depend on how sticky your service is and how valuable the rewards are that you offer to your most active users….”
Apple has delayed plans to roll out its child sexual abuse (CSAM) detection technology that it chaotically announced last month, citing feedback from customers and policy groups.
That feedback, if you recall, has been largely negative. The Electronic Frontier Foundation said this week it had amassed more than 25,000 signatures from consumers. On top of that, close to 100 policy and rights groups, including the American Civil Liberties Union, also called on Apple to abandon plans to roll out the technology.
In a statement on Friday morning, Apple told TechCrunch:
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Apple’s so-called NeuralHash technology is designed to identify known CSAM on a user’s device without having to possess the image or knowing the contents of the image. Because a user’s photos stored in iCloud are end-to-end encrypted so that even Apple can’t access the data, NeuralHash instead scans for known CSAM on a user’s device, which Apple claims is more privacy-friendly than the current blanket scanning that cloud providers use.
But security experts and privacy advocates have expressed concern that the system could be abused by highly resourced actors, like governments, to implicate innocent victims or to manipulate the system to detect other materials that authoritarian nation states find objectionable.
Within a few weeks of announcing the technology, researchers said they were able to create “hash collisions” using NeuralHash, effectively tricking the system into thinking two entirely different images were the same.
iOS 15 is expected out later in the next few weeks.
- Apple confirms it will begin scanning iCloud Photos for child abuse images
- Apple’s CSAM detection tech is under fire — again
- Apple details child abuse detection and Messages safety features
- iOS 15 will warn parents and children about sexually explicit photos in Messages
The decision came in response to an American Civil Liberties Union challenge to a first-in-the-nation law enacted by Republican state legislators in April.
A late bloomer in law school, he founded a constitutional rights clinic to guard against government overreach and was a longtime general counsel at the A.C.L.U.
The judge said that federal officials had immunity, and that claims of a conspiracy to clear Lafayette Square for President Donald J. Trump’s walk across it were “simply too speculative.”
According to BuzzFeed News, Democratic Senator Ron Wyden and Representative Ted Lieu will introduce legislation later today that seeks to restrict police use of international mobile subscriber identity (IMSI) catchers. More commonly known as Stingrays, police frequently use IMSI catchers and cell-site simulators to collect information on suspects and intercept calls, SMS messages and other forms of communication. Law enforcement agencies in the US currently do not require a warrant to use the technology. The Cell-Site Simulator Act of 2021 seeks to change that.
IMSI catchers mimic cell towers to trick mobile phones into connecting with them. Once connected, they can collect data a device sends out, including its location and subscriber identity key. Cell-site simulators pose a two-fold problem.
The first is that they’re surveillance blunt instruments. When used in a populated area, IMSI catchers can collect data from bystanders. The second is that they can also pose a safety risk to the public. The reason for this is that while IMSI catchers act like a cell tower, they don’t function as one, and they can’t transfer calls to a public wireless network. They can therefore prevent a phone from connecting to 9-1-1. Despite the dangers they pose, their use is widespread. In 2018, the American Civil Liberties Union found at least 75 agencies in 27 states and the District of Columbia owned IMSI catchers.
In trying to address those concerns, the proposed legislation would make it so that law enforcement agencies would need to make a case before a judge on why they should be allowed to use the technology. They would also need to explain why other surveillance methods wouldn’t be as effective. Moreover, it seeks to ensure those agencies delete any data they collect from those not listed on a warrant.
Although the bill reportedly doesn’t lay out a time limit on IMSI catcher use, it does push agencies to use the devices for the least amount of time possible. It also details exceptions where police could use the technology without a warrant. For instance, it would leave the door open for law enforcement to use the devices in contexts like bomb threats where an IMSI catcher can prevent a remote detonation.
“Our bipartisan bill ends the secrecy and uncertainty around Stingrays and other cell-site simulators and replaces it with clear, transparent rules for when the government can use these invasive surveillance devices,” Senator Ron Wyden told BuzzFeed News.
The bill has support from some Republicans. Senator Steve Daines of Montana and Representative Tom McClintock of California are co-sponsoring the proposed legislation. Organizations like the Electronic Frontier Foundation and the Electronic Privacy Information Center have also endorsed the bill.
This article was originally published on Engadget.
Yes, the civil liberties group is divided. What else is new?
The justices had been asked to decide whether one of the last sex-based distinctions in federal law should survive now that women can serve in combat.
An organization that has defended the First Amendment rights of Nazis and the Ku Klux Klan is split by an internal debate over whether supporting progressive causes is more important.
Rental applications will no longer ask about criminal convictions, and landlords’ use of background checks will be limited.
Many of the decisions of the Foreign Intelligence Surveillance Court have not seen the light of day. That’s irreconcilable with the Constitution.
As a senator, he backed legislation that led to mass incarceration and long drug sentences. Now his administration is signaling that he could use clemency to address inequities.
A.C.L.U. attorney Chase Strangio on the coordinated strategy behind the more than 100 anti-transgender bills introduced this year.
Boston Dynamics’ robot “dogs,” or similar versions thereof, are already being employed by police departments in Hawaii, Massachusetts and New York. Partly through the veil of experimentation, few answers are being given by these police forces about the benefits and costs of using these powerful surveillance devices.
The American Civil Liberties Union, in a position paper on CCOPS (community control over police surveillance), proposes an act to promote transparency and protect civil rights and liberties with respect to surveillance technology. To date, 19 U.S. cities in have passed CCOPS laws, which means, in practical terms, that virtually all other communities don’t have a requirement that police are transparent about their use of surveillance technologies.
For many, this ability to use new, unproven technologies in a broad range of ways presents a real danger. Stuart Watt, a world-renowned expert in artificial intelligence and the CTO of Turalt, is not amused.
Even seemingly fun and harmless “toys” have all the necessary functions and features to be weaponized.
“I am appalled both by the principle and the dogbots and of them in practice. It’s a big waste of money and a distraction from actual police work,” he said. “Definitely communities need to be engaged with. I am honestly not even sure what the police forces think the whole point is. Is it to discourage through a physical surveillance system, or is it to actually prepare people for some kind of enforcement down the line?
“Chunks of law enforcement have forgotten the whole ‘protect and serve’ thing, and do neither,” Watts added. “If they could use artificial intelligence to actually protect and actually serve vulnerable people, the homeless, folks addicted to drugs, sex workers, those in poverty and maligned minorities, it’d be tons better. If they have to spend the money on AI, spend it to help people.”
The ACLU is advocating exactly what Watt suggests. In proposed language to city councils across the nation, the ACLU makes it clear that:
The City Council shall only approve a request to fund, acquire, or use a surveillance technology if it determines the benefits of the surveillance technology outweigh its costs, that the proposal will safeguard civil liberties and civil rights, and that the uses and deployment of the surveillance technology will not be based upon discriminatory or viewpoint-based factors or have a disparate impact on any community or group.
From a legal perspective, Anthony Gualano, a lawyer and special counsel at Team Law, believes that CCOPS legislation makes sense on many levels.
“As police increase their use of surveillance technologies in communities around the nation, and the technologies they use become more powerful and effective to protect people, legislation requiring transparency becomes necessary to check what technologies are being used and how they are being used.”
For those not only worried about this Boston Dynamics dog, but all future incarnations of this supertech canine, the current legal climate is problematic because it essentially allows our communities to be testing grounds for Big Tech and Big Government to find new ways to engage.
Just last month, public pressure forced the New York Police Department to suspend use of a robotic dog, quite unassumingly named Digidog. After the tech hound was placed on temporary leave due to public pushback, the NYPD used it at a public housing building in March. This went over about as well as you could expect, leading to discussions as to the immediate fate of this technology in New York.
The New York Times phrased it perfectly, observing that “the NYPD will return the device earlier than planned after critics seized on it as a dystopian example of overly aggressive policing.”
While these bionic dogs are powerful enough to take a bite out of crime, the police forces seeking to use them have a lot of public relations work to do first. A great place to begin would be for the police to actively and positively participate in CCOPS discussions, explaining what the technology involves, and how it (and these robots) will be used tomorrow, next month and potentially years from now.
Lawyers’ request to conduct additional DNA testing before Ledell Lee was executed had been denied.
Some parents are opting to try, once again, the dangerous trek north to rejoin the children who were kept in the U.S. by the Trump administration.
A handful of parents from Mexico and Central America who were deported under the Trump administration’s family separation policy will be reunited this week with their children.
The justices struggled to determine how the First Amendment applies to public schools’ power to punish students for social media posts and other off-campus speech.
If you live in a community with a homeowners association, chances are good that you may be limited to just the Stars and Stripes.
Deployed at a public housing building, the device drew condemnation as a stark example of police power and misplaced priorities.
Advocates and Biden administration officials say finding their families is difficult with each passing day.
The plaintiff, a Lebanese-American, says he was barred from flying after refusing to become an F.B.I. informant.
The episode highlighted how murky new guidelines toward past use are, particularly for a White House that has pledged to embrace progressive positions.
There have been 25 state bills introduced this year that would prohibit transgender athletes, mainly women, from competing on teams matching their gender identity. Here’s what we know.
Activists are pushing President Biden to uphold a campaign promise to add the nonbinary designation “X” to federal IDs.