Okay, Google: To protect women, collect less data about everyone
In post-Roe America, Google searches and location records can be evidence of a crime. Here are four ways Google should protect civil rights in its products now.
Perspective by Geoffrey A. FowlerColumnist
Updated July 1, 2022 at 5:55 p.m. EDT|Published July 1, 2022 at 8:00 a.m. EDT
We the users want Google to delete our data. Our rights depend on it.
This is a moment I’ve long worried would arrive. The way tens of millions of Americans use everyday Google products has suddenly become dangerous. Following the Supreme Court decision to overturn the landmark Roe v. Wade ruling, anything Google knows about you could be acquired by police in states where abortion is now illegal. A search for “Plan B,” a ping to Google Maps at an abortion clinic or even a message you send about taking a pregnancy test could all become criminal evidence.
There is something Google could do about this: Stop collecting — and start deleting — data that could be used to prosecute abortions. Yet so far, Google and other Big Tech companies have committed to few product changes that might endanger their ability to profit off our personal lives. Nor have they publicly committed to how they might fight legal demands related to prosecuting abortions.
The core issue is Google knows too much about everyone, way beyond just abortion. How much does Google know? I checked, and it’s got about 167 gigabytes just on me, including lots of photos. That’s roughly equivalent to 83,500 Stephen King novels. (You can download your data here, or see its map of your location history here.) Google built a $1.5 trillion business by grabbing every bit of data it can, with very few restrictions.
Most of us understand on some level that Google and other tech companies invade our privacy. But Silicon Valley has made us think the stakes are quite low. Google provides useful products, and in exchange we might be targeted with annoying ads. Big whoop.
Until now. The danger of all that data feels different after the end of Roe, said Shoshana Zuboff, an emerita Harvard Business School professor who popularized the term “surveillance capitalism” to describe Google’s business. “Every device becomes our potential enemy,” she told me.
Earlier this week, even the Department of Health and Human Services decided it needed to publish an advisory on locking down health information when using a smartphone “to protect yourself from potential discrimination, identity theft, or harm to your reputation.”
Zuboff, whose writings are like the “Silent Spring” of the digital age, is very concerned about where our surveillance society goes from here. “The harsh reality is that while we’re now worried about women who seek abortions being targeted, the same apparatus could be used to target any group or any subset of our population — or our entire population — at any moment, for any reason that it chooses,” she said. “No one is safe from this.”
“We the users want Google to delete our intimate data. Our rights depend on it.”
Of course, Google isn’t alone in collecting intimate information. In the past week, many concerned patients have focused on the privacy practices of period-tracking apps, which store reproductive health data. Other Big Tech companies facilitate data grabs, too: Facebook watches you even when you’re not using it, Amazon’s products record you, and Apple makes it too easy for iPhone apps to track you.
But in many ways, Google’s reach into the life of a person seeking reproductive health information is unrivaled. Just one example: For much of this year in the United States, Google searches for “Am I pregnant?” have outranked “Do I have covid?” Searches for the emergency contraceptive drug “Plan B” far outnumber both combined.
The sheer volume of Google’s surveillance also makes it likely the most attractive police target. Across all topics, it received more than 40,000 subpoenas and search warrants in the United States in the first half of 2021 alone.
That means whatever Google does next, it can’t remain neutral — and will set the tone for how the entire industry balances our rights with the business imperative to grab more data.
Google didn’t make an executive available for an interview. “We’ve long focused on minimizing the data we use to make our products helpful and on building tools that allow people to control and delete data across our platforms,” emailed spokesman Matt Bryant.
Starting in 2019, Google began offering users a setting to retain certain data for select periods of time rather than infinity, and in 2020 it made the default 18 months.
In reality, Google knows very few people use its controls, and even 18 months is a very long time. The only way to really protect its users is to make whole swaths of data off-limits by default.
Four ways to build civil rights into Google products
So what are the most urgent kinds of data Google should stop collecting? I spoke to privacy advocates to start a list of demands.
“It is their responsibility as a company to keep people’s data secure — but as it currently stands, it shifts the work onto the user to figure out how to delete their data,” said Jelani Drew-Davi, campaigns director of Kairos, a left-leaning digital advocacy group.
I understand there’s a sad irony in this exercise. “Take a minute and just feel how intolerable it is for us to essentially be supplicants toward a massively wealthy, massively powerful data company, saying, ‘Please, please, please stop collecting sensitive data,’ ” said Zuboff.
“We shouldn’t be relying on the goodwill of individual companies to protect our data,” said Rep. Sara Jacobs (D-Calif.), who introduced a bill called My Body, My Data in response to the end of Roe that would put new limits on how companies store reproductive or sexual health data.
Seeking an abortion? Here’s how to avoid leaving a digital trail.
We know there’s zero chance Google will overnight exit the lucrative personal data business. And frankly, Congress has been asleep at the wheel on protecting our data rights for decades.
Yet I also know there are employees inside Google who want to do the right thing. There is immediate harm that could be reduced with even more subtle changes to how Google collects and stores our data.
Here’s an action plan for Google to build our civil rights into its products.
1) Delete search queries and web-browsing history
By default, Google keeps a record of what you search for (whether by typing or speaking) and the websites you visit in the Chrome browser. It saves this information to your Google account, where it’s linked to your email address, phone number or other identifying information.
As an individual user, you can change how long it retains this sort of data under Google’s “My Activity” settings, including telling Google to delete it immediately. Google could make using these tools much clearer, but even still only a fraction of its users will ever mess with its default settings.
Instead, Google should categorize some queries, websites and keywords as just too sensitive to keep records around. It should delete anything about sexual health from records immediately, regardless of account settings.
While they’re at it, how about deleting information related to any health query whatsoever? The sad truth is Google is not covered by America’s existing health-privacy law. When people aren’t confident their information — or even just their research — is private, they may end up with worse health outcomes.
Some privacy advocates worry there’s no way to ever draw the lines around “sensitive” data that will actually protect people. Even queries “seemingly unrelated to abortion may still be used against people seeking care or those who assist them,” said Matt Cagle, senior staff attorney at the ACLU of Northern California.
So an even better solution would be for Google to change its default to delete all user data after one week — or less — unless we specifically ask for it to be held longer. It’s doable: Rival DuckDuckGo by default doesn’t share or save any user search or location histories.
2) Stop saving individual location information
For almost any Google service you use, from search to maps, Google tries to get you to hand over location data with the promise of a better experience. On an Android phone, Google has at least eight ways to collect and use your location. It wants this, of course, not only to provide you more relevant information but also to show you much more targeted ads.
All of this information leaves Google with a map of your life that’s akin to a team of private investigators tracking your moves. And increasingly, Google is receiving what’s known as “geofence warrants,” where it’s asked to hand over the identities of people known to be in a certain area.
You can stop some of this location data harvesting by turning off location access on your phone, or telling your Google settings to “pause” saving location information to your account. But it’s time to recognize the consequences of gathering this data are greater than the benefits.
After this column originally published, Google announced on Friday it would begin deleting location information about visits to “particularly personal” places including abortion clinics. I’m glad it’s listening, but it’s unclear how much that step alone will help. How will it define personal? Will it still record people going to and from those places? As with search queries, seemingly unrelated data can be unintentionally revealing.
In the wake of the Roe ruling — and, frankly, even before it — privacy advocates and even lawmakers called on Google to just stop storing individual location data.
“Do not collect this data in a way that‘s vulnerable to digital dragnets,” said Albert Fox Cahn, the founder of the Surveillance Technology Oversight Project. “If you are going to have this data for a single individual, or you can see everyone who went to a certain area — that is too much power.”
But wait, might this ruin the functionality of Google Maps? It doesn’t have to: Apple, for example, designed its maps service to not store personal information associated with how you’re using Apple Maps, except when you submit a rating or photo of a place.
3) Make Chrome’s ‘Incognito mode’ actually incognito
Google’s web browser is extraordinarily popular because it’s speedy — but it’s terrible for your privacy. It’s one of the few tech products I’ve ever just straight-up labeled “spyware” because it facilitates so much data collection not only by Google but lots of other companies as well.
One of the most dangerous parts of Chrome is the so-called Incognito mode, which tells users it lets you “browse privately.” What it really means is, while you’re using this mode Chrome will no longer save your browsing history on your computer. But it doesn’t necessarily make you anonymous to websites you visit, your internet service provider or even Google itself (if you log into your account).
One example: Just this week, my colleague Tatum Hunter reported that Google (as well as Facebook and TikTok) was sent personal information when patients use the Planned Parenthood website scheduling pages. The problem was marketing embedded in the code of the page — and Chrome does little to stop that kind of tracking.
Google has the technical muscle to make Incognito actually mean something. Already rivals such as Mozilla’s Firefox by default block attempts to track what you do online by data brokers and even Facebook and Google.
An even better version of Incognito would make sure that nobody could know what sites you’re visiting. Apple is testing a version of this with its paid iCloud Private Relay service. The nonprofit Tor offers free anonymous-surfing software, which sends internet traffic bouncing between volunteer computers around the world so it can’t be easily traced back to you. It has lately seen an uptick of use by people in Russia likely seeking unfiltered information about the war in Ukraine.
4) Better protect texts and messages
Are the chats we have on Google products totally private? The answer is, it depends.
For people with Android phones, last year Google finally turned on end-to-end encryption for the default messaging app, meaning the contents can be seen only by the participants. But it comes with some conditions: It applies only to conversations with just two people, and both parties have to be using Google’s Messages app. (When a conversation is actually encrypted, you’ll see a lock icon.)
That means chats with friends who use iPhones are definitely not private. We’d all be better off if Google and Apple could summon the will to work together on common secure messaging technologies that would encrypt conversations across platforms by default. (After passing a new law, the government in Europe may finally force them to open up and work together — at least in Europe.)
Google Chat, the messaging function built into other Google products, does encrypt content at rest and in transit. But when I asked whether Google could hand over the contents of chats if it gets a search warrant, the answer was still yes. That doesn’t fit my definition of private.
Google’s smart product designers should find ways to provide warnings to us when our current activities or settings may result in the retention of sensitive information — and leave us vulnerable.