Can the cops get your online data? In short, yes. There are a variety of US federal and state laws which give law enforcement powers to obtain information that you provided to online services. But, there are steps you as a user and/or as a service provider can take to improve online privacy.
Law enforcement demanding access to your private online data goes back to the beginning of the internet. In fact, one of EFF’s first cases, Steve Jackson Games v. Secret Service, exemplified the now all-too-familiar story where unfounded claims about illegal behavior resulted in overbroad seizures of user messages. But it’s not the ’90s anymore, the internet has become an integral part of everyone’s life. Everyone now relies on organizations big and small to steward our data, from huge service providers like Google, Meta, or your ISP, to hobbyists hosting a blog or Mastodon server.
There is no “cloud,” just someone else's computer—and when the cops come knocking on their door, these hosts need to be willing to stand up for privacy, and know how to do so to the fullest extent under the law. These legal limits are also important for users to know, not only to mitigate risks in their security plan when choosing where to share data, but to understand whether these hosts are going to bat for them. Taking action together, service hosts and users can curb law enforcement getting more data than they’re allowed, protecting not just themselves but targeted populations, present and future.
This is distinct from law enforcement’s methods of collecting public data, such as the information now being collected on student visa applicants. Cops may use social media monitoring tools and sock puppet accounts to collect what you share publicly, or even within “private” communities. Police may also obtain the contents of communication in other ways that do not require court authorization, such as monitoring network traffic passively to catch metadata and possibly using advanced tools to partially reveal encrypted information. They can even outright buy information from online data brokers. Unfortunately there are few restrictions or oversight for these practices—something EFF is fighting to change.
Below however is a general breakdown of the legal processes used by US law enforcement for accessing private data, and what categories of private data these processes can disclose. Because this is a generalized summary, it is neither exhaustive nor should be considered legal advice. Please seek legal help if you have specific data privacy and security needs.
Type of data | Process used | Challenge prior to disclosure? | Proof needed |
---|---|---|---|
Subscriber information | Subpoena | Yes | Relevant to an investigation |
Non-content information, metadata | Court order; sometimes subpoena | Yes | Specific and articulable facts that info is relevant to an investigation |
Stored content | Search warrant | No | Probable cause that info will provide evidence of a crime |
Content in transit | Super warrant | No | Probable cause plus exhaustion and minimization |
Types of Data that Can be Collected
The laws protecting private data online generally follow a pattern: the more sensitive the personal data is, the greater factual and legal burden police have to meet before they can obtain it. Although this is not exhaustive, here are a few categories of data you may be sharing with services, and why police might want to obtain it.
- Subscriber Data: Information you provide in order to use the service. Think about ID or payment information, IP address location, email, phone number, and other information you provided when signing up.
- Law enforcement can learn who controls an anonymous account, and find other service providers to gather information from.
- Non-content data, or "metadata": This is saved information about your interactions on the service; like when you used the service, for how long, and with whom. Analogous to what a postal worker can infer from a sealed letter with addressing information.
- Law enforcement can use this information to infer a social graph, login history, and other information about a suspect’s behavior.
- Stored content: This is the actual content you are sending and receiving, like your direct message history or saved drafts. This can cover any private information your service provider can access.
- This most sensitive data is collected to reveal criminal evidence. Overly broad requests also allow for retroactive searches, information on other users, and can take information out of its original context.
- Content in transit: This is the content of your communications as it is being communicated. This real-time access may also collect info which isn’t typically stored by a provider, like your voice during a phone call.
- Law enforcement can compel providers to wiretap their own services for a particular user—which may also implicate the privacy of users they interact with.
Legal Processes Used to Get Your Data
When US law enforcement has identified a service that likely has this data, they have a few tools to legally compel that service to hand it over and prevent users from knowing information is being collected.
Subpoena
Subpoenas are demands from a prosecutor, law enforcement, or a grand jury which do not require approval of a judge before being sent to a service. The only restriction is this demand be relevant to an investigation. Often the only time a court reviews a subpoena is when a service or user challenges it in court.
Due to the lack of direct court oversight in most cases, subpoenas are prone to abuse and overreach. Providers should scrutinize such requests carefully with a lawyer and push back before disclosure, particularly when law enforcement tries to use subpoenas to obtain more private data, such as the contents of communications.
Court Order
This is a similar demand to subpoenas, but usually pertains to a specific statute which requires a court to authorize the demand. Under the Stored Communications Act, for example, a court can issue an order for non-content information if police provide specific facts that the information being sought is relevant to an investigation.
Like subpoenas, providers can usually challenge court orders before disclosure and inform the user(s) of the request, subject to law enforcement obtaining a gag order (more on this below).
Search Warrant
A warrant is a demand issued by a judge to permit police to search specific places or persons. To obtain a warrant, police must submit an affidavit (a written statement made under oath) establishing that there is a fair probability (or “probable cause”) that evidence of a crime will be found at a particular place or on a particular person.
Typically services cannot challenge a warrant before disclosure, as these requests are already approved by a magistrate. Sometimes police request that judges also enter gag orders against the target of the warrant that prevent hosts from informing the public or the user that the warrant exists.
Super Warrant
Police seeking to intercept communications as they occur generally face the highest legal burden. Usually the affidavit needs to not only establish probable cause, but also make clear that other investigation methods are not viable (exhaustion) and that the collection avoids capturing irrelevant data (minimization).
Some laws also require high-level approval within law enforcement, such as leadership, to approve the request. Some laws also limit the types of crimes that law enforcement may use wiretaps in while they are investigating. The laws may also require law enforcement to periodically report back to the court about the wiretap, including whether they are minimizing collection of non-relevant communications.
Generally these demands cannot be challenged while wiretapping is occurring, and providers are prohibited from telling the targets about the wiretap. But some laws require disclosure to targets and those who were communicating with them after the wiretap has ended.
Gag orders
Many of the legal authorities described above also permit law enforcement to simultaneously prohibit the service from telling the target of the legal process or the general public that the surveillance is occurring. These non-disclosure orders are prone to abuse and EFF has repeatedly fought them because they violate the First Amendment and prohibit public understanding about the breadth of law enforcement surveillance.
How Services Can (and Should) Protect You
This process isn't always clean-cut, and service providers must ultimately comply with lawful demands for user’s data, even when they challenge them and courts uphold the government’s demands.
Service providers outside the US also aren’t totally in the clear, as they must often comply with US law enforcement demands. This is usually because they either have a legal presence in the US or because they can be compelled through mutual legal assistance treaties and other international legal mechanisms.
However, services can do a lot by following a few best practices to defend user privacy, thus limiting the impact of these requests and in some cases make their service a less appealing door for the cops to knock on.
Put Cops through the Process
Paramount is the service provider's willingness to stand up for their users. Carving out exceptions or volunteering information outside of the legal framework erodes everyone's right to privacy. Even in extenuating and urgent circumstances, the responsibility is not on you to decide what to share, but on the legal process.
Smaller hosts, like those of decentralized services, might be intimidated by these requests, but consulting legal counsel will ensure requests are challenged when necessary. Organizations like EFF can sometimes provide legal help directly or connect service providers with alternative counsel.
Challenge Bad Requests
It’s not uncommon for law enforcement to overreach or make burdensome requests. Before offering information, services can push back on an improper demand informally, and then continue to do so in court. If the demand is overly broad, violates a user's First or Fourth Amendment rights, or has other legal defects, a court may rule that it is invalid and prevent disclosure of the user’s information.
Even if a court doesn’t invalidate the legal demand entirely, pushing back informally or in court can limit how much personal information is disclosed and mitigate privacy impacts.
Provide Notice
Unless otherwise restricted, service providers should give notice about requests and disclosures as soon as they can. This notice is vital for users to seek legal support and prepare a defense.
Be Clear With Users
It is important for users to understand if a host is committed to pushing back on data requests to the full extent permitted by law. Privacy policies with fuzzy thresholds like "when deemed appropriate" or “when requested” make it ambiguous if a user’s right to privacy will be respected. The best practices for providers not only require clarity and a willingness to push back on law enforcement demands, but also a commitment to be transparent with the public about law enforcement’s demands. For example, with regular transparency reports breaking down the countries and states making these data requests.
Social media services should also consider clear guidelines for finding and removing sock puppet accounts operated by law enforcement on the platform, as these serve as a backdoor to government surveillance.
Minimize Data Collection
You can't be compelled to disclose data you don’t have. If you collect lots of user data, law enforcement will eventually come demanding it. Operating a service typically requires some collection of user data, even if it’s just login information. But the problem is when information starts to be collected beyond what is strictly necessary.
This excess collection can be seen as convenient or useful for running the service, or often as potentially valuable like behavioral tracking used for advertising. However, the more that’s collected, the more the service becomes a target for both legal demands and illegal data breaches.
For data that enables desirable features for the user, design choices can make privacy the default and give users additional (preferably opt-in) sharing choices.
Shorter Retention
As another minimization strategy, hosts should regularly and automatically delete information when it is no longer necessary. For example, deleting logs of user activity can limit the scope of law enforcement’s retrospective surveillance—maybe limiting a court order to the last 30 days instead of the lifetime of the account.
Again design choices, like giving users the ability to send disappearing messages and deleting them from the server once they’re downloaded, can also further limit the impact of future data requests. Furthermore, these design choices should have privacy-preserving default
Avoid Data Sharing
Depending on the service being hosted there may be some need to rely on another service to make everything work for users. Third-party login or ad services are common examples with some amount of tracking built in. Information shared with these third-parties should also be minimized and avoided, as they may not have a strict commitment to user privacy. Most notoriously, data brokers who sell advertisement data can provide another legal work-around for law enforcement by letting them simply buy collected data across many apps. This extends to decisions about what information is made public by default, thus accessible to many third parties, and if that is clear to users.
(True) End-to-End Encryption
Now that HTTPS is actually everywhere, most traffic between a service and a user can be easily secured—for free. This limits what onlookers can collect on users of the service, since messages between the two are in a secure “envelope.” However, this doesn’t change the fact the service is opening this envelope before passing it along to other users, or returning it to the same user. With each opened message, this is more information to defend.
Better, is end-to-end encryption (e2ee), which just means providing users with secure envelopes that even the service provider cannot open. This is how a featureful messaging app like Signal can respond to requests with only three pieces of information: the account identifier (phone number), the date of creation, and the last date of access. Many services should follow suit and limit access through encryption.
Note that while e2ee has become a popular marketing term, it is simply inaccurate for describing any encryption use designed to be broken or circumvented. Implementing “encryption backdoors” to break encryption when desired, or simply collecting information before or after the envelope is sealed on a user’s device (“client-side scanning”) is antithetical to encryption. Finally, note that e2ee does not protect against law enforcement obtaining the contents of communications should they gain access to any device used in the conversation, or if message history is stored on the server unencrypted.
Protecting Yourself and Your Community
As outlined, often the security of your personal data depends on the service providers you choose to use. But as a user you do still have some options. EFF’s Surveillance Self-Defense is a maintained resource with many detailed steps you can take. In short, you need to assess your risks, limit the services you use to those you can trust (as much as you can), improve settings, and when all else fails, accessorize with tools that prevent data sharing in the first place—like EFF’s Privacy Badger browser extension.
Remember that privacy is a team sport. It’s not enough to make these changes as an individual, it’s just as important to share and educate others, as well as fighting for better digital privacy policy on all levels of governance. Learn, get organized, and take action.