Sharing your personal fitness goals—lowered heart rates, accurate calorie counts, jogging times, and GPS paths—sounds like a fun, competitive feature offered by today’s digital fitness trackers, but a recent report from The Washington Post highlights how this same feature might end up revealing not just where you are, where you’ve been, and how often you’ve traveled there, but sensitive national security information.
According to The Washington Post report, the fitness tracking software company Strava—whose software is implemented into devices made by Fitbit and Jawbone—posted a “heat map” in November 2017 showing activity of some of its 27 million users around the world. Unintentionally included in that map were the locations, daily routines, and possible supply routes of disclosed and undisclosed U.S. military bases and outposts, including what appear to be classified CIA sites.
Though the revealed information itself was anonymized—meaning map viewers could not easily determine identities of Strava customers with the map alone—when read collectively, the information resulted in a serious breach of privacy.
Shared on Twitter, the map led to several discoveries, the report said.
“Adam Rawnsley, a Daily Beast journalist, noticed a lot of jogging activity on the beach near a suspected CIA base in Mogadishu, Somalia.
Another Twitter user said he had located a Patriot missile system site in Yemen.
Ben Taub, a journalist with the New Yorker, homed in on the location of U.S. Special Operations bases in the Sahel region of Africa.”
On Monday, according to a follow-up report by The Washington Post, the U.S. military said it was reviewing guidelines on how it uses wireless devices.
As the Strava map became more popular, the report said, Internet users were able to further de-anonymize the data, pairing it to information on Strava’s website.
According to The Washington Post's follow-up report:
“On one of the Strava sites, it is possible to click on a frequently used jogging route and see who runs the route and at what times. One Strava user demonstrated how to use the map and Google to identify by name a U.S. Army major and his running route at a base in Afghanistan.”
The media focused on one particular group affected by this privacy breach: the U.S. military. But of course, regular people’s privacy is impacted even more by privacy leaks such as this. For instance, according to a first-person account written in Quartz last year, one London jogger was surprised to learn that, even with strict privacy control settings on Strava, her best running times—along with her first and last name and photo—were still visible to strangers who peered into her digital exercise activity. These breaches came through an unintended bargain, in which customers traded their privacy for access to social fitness tracking features that didn’t exist several years ago.
And these breaches happened even though Strava attempted to anonymize its customers’ individual data. That clearly wasn’t enough. Often, our understanding of “anonymous” is wrong—invasive database cross-referencing can reveal all sorts of private information, dispelling any efforts at meaningful online anonymity.
While “gamified” fitness trackers, especially ones that have social competition built-in, are fun, they are really just putting a friendly face on big brother. When we give control over our personal data—especially sensitive data such as location history—to third parties, we expect it to be kept private. When companies betray that trust, even in “anonymized” form such as the Strava heat map, unintended privacy harms are almost guaranteed. Clearly communicated privacy settings can help us in situations like these, but so can company decisions to better protect the data they publish online.