Skip navigation

In some ways, it reads like a bad novel: “Every Step You Fake” (, a Canadian study of privacy and security in personal fitness devices. The report outlines two key areas in which these devices have significant security and privacy shortcomings — but just as you feel sympathy for the devices’ wearers, you learn they may be the “bad actor” in other cases. We can spot adversaries in every direction, but who’s the hero of this drama? And, frankly, does it need to be a drama?
The two shortcomings outlined in the report are that:

  • the devices’ radio-based transmissions can “leak” your presence and make you trackable (anonymously) through shopping malls that do that sort of thing; and
  • it’s possible to fake out some of the website collection servers so that you can “adjust” your results.

Well, wait. Why so much drama around a device you electively wear on your person? What are the actual problems that need solving here?

In the first case, tracking, let’s be clear that the shopping malls have no knowledge of who they are tracking in this instance, but just that the same device is showing up in different areas of the mall, and how much time the wearer is spending in particular places. And, they could presumably also note when the same tracker reappeared at a later date, if they keep and mine their data that way. The purported use of this tracking technology is to determine mall traffic patterns and peoples’ interests. The report gives higher marks to the fitness devices that mask this information, making them harder to track.
From the standpoint of the wearer, I have to say I don’t care to be tracked: I never enjoyed being followed around a shop by an overzealous clerk, and the feeling isn’t particularly reduced just because I can’t see the person who’s following me. But there are actually 3 avenues to pursue here:

  • Don’t wear the fitness device.
  • Only wear a fitness device that purports to mask the leaky data.
  • Work to make the tracking of device IDs illegal, or at least heavily circumscribed in terms of what can be done with the data.

None of them is actually a great answer, but the one most often assumed to be the best is the middle one: working on making the devices untrackable.

Let’s go back to the device-wearer-as-hacker side of the coin. Some fitness tracker server sites use less security protections than others, and people have figured out how to edit the data that has been collected by the device to present a very different reality. E.g., to say that the wearer has walked 800 miles in a day.
So, what are the consequences if the wearer fakes out their own data? We can expect that such trickery would have all the usual consequences associated with (false) bravado and boasting of accomplishments — just faster, because the services are connected to social networks and “accomplishments” can circle the globe in seconds. Additionally, there is of course self-delusion, and loss of utility of the device for your own health-management purposes.
But, the world has already moved on, and there might be more significant, even monetary implications, if their health insurance premiums are dependent on demonstrating that they have exercised.

So… what?

Yes, of course, companies should build more robust systems. But maybe we should remember that technology is a tool, which can either help or hinder but not replace our individual engagement with the world. Is it right to base judgements on these data to begin with? The data collected and the tracking are both representations that we assume adequately reflect a reality we’re interested in capturing. Any step taken from here is essentially layering more technology baggage to make a surrogate seem more reliable (while in fact making it more brittle).
When do we step back and focus on our actual interpersonal interactions, re-establishing the value of personal integrity and commitment, and putting some real power in the statement — for individuals, corporations, and governments — that “just because you can, doesn’t mean you should?”