Regulating the invisible ecosystem where thousands of firms possess data on billions of people
Currently there is no way for us to retract information that previously seemed harmless to share. Once tied to our identities, data about us can be part of our permanent record in the hands of whoever has it — and whomever they share it with, voluntarily or otherwise. The Cambridge Analytica data set from Facebook is itself but a lake within an ocean, a clarifying example of a pervasive but invisible ecosystem where thousands of firms possess billions of data points across hundreds of millions of people — and are able to do lots with it under the public radar.
Several years ago Facebook started to limit what apps could scrape from friends’ profiles even with permission, but the basic configuration of user consent as a bulwark against abuse hasn’t changed. Consent just doesn’t work. It’s asking too much of us to meaningfully respond to dialogue boxes with fine print as we try to work or enjoy ourselves online — and even that is with the naïve assumption that the promises on which our consent was premised will be kept.
There are several technical and legal advances that could make a difference.
On the policy front, we should look to how the law treats professionals with specialized skills who get to know clients’ troubles and secrets intimately. For example, doctors and lawyers draw lots of sensitive information from, and wield a lot of power over, their patients and clients. There’s not only an ethical trust relationship there but also a legal one: that of a “fiduciary,” which at its core means that the professionals are obliged to place their clients’ interests ahead of their own.
The legal scholar Jack Balkin has convincingly argued that companies like Facebook and Twitter are in a similar relationship of knowledge about, and power over, their users — and thus should be considered “information fiduciaries.”
Doctors don’t ask patients whether they’d consent to poison over a cure; they recommend what they genuinely believe to be in the patients’ interests. Too often a question of “This app would like to access data about you, O.K.?” is really to ask, “This app would like to abuse your personal data, O.K.?” Users should be respected by protecting them from requests made in bad faith. [Continue reading…]
Don’t miss the latest posts at Attention to the Unseen: Sign up for email updates.