from the privacy-theater-incorporated dept.
Earlier this year Apple received ample coverage about how the company was making privacy easier for its customers by introducing a new, simple, tracking opt-out button for users as part of an iOS 14.5 update. Early press reports heavily hyped the concept, which purportedly gave consumers control of which apps were able to collect and monetize user data or track user behavior across the internet. Advertisers (most notably Facebook) cried like a disappointed toddler at Christmas, given the obvious fact that giving users more control over data collection and monetization, means less money for them.
By September researchers had begun to notice that Apple's opt-out system was somewhat performative anyway. The underlying system only really blocked app makers from accessing one bit of data: your phone's ID for Advertisers, or IDFA. There were numerous ways for app makers to track users anyway, so they quickly got to work doing exactly that, collecting information on everything from your IP address and battery charge and volume levels, to remaining device storage, metrics that can be helpful in building personalized profiles of each and every Apple user.
[...] Here's the thing. There's been just an absolute torrent of studies showing how "anonymizing" data is a gibberish term. It only takes a few additional snippets of data to identify "anonymized" users, yet the term is still thrown around by companies as a sort of "get out of jail free" card when it comes to not respecting user privacy. There's an absolute ocean of data floating around the data broker space that comes from apps, OS makers, hardware vendors, and telecoms, and "anonymizing" data doesn't really stop any of them from building detailed profiles on you.
Apple's opt-out button is largely decorative, helping the company brand itself as hyper privacy conscious without actually doing the heavy lifting required of such a shift [...]
In other words, it's B.A.D. (Broken As Designed).
Leaders today must be ready to take a stand on thorny social and political issues. A case study by Nien-hê Hsieh and Henry McGee examines how Apple CEO Tim Cook turned calls for data access into a rallying cry for privacy, and the complexities that followed:
Apple CEO Tim Cook didn't come to his post with an activist agenda, yet when law enforcement officials began pressuring the company to hand over iPhone users' data without their permission, Cook took what he believed was a moral stance to protect consumers' privacy.
[...] "We believe that a company that has values and acts on them can really change the world," Cook said in 2015, a year after Apple debuted new privacy measures that blocked law enforcement from accessing its customers' data. "There is an opportunity to do work that is infused with moral purpose." He said shareholders who were only looking for a return on investment "should get out of the stock."
A Harvard Business School case study and its revision, Apple: Privacy vs. Safety (A) and (B), illustrates the complex ramifications that companies should consider when putting their stake in the ground on challenging societal issues like privacy. The authors of the case offer a suggestion for CEOs: Few corporations can expect to steer clear of the lightning-rod issues of the day, so perhaps it's best to meet them head on as part of the job.