…Don’t be Facebook.
At Google, of course, that would not mean “respect user’s privacy.” It would mean “don’t get caught.”
I see Facebook CEO Mark Zuckerberg is reacting to his company’s poor user-data stewardship by inviting regulation. Not regulation of his company; he’s asking for political advertising to be regulated.
“Actually, I’m not sure we shouldn’t be regulated,” Zuckerberg said in an interview with CNN that represented some of his first public remarks since the Cambridge Analytica controversy plunged his company into crisis and led to calls for his testimony before Congress.
“I actually think the question is more ‘What is the right regulation?’ rather than ‘Yes or no, should it be regulated?’” Zuckerberg told CNN.
The Facebook CEO said that “he would love to see” new transparency regulations for political advertisements. Facebook has been criticized for a lack of transparency.
OK, Mr. Zuckerberg, I’ll take a shot at “What is the right regulation?”
First, it’s not about political advertising. You’re looking to make government regulation a CYA for Facebook: “Look, we followed the regulations!” You’re asking to “consult” with government on how political advertising should be constrained. Foxes. Henhouse. Plus a helping of partisanship and financial self-interest.
Advertising isn’t the problem. The problem is your business model and its intentional lack of honesty.
The regulation of Facebook, Google, Amazon, Twitter, Apple, etc. should start from the premise that users own their identity data, including when it’s aggregated. This enables micro-payments to those whose data is aggregated, each time it is accessed or updated. Basically, an identity copyright law. You’re using my identity, you have to pay me.
Defining ownership of the data as the individual’s would require absolute positive opt-in – data can’t be sold without payment and unless specific permission is given. Big Data like their interminable click-through contracts; they love changing the terms of service at will; they love hiding the opt-out buttons. We need these contracts re-written. One thing would happen for sure; the mandatory opt-in buttons would be prominent and they would list the payment to be gained.
Granting ownership of users’ data to users also encourages companies who gather and store it to be careful with it as a fiduciary duty. CEO Zuckerberg appears to agree that that is a good idea.
On Wednesday afternoon, Zuckerberg published a post promising to audit and restrict developer access to user data, “We have a responsibility to protect your data, and if we can’t then we don’t deserve to serve you.”
He’s right, Facebook doesn’t deserve to serve you for exactly the reason he gave. The word “serve” in that sentence can be interpreted in two very different ways. Zuckerberg is only too happy to “serve” you to advertisers. This attitude is long standing, as noted by the New Yorker in 2010
In [an] exchange leaked to Silicon Alley Insider, Zuckerberg explained to a friend that his control of Facebook gave him access to any information he wanted on any Harvard student:
Zuck: yea so if you ever need info about anyone at harvard
Zuck: just ask
Zuck: i have over 4000 emails, pictures, addresses, sns
Friend: what!? how’d you manage that one?
Zuck: people just submitted it
Zuck: i don’t know why
Zuck: they “trust me”
Zuck: dumb f*cks
While Zuckerberg claims he’s matured since that exchange, “if you ever need any information” nonetheless remains the raison d’être of Facebook. Zuckerberg went on to say, “I’ve been working to understand exactly what happened and how to make sure this doesn’t happen again.” Well, since privacy violations and sleazy ethical conduct just keep happening, he must be a slow learner.
In 2006 Facebook’s introduction of “News Feed” made information public that users had intended to keep private. In 2009, Facebook made posts public by default, when they had been private, again by simply changing its ToS. That attracted the attention of the U.S. Federal Trade Commission. In 2011, Facebook was caught tracking you with its cookies even after you had logged out. Zuckerberg is worried about regulating advertising, but Facebook had no problem in 2013 with the posting of beheading videos. In 2014, the company was forced to acknowledge that it had conducted a psychology experiment intended to manipulate users’ emotions.
The current angst over Cambridge Analytics should be directed at Facebook business practices. The same thing happened in 2012 with the Obama campaign – except with Facebook’s active participation. At the time this was considered a clever advertising use of social media by the Democrats.
So, suddenly, 6 years later, Zuckerberg wants political advertising regulated? You know he made the offer because his lobbyists would write the legislation. It’ll turn into a barrier to competition while likely eroding freedom of speech.
Facebook has repeatedly violated agreements with users, changed ToS without warning, hidden privacy controls deep within users’ profiles, made and allowed unethical use of its data, and directly participated in targeting election advertising. Maybe they’d be more careful, ethical and transparent if you owned the data.
A final word from Zuckerberg:
The real question for me is, do people have the tools that they need in order to make those decisions well? And I think that it’s actually really important that Facebook continually makes it easier and easier to make those decisions… If people feel like they don’t have control over how they’re sharing things, then we’re failing them.
Only one way to fix that. Give them control.