Network effect feudalism

This is the most important article I’ve read in 2019. Kudos to Allen Farrington.

I have struggled to make these points to others for a long time, and am generally viewed as a curmudgeon (charitably), or a paranoid fanatic (more typical) for my efforts. There is a bias toward the fallacious “I’ve got nothing to hide,” response, because the harm is unseen. People don’t yet grasp that they are not the ones to decide if they have something to hide*.

Farrington delineates the harm brilliantly.

Given that our rulers feel compelled to ‘do something’ about social media’s disdain for its peons, among whom they number, we can be sure government will make it worse and further entrench the incumbents.

Farrington’s comments on blockchains, free speech, Gab (of which I’m a long time member), Ethereum (which George Gilder referenced in a recent, related interview), and anti-trust are enlightening. The economic analysis is thought provoking. The political implications are consequential. A slice:

“It is not actually free,” [Facebook co-founder Chris] Hughes tells us, “and it certainly isn’t harmless.” But both [Hughes and Senator Elizabeth Warren] seem to believe that Facebook, Google and others succumb to the temptation to inflict such harm solely because they are big. Hence, the solution is to make them smaller. It doesn’t appear to have occurred to either of them that they are big because they inflict such harm.

Facebook and Google are not Standard Oil and AT&T. They operate business models whose network effects tend towards monopoly, due to continuous redeployment of increasing returns to scale. Users pay not with money but with data, which Facebook and Google then turn into productive capital that creates products for another group entirely. The quality of the service to the users—the unknowing and hence unrewarded capital providers—scales quadratically with the size of the network and, since they are free in monetary terms, any serious attempt to compete would require monumentally more capital than could ever generate a worthwhile return. The proper regulatory approach is not to cut off the heads of these hydras one at a time, but to acknowledge that these are fundamentally new economic entities.

Artificial intelligence makes this all the more imperative. By AI, I mean the honing of proprietary algorithms on enormous complexes of unwittingly generated data to identify patterns no human could—identifications that will be re-applied to dynamic pricing decisions and content filtering in order to make what will surely be called efficiency gains and improvements to the user experience. This would all be fine and dandy—as opposed to highly ethically suspect—if the contributors of the data had any idea of their own involvement, either in the contribution itself or in the eventual gain in efficiency. What is really happening here is that information that previously only existed transiently and socially will soon be turned into a kind of productive capital that will only have value in massive aggregations. This is why those who generate the data are happy to do so for free, for it is of no monetary value to them, and it is why the only people who will derive any productive value from it will be the already very well capitalized.

This is an unflattering, but perfectly accurate, description of the business models of Facebook and Google, who stalk you wherever you go on the web, wherever you bring your smartphone, and wherever you interact in any way with one of their trusted partners, all in an effort to manipulate your sensory environment and slip in as many ads as possible. This is so effective that they buy data from outside their platforms to supplement the potency of their manipulations…

[I]f something is free, it is difficult if not impossible to discern the kind of meaningful information that one might from a price in a market. The willingness to pay a price indicates a sincere belief and an honest commitment. There are costs to insincere or dishonest behaviour that will simply be dispersed throughout the network, rather than borne by the perpetrator.

It is not about the value of an individual’s data, “it is of no monetary value to them.

You are not just the product Google and Facebook sell; you are the enabling capital in a vast pyramid scheme.

How can we preserve our identity capital? How can we price our data? By making identity data scarce:

“Participants in the [redesigned] network are discouraged from being dishonest or insincere by the price and permanence of their scarce identity…

Several clearly desirable features immediately present themselves. For example, the issue of gatekeepers who exist for technical reasons assigning themselves political authority would evaporate…

So here’s my plea: stop using big tech and venture into the wild.”

Yes. The network effect can only be blunted if individuals stop enhancing it. Call it utopian, but boycotting Google and Facebook is something you control, and doesn’t depend on Senator Warren’s tender, collectivist mercies. Or, Facebook’s Social Justice agenda of the day.

“If a critical mass of users switches away from Google or Facebook, their collapse will be surprisingly quick. This is a very dramatic potential outcome, and I suspect it is more likely that, at a certain rate of user emigration, these companies, and others, will adapt their policies to be more free and open, so as to better compete in this new environment.”

The article is not a long read, but if you want to know what I’m talking about when I mention George Gilder, you’ll want to watch this 45 minute interview regarding Gilder’s book Life After Google. I wished for more Gilder and less interviewer at times, and more depth on some ideas, but for a general audience it’s not a bad look at Google, AI, blockchain, and other things related to Farrington’s post. A few gems from Gilder.


“The old cliché is often mocked though basically true: there’s no reason to worry about surveillance if you have nothing to hide. That mindset creates the incentive to be as compliant and inconspicuous as possible: those who think that way decide it’s in their best interests to provide authorities with as little reason as possible to care about them. That’s accomplished by never stepping out of line. Those willing to live their lives that way will be indifferent to the loss of privacy because they feel that they lose nothing from it. Above all else, that’s what a Surveillance State does: it breeds fear of doing anything out of the ordinary by creating a class of meek citizens who know they are being constantly watched.”

~ Glenn Greenwald

1 thought on “Network effect feudalism”