Don’t be doubleplus-ungood

An interesting article at Wired, by Nitasha Tiku:
Three Years of Misery Inside Google, the Happiest Company in Tech

It is sympathetic to leftist Googlers, and that sympathy seems justified in the case of the doxxing described. However, it fails to mention that among the first things to happen to James Damore was doxxing and defaming from the left. It should be remembered he wrote his common sense, scientifically grounded memo in response to “sensitivity training” mandated for all employees. That the memo couldn’t be tolerated tells us all we need to know about Google culture.

The article ignores the deplatforming and/or demonetizing of numerous center-rightists who offend leftist sensitivities (Dennis Prager, for just one example). That Google can’t see this as a negative reflection on their culture tells us more than we need to know about their culture. Tiku doesn’t discuss harassment of conservatives by the left, which I must imagine constituted a “hostile work environment” long before this SHTF. We get little from her in that regard. We’re meant to think the majority of Googlers were harassed by a tiny group of conservatives, with little to no provocation.

It is further obvious that Google’s culture skews rigidly left based on political donations, its executives’ assistance to Barack Obama, and who the internal activist stars are. It’s clear from their push to hire based on sex, sexual orientation, skin color, and ethnicity that they agree with SJW tropes.

Still, this is a very interesting peek behind the Google curtain. Let’s look at some outtakes.

Larry Page and Sergey Brin, the former Montessori kids who founded Google as Stanford grad students in the late ’90s, had designed their company’s famously open culture to facilitate free thinking. Employees were “obligated to dissent” if they saw something they disagreed with, and they were encouraged to “bring their whole selves” to work rather than check their politics and personal lives at the door. And the wild thing about Google was that so many employees complied.

Complied???? “Complied” is fulsome BS. They were selected via policies guaranteed to ensure their willingness to complain about any social or political issue that made them want to run to their “safe spaces,” and then sensitivity trained. It isn’t possible for the SJWs to check their identity-group politics at the door in the first place.

The culture was, and is, far from “open.” It is characterized by internecine tribal struggles between privileged, brilliant people with generally very poor social skills, whose idea of free thinking is mostly modeled after 1984 leavened with touches of Rules for Radicals.

[T]o a remarkable extent, Google’s workers really do take “Don’t Be Evil” to heart. C-suite meetings have been known to grind to a halt if someone asks, “Wait, is this evil?” To many employees, it’s axiomatic: Facebook is craven, Amazon is aggro, Apple is secretive, and Microsoft is staid, but Google genuinely wants to do good.

Well, for a given definition of “good.” And that is The. Whole. Problem. “Evil” is whatever does not toe the Progressive line. The motto should have been “Don’t be doubleplus-ungood.”

According to The Wall Street Journal, members of one mailing list brainstormed whether there might be ways to “leverage” Google’s search results to surface ways of helping immigrants; some proposed that the company should intervene in searches for terms like “Islam,” “Muslim,” or “Iran” that were showing “Islamophobic, algorithmically biased results.” (Google says none of those ideas were taken up.)

Didn’t take them up. But they openly considered substituting their opinions as reality because they are so sure of their righteousness and so drunk on their power. No one with a grounded definition of evil, and a gram of introspection, would have dared bring that up unless it was preceded with, “One thing we cannot do…”.

For this article, WIRED spoke with 47 current and former Google employees. Most of them requested anonymity.

Wouldn’t need anonymity if “Don’t be evil” meant anything, would they?

I hate that argument because it’s often applied to fee speech: “I’ve done nothing wrong. I’ve got nothing to hide.” It’s quite clear from Google’s example that it’s other people who decide whether you have anything to hide. And a lot of those who work at Google want to decide to hide your opinion, or prevent you from forming it, and, at all costs, stop you from expressing it in the first place.

YouTube CEO Susan Wojcicki and head of communications Jessica Powell urged their colleagues to consider how they would have reacted if Damore had applied the same arguments to race, rather than gender. That persuaded them: The engineer had to go. In a note to employees, Pichai said he was firing Damore for perpetuating gender stereotypes.

If all else fails, play the race card. I don’t know Wojcicki’s or Powell’s race, but is smacks of cultural appropriation, doesn’t it?

Stereotypes can be valid, or we would see many more whites in the NBA, and many women’s world weight lifting records completely smashed. But, you can’t say those biologically male lifters doing the smashing aren’t really women. The same type of rational arguments DO apply to race. This is not how free thinkers react when there is scientific evidence, presented without vitriol. It is just a replay of the reaction to Murray’s The Bell Curve.

Pichai tried to assure the left without alienating the right. “To suggest a group of our colleagues have traits that make them less biologically suited to that work is offensive and not OK,”

Well, nobody said that. But claiming someone did indicates management’s attitude, which had to be double-reverse engineered from a faulty understanding of the word stereotype, and a “la la la, I can’t hear you” inspired ignorance. Ostriches.

Two days after Damore was fired, Milo Yiannopoulos, the former tech editor at Breitbart, shared the Reddit collage image with 2 million Facebook followers. “Look at who works for Google, it all makes sense now,” he wrote—as if these eight employees had been the ones who made the decision to ax Damore.

“As if?” They WERE the ones who forced that decision!

At the time, Google was run as a triumvirate, with CEO Eric Schmidt playing the role of resident grown-up. Schmidt argued that if Google stopped censoring search results, it would never get back into China.

The “grownup” is the one who wants to knuckle under to totalitarian demands to suppress information. I.e., to betray Google’s mission: https://about.google/
Google mission

It doesn’t say “except in countries with totalitarian dictatorships, that put millions into re-education camps because of their religion, and want to destroy the capitalist system that made Google possible.” I guess it’s not evil to interpret it as “useful to despots,” though.

“The legacy of the China decision was a giant dose of goodwill from Googlers around the world,” Schmidt wrote in How Google Works; it reaffirmed the company’s principles “governing how all tough decisions should be made.”

Schmidt takes credit for a policy he opposed. If you aren’t evil, it’s an easy decision.

As Google seemed to close in on winning the [Maven] contract, executives from the cloud team pondered how a deal with the Pentagon—especially one that could be linked to autonomous weapons—might reflect on Google’s non-evil brand. In September, a few weeks after the meeting with Mattis, they discussed spinning up some positive PR that would focus on the “vanilla cloud technology” aspects of the Maven contract. “Avoid at ALL COSTS any mention or implication of AI,” wrote Fei-Fei Li, a Stanford professor and Google Cloud’s chief scientist for AI.

Censor search in China and object to a US military contract. Fit those ethics together for me, will you? An ethical approach might have been not to do either one.

HR had become “weaponized,” they said; Googlers on both sides of the battle lines had become adept at working the refs—baiting colleagues into saying things that might violate the company’s code of conduct, then going to human resources to report them.

And what did they expect would happen after the cancel culture they encouraged became commonplace and was rewarded?

In early June 2018, Pichai finally published the AI principles that Google had promised its employees. They included a list of four applications of AI that Google would not pursue, including weapons, technologies that gather and use information “for surveillance violating internationally accepted norms,” and technology “whose purpose contravenes widely accepted principles of international law and human rights.”

“Don’t be evil” notwithstanding, they entertained a project to co-operate with the Chinese Communists that would shove human rights under the jackboot, and surveillance violating internationally accepted norms. They had to write that down. I guess it depends on your definition of “norms” and “principles.” Is it a norm because China does it to a billion and a half Chinese?

Google asked for chaos, I’m glad they’re getting it. The moral is don’t let a bunch of middle school mean girls run your company via internal social media.

Philosopher ‘pretenders to the throne’

This is a nice, short (7 min) introduction to Friedrich Hayek’s insights on emergent order. If you haven’t read Road to Serfdom (free downloads at the link), maybe this will nudge you to do so.

Order without intent: How spontaneous order built our world. from The IHS on Vimeo.

Allowing order without intent to flourish is how we might avoid the tyranny of good intentions.

Related, from Edward Snowden:

“The most unflattering thing is to realize just how naïve and credulous I was and how that could make me into a tool of systems that would use my skills for an act of global harm. The class of which I am a part of, the global technological community, was for the longest time apolitical. We have this history of thinking: “We’re going to make the world better.””

The idea that “making the world better” is apolitical shows Snowden is still naive and credulous. The toolmakers of the global technological community may have good intentions. They may be motivated by thoughts of the benefits they are bringing to humanity. They may also be motivated by profit and ideology.

How a better world is constituted, in any case, is an ethical and moral question beyond the ken of their meta-data, and in direct conflict with the ethical ‘principles’ demonstrated by their business models.

Who defines “better?” We have ample evidence Google/Facebook/Twitter aren’t up to the task.

“Making the world better” can be apolitical only in terms of each individual’s actions. It cannot be apolitical for giant corporations whose tools are designed to deceive users into acts of self harm: A system of fools.

Politics is the very essence of social media and the control of access to information.

Politics, noun. A strife of interests masquerading as a contest of principles. The conduct of public affairs for private advantage.
-Ambrose Bierce

And, in ways Bierce couldn’t imagine – conducting private affairs for public advantage. Affecting elections for example.

Snowdon’s NSA is simply the government instantiation of the Facebook/Google/Twitter business models. They are all dedicated to making their subjects “better.”

“The urge to save humanity is almost always a false front for the urge to rule.”
-H. L. Mencken

Order with intent is the model practiced by authoritarians for “your own good,” public or private, from de Blasio to Google.

So, I’ll close with some relevant Friedrich Hayek quotations on good intentions, control of information, collectivist ethics, and the limits of knowledge: All of which apply to government and to the massive private enterprises whose control of information and manipulation of public opinion Hayek couldn’t imagine:

“Everything which might cause doubt about the wisdom of the government or create discontent will be kept from the people. The basis of unfavorable comparisons with elsewhere, the knowledge of possible alternatives to the course actually taken, information which might suggest failure on the part of the government to live up to its promises or to take advantage of opportunities to improve conditions–all will be suppressed. There is consequently no field where the systematic control of information will not be practiced and uniformity of views not enforced.”

“Our freedom of choice in a competitive society rests on the fact that, if one person refuses to satisfy our wishes, we can turn to another. But if we face a monopolist we are at his absolute mercy. And an authority directing the whole economic system of the country would be the most powerful monopolist conceivable…it would have complete power to decide what we are to be given and on what terms. It would not only decide what commodities and services were to be available and in what quantities; it would be able to direct their distributions between persons to any degree it liked.”

“All political theories assume, of course, that most individuals are very ignorant. Those who plead for liberty differ from the rest in that they include among the ignorant themselves as well as the wisest. Compared with the totality of knowledge which is continually utilized in the evolution of a dynamic civilization, the difference between the knowledge that the wisest and that the most ignorant individual can deliberately employ is comparatively insignificant.”

“To act on behalf of a group seems to free people of many of the moral restraints which control their behaviour as individuals within the group.”

“The idea of social justice is that the state should treat different people unequally in order to make them equal.”

Network effect feudalism

This is the most important article I’ve read in 2019. Kudos to Allen Farrington.

I have struggled to make these points to others for a long time, and am generally viewed as a curmudgeon (charitably), or a paranoid fanatic (more typical) for my efforts. There is a bias toward the fallacious “I’ve got nothing to hide,” response, because the harm is unseen. People don’t yet grasp that they are not the ones to decide if they have something to hide*.

Farrington delineates the harm brilliantly.

Given that our rulers feel compelled to ‘do something’ about social media’s disdain for its peons, among whom they number, we can be sure government will make it worse and further entrench the incumbents.

Farrington’s comments on blockchains, free speech, Gab (of which I’m a long time member), Ethereum (which George Gilder referenced in a recent, related interview), and anti-trust are enlightening. The economic analysis is thought provoking. The political implications are consequential. A slice:

“It is not actually free,” [Facebook co-founder Chris] Hughes tells us, “and it certainly isn’t harmless.” But both [Hughes and Senator Elizabeth Warren] seem to believe that Facebook, Google and others succumb to the temptation to inflict such harm solely because they are big. Hence, the solution is to make them smaller. It doesn’t appear to have occurred to either of them that they are big because they inflict such harm.

Facebook and Google are not Standard Oil and AT&T. They operate business models whose network effects tend towards monopoly, due to continuous redeployment of increasing returns to scale. Users pay not with money but with data, which Facebook and Google then turn into productive capital that creates products for another group entirely. The quality of the service to the users—the unknowing and hence unrewarded capital providers—scales quadratically with the size of the network and, since they are free in monetary terms, any serious attempt to compete would require monumentally more capital than could ever generate a worthwhile return. The proper regulatory approach is not to cut off the heads of these hydras one at a time, but to acknowledge that these are fundamentally new economic entities.

Artificial intelligence makes this all the more imperative. By AI, I mean the honing of proprietary algorithms on enormous complexes of unwittingly generated data to identify patterns no human could—identifications that will be re-applied to dynamic pricing decisions and content filtering in order to make what will surely be called efficiency gains and improvements to the user experience. This would all be fine and dandy—as opposed to highly ethically suspect—if the contributors of the data had any idea of their own involvement, either in the contribution itself or in the eventual gain in efficiency. What is really happening here is that information that previously only existed transiently and socially will soon be turned into a kind of productive capital that will only have value in massive aggregations. This is why those who generate the data are happy to do so for free, for it is of no monetary value to them, and it is why the only people who will derive any productive value from it will be the already very well capitalized.

This is an unflattering, but perfectly accurate, description of the business models of Facebook and Google, who stalk you wherever you go on the web, wherever you bring your smartphone, and wherever you interact in any way with one of their trusted partners, all in an effort to manipulate your sensory environment and slip in as many ads as possible. This is so effective that they buy data from outside their platforms to supplement the potency of their manipulations…

[I]f something is free, it is difficult if not impossible to discern the kind of meaningful information that one might from a price in a market. The willingness to pay a price indicates a sincere belief and an honest commitment. There are costs to insincere or dishonest behaviour that will simply be dispersed throughout the network, rather than borne by the perpetrator.

It is not about the value of an individual’s data, “it is of no monetary value to them.

You are not just the product Google and Facebook sell; you are the enabling capital in a vast pyramid scheme.

How can we preserve our identity capital? How can we price our data? By making identity data scarce:

“Participants in the [redesigned] network are discouraged from being dishonest or insincere by the price and permanence of their scarce identity…

Several clearly desirable features immediately present themselves. For example, the issue of gatekeepers who exist for technical reasons assigning themselves political authority would evaporate…

So here’s my plea: stop using big tech and venture into the wild.”

Yes. The network effect can only be blunted if individuals stop enhancing it. Call it utopian, but boycotting Google and Facebook is something you control, and doesn’t depend on Senator Warren’s tender, collectivist mercies. Or, Facebook’s Social Justice agenda of the day.

“If a critical mass of users switches away from Google or Facebook, their collapse will be surprisingly quick. This is a very dramatic potential outcome, and I suspect it is more likely that, at a certain rate of user emigration, these companies, and others, will adapt their policies to be more free and open, so as to better compete in this new environment.”

The article is not a long read, but if you want to know what I’m talking about when I mention George Gilder, you’ll want to watch this 45 minute interview regarding Gilder’s book Life After Google. I wished for more Gilder and less interviewer at times, and more depth on some ideas, but for a general audience it’s not a bad look at Google, AI, blockchain, and other things related to Farrington’s post. A few gems from Gilder.

*

“The old cliché is often mocked though basically true: there’s no reason to worry about surveillance if you have nothing to hide. That mindset creates the incentive to be as compliant and inconspicuous as possible: those who think that way decide it’s in their best interests to provide authorities with as little reason as possible to care about them. That’s accomplished by never stepping out of line. Those willing to live their lives that way will be indifferent to the loss of privacy because they feel that they lose nothing from it. Above all else, that’s what a Surveillance State does: it breeds fear of doing anything out of the ordinary by creating a class of meek citizens who know they are being constantly watched.”

~ Glenn Greenwald

Bait and switch

Google Finds It’s Underpaying Many Men as It Addresses Wage Equity

Here is the core point from that NYT article:

When Google conducted a study recently to determine whether the company was underpaying women and members of minority groups, it found, to the surprise of just about everyone, that men were paid less money than women for doing similar work.

Now, that’s a blockbuster, right? Feminists should be rejoicing. They aren’t. They are still whining, and the goalposts are being adjusted as you read this.

From Google’s point of view these results are a happy thing. If you wanted to spike some private suits, fire a shot across the bow of crazed employees, and stick a finger in the eye of the Labor Department all at once… you might want a study just like this.

For example:

The Labor Department is investigating whether the company systematically underpays women. It has been sued by former employees who claim they were paid less than men with the same qualifications.

However, according to critics, it isn’t enough that Google has been paying women more for equivalent work – they were started at lower salaries.

Google’s critics say it doesn’t come close to matching what a woman would make if she had been assigned to the appropriate pay grade in the first place…

This is a strange objection, because the data imply the opposite: Either men are started at lower salaries than they should be, or women get more substantial raises more quickly. Otherwise, how is it that men at Google are more likely to be underpaid?

Men disproportionately received raises and bonuses. Google apparently found that it’s men who are hired at lower than “equitable” salaries. Italics mine:

The company has done the study every year since 2012. At the end of 2017, it adjusted 228 employees’ salaries by a combined total of about $270,000. This year, new hires were included in the analysis for the first time, which Google said probably explained the big change in numbers.

Those who don’t get that relationship are probably not good candidates for high level software engineering jobs. They do better at diversity consulting.

Joelle Emerson, CEO of a company which profits by convincing its clients ‘increasing diversity’ is so hard it can’t be done without ‘woke’ consultants, explains:

Google seems to be advancing a “flawed and incomplete sense of equality” by making sure men and women receive similar salaries for similar work, said Joelle Emerson, chief executive of Paradigm, a consulting company that advises companies on strategies for increasing diversity. That is not the same as addressing “equity,” she said, which would involve examining the structural hurdles that women face as engineers.

Google, “by making sure men and women receive similar salaries for similar work” is doing it wrong.  It needs to hire Ms. Emerson’s consultants.

You have to admit this is a nice twist on planned obsolescence. The “structural hurdles” will never be exhausted in the search for equality of outcome and the righteous battle to prevent diversity of thought.

A good example of Ms. Emerson’s definition of diversity would appear to be equal pay outcomes for those who can’t code, but only if they are female, or members of some other identity group not white or male.

“Equity” is a code word for equal outcome. In the ’60s, it was equal opportunity that drew sensible people to support changes in how women were treated. That’s all gone.

See also: Asymmetries in the workplace do not necessarily reflect gender discrimination for more examples of denialism from the Feminists:

  1. In countries with little to no institutional barriers to employment on the basis of identity, men and women often make choices (involving their own family and vocational priorities) that result in asymmetries in workplace representation and earnings (whether among Uber drivers or graduatesof prestigious MBA programs).

  2. Men overwhelmingly outnumber women in the most dangerous jobs. This also doesn’t indicate that discrimination has taken place.

  3. While unequal treatment before the law and corruption should not be tolerated, different career and family choices (as well as preferences and aptitudes) that result in asymmetries in workplace representation and earnings neither result from conspiracies nor from oppression.

RTWT.

Thanks to the Internet of Things

Your trash disposal habits might now require a small EMP generator before you can safely throw away a lightbulb.

Recycling is definitely contraindicated without that EMP.  Or a 2 pound sledge (wear eye-protection).

The people scanning the conveyor belt to sort actual trash out of the recycling stream could quickly “monetize” burned out lightbulbs without even the bother of diving into a dumpster, and without any computer skills whatever.

Discarded smart lightbulbs reveal your wifi passwords, stored in the clear

I am quite sure this does not apply only to IoT lightbulbs.

The future is stupid, but not stupider than LIFX management. They sell you electronic security breachers so you can implant them yourself. Which would make you the stupidest.

The engineers at LIFX did not encrypt the RSA key on their “smart” lightbulbs, so an enterprising garbage collector who’d ‘learned to code’ could have root access to your home WiFi because you threw one away.

It isn’t believable that the engineers at LIFX failed to understand this problem.

Therefore, it wasn’t the engineers who decided to ship these Trojan Horses.

Therefore, protestation from LIFX that they’ve cleaned up their act is incredible.

That is, it is as credible as Google and Facebook when they claim they protect your privacy – even though selling it is how they prosper.

This is not to say LIFX planned to harvest your WiFi passwords.  It is to say they just didn’t give a shit.

I can’t wait until lightbulbs speak like HAL… I wonder if you can get HAL’s voice on Alexa or Google Home?

“Light?… Off.”
“Sorry, I can’t do that _your name here_.”

Sadly, most Millennials wouldn’t get the reference, not having seen 2001: A Space Odyessy. I’m sure they are installing these bulbs in their parent’s basements.