Snippets

The AI bubble is a political project

Numerous executives in tech repeatedly talk about how they think “AI” is going to replace workers. By their own account, that seems to be the point of the technology.

So far, the impact seems limited to a few select fields. Copywriters have been hit hard. Translators and illustrators are losing gigs everywhere I look. Training, especially in software development, seems to have been hit hard. Voice actors are getting replaced with generated voices.

The goal, if we are to take tech executives at their word, is to make these trends the norm, not the exception.

That is a political project. Attacking labour, deskilling the work force, and driving down wages, is fundamentally a political project and an extremist one at that.

Centrally managing language, using pervasive chatbot adoption as a lever to change corporate writing at scale, is another explicit goal of these companies. When Musk and Altman argue about which of their respective chatbots is less “left-wing”, their intent is clearly that they want to make all writing done with their tools less left-wing.

Centralised ideological control over all corporate writing is, again, an extremist political project, historically associated with violent authoritarianism.

We could go through how these tools are being used to power a wholesale takeover of our education systems, create a tiered system of healthcare access, and automate decisions to ensure that nobody can be held accountable for atrocious decisions, but it all comes down to the same, repeated point:

The AI Bubble is a right-wing political project that goes hand-in-hand with the ongoing resurgence of fascism.

Tech is a pop culture

Tech is a pop culture. Very few of the decisions made in the industry are made rationally or empirically. Studies and tests are used to justify the emotional decisions of the executive or management class. Infrastructure and stack decisions are made hedonistically – “cool” tech that makes the engineers and devs feel good about themselves almost always gets a priority over “boring” tech that has no risks.

The industry, especially the software side of tech, is driven by emotion and a sense of what is fashionable. There is genuinely more grounded engineering – materials, machinery, process, supply chains, etc. – taking place in the fashion industry than there ever has been in the software industry.

That’s why it’s a pop culture, not a fashion culture.

Toolmen

I think of the engineers and designers who have spent decades honing their skills, deepening personal and public creative practices in service both to the users of the systems they built and to their own brilliant spirits, now being told to park themselves in front of a sycophantic oracle that can be appeased only through rote dictates, and which never tires of lying even as their own minds and muscles atrophy from disuse. What is being automated here: the work or the people?

"It made me feel like I was eavesdropping"

The workers with knowledge of the project said it was initially intended to improve the chatbot’s voice capabilities, and the number of sexual or vulgar requests quickly turned it into an NSFW project.

“It was supposed to be a project geared toward teaching Grok how to carry on an adult conversation,” one of the workers said. “Those conversations can be sexual, but they’re not designed to be solely sexual.”

“I listened to some pretty disturbing things. It was basically audio porn. Some of the things people asked for were things I wouldn’t even feel comfortable putting in Google,” said a former employee who worked on Project Rabbit.

“It made me feel like I was eavesdropping,” they added, “like people clearly didn’t understand that there’s people on the other end listening to these things.”

A rude technology deserves a rude response

Critics have already written thoroughly about the environmental harms, the reinforcement of bias and generation of racist output, the cognitive harms and AI supported suicides, the problems with consent and copyright, the way AI tech companies further the patterns of empire, how it’s a con that enables fraud and disinformation and harassment and surveillance, the exploitation of workers, as an excuse to fire workers and de-skill work, how they don’t actually reason and probability and association are inadequate to the goal of intelligence, how people think it makes them faster when it makes them slower, how it is inherently mediocre and fundamentally conservative, how it is at its core a fascist technology rooted in the ideology of supremacy, defined not by its technical features but by its political ones.

[…]

I am here to be rude, because this is a rude technology, and it deserves a rude response. Miyazaki said, “I strongly feel that this is an insult to life itself.” Scam Altman said we can surround the solar system with a Dyson Sphere to hold data centers. Miyazaki is right, and Altman is wrong. Miyazaki tells stories that blend the ordinary and the fantastic in ways people find deeply meaningful. Altman tells lies for money.

LLMs don't do what they're usually presented as doing

LLMs, the technology underpinning the current AI hype wave, don’t do what they’re usually presented as doing. They have no innate understanding, they do not think or reason, and they have no way of knowing if a response they provide is truthful or, indeed, harmful. They work based on statistical continuation of token streams, and everything else is a user-facing patch on top.

Tags

JavaScript template literal as object property name

This will throw an error in every JavaScript engine:

Code language: JavaScript

{
  `mouseenter.${eventNamespace}`: handler,
}

Why? Because template literals are not actually literals, as confusing as that name is - they’re expressions. However, it is possible to (ab)use JavaScript’s zany, vibes-based type coercion to make this valid by wrapping the template literal in an array like so:

Code language: JavaScript

{
  [`mouseenter.${eventNamespace}`]: handler,
}

This causes the JavaScript engine to evaluate the template literal into a string, then it coerces the array containing that one string into a string, which is then valid as the property name.