I had a good argument against generative AI a few years ago: Rick Rubin.

In an interview with Anderson Cooper, he’d described his role as a music producer who couldn’t work a soundboard.

Rubin: I have no technical ability. And I know nothing about music. … I know what I like and what I don’t like, and I’m decisive about what I like and what I don’t like.

Cooper: So what are you being paid for?

Rubin: The confidence I have in my taste, and my ability to express what I feel, has proven helpful for artists.

It’s that last sentence that clarified the distinction between human thought and LLM-based generative AI.

AI lacks taste.

Artificial Intelligence and Taste

The debate over whether AI in its current state is “intelligent” is semantic at this point. It’s smart in the way computers are smart—they can do some tasks much faster and with far fewer errors than we can.

Taste is a uniquely human attribute (we’ll ignore other animals, which obviously also express taste).

Any taste an AI model might seem to exhibit is a result of training and programming by humans. Its taste is equivalent to a bias, not a critical opinion.

My understanding of LLM-based AI models is that they generate output by mathematically predicting each next word in a sentence based on the preceding words as well as additional context from their massive pools of training data.1

As AI learns from its training data, it might change its output over time based on what it reads. But it’s not developing taste. It’s obeying algorithms.

That’s why it’s so frustrating to see business decision-makers so quickly abandon their creative colleagues and jump ship to “free” AI-based tools. It’s clear that AI is going to be transformative for some tasks—highly specialized or limited-scope tasks, summarization, data analysis, etc.

But for purely creative tasks—what you might call artistic work—taste is a critical attribute. Art is opinionated, or it’s boring.

For AI, there is no such thing as opinion. There is no taste. So content produced by AI can only be derivative.

The Place for AI in Art and Work

All this is on my mind again 1) because AI weaves its way into literally every discussion every day and 2) because Rick Rubin again appeared in my endless media scroll, but this time he was talking specifically about AI.

While discussing his digital book, The Way of Code: The Timeless Art of Vibe Coding, he analogized AI-powered coding to punk rock music:

“In the past, for music, you had to go to the conservatory and study for years and years. Then someday, you could play in a symphony,” he told Andreessen and Horowitz. “And then, when punk rock came along, you could maybe learn three chords in a day—and there were all these bands. That made it for everybody. How I started in music was punk rock. If you had something to say, you could say it. You didn’t need the expertise or skill set, other than your idea and your ability to convey it. And vibe coding is the same thing—it’s the punk rock of coding.”

He’s not suggesting we take the creativity out of art. He’s advocating for empowering creativity by using AI to make the tools accessible.

I’m a writer, and I’ll never advocate for replacing me with AI (it’s not as good as me, I promise). I’m sure no programmer would want to be replaced by AI.

But there’s a lot to be said for the ability to drive creativity with technology.

In fact, here’s a recent example of an actual software developer who has a positive outlook on AI in programming.

AI Doing the Dirty Work

Ken Kocienda is a longtime programmer (the inventor of the iOS autocorrect feature) who’s seeing the benefits of using AI in his workflow to enrich his creativity, not water it down.

He says:

I’m more productive with AI assistance. Today, I write fewer lines of code than ever—by hand in the old-fashioned way—yet I create more code than ever. What’s more, as far as I can tell, there is no detectable reduction in quality. I’m just faster at making changes, fixing bugs, and turning out more features.

The AI helps with the line-by-line manual work, and the human can focus their energy on the creative work.

My mind feels freed up. I remain at the higher levels of abstraction, with more time to think about ideas and plans. There’s less cognitive overhead in attempting things, so I attempt more things.

With AI editing thousands of lines of code in seconds (a task that it’s frighteningly good at), a programmer can try ideas more freely and innovate faster.

AI as Tool, Not Artist

There seems to be a distinction forming between the kind of purely generative AI that first exploded in popularity a few years ago and the kinds of AI search engines or agentic AI tools that can browse the web, perform administrative tasks, analyze data, and help speed up workflows.

This kind of AI doesn’t threaten creativity. Hopefully.

My original argument against generative AI was that it doesn’t have taste. It can’t. It still can’t.

My argument now is that it’s offensive to overvalue a dumb large language model and undervalue truly creative human brains and the people who have the focus and willpower and ambition to act on that creativity to do great work. The artists.

AI doesn’t have taste, and that’s good. It doesn’t need to, because we have taste. Let AI tools help us act on our taste in more productive and innovative ways instead of replacing the creative process entirely.

  1. When I asked ChatGPT if this was a correct description, it replied: “Large Language Models (LLMs) like mine generate text by predicting the next word in a sequence based on the context provided by the previous words. This process relies on patterns learned from extensive training data, enabling the model to produce coherent and contextually relevant text. The models use complex mathematical structures, such as neural networks, to determine the most likely next word.”