John Carmack defends AI amid backlash over Microsoft’s generative Quake II demo

It really is worth reading Carmack's full tweet. He watched the thing that interests him most: systems engineering be completely trivialized in video game development over time. So he's not talking out of his ass here. He's not shilling for anything, just offering a reasonable take based on his experience of living through similar things in the past.

I honestly couldn't think of a better way to write out how I feel about AI in my industry (programming). It's a tool I need to adapt to, its going to eliminate some jobs like all low code tools do, and there are some scary things about how people are using it that I'm worried about.

The full tweet:
The problem with his premise is that a core assertion he makes is objectively incorrect. Game engines have never, at any point, de-emphasized the engineering side at all. The evolution of game engines has continuously added new layers to core engineering.

Carmack's original game engine had a very rudimentary level scripting functionality. He sketched levels out on graph paper, created data tables to represent these levels, fed the data into the engine, and the engine produced arbitrary game play based on these levels. It would require zero engineering skill whatsoever for anyone to grab a sheet of graph paper and design levels ... and precious little to do the data-entry of converting graph paper notation to computer-readable data tables.

While there is very little plumbing in Carmack's nostalgia case, that split engine/level design structure itself is really is no different today. From that point to now, all that has happened is level designers have been given computerized design tools instead of graph paper along with an ever-growing palette of elements to work with.

The odd bit is Carmack himself helped innovate engine-side support for now common engine design features like fully-exposed scriptable APIs, level editors, etc. None of that can be leveraged by level designers and artists without a TON of back-end engineering every step of the way. This process has never stopped. The more features and capabilities exposed to the design teams .... the MORE back-end engineering has been needed to support it.

Carmack's telling of things is always horribly convenient for whatever political point he's trying to make but often technically quite squishy if you poke at it even a little. He's just intellectually lazy.

Depending on what he means, I think it is generally correct AI tools will inevitably be used in certain ways for game development. At the same time, this objectively terrible tech demo is a pretty rough context from which to make the case. There is a real possibility trying to use generative transformer models this way is a technological dead-end.
 
Upvote
9 (9 / 0)

steelcobra

Ars Tribunus Angusticlavius
9,400
(I work in this space on a commercial basis but my clients are now by and large CPG companies. I make interactive 3d stuff and videos, I use Unreal, Unity, 3ds Max, Adobe CC et al. I've been doing stuff in this area since 1989 and have worked in games, TV, online and print. I have no particular axe to grind).

But I think Carmack is right, especially with his farming analogy. It reminds me a bit of a time in the very early 90's when we were doing 3d using programs like Topaz and Linotype image-setters to create colour separations. I remember taking a sample brochure to the Print Industry Research Association in the UK and being told computers would never be up to doing colour, it was too demanding (we were using SGI workstations). When I took out the brochure (I'd actually gone to ask some questions about UCR/GCR) the argument immediately switched to a moral one about how dreadful it all was an how it would take people's jobs. Morality to one side, I think I'm still spinning from that pivot more than 30 years later.

My point is just this. This shit is going to happen, and it had better happen with creative professionals, people who actually know what they're doing, in control not some social media sensationalist thinking they can make the next big thing without any effort at all. AI is here. Stop bitching about it, find out what it's good at, what it's shit at (which is a lot) grab it by the balls and use it like any other tool.

Otherwise we're all fucked.
That was still "people doing things, with a computer as the tool."

MLAs are "Use human creative works to train, then spit out prompt results based on what it was trained on."

AI for game development by it's nature can't create anything new or interesting.
 
Upvote
0 (2 / -2)

Ozy

Ars Tribunus Angusticlavius
7,218
Depending on what he means, I think it is generally correct AI tools will inevitably be used in certain ways for game development. At the same time, this objectively terrible tech demo is a pretty rough context from which to make the case. There is a real possibility trying to use generative transformer models this way is a technological dead-end.
As what way? The final playable game? Yeah, probably not feasible for the next decade(s). As a prototype to get a feel for certain aspects of gameplay? More rapid iteration on tweaking game mechanics, level design, character interactions? As a game development 'tool' rather than the final product? He's absolutely spot on.
 
Upvote
-5 (1 / -6)

caramelpolice

Ars Scholae Palatinae
1,436
Subscriptor
The problem with his premise is that a core assertion he makes is objectively incorrect. Game engines have never, at any point, de-emphasized the engineering side at all. The evolution of game engines has continuously added new layers to core engineering.
Modern 3D game engines are some of the most complex and multifaceted pieces of software on the planet. They're doing real-time physics simulations, ray tracing, 3D animation, material shaders, enemy AI, environmental sound simulation and audio DSP, networking, all this and more at least 30 times a second, and some people get mad if they can't do it at least 144 times a second. Engineering on a modern game engine is insanely more complicated than the last time Carmack shipped a game.

And nothing Microsoft demonstrated here helps with any of it.
 
Upvote
6 (7 / -1)

Nalyd

Ars Tribunus Militum
2,845
Subscriptor
That was still "people doing things, with a computer as the tool."

MLAs are "Use human creative works to train, then spit out prompt results based on what it was trained on."

AI for game development by its nature can't create anything new or interesting.
But if creative people make the prompts it’s just another development tool.
 
Upvote
-3 (2 / -5)

Nalyd

Ars Tribunus Militum
2,845
Subscriptor
Because the costs, both monetary and environmental are too high given the proposed benefits. It's like Tesla and FSD, it's always just around the corner, and humans wind up having to supervise these systems anyway. Sure, AI can generate content, but you'll still need humans to catch and fix errors.

There's also no business model that justifies this massive investment. Nobody is making money off of AGI, in fact it's a giant money hole. But rich motherfuckers would rather rely solely on technology to try and fix problems rather than admitting that they are the problem.
Again your argument relies on a static view of how current models operate, and current business situations for a technology in its infancy. Lots of stuff is developed at a loss and then because of that investment becomes more broadly useful.
 
Upvote
-8 (2 / -10)

Derecho Imminent

Ars Legatus Legionis
15,233
Subscriptor
Again your argument relies on a static view of how current models operate, and current business situations for a technology in its infancy. Lots of stuff is developed at a loss and then because of that investment becomes more broadly useful.
Sure but you cant always tell which new ideas will be a success. Some win, many do not. People that say "this WILL be the future" are just trying to predict the future based on hopes and dreams. So far the evidence Ive seen on LLMs is that they are weak, not revolutionary.
 
Upvote
5 (7 / -2)

Oak

Ars Tribunus Militum
2,484
Subscriptor++
The nested double quotation marks in the last paragraph made me visually rewind a bit to make sure I hadn't misread attribution before that point. Standard U.S. practice is to use single marks for inner quotation, double for outer. (The UK mostly does the reverse, at least in books, though its practice seems to vary in other media. Either way works as long as there's consistency.)
 
Upvote
0 (0 / 0)

robco

Ars Scholae Palatinae
781
Subscriptor++
Again your argument relies on a static view of how current models operate, and current business situations for a technology in its infancy. Lots of stuff is developed at a loss and then because of that investment becomes more broadly useful.
Again, always just around the corner, any day now. Why, give us all the electricity and silicon and we'll get there eventually! Why the next gen models will be even better (slightly)!

There are some genuine uses for AGI and even blockchain, but not near enough to justify this level of consumption. I don't want to burn down the planet just for mights and maybes.
 
Upvote
6 (6 / 0)

caramelpolice

Ars Scholae Palatinae
1,436
Subscriptor
Upvote
8 (8 / 0)
Plenty of uncharitable comments here. Carmack has the right of it, eventually this will become yet another tool in the toolbox, and I expect there will be both good examples, and bad examples, of it being used.

Also, TV example, but ST:TNG's holodeck basically created storylines and events on-the-fly based on the user's request. If we want that (holodecks), we need this (AI game confabulation based on parameters).
I have no idea why Ars promoted this, but I vehemently disagree. You see, in the Star Trek future, capitalism is dead. People are free to pursue whatever they want. So, things like the holodeck aren't hurting anyone.

Not true here in the really real world. Complete opposite situations.
 
Upvote
7 (8 / -1)

caramelpolice

Ars Scholae Palatinae
1,436
Subscriptor
I have no idea why Ars promoted this, but I vehemently disagree. You see, in the Star Trek future, capitalism is dead. People are free to pursue whatever they want. So, things like the holodeck aren't hurting anyone.

Not true here in the really real world. Complete opposite situations.
weird how all these rich tech dudes love to talk about creating the holodeck or whatever but never plan to end war or racism or capitalism first. funny, that
 
Upvote
7 (8 / -1)

42Kodiak42

Ars Scholae Palatinae
837
Because you have to walk before you can run? Why does so much of the commentariat assume that the capabilities of current AI is the limit and no future improvement will occur? That improvement necessarily comes from tech development and demos like this one, even if it itself isn't a desirable exprerience.

Remember how much worse image generation was just 2 years ago? Baffles me how many of the presumably tech literate have such a static view on this stuff.
Except this isn't walking, this is squirming around on the floor in circles as you try to make the motions of running without standing up first. How is replacing the entire game engine with a visual generator trained on an extant game and hooked up to controller inputs supposed to pan out? All they've demonstrated is that this idea doesn't work, that they've failed, and there's too much going wrong to even identify failure points, limitations, or problems that need to be solved. This has only confirmed that their AI can't maintain a coherent game-state from its short-term context, and that's not an issue of technological refinement here, that's a fundamental limitation of the technology they're using.

And nobody is under the impression that "AI" will never get better because "AI" never referred to any specific technology to begin with, it has always referred to whatever piece of software just did something that, until that point, only humans could do. AI referred to decision trees, chess robots, OCR, Markov chains, chatbots, simple neural networks, and now image generators and LLMs. Once they come up with a better tool that does the jobs of image generators and LLMs with a fundamentally different method, those too will stop being "AI"

And yes, image generation was worse 2 years ago, the problem now isn't that the results are unrecognizable nonsense, but how its peddlers don't pay the artists whose work made it possible, and how the lack of physical accuracy or coherent logic with image details renders the results unusable for many practical purposes such as concept art.
 
Upvote
7 (7 / 0)

steelcobra

Ars Tribunus Angusticlavius
9,400
I have no idea why Ars promoted this, but I vehemently disagree. You see, in the Star Trek future, capitalism is dead. People are free to pursue whatever they want. So, things like the holodeck aren't hurting anyone.

Not true here in the really real world. Complete opposite situations.
And people are still the primary creative force in holodeck media.
 
Upvote
3 (3 / 0)
That was still "people doing things, with a computer as the tool."

MLAs are "Use human creative works to train, then spit out prompt results based on what it was trained on."

AI for game development by it's nature can't create anything new or interesting.
Disagree. Even the most original and complex games can be described in plain language. Using a few paragraphs, detailing every subsystem, logical concept, physics, interface, and button interaction.

Creating a brand new style of art I could see being hard. But creating a revolutionary system is really just translating ideas into code.
 
Upvote
-8 (0 / -8)

steelcobra

Ars Tribunus Angusticlavius
9,400
Disagree. Even the most original and complex games can be described in plain language. Using a few paragraphs, detailing every subsystem, logical concept, physics, interface, and button interaction.

Creating a brand new style of art I could see being hard. But creating a revolutionary system is really just translating ideas into code.
How do MLAs accomplish any of that? They can't. And they definitely don't do anything "revolutionary."
 
Upvote
2 (2 / 0)
And people are still the primary creative force in holodeck media.
Right, because there is no financial incentive to replace humans in that fictional setting.

We DO NOT WANT corporations replacing us for satisfying mental work! It's one thing to replace unsatisfying back breaking menial labor, the stuff no one wants to do, but to have machines replace us for the satisfying work so we're FORCED to do the menial tasks just to survive? That's a dystopia. That's not the Federation, that's Ferengi.
 
Upvote
2 (2 / 0)
How do MLAs accomplish any of that? They can't. And they definitely don't do anything "revolutionary."
Oh, I think you misunderstood my post. My point is that they don't have to. A reasonably creative person could describe a novel game in detail, and an AI could build it from the description.
 
Upvote
-7 (2 / -9)

WXW

Ars Scholae Palatinae
1,081
So one possible answer is they're not looking to create a published game, but to use this to speed development of internal prototypes, having the AI get down basic gameplay mechanics and generated assets in the style they want (think of the next assassin creed game prototype prompt being 'assassins creed game but set in medieval Japan') as a kind of draft that they can then build off of. A lot of the early development timeline is taken up with work like this which you could conceivably do in hours instead of months. Sure, there are going to be people who take this idea and publish it (go look at the Nintendo Switch or Playstation stores for 800 variations of store simulators) but presumably that's not what Microsoft is looking to do. I can imagine doing level design mockups through a series of iterative prompts to get a rough idea of how the level might work and a playable version of it - all in hours, and then doing the actual work of building it, with no actual AI content in the final result. Shaving 6 months or a year off of your development timeline is no joke.
This is absolutely not useful for doing prototypes, in any way at all. It requires training with an already existing game, so you have to make the game/prototype first. It needs an absurd amount of data and processing power to work at all. It has horrible latency. The game mechanics are unreliable, they can change from one second to the next. Object and state persistence is terrible. Etc. etc. etc. You can't test a prototype this way.

If you want a prototype, just make a prototype, it will be much much better and much much faster. You don't need months, not even weeks for most prototypes; you can do it in hours for some. If you really want to use AI for whatever reason, you can use it for helping with programming and art; not that I would recommend it, but if somebody wants to say "you are judging it based on the capabilities today, not in the future!", whatever improvements you expect in AI, they will still be way better used for creating the actual prototype software, not hallucinating it.
 
Upvote
8 (8 / 0)

WXW

Ars Scholae Palatinae
1,081
It will replace a lot of people in the gaming industry. The question is will they produce better games with good gameplay or will they just regurgitate crap with it? Most of the games today have better graphics in many cases, but the gameplay is horrible…..

Conversely I also think that as the models get better smaller, and as the computers also get better and smaller, i.e. imagine being able to do more things without being dependent on online computing services for the bulk of the design and engineering of your AI program model, what if a small talented 3 to 6 man company could accomplish something amazing without being tied to the large bureaucracy involved in creating a game in today’s industry.
Didn't you just describe indie games, many of them actually amazing?
 
Upvote
10 (10 / 0)

WXW

Ars Scholae Palatinae
1,081
Let AI do the boring stuff. The trivial things. Makes a programmer's job more interesting. Admit it, it can be boring as hell. Of course we need bread on the table... Anyone did a career switch recently that wants to share their story?
It can be boring sometimes, but most of the time I enjoy it. For the trivial things I can just use libraries (mine or 3rd party), code snippets, etc., which I don't need to review as thoroughly as AI's output (that is waaaaaaaaaaay more boring).
 
Upvote
2 (2 / 0)

WXW

Ars Scholae Palatinae
1,081
As a dev, would have to add that nothing about our line of work, is about making more people work in the pipeline to deliver the final output. The brutal reality of it all is, we are always striving to automate, and AI is the ultimate tool of this era for it. I'm thinking of the old days when code integration was a manual task, and how today it's completely appveyor'd (or similar). Not to mention generating unit tests, etc etc. WHAM is a natural progression of things, and any real developer would understand this. And any developer hankering after old ways of doing things, simply reveals how they've become irrelevant to the programming culture as a whole. The day our evolution as programmers stops, is the day we stop being programmers.
Maybe I'm not a real developer and should inform the people I work for, but I fail to see how WHAM is a natural progression of anything. As I said before, if you want AI, just use it to create the actual software, not hallucinate it, otherwise at the end of the process you get nothing but fancy videos, not a game.
 
Upvote
5 (5 / 0)

42Kodiak42

Ars Scholae Palatinae
837
(I work in this space on a commercial basis but my clients are now by and large CPG companies. I make interactive 3d stuff and videos, I use Unreal, Unity, 3ds Max, Adobe CC et al. I've been doing stuff in this area since 1989 and have worked in games, TV, online and print. I have no particular axe to grind).

But I think Carmack is right, especially with his farming analogy. It reminds me a bit of a time in the very early 90's when we were doing 3d using programs like Topaz and Linotype image-setters to create colour separations. I remember taking a sample brochure to the Print Industry Research Association in the UK and being told computers would never be up to doing colour, it was too demanding (we were using SGI workstations). When I took out the brochure (I'd actually gone to ask some questions about UCR/GCR) the argument immediately switched to a moral one about how dreadful it all was an how it would take people's jobs. Morality to one side, I think I'm still spinning from that pivot more than 30 years later.

My point is just this. This shit is going to happen, and it had better happen with creative professionals, people who actually know what they're doing, in control not some social media sensationalist thinking they can make the next big thing without any effort at all. AI is here. Stop bitching about it, find out what it's good at, what it's shit at (which is a lot) grab it by the balls and use it like any other tool.

Otherwise we're all fucked.
There are tools that make our work better, easier, and allow us to do more useful things faster. The vast majority of AI deployed tools are there to be cheap knock-offs of a job done the right way. They are useful at some things, but most of the times we see AI, it's used the actual work of a human being doing their job the right way to replace them with an ill-fitting substitute that has an unpredictable failure rate.

We have new technologies that some companies and people are slow to adopt, we also have dotcom bubbles and Web3 numbskulls who think that cheap CGI and funny sunglasses are the only things they need to be "The way of the future!" And modern venture capital is no longer satisfied with their shit-not-sticking-to-the-wall until they double down and shove it down our throats.

There are some people trying to come up with good uses for modern AI tech: enthusiasts and clever hobbyists who've identified ways to incorporate it into their projects. But the headlines are full of greasy salesmen with bad CGI and funny sunglasses throwing AI at everything they can imagine, including Carmack; they don't come out on top unless they get lucky picking a winning team and project.
 
Upvote
5 (6 / -1)

steelcobra

Ars Tribunus Angusticlavius
9,400
There are tools that make our work better, easier, and allow us to do more useful things faster. The vast majority of AI deployed tools are there to be cheap knock-offs of a job done the right way. They are useful at some things, but most of the times we see AI, it's used the actual work of a human being doing their job the right way to replace them with an ill-fitting substitute that has an unpredictable failure rate.

We have new technologies that some companies and people are slow to adopt, we also have dotcom bubbles and Web3 numbskulls who think that cheap CGI and funny sunglasses are the only things they need to be "The way of the future!" And modern venture capital is no longer satisfied with their shit-not-sticking-to-the-wall until they double down and shove it down our throats.

There are some people trying to come up with good uses for modern AI tech: enthusiasts and clever hobbyists who've identified ways to incorporate it into their projects. But the headlines are full of greasy salesmen with bad CGI and funny sunglasses throwing AI at everything they can imagine, including Carmack; they don't come out on top unless they get lucky picking a winning team and project.
I'll note that the AI that's actually doing good things is not Generative. It's the Iterative/Evolutionary algorithms that are doing things that take research money to do.
 
Upvote
7 (7 / 0)

Hyoubu

Ars Scholae Palatinae
682
The argument of it being a tool is a bit self-serving in this context. Game development is not like working in a mine. It is art. Like most art, it contains a lot of skilled people paid money off their creative passions, all while serving the end goal of making some studios money. I am not in this industry, but I know professional pianists and composers, and I wouldn't want what they do replaced by a computer, and they wouldn't either. Technology entered these industries of course, but you can tell when the artists lead the conversation and not the business leaders.

I think I know why though many are increasingly skeptical of most new tech in general lately.
The original technocratic pitch sold to the public was replacing the jobs we know are very dangerous or harmful for people with computers to give people safer (and higher paying) jobs in return, but the perk of things progressing fast is people can just remember what really happened. Some of that good stuff happened, but many just lost more creative or intellectual autonomy in their job having to follow increasingly standardized procedures to the point it took the joy out of the job they wanted in the first place.
 
Upvote
5 (6 / -1)
People didn't want to give up their hard labor jobs starve either.
That's a little more accurate. Go ahead and offer an old timey gypsum miner a daily wage to stay home, see if he'll turn you down. (Move fast; he'll be dead in less than six months.)

It's all exploitation of labor, top to bottom. An accelerating cycle of an elite class replacing humans with mostly-good-enough software faster than our societal structure can adapt, all so line go up these 90 days now now now. As @*Legion* said, the conversation would be different if this wasn't the case. There would be focus on the tool aspect -- its suitability, efficiency, accuracy. (Right now, consensus? All ass.) I'd absolutely love that. To get there, we have to remember that technology is for humans, tools are for humans, and until we do the things in service of humanity necessary to fill in those bottom bricks of Maslow's pyramid that's going to be the primary focus.
 
Upvote
7 (7 / 0)
That's a little more accurate. Go ahead and offer an old timey gypsum miner a daily wage to stay home, see if he'll turn you down. (Move fast; he'll be dead in less than six months.)
It's a bit more than that. Men have always taken pride in physical work and in being the biggest and strongest, until the steam engine was invented and it became moot. But yes, it's mostly about money. Watching less common talents become cheap. But it's not all about money. Tell me folks on SSDI are happy.
 
Last edited:
Upvote
0 (1 / -1)
Plenty of uncharitable comments here. Carmack has the right of it, eventually this will become yet another tool in the toolbox, and I expect there will be both good examples, and bad examples, of it being used.

Also, TV example, but ST:TNG's holodeck basically created storylines and events on-the-fly based on the user's request. If we want that (holodecks), we need this (AI game confabulation based on parameters).
Ars editors, please take note of how controversial your "staff pick" comment is here. And please stop being so credulous when it comes to investors' boasting about generative ai.

EDIT: typo
 
Upvote
6 (10 / -4)
It really is worth reading Carmack's full tweet. He watched the thing that interests him most: systems engineering be completely trivialized in video game development over time. So he's not talking out of his ass here. He's not shilling for anything, just offering a reasonable take based on his experience of living through similar things in the past.

I honestly couldn't think of a better way to write out how I feel about AI in my industry (programming). It's a tool I need to adapt to, its going to eliminate some jobs like all low code tools do, and there are some scary things about how people are using it that I'm worried about.
This is a really bad take.

You are resigning yourself to the situation with regards to generative AI. So you are accepting all the vast harms the technology is already causing. But worse than that stain on your soul, is the fact that you're resigning yourself to the inevitability of generative AI. It isn't inevitable, not when the large majority of its supporters are those who stand who make money (ie, steal money) using it.
 
Upvote
0 (4 / -4)
Tell me folks on SSDI are happy.
Tell me folks on SSDI are safe, have ready access to the necessities for a healthy life, and have security that this won't suddenly change for the worse.,

We're getting far afield. Carmack lives in a world that doesn't exist anymore. Hell, it's arguable it never existed back then. How much competition did he have from Korean, Chinese, Indian, Indonesian coders?
 
Upvote
4 (4 / 0)
Tell me folks on SSDI are safe, have ready access to the necessities for a healthy life, and have security that this won't suddenly change for the worse.,

We're getting far afield. Carmack lives in a world that doesn't exist anymore. Hell, it's arguable it never existed back then. How much competition did he have from Korean, Chinese, Indian, Indonesian coders?
Considering the sorts of people Carmack has been associating with as of late, as much as I appreciate his thoughts on coding, since he's a brilliant coder, I have no inclination to care about his thoughts on public policy.
 
Upvote
6 (6 / 0)

cococarbon

Seniorius Lurkius
8
Subscriptor
Can't disagree with they are saying, BUT, there's also a lot that's not being said.

For example, what about the ethics of training these models on other people's IP without their consent, or the incredibly high resource utilisation/requirements? And the way it facilitates harming real people with advanced deepfaking?

So there are indeed plenty of valid reasons to think that this tech is "disgusting". And honestly, this AI quake 2 demo is a cool party trick, but nothing usable or helpful in there at all IMO.
 
Upvote
3 (4 / -1)
Can't disagree with they are saying, BUT, there's also a lot that's not being said.

For example, what about the ethics of training these models on other people's IP without their consent, or the incredibly high resource utilisation/requirements? And the way it facilitates harming real people with advanced deepfaking?

So there are indeed plenty of valid reasons to think that this tech is "disgusting". And honestly, this AI quake 2 demo is a cool party trick, but nothing usable or helpful in there at all IMO.
The counter argument seems to be pointing to specific category things we "accept" without any notion of degrees of harm.

To those who think degrees don't matter, let me ask you if you consider there a difference between your thermostat being set to 75 degrees, or 75000 degrees.

Such is the danger of purely categorical thinking, when in this case, wholistic thinking leads to a more helpful and meaningful understanding of the issue.
 
Last edited:
Upvote
1 (2 / -1)