DnD, AI & the Prompt Problem

A small chihuahua stands in front of a Dungeon Masters screen. Above it's head, an abstract Gen AI prompt box floats with the text 'Prompt me please'

Great RPG systems constantly prompt players. What would happen if generative AI took a page out of the Dungeon Master’s Guide?


Last weekend, I found myself unexpectedly roleplaying a failed Chihuahua romance—For two brief, glorious minutes, I became Bill, anxious, suspicious, refusing to leave his post, not even for the most beautiful lady Chihuahua he had ever seen.

How did I get here? Let's back up.

This summer, I signed up to run a DnD session for teens.

I'd never DM'd for strangers before. To prepare, I play-tested my module with friends. We had… something just short of fun. Some folks struggled to connect with the open-ended story I provided, lost without enough guidance. They hemmed and hawed, discussing plans round and round.

But hey, feedback is what a play test’s all about!

I took their notes and added more structure, breaking down information I needed to prompt the players with in 10-minute increments.

Not what plot I needed to deliverwhat information to prompt the players with. So players could handle the plot.

And that was how, one week later, I found myself roleplaying the Chihuahua.

The teens were fantastic. I gave them a heist, and they went ham—set hawks loose, lit everything on fire, and had a shapeshifter animal companion transform into a... Chihuahua. It was the kind of rangy, delightful storytelling you can only get with co-creation.

This has me thinking a lot about collaboration, AI, and whether what I think I want is truly what I want.

DnD & The Art of Prompting

In Dungeons and Dragons, the DM walks a tightrope. If they generate too much content, players consume a story rather than co-create it. Too little, and players wander.

In other words, ideally, the DM builds a scaffold, but the players turn it into a home. Or a Zeppelin. It's up to them.

DnD's system doesn't hand you this idea. It's legendary for its thick player handbooks and obscure edge-case rules. But at its core, the mechanics give the DM guidance to non-deterministically prompt the players towards their next action.

So where does AI fit into this?

I want a system that prompts me, as much as I prompt the system.

Mapping RPG Scaffolding to AI Tools

Here's a pass at a scaffold spectrum for RPGs:

  • Too much: The DM completes the plot. The player consumes.

  • Too little: The DM offers minor nudges. The player wanders.

  • Just right: The DM distills relevant information, prompting the player. The player follows prompts that interest them. The DM builds on those. Together, they come up with something neither could have predicted.

Applying it to generative AI:

  • Too much: The AI completes the task. The user consumes. (Midjourney, Dall-E)

  • Too little: The AI offers minor nudges. The user wander. (Spellcheck)

  • Just right: The AI distills relevant information, prompting the user. The user follows prompts that interest them. The AI builds on those. Together, they come up with something neither could have predicted.

I am, of course, reductive with this parallel. “Too Much” could be a great movie or campfire tale. “Too Little” could be an aimless but calming doodle across blank paper.

But what I getting at is this: I don’t want to just swallow content. I want to co-create. I want to move from passive consumption to active participation.

"Write an essay for me…" Robs me of the guidance to grow as a writer.

"Give me a painting…" Bereft of the satisfaction of a finished work of art.

"Create an app for me…" No context on best practices, resulting in a gloopy mess.

AI as Content Factory

Today, many AI workflows can be boiled down to: I prompt, it gives me content. I consume the content.

This has some mundane but solid wins: It can give me middling analysis on my breast cancer risk, aid quick iterative thinking, and find a subject line for that email I didn't want to spend one more second thinking about. These are things I am grateful for.

But here’s the trap:

I ask it to "Write an essay", and it delivers a mediocre essay. I think, “I would like this essay to be written more effectively and efficiently.” So I try to prompt AI better.

I think that’s the wrong instinct. Let's interrogate this.

Co-Creation Matters

First, my full-throated participation in creation makes a better experience: I love the "aha" moment of accomplishment, when I understand for myself. It clicks. This is the difference between running the marathon and hiring a Strava mule—the essence of Type 2 fun.

There are many tasks (such as coding) that, without more aid and context, I am stuck. But with a bit of guidance, I can own it.

I want AI that ushers me into the flow state, not flows for me.

Second, the unexpected is often where the most delight lives: I am not a perfect predictor of what will excite me—new cuisines, new music, new dialogue branches. I live for the ooooh shit moment, where threads collide in a way I could not have predicted.

A second creator, one with access to very different information from me, will continually surprise me—push me, stretch me, and make me giggle. If it simply parrots back the most polished iteration of exactly what I ask for, the mirror image degrades and degrades.

So what?

So why am I—a B2B product designer—spending time on this abstract ramble?

In 1992, with witchy foresight, Mark Weiser argued for moving away from a 'Co-Pilot' towards a heads-up-display (HUD) model. He used a flight assistant as an example, "The goal of the pilot's assistant is to enhance the ability of the pilot to fly the airplane" — not, to fly it for him, "leaving behind just an enhanced ability to act".

His presentation corralled my disparate RPG thoughts.

As I design AI use for software, do I understand the tasks my users face deeply enough to offer them the strategic information they need to problem solve? Can I offer them AI that equips them? Generates with them, not for them?

If the AI generates for users, it limits the scope of my ambition as a designer—AI mediates a collaborative relationship between me and the user.

“…the model can only generate recombinations of existing ideas... Such a model can’t directly generate an image based on new fundamental principles, because such an image wouldn’t look anything like it’s seen in its training data.” - Using Artificial Intelligence to Augment Human Intelligence

Unlike AI, the user is a universe of original thought. A limitless source of learning for me.

If I give the user AI that makes effective and efficient content for the user, however wonderful that content may be, it will reduce the user to remixing existing ideas. And so I am limited as well.

It sounds much more fun to collaborate.


Next
Next

Design Like a (Recovering) Eldest Daughter