Journal

I Used AI to Audit My UX. It Can't Feel Frustration.

Minimal conceptual illustration showing a neural network brain and a small heart separated by a gap, representing what AI cannot feel in UX design

I’ve been building projects with Claude Code for the past few months. Many of them started without a plan. I’d describe what I wanted, prompt “build a better version,” and let it run.

After a few days on each project, the UX would collapse. Long lists, cluttered menus, low readability. Interfaces that work but make you want to close the tab. If you’ve built apps fast with AI, you know the pattern.

Yesterday, I tried a different approach.

Cowork Mode as a UX Auditor

I gave Claude Code’s Cowork mode access to the project folder, described my ideal user behavior, and asked it to audit the current UX. Find the gaps. Suggest restructures. Flag improvements.

I also asked it to open the app in the browser, walk the full flow, and test like a real user would.

The audit covered four areas:

TaskResult
Identify UX gaps in the current flowFlagged layout issues, redundant elements, unclear navigation
Suggest restructures for readabilityProposed grouping, hierarchy changes, component reordering
Flag improvementsListed accessibility fixes, mobile responsiveness issues, contrast problems
Walk through the app as a userTested click paths, noted dead ends, checked state transitions

The audit was thorough and logical. Each suggestion made sense on paper.

The result was okay. Not good enough.

Cowork gave me the most logical solution. I needed the most intuitive one.

The Emotional Layer AI Misses

Cowork can’t feel the frustration of a confused user. It can’t sense when something feels wrong before you can name why. It can’t tell you that a button placement follows the rules but pushes users away.

Those senses belong to humans. Token prediction doesn’t replicate them.

CapabilityAI (Cowork)Human Designer
Structural analysisCatches hierarchy issues fastSlower and less systematic
Accessibility flagsThorough: contrast, labels, ARIA rolesMissed unless trained for it
Flow testingWalks paths without fatigueSkips edge cases, gets bored
Emotional resonanceZero. Can’t feel confusion or frustrationWhere the real UX lives
”Something feels off” instinctNot possibleThe most valuable design sense
Visual harmonyFollows rulesKnows when to break them

AI gives you the most logical answer. You give the most intuitive one. Good UX needs both.

I’ve spent years building products and shipping apps. The gap between “works right” and “feels right” is where most products win or lose. Cowork closes the first gap. The second one belongs to you.

200,000 Neurons Playing Doom

This connects to something I saw last week that seemed unrelated.

Cortical Labs grew 200,000 living human neurons on a chip and got them to play the original 1993 Doom.

The process:

  1. Visual input from the game converts into electrical signals
  2. The neurons receive those signals on the chip and process them through firing patterns
  3. The firing patterns generate output signals mapped to movement and shooting controls
  4. The game responds, and the loop repeats
Game Visuals

Electrical Signals → Neuron Chip (200K human neurons)

Firing Patterns → Movement + Shooting Controls

Game Responds → Loop Repeats

The neurons play like a beginner who has never touched a computer. But they learn. They form patterns. They improve through biological processes no researcher has mapped yet.

Imagine giving AI a biological layer like that. Not token probability matching, but organic processing that adapts the way living tissue does.

I don’t know how close we are to that. But a dish of neurons learning to navigate a 3D environment and shoot demons makes the distance between silicon logic and human sense look shorter than it did last year.

Building With Both Layers

The best workflow I’ve found:

  • Use AI for the structural audit because it catches what humans miss through fatigue and familiarity
  • Use your own instinct for the emotional layer because no model simulates that yet
  • Combine both and you get UX that is sound in logic and right in feel
LayerWho Handles ItWhy
Structure, hierarchy, flowAI (Cowork)Systematic, tireless, catches gaps humans skip
Feeling, intuition, emotional weightYouNo model replicates the “this feels wrong” instinct
Final productBoth, in sequenceAI audits first, you refine with gut sense

The Systems Before Tools philosophy applies. The system is your design process. AI is one tool inside it. A capable tool. Still a tool.

Let AI audit the logic. Keep the feeling for yourself. The best products come from running both layers.

The Feeling Stays Ours

I’ll keep using Cowork as my UX auditor. It catches structural problems, hierarchy gaps, and broken flows better than I can on a tired Tuesday afternoon.

The things it misses, the subtle frustration, the moment where a layout pushes users away despite following the rules, those I have to catch myself.

Maybe Cortical Labs’ neurons will help close that gap one day. Until then, I’m Shahab Papoon, and I build with AI while trusting my own sense on the parts it can’t reach.

Keep Reading