All posts

Who I Am and Why I Built This

March 16, 2026 by Jared
adhdaddneurodivergentauhddoompileorigin storyaphantasia

So about ten days ago, my wife and son left the house for the evening and I was sitting in my living room feeling that thing. You know the one. Not the urge to clean. The guilt. The heavy, sticky guilt of looking around a room that you know is a mess but also looks exactly like it's always looked your entire life.

I have ADHD. I've had it since I was a kid. Ritalin on, Ritalin off, bouncing off walls. I was the kid who couldn't sit still, couldn't shut up, couldn't stop touching things. That's mellowed out some. I'm 28 now and these days it shows up less as bouncing off the walls and more as staring at them. The mess doesn't bother me until it suddenly does, and then I'm drowning in it and everything needs to be thrown away immediately. Like, my wife has literally had to go through trash bags after one of my panic-cleaning episodes because I'll decide that the solution to a messy room is owning less stuff. Right now. All of it. Gone. She's very patient.

But that night, I didn't go nuclear. I did something weirder. I opened an AI chat app, took a picture of my living room, and typed: "What would this look like clean?"

And here's where I need to go on a tangent. It'll make sense, I promise.

Quick tangent: I can't see things in my head (if you don't care, skip to the good part)

I have aphantasia. Found out about six months ago watching a random video. Aphantasia means when I close my eyes, I see nothing. Black. Every single time.

When you say "picture a beach," I can tell you facts about beaches. Sand. Waves. Blue sky. But there's no image in my head. There has never been an image in my head. I have dreams and I remember them, but even those are more like facts I know happened than pictures I can replay.

I have this memory from when I was a kid at the library. Another kid said "just picture it in your head" and I concentrated really hard and kind of convinced myself I could almost see something. I couldn't. I just didn't know that yet.

When I was little I used to rub my eyes really hard because the fuzzy colors and static were the closest thing I had to seeing something with my eyes closed. That was my visual imagination. Pressing on my eyeballs until colors happened.

I didn't know this was weird. I knew people could "picture things in their mind" and I knew I didn't do that, but I never connected those two facts into "oh, that's a whole condition with a name." It was just... how things were.

So why does this matter for a cleaning app?

Because when I look at my messy living room, I literally cannot picture what it would look like clean. I don't see individual items that need to be moved. I see a room full of stuff. It's all just... room. Somewhere in the back of my brain there's a feeling that something is wrong, but I can't turn that feeling into a picture of what "right" looks like. And if I can't see the destination, I definitely can't see the steps to get there.

It's like colorblindness but for clutter. The mess is there. I just can't see the individual pieces.


Back to the living room

So the AI showed me a cleaned-up version of my room and something clicked. For the first time, I could see it. Not in my head. On my phone. But still. I could see what clean looked like and suddenly the mess made sense. Oh, that doesn't go there. That needs to move. That shouldn't exist in this room at all.

One of the big things it flagged was this easel we had in the corner. It said something like "find a better location for this." And I looked at that easel, the easel I'd been walking past for months, and thought: huh. Yeah. That thing does take up a lot of space. So I moved it under the stairs.

A few days later a friend came over and said, "Dude, I was going to tell you to move that easel. It takes up way too much room. And I see you already put it under the stairs."

I was like, yeah, my app told me to do that.

And it hit me that a human and an AI both saw the same problem in my living room, and I'd been walking past it for months without registering it. Not because I'm lazy or I don't care. Because my brain genuinely does not process visual clutter the way most people's brains do. The executive function that says "that easel is in the wrong spot" just... isn't there for me. It's not a motivation problem. It's a hardware problem.

Why it's an app and not just me yelling into ChatGPT

I told my wife I wanted other people to have this. So I did a bunch of research on how to build it so that as many people as possible could use it for free and it would basically pay for itself. I'm not trying to start a company. I'm a developer with ADHD who couldn't clean his living room and accidentally built something that worked.

Originally I thought the audience would be overwhelmed moms AND neurodivergent people. Cast a wide net, right? So I showed it to some moms.

They looked at it and said, "I know how to clean."

Oh.

Oh.

Right. Most people look at a messy room and see the steps. They might not want to do the steps, might be exhausted, might be busy. But they can see them. They look at a counter with dishes and mail and think "dishes go in the sink, mail goes in the drawer." Automatically. Without effort.

I look at that same counter and see a blob of overwhelm. My brain shuts the door and walks away. But when a picture on my phone says "there are six things here and they all go in the kitchen," suddenly it's easy. Oh. Six things. Kitchen. I can do that.

So I realized pretty fast this isn't for people who are too busy to clean. It's for people whose brains skip the planning step entirely.

So the audience narrowed. ADHD. Autism. Executive dysfunction. People who look at a pile and feel everything and see nothing.

Does it actually work though?

I spent weeks running dozens of test scans just to build the engine and fine-tune the AI. But the first time I actually put my phone in my hand and used it to genuinely clean my room? It was weird. I didn't have to think about what to grab next. Didn't have to decide where anything goes. Just followed the list.

When my wife came home I told her it actually helped. Which sounds dumb because I built the thing, obviously I think it should help. But knowing something should work and feeling it work are different things. It felt good. Like someone was in the room with me, calmly telling me what to pick up next so I didn't have to figure it out myself.

That's Doompile. You take a picture of your mess. AI figures out what's there, groups it by where it goes, and gives you a walking plan. No decisions. No figuring out where to start. Grab this, walk there, come back.

That's pretty much it

I don't know if this is going to be a big thing or a small thing or a nothing thing. I just want to help people who feel the way I felt sitting in that living room. Three free scans a month. No credit card.

Go take a picture of your pile.

Built with love and executive dysfunction.

Ready to destroy your doom pile?

Snap a photo. AI builds the plan. You just follow it.

Get My 3 Free Scans