What shipping a consumer AI product taught us about building for businesses
There's a version of AI consultancy that involves a lot of decks, some workshops, a proof of concept that runs in a notebook, and a handoff document. Nobody is paying for that anymore — or they shouldn't be. The question clients are now rightly asking is: can you actually build something that works?
ioomm.app is our answer to that question. It's an AI product that generates professional corporate portraits from a selfie. It's live, it takes real payments from real users, and it runs without us watching it. We built it, shipped it, and it works. That experience — far more than any consulting engagement — taught us what it actually takes to build AI tools that hold up in production.
The gap between a demo and production
AI demos are easy. You call an API, get a result, show it to someone. It works in the right conditions, with the right input, on the right day. The demo looks impressive. Everyone nods.
Production is different. In production, users upload the wrong file format. They have slow connections. They submit the same job twice. They expect a result in 30 seconds and get frustrated at 45. The AI model returns something unexpected. The payment succeeds but the generation fails. The generation succeeds but the download link breaks.
Every one of those failure modes has to be handled — not because they happen often, but because when they do happen, they happen to a real person who paid real money and has a real expectation. Handling them is not glamorous engineering. It's the difference between a demo and a product.
Building ioomm forced us to handle all of it. Async job queues. Retry logic. Graceful error states. Stripe webhooks that have to be idempotent. Image processing that has to work across every phone camera. We didn't build these because they were interesting — we built them because the product broke without them.
What this means for B2B tool builds
The instinct in B2B is to treat internal tools more loosely than consumer products. Nobody is paying directly, the user is a staff member not a customer, and the stakes feel lower. That instinct is usually wrong.
A field engineer whose job reporting tool crashes mid-job doesn't just get annoyed — they fall back to paper, the data never gets captured, and the compliance record has a gap. A hospital administrator whose data interpretation tool returns a wrong answer doesn't just shrug — they escalate, trust is lost, and the tool gets switched off.
The production requirements for B2B tools in regulated industries are often higher than for consumer products, not lower. The tolerance for errors is smaller. The consequences of failure are more visible. And the users are less forgiving, because they didn't choose the tool — it was chosen for them.
Having built something that real consumers use and pay for means we don't romanticise the build. We know what production looks like. We design for it from the start.
Why product instincts matter in B2B
Consultants think about deliverables. Product builders think about outcomes. The difference sounds subtle but it changes almost every decision.
A consultant delivering an AI tool asks: does it meet the specification? A product builder asks: does it actually get used? Does it make the person's job easier in practice, not just in theory? Does it survive contact with real workflows, real data, real edge cases?
Building ioomm gave us product instincts that consulting alone doesn't develop. When we scope a B2B AI tool now, we think about the failure modes before we think about the features. We ask what happens when the input is wrong, when the model is slow, when the user does something unexpected. We design the unhappy path before we polish the happy one.
That's not a methodology we adopted. It's a reflex we developed by shipping something and watching it break.
The honest difference between consulting and shipping
Consulting produces knowledge. Shipping produces capability. Both are valuable, but they're not the same thing, and confusing them is expensive.
A consulting engagement at its best produces a clear understanding of a problem, a well-scoped brief, and enough trust to commission a build. That's genuinely useful. But the consulting engagement is not the build. The document is not the tool. The recommendation is not the solution.
The organisations that get the most out of AI right now are the ones that can move from understanding a problem to having something running in production — without a 12-month gap in between. That requires build capability, not just analytical capability. It requires people who have done it before, not just people who understand how it works in theory.
ioomm exists partly because we wanted to be those people. We wanted to know what it felt like to ship an AI product, take a payment, watch a real user complete a real job, and deal with the things that broke. We know now. And it informs everything we build for clients.
Building AI capability for your business? We help businesses design and deploy AI systems for operations, knowledge, service, and growth. Book a diagnostic at alvento.ltd or email hello@alvento.ltd — first conversation is free.