TL;DR
- Static dashboards take an average of four months to build β and 72% of users export the data to Excel anyway, according to Starburst CEO Justin Borgman.
- A company called Starburst is building an AI chat interface on top of Apache Trino that lets you query all your data sources directly, without ETL-ing everything into a single warehouse first.
- The real value isn't just speed... it's asking questions you didn't know you needed to ask, before a dashboard could have been built for them.
- Hallucination risk is real. If the AI is right 90% of the time, the 10% that's wrong will eventually cost you.
- For most small businesses, the bottleneck isn't visibility into data... it's getting the data in one place at all. Fix the tooling first. Then worry about asking it questions.
On Infacto Daily this week, Jackson and I got into something I've quietly been frustrated with for a while: the gap between "I want to understand what's happening in my business" and "I can actually see what's happening in my business."
That gap has a name. It's called building a dashboard.
Forty-one percent of companies take over four months to build a single dashboard
According to Starburst CEO Justin Borgman, 41% of companies spend more than four months building a single dashboard. And then 72% of the users just export the data to Excel anyway.
Let me sit with that for a second.
You spend four months building something... and the people who asked for it immediately pull it into a spreadsheet the second they get it.
I've felt this firsthand. Right now, email opens and clicks for the Infacto Daily newsletter aren't in a report anywhere. If I needed to make a decision about subject lines today, I'd have to go dig manually, write queries, and piece it together myself. It's a lagging problem. By the time you want the data, you're already behind.
Jackson put it plainly: most dashboards are either out of date by the time you need them, or you need a new one built for the specific decision in front of you. Neither is fast.
What Starburst is actually doing
The traditional path to business intelligence looks roughly like this: you have data in five different places (your CRM, your Shopify, some spreadsheets your team keeps, maybe some API endpoints), and a team of data engineers runs ETL processes to pull it all into a central data lake or warehouse. Then you have tiered "data zones" where the raw data gets cleaned, joined, and massaged into something queryable. Then someone builds a dashboard on top of that. Then someone else exports it to Excel.
Starburst is built on top of Apache Trino, an open-source query engine that lets you query data where it already lives, across multiple sources, without duplicating it into a central warehouse. On top of that, they add a semantic layer (basically a map telling the AI what "revenue" means, which database it lives in, what it might be called elsewhere) and then a chat interface so you can just ask questions in plain language.
No ETL, no four-month dashboard build. You ask: "Which product category has the highest return rate this quarter?" and it goes and gets it.
Palantir does something similar at the enterprise level, but their involvement tends to be deep and expensive. Starburst looks more like a mid-market layer.
The small business reality check
Here's the thing I kept coming back to during this conversation: most small businesses under $10M don't have a data visibility problem yet. They have a data organization problem.
If you're doing a few million on Shopify, your customer data probably lives across Shopify, your email platform, a spreadsheet or two, and maybe a CRM. Before you worry about an AI chat interface to query all of that... you have to connect all of that.
That's where the value of something like Starburst actually lands: not at the "start building reports" stage, but after you've got the tooling in place and suddenly realize you have mountains of data across five systems with no way to see it all at once.
Jackson pointed out that an IT-capable team could probably build this themselves using Trino (it's open source). A small company like Infacto could build a version that works for its specific needs without paying enterprise prices. The hard part isn't the query layer... it's the semantic mapping and the system prompts that make the AI understand your data model.
For the right business (high-volume e-commerce, field service companies with large CRMs, manufacturers with operational data spread across tools), this is a very real unlock. For a 10-person agency with spreadsheets? You've got earlier problems to solve first. The strategy diagnosis quiz is a good way to figure out which category you're actually in.
The hallucination problem nobody wants to talk about
Jackson brought this up and I think it's the most important consideration before anyone gets excited and deploys a tool like this on real business data.
If the AI is right 90% of the time, you'll start trusting it. You'll make decisions based on it. And then the 10% will catch you.
For people who know what the data looks like under the hood, this is manageable... because you know what "wrong" smells like. You can validate. But for someone who doesn't have that technical context and is using the AI to coach them through their own data, they may not even know when to be skeptical.
The semantic layer Starburst builds is partly meant to reduce this risk. If the AI has a clear map of where to look and what terms mean, it hallucinates less. But it's still a real tradeoff, and one worth building in human review for any decision that actually matters.
What comes after dashboards
Jackson's take at the end of the episode stuck with me: dashboards are probably going away, at least in their current form. The future isn't a static view that gets stale. It's ad hoc queries on demand... you think of a question, you ask it, you get an answer, and then it goes away. No maintenance. No syncing issues. No report that breaks when someone changes a column name.
The ones you keep? The ones that turn out to be useful on a weekly basis? You schedule them. Claude, ChatGPT, and Cursor already have task scheduling built in if you've connected them to your data via MCP. So this pattern is here now, just not productized the way Starburst is going for.
I'll add one more layer: if you're running AI agents to do work in your business, you're going to need somewhere to see what they're doing. Dashboards will change shape... they'll become agent status pages more than KPI views. That's actually where I think something like Starburst could get really interesting over time.
Start with the question, not the tool
The thing I keep coming back to whenever a new BI or data tool shows up: what question are you actually trying to answer?
If you can't answer that, no tool helps. If you can answer it, start manually. Build the query yourself, even if it takes an hour. Do it a few times until you know exactly what you're looking for and what "good" looks like. Then you'll know what to automate, what to delegate to AI, and where to trust the output.
That's the same principle as lead management, agent deployment, or anything else. Manual first. Repeatable second. Automated third.
Starburst is a promising shortcut on the infrastructure side. But the discipline of knowing your data well enough to validate AI answers... that's still on you.
If you want help figuring out where tools like this fit in your specific business, the free AI tools checklist is a good place to start sorting out what's actually useful versus what's just interesting.