What Happens When Designers Can Build, Not Just Design
Most product teams rely on pre-built charting libraries. They're practical, well-documented, and they solve a lot of common problems. But they also constrain what you can build. When your data doesn't fit neatly into a bar chart, a line graph, or a standard scatterplot, you're stuck — either forcing the data into a format that doesn't serve it, or writing a spec and hoping engineering can figure out something custom.
The chart above illustrates what this difference looks like in practice — the same feature built two ways, with the traditional design-development cycle taking roughly twice as long. Most of that extra time is rework caused by discovering problems late.
AI-assisted development tools like Cursor have changed this equation for designers. Not because AI is good at visual problem-solving — it isn't, and that's an important distinction. But because once you know what you want to build, AI can help you build it fast enough to validate the idea before committing engineering resources.
This article walks through two examples from my work at NFTfi, a peer-to-peer NFT lending protocol, where AI-assisted prototyping allowed me to build and validate solutions faster than the traditional design-handoff cycle would have allowed.
Example 1: A bespoke bubble chart — built in under a week
We needed a tool for lenders to explore the active loan landscape — timeline on the x-axis, APR on the y-axis, bubble size representing loan amount. The problem was that collection offers in NFT lending generate many loans at identical coordinates, which stack invisibly on any standard implementation. Pre-built charting libraries can't configure their way out of this. It needed something bespoke.
The solution came from research on Observable — D3's documentation and showcase site — where I discovered the packSiblings algorithm: a circle-packing method that forces overlapping circles to spread apart rather than stack. Once I knew what to build, Cursor compressed the implementation from days to hours. The result is a chart where clustered loans spread apart visually, each bubble remains individually inspectable, and the density of clusters communicates where market activity is concentrated — built and validated in under a week. I wrote a separate deep dive on this project — including the D3 research process and why AI couldn't solve the visual problem — here.
Example 2: A responsive table component — 20 minutes from idea to validated code
Not every prototype needs to be a complex visualisation. Sometimes the value of AI-assisted prototyping is in validating small UX decisions quickly, using the exact tools your development team will use in production.
We had a data-dense table in the NFTfi interface showing loan details — asset name, loan amount, APR, duration, status, and an action button at the end of each row. On a wide monitor, the layout worked fine. On a laptop screen, the action button consumed disproportionate space, forcing either horizontal scroll or cramped columns.
The idea was straightforward: below a certain screen width, replace the full-text action button with a compact dropdown icon that expands into a menu. Standard responsive pattern. But the question was whether it would actually work with our data density and layout — and whether the right component existed in our library.
This is where a specific detail matters: at NFTfi, our front-end is built on MUI Minimal — a specific component library within the MUI ecosystem. I didn't just ask Cursor to build a generic responsive table. I specified that it should use the MUI Minimal library, the same library our developers use in production.
This is important because what's easy to implement in one component library can be surprisingly difficult in another. By prototyping with the exact same tools the dev team uses, I could be confident that what I built was actually feasible in production — not a concept that would need to be re-engineered with different components. When I showed the working example to our developers, they could look at the actual code, see which MUI components I'd used, and reference it directly in their implementation.
I pasted a screenshot of the actual table with real loan data into Cursor, asked it to implement the responsive behaviour using MUI Minimal, and had a working example in about twenty minutes. Real column widths, real data density, the correct component identified and implemented.
The time saving isn't just in the twenty minutes. It's in what doesn't happen downstream. The developer receiving this doesn't need to guess which component to use, whether the responsive breakpoint works with real data, or whether the layout holds up at different screen widths. Those questions are answered. The handoff friction that normally stretches small UX decisions across multiple review cycles is almost eliminated.
User testing without the development bottleneck
There's another dimension to this that's easy to overlook. In a traditional cycle, it's not just bad data that can send you back to the drawing board — user testing can do the same thing. You design a feature, engineering builds it, you put it in front of users, and they tell you it doesn't solve their problem the way you expected. Now you're back in design, then back in engineering, burning another cycle.
With AI-assisted prototyping, you can test with real users before developers are ever involved. The prototype is functional enough — with real data, real interactions, real components — that meaningful user feedback comes in at the cheapest possible moment. If users don't understand the interface, or want something different, or use it in a way you didn't anticipate, iterating costs you hours, not weeks. And critically, your developers never spent time building something that needed to be rethought. They only start work once the concept has already survived contact with actual users.
For the bubble chart, we shared working versions with Calvin — a power user who runs a serious lending operation on NFTfi. Within minutes of using the prototype with real data, he identified clustering patterns that confirmed the packSiblings approach was working, and flagged edge cases in how expiring loans were displayed that we hadn't considered. That kind of feedback is only possible when someone is interacting with a functional tool using their own data — not reviewing a static mockup.
What this means for teams
For founders and product managers evaluating how to structure teams, this represents a meaningful shift in what a product-oriented designer can deliver.
A designer working with AI tools can now validate concepts against real data before writing a spec, reducing the risk that engineering builds something that doesn't hold up in practice. They can build with the exact component libraries the dev team uses, ensuring compatibility rather than producing mockups that need reinterpretation. They can explore bespoke solutions — beyond what pre-built libraries offer — and validate them in days rather than sprint cycles. And they can deliver working prototypes that developers reference directly, reducing the ambiguity that usually fills the gap between a Figma file and production code.
The financial impact is straightforward. A single wasted sprint cycle — where engineering builds something that doesn't survive contact with real data or real users — can cost a startup tens of thousands in developer time. Multiplied across a product's lifecycle, the cumulative cost of late discovery is one of the biggest drains on early-stage budgets. AI-assisted prototyping doesn't eliminate all risk, but it moves the most expensive discoveries to the cheapest possible moment.
This doesn't replace engineering. Production code needs architecture, testing, and performance work that a prototype doesn't provide. But it means that by the time engineering starts building, the core product questions have already been answered. What to build, which components to use, how it behaves with real data, whether users understand it — all validated before the expensive work begins.
Tools used: D3.js (with packSiblings), Cursor (AI-assisted development), MUI Minimal component library, Observable (D3 showcase and documentation)
For a deeper look at using D3 with AI — including specific prompts, debugging techniques, and lessons from ten D3 projects — see D3 Was for Engineers. AI Changed That.