Stitch
What Stitch is
Basically, we love VPLs and wanted a free, open-source version of Origami Studio that ran on modern technology (iPad, AR, LLMs).
Off the top of my head, the most important things about Stitch are:
We run on iPad, iPhone and Mac — wherever you prototype.
We support Augmented Reality and have a pretty unique approach to prompting with LLMs (see below).
We were early adopters of SwiftUI. My favorite improvements were @Observable and .gesture accepting UIGestureRecognizerRepresentable (super helpful for complex touch interfaces). Biggest disappointment was SwiftData.
We support a huge range of functionality — everything from logic to bezier animation curves to stateful operations like video capture.
You can build some insane graphs — 3D models moving with music while hundreds of nodes update at 120 fps. Supporting this required digging deep into SwiftUI’s Layout protocol and being careful about passing closures into Views.
UX details matter: once we added the “press number to connect edge” shortcut, I never drew edges manually again.
How Stitch uses LLMs
What we tried but didn't work (at least not at our scale): fine-tune o4 on custom DSL data
We defined the “steps” to build a graph from scratch:
- Add a node
- Change an input
- Create an edge
- Change the node's type (e.g. to cat strings instead of add numbers)
We turned about ~500 graphs into our DSL (not manually! just reverse-engineered existing graphs). We tried to teach the model simple things like “make a red rectangle” or “make the rectangle draggable” or “change color when touched.”
The problem: model barely got better even with 10x data (first fine-tune was with 50 examples, second was 500) and we spent $2400 (24 hours at $100/hr). To provide true VPL vibecoding would take too long and be too expensive.
What actually worked: programmatically extending AI-generated code
Today, Stitch's AI is just:
- when you ask the model to make a change:
graph -> code - when the model comes back with more code:
code -> graph
Instead of competing with the big models, we leverage them to generate code which we then programmatically turn into a graph.
Here’s what our cofounder Adam had to say about it.
We've always been an interface to SwiftUI: a scrollable v-stack in Stitch is just a ScrollView { VStack { ... } } in SwiftUI, and so on.
All we needed was the SwiftSyntax library and a little mapping between Stitch and SwiftUI concepts. (LLMs even wrote most of the mapping!)