Q1 2021

Crossroads

Rust, WebAssembly, and Choosing a Path

It is a new year, and I have been incommunicado. The project is alive but on the back burner. This crazy pandemic gave me time, and I used it to work out and improve myself rather than grind on code. Instead of committing to expensive execution, the focus is purely on strategy -- thinking this project through the following decades.

And the first thing I thought was: I fucked up by targeting Java.

Cloudflare is investing deep in WebAssembly with durable objects. DFINITY is introducing WebAssembly canisters for decentralized compute. IF ONLY I had the wisdom to target WebAssembly, I could leverage other people's infrastructure. But WebAssembly feels too low-level for my patience, and there's a swarm of problems targeting it directly. What if I translated to Rust?

I started learning Rust and I love it. It triggers the special feeling that I'm making something that will last. The borrow checker forces me to plan, think hard, and design the shape of the beast upfront. As an example, I wrote a JSON parser to flex parser skills. It was not easy, and I like the results. The discipline of Rust is intoxicating -- Java code from a decade ago still works and I like that, but Rust gives me confidence that we can have good software that will outlast our attention spans.

But following Rust into WebAssembly would be a siren song. The gaps between what the browser offers and where the Rust ecosystem sits are real. I need to find a balance where long-term investments can migrate to Rust over time without getting bogged down in missing features today. The browser is a challenging foe to usurp.

This project is in the dip. I re-read The Dip by Seth Godin -- a small book about the power of quitting early or being strategic to push through. The key is recognizing your situation. There is a long road ahead. I will make catastrophic mistakes. But I must soldier on with a strategy.

Meanwhile, the core architecture was crystallizing. I wrote a manifesto for how the UI should work with Adama. The mental model: you're alone in front of your computer playing a board game with friends online. You interact with a picture. Those interactions become messages sent to the Adama server or update local view state. Other users are sending signals too. The server synchronizes its state to all clients. The DOM combines hidden viewer state with server state to produce the picture. Feedback loop complete.

The network problem required serious thought. You can send entire state via UDP (tolerates packet loss, but doesn't scale). You can use the command pattern with a TCP-like stream (gives replay, but creates unbound queues). Or you can use state synchronization with state differentials -- which is what Adama does.

For board games, state is bounded (there are only so many pieces in the box). With finite state, you have a maximum transmission rate of just sending the entire state using flow control. Blips in the network manifest as strange jitter rather than catastrophic data loss. This requires a data model that supports both differentiation and integration -- now you know why you studied Calculus.

// JSON merge as the algebraic operator
// arrays are problematic, overcome with specialized objects
// infinite stream of updates collapses into a finite update
// flow control from server to client becomes effective and cheap

The cost is that you lose convenient change logs. There's no "player X played card Y" narrative for free. Instead you must construct human-friendly descriptions from before and after states. No silver bullets in this life.

JSON itself became an obsession. RFC 7396 merge is an exceptionally powerful idea -- it makes the set of all JSON objects into an almost algebraic group. If you interpret null as nothing equivalently, you have a group. As a math guy, group properties signal a general substrate for useful stuff.

The delta log on disk avoids arrays entirely. Adama's data model is a giant object with just objects and values. Collections use tables or maps. This enables efficient replication matching traditional database solutions, plus a superpower: the game's state can be rewound to any point by trimming the head of the log and rebuilding. For board games, this means undo. For applications, this means collaborative undo -- though the algebra of undoing one person's contribution while preserving others' work gets theoretical fast.

For client synchronization, arrays of objects get transformed using a special @o field for ordering:

{
  "42": {"name":"Jeff"},
  "50": {"name":"Jake"},
  "@o": [[0,1], 100, [2,4]]
}

Subsequences from the prior ordering are reused as range pairs, saving tremendous space for small data changes. This extension to RFC 7396 makes array-heavy client state efficient over the wire.

I also started designing a game. A deck builder, co-op against a narrative structure, designed for couples. The entire game mirrors a 50 hour legacy game in reasonable chunks. With it being co-op, I want it to be a serious challenge -- it should feel like a raid. I can already have AI players play poorly with random decisions for balance testing. The game design is fun. But I need tools, I need UI, and I am stupid enough to be thinking about building yet another cross-platform UX ecosystem... for board games... in Rust.

The quarter ended with clarity on one thing: whether I target Rust or keep Java, the architecture is sound. Reactive state with differential synchronization over WebSocket, backed by a transactional delta log, with privacy baked into the reactive layer. The path forward is less about which language generates the code and more about shipping something people can use.

Is this a silly endeavor? Perhaps.