Q1 2024

Craft

Data Flow, Testing, Math, and Type Systems

I started 2024 with a strong desire to radically improve documentation and ended the quarter designing a usage-cap system so nobody ends up on Serverless Horrors. The throughline was craft -- doing things right at the foundational level so that everything built on top can last.

The happy data flow post came from a local community I was forming. Someone asked "how does data actually flow in Adama?" and I realized I'd never documented it properly. The answer, in thirteen steps: a user's browser connects via WebSocket to a document identified by a space and key. The connection allows sending messages in the language of the product rather than a data mutation language. The message hits a web server, gets routed to the correct host via the Adama network protocol, lands on the machine hosting the document. The document runs bytecode from the Adama specification. Channel logic mutates the document. The monitored state yields a JSON delta. The delta appends to a write-ahead log on disk and awaits fsync. After fsync completes, every callback succeeds up the entire stack. The user gets an end-to-end commitment that the write was durable.

In RxHTML, this entire flow starts with an rx:action attribute on a form element. That's it. The framework handles the connection, the routing, the delta synchronization, and the reactive DOM updates. You write HTML and declare what data you want. The platform handles the rest.

The math post was born from a nit-pick about range checks. I told a product engineer that x >= low && x <= high should be written as low <= x && x <= high, then lamented that Adama doesn't yet support low <= x <= high as a chained comparison. That spiraled into a meditation on mathematical education and what practitioners actually need. My answer: beyond arithmetic and basic algebra, it's really about discipline of mind. The careful process of doing math correctly is the proxy skill because you can take things at the appropriate pace with well-defined justifications.

I shared two stories. In sixth grade, I invented algebra by trying to build an XP table for a QBasic dungeon crawler -- I was solving linear equations without knowing the name for what I was doing. In high school, I derived the quadratic formula from scratch on a road trip by exploiting the symmetry of parabolas. Neither derivation was elegant. Both were the result of curiosity and a lot of paper. The recommendation for practitioners: build your own language, starting with abstract syntax trees. Parse (+ 1 (* 42 13)), compute with it, introduce variables, then go nuts. The boundary between "just a programmer" and a "good engineer" is discipline of mind to enforce good error handling or remove errors by simplifying the process.

Testing got a proper manifesto. Adama had 6,344 unit tests for the platform, but the RxHTML side had zero because JavaScript is a garbage language and the browser is a dumpster fire. I designed a crawl-mode testing system: take a document snapshot, fire it up in a devbox, crawl the product finding broken pages, take screenshots per page, capture backing data. Every form mutation forks the document into a parallel universe. Each parallel universe gets crawled for differences. Before/after snapshots, both data and visual, for regression protection at scale. For deeper testing, a record mode would let developers initiate sessions and replay interactions.

The Euler post traced my entire journey from QBasic on an IBM PC through elementary school, to C++ algorithms in middle school, through high school game engines, university research, Amazon S3, Meta's real-time infrastructure, and finally to Adama. The point wasn't autobiography -- it was illustrating the path from novice to principal engineer as a sequence of increasingly ambitious projects, each one building on the foundations of the last. Always Be Coding. There is no short path toward excellence.

February's usage limits post tackled a business problem. Since Adama aims to be a next-gen all-in-one serverless platform, I wanted to avoid the criticism of infinite cost. All vendors should provide a maximum usage cap, and I'm shocked it isn't done more (actually, I'm not shocked since the lack of caps maximizes shareholder value). I designed a three-bucket bandwidth system: maximum hourly crawl bandwidth, public bandwidth, and authenticated bandwidth, each coordinated through token buckets refreshed from the billing document. For compute, the plan was minimum and maximum "bricks" (essentially threads on machines) with soft and hard caps on memory and CPU. Soft caps prevent new loading; hard caps start tearing down existing documents.

The precision problem is what makes this hard: the moment you check in with the accountant, cost has already incurred. The algorithm divides token budgets across web servers, which means too many servers can choke out the allocation. There are fun distributed systems techniques to reduce pressure on the coordinating entity, but the initial goal was getting something reasonable shipped.

By the end of Q1, I was forming a local community, helping engineers grow, building a testing philosophy, and designing the billing infrastructure that would let the platform scale without bankrupting anyone. The craft isn't just in the code. It's in every decision about how the system should behave when things go wrong.