Coding in the Large Language: Building the too.foo Ecosystem
We are entering an era where people can express themselves using a new programming language: the Large Language.
Coding in the Large Language: Building the too.foo Ecosystem
We are standing at the precipice of a new era. For decades, coding has been a translation exercise—a strenuous process of converting human intent into machine-readable syntax. Engineers spent years mastering the dialects of compilers, frameworks, and virtual machines. But the paradigm is shifting beneath our feet. We are no longer just translating; we are expressing. We are entering an era where people can express themselves using a new, universal programming language: the Large Language.
This is a personal diary, a manifesto, and a blueprint for the future of the too.foo ecosystem.
The Death of Syntax and the Rise of Intent
When I started building the S3M2P monorepo—the engine behind too.foo, HELIOS, and MCAD—the goal was always to create an AI-native engineering platform. A place where complex algorithms, physics math, and web architectures lived in harmony. But as the codebase grew to nearly 150,000 lines of Rust, a harsh truth emerged: writing code the traditional way does not scale with human imagination.
I found myself bottlenecked not by my ideas, but by the physical act of typing syntax. I wanted to sculpt simulations, visualize quantum fluids, and build intricate shipping crate algorithms. Instead, I was fighting WebGL buffer bindings, CSS flexbox quirks, and esoteric Rust lifetimes.
That ends now.
We are rebuilding too.foo from the ground up to be optimized entirely for LLM-based development. The codebase will no longer primarily be read by humans. It will be read, digested, and manipulated by AI agents. I will act as the director, the choreographer, the architect. Gemini will be my chief planner and strategist. Codex will be the relentless executor.
And my new programming language? Plain English.
The New Architecture: Designing for Machine Minds
Optimizing a codebase for LLMs is fundamentally different than optimizing it for humans. Humans need abstractions to save screen space; LLMs need explicit, unambiguous schemas to save context tokens. Over the coming weeks, we are rolling out a 30-step architectural master plan to turn too.foo into the perfect sandbox for multi-agent creation.
Here is the philosophy driving the new ecosystem:
1. Headless by Default
LLMs cannot “see” a web browser. When Codex writes a new particle simulation for HELIOS, it cannot open Chrome to verify if the math works. Therefore, we are strictly decoupling our CORE engines from WASM and WebGL. Every computational engine will run headlessly. Codex will write the logic, and verify it via sub-millisecond, headless cargo test suites. The DOM and WebGL will become mere “viewers” mapped to invisible, robust data models.
2. Everything is TOML
Code is a terrible medium for data. It’s verbose and prone to syntax errors. We are pivoting to a “Data-Driven Development” model. Whenever setting up a new CAD part, a new Chladni wave simulation, or defining a test case, we won’t write Rust code. We will write TOML. LLMs generate serialized TOML datasets with near-perfect accuracy. By treating configurations and test hooks as data structures, we eliminate thousands of lines of boilerplate.
3. Granular AI Context Boundaries
When I tell Codex to add a new button to the AUTOCRATE tool, it does not need to read the 15,000-line DNA math library. We are implementing localized .ai-context.md boundaries across our monorepo. Our context-bundling scripts will dynamically fetch only the specific domain rules required for the current task. We are giving the AI localized peripheral vision, drastically preventing hallucinations.
4. The Agent-to-Agent Schema
English is the perfect language for my intent, but it is a noisy language for Agent-to-Agent communication. When Gemini plans a feature, it will no longer write paragraphs of advice for Codex. It will output a strict, deterministic PLAN.json schema. Codex will read this JSON, execute the diffs, and return a RESULT.json. We are formalizing the API between the Planner and the Executor.
Painless Scaffolding and Safe Refactors
The ultimate vision is frictionlessness. I want to open Antigravity, select a thought, and watch a new web app materialize on too.foo.
To achieve this, we are building a Zero-Config Scaffolding CLI. I will type ./SCRIPTS/scaffold_app.sh "My Tracker", and in three seconds, the monorepo will generate the Cargo workspace, assign an open port, configure the WASM bundler, and inject a unified UI wrapper. The too.foo index will dynamically reroute to include the new project without a single hardcoded edit.
But with rapid generation comes the risk of rapid destruction. How do we ensure that an AI tweaking the global system font doesn’t shatter the HELIOS solar system simulation?
We are instituting ironclad Ecosystem Boundaries:
- Centralized Design Tokens: A single
DNA/assets/THEME.tomlwill hold all design variables. Codex will never write a CSS rule again; it will only tweak the TOML parameters. - Global Impact Analysis: Before modifying a core physics struct or UI paradigm, Gemini will ping a script to map exactly which downstream web applications rely on that exact line of code.
- Automated Reverts: If the executor hits a compilation wall, the planner has the supreme authority to run
./SCRIPTS/git_rollback.sh, cleanly wiping the slate and formulating a new angle of attack.
The Art of the Manager
I am no longer a software engineer. I am a manager of digital intelligence.
My role is to inject the “flavor.” The aesthetic adjustments, the visionary leaps, the human touch. When I want a new web-based project, the heavy lifting—the boilerplate, the routing, the CI/CD deployment—should be utterly invisible.
too.foo is becoming a living organism. It is a monorepo that writes itself, tests itself, and heals itself.
This post is merely the prologue. Over the coming days, I will be publishing a series of deep-dives, one for each specific change we make to this ecosystem—from sandboxing security and code styling to UI consistency checks.
We are writing in the Large Language now. And there has never been a better time to build.