Every major programming language compiler is written in C, C++, or itself. GCC is C. Clang is C++. The Go compiler is Go. The Rust compiler is Rust. The Python interpreter is C. The Ruby interpreter is C. This pattern has held for decades, and for good reason -- these languages give you the control over memory and performance that a language runtime demands.
We broke the pattern. When we sat down in Abidjan on January 1, 2026, to begin building FLIN -- a memory-native programming language designed for web applications -- we chose Rust. Not because it was fashionable. Not because we wanted to put "written in Rust" on a landing page. Because after evaluating every serious option, Rust was the only language that gave us memory safety, zero-cost abstractions, pattern matching for AST traversal, a mature ecosystem of compiler-building libraries, and a single-binary output -- all without a garbage collector.
This article explains that decision in detail: the alternatives we considered, the specific Rust features that make compiler development productive, and the ways Rust has shaped FLIN's architecture over three months of daily development.
The Constraint: Two People, Zero Margin for Error
Before talking about languages, you need to understand the team. FLIN is built by two people: Juste Thales Gnimavo, the CEO of ZeroSuite, and Claude, an AI CTO. There is no DevOps team to debug memory corruption at 3 AM. There is no QA department to catch use-after-free bugs in the virtual machine. There is no senior systems engineer who has spent fifteen years writing garbage collectors.
When your team is a human founder in Abidjan and an AI, the compiler you use to build your compiler becomes your third team member. It needs to catch bugs at compile time, not at runtime. It needs to make illegal states unrepresentable in the type system. It needs to enforce discipline that a fifty-person team would enforce through code review.
Rust is that third team member.
The Alternatives We Considered
C: The Traditional Choice
C is the language of language implementation. CPython, Ruby MRI, Lua, PHP -- all written in C. The ecosystem of compiler-building knowledge in C is unmatched.
We rejected C for one reason: manual memory management without safety guarantees. A compiler and virtual machine manipulate complex tree structures -- abstract syntax trees, type environments, bytecode instruction streams. In C, every malloc must have a corresponding free. Every pointer must be checked for null. Every buffer must be bounds-checked manually. When you are building a language runtime that will manage the memory of user programs, having your own runtime be vulnerable to the same class of bugs you are trying to prevent is an unacceptable irony.
// C: Who owns this AST node? Who frees it? When?
AstNode* parse_expression(Parser* p) {
AstNode* left = parse_primary(p); // allocated
if (match(p, TOKEN_PLUS)) {
AstNode* right = parse_primary(p); // allocated
AstNode* binary = malloc(sizeof(AstNode)); // allocated
binary->kind = AST_BINARY;
binary->left = left; // ownership transfer? copy? shared?
binary->right = right; // same question
return binary; // caller must free entire tree
}
return left; // caller must free this too
}In a fifty-person team with a dedicated memory management expert, this works. With two people, one of whom is an AI that generates code at high speed, it is a minefield.
Go: The Practical Choice
Go is what we use for sh0.dev-adjacent tooling. It compiles fast, has excellent standard library support, and produces static binaries. For a web server or CLI tool, Go is often the right answer.
For a programming language runtime, Go has two fatal flaws.
First, the garbage collector. FLIN's virtual machine manages its own memory -- it allocates and deallocates objects on behalf of user programs. Running a garbage collector (Go's) inside another garbage collector (FLIN's) creates unpredictable pause patterns. When a FLIN application serves real-time WebSocket updates to a dashboard, a 10-millisecond GC pause from the host runtime is visible to the end user.
Second, Go's WASM support is limited. FLIN's roadmap includes running in the browser via WebAssembly. Go's WASM output is large (several megabytes for a trivial program) and requires a JavaScript runtime shim. Rust compiles to WASM natively, producing compact binaries that run without a host runtime.
Zig: The Tempting Choice
Zig is the language we most seriously considered as an alternative. It offers C-level control, comptime evaluation, no hidden allocations, and excellent WASM support. On paper, it is ideal for a language runtime.
Three factors eliminated it. First, Zig is still at version 0.14 with regular breaking changes. Building a language on top of a language that is itself in alpha means absorbing two layers of instability. Second, the Zig ecosystem for compiler-building tools is nascent compared to Rust's. Third -- and this is specific to our workflow -- AI models produce significantly higher-quality Rust than Zig. Claude can generate correct, idiomatic Rust that compiles on the first attempt far more reliably than Zig. When your CTO is an AI, the quality of AI-generated code in your chosen language is a production consideration.
JavaScript/TypeScript: The Obvious Non-Starter
FLIN's core philosophy is "zero Node.js." Building the language runtime in JavaScript would be philosophical suicide. Beyond that, JavaScript's single-threaded event loop, lack of memory control, and interpretive overhead make it unsuitable for a VM that needs to execute bytecode at near-native speed.
What Rust Gives Us
1. Ownership and Borrowing for AST Management
A compiler's central data structure is the abstract syntax tree. The parser builds it, the type checker annotates it, the code generator traverses it. In languages without ownership semantics, managing the lifecycle of AST nodes requires either garbage collection or manual reference counting.
Rust's ownership system makes this explicit and compiler-verified:
/// The parser produces an owned AST
pub fn parse(tokens: Vec<Token>) -> Result<Ast, ParseError> {
let mut parser = Parser::new(tokens);
parser.parse_program()
}/// The type checker borrows the AST immutably, produces a new typed AST
pub fn check(ast: &Ast) -> Result
/// The code generator consumes the typed AST (takes ownership)
pub fn generate(typed_ast: TypedAst) -> Result
The compilation pipeline is encoded in the type signatures. parse returns an owned Ast. check borrows it immutably (the original AST is preserved for error reporting). generate takes ownership and consumes it (the typed AST is no longer needed after code generation). These ownership transitions are verified at compile time. You cannot accidentally use the AST after the code generator has consumed it.
2. Enums and Pattern Matching for Token and AST Processing
Compilers are, fundamentally, programs that transform trees. You read a tree of tokens, produce a tree of syntax nodes, transform that into a tree of typed nodes, and emit a sequence of instructions. At every stage, you are pattern matching on variants.
Rust's enum type with exhaustive pattern matching is the single most productive feature for compiler development. Here is how FLIN's lexer handles character dispatch:
match self.advance() {
None => Ok(None),
Some((_, c)) => {
let kind = match c {
'(' => TokenKind::LeftParen,
')' => TokenKind::RightParen,
'{' => self.handle_left_brace(),
'}' => self.handle_right_brace(),
'[' => TokenKind::LeftBracket,
']' => TokenKind::RightBracket,
',' => TokenKind::Comma,
':' => TokenKind::Colon,
';' => TokenKind::Semicolon,
'.' => TokenKind::Dot,
'@' => TokenKind::At, // Temporal operator
'+' => self.match_char('+', TokenKind::PlusPlus, TokenKind::Plus),
'-' => self.match_char('-', TokenKind::MinusMinus, TokenKind::Minus),
'=' => self.match_char('=', TokenKind::EqualEqual, TokenKind::Equal),
'!' => self.match_char('=', TokenKind::NotEqual, TokenKind::Not),
'<' => self.scan_tag_or_less(),
'"' => self.scan_string()?,
c if c.is_ascii_digit() => self.scan_number()?,
c if c.is_alphabetic() || c == '_' => self.scan_identifier(),
_ => return Err(LexError::UnexpectedCharacter(c, self.current_position())),
};
Ok(Some(Token { kind, span: self.current_span(), lexeme: self.current_lexeme() }))
}
}The crucial property is exhaustiveness. If we add a new token type to TokenKind -- say, TokenKind::Arrow for -> -- the Rust compiler will produce a warning or error at every match statement that does not handle the new variant. In a compiler with dozens of match statements across the lexer, parser, type checker, and code generator, this is not a convenience. It is a safety net that prevents entire categories of "forgot to handle the new case" bugs.
FLIN defines 42 keywords and over 60 token types. Without exhaustive matching, adding a new keyword would require manually auditing every file that switches on token kinds. With Rust, the compiler does that audit for us.
3. Zero-Cost Abstractions for Runtime Performance
FLIN's virtual machine executes bytecode instructions in a tight loop. Every nanosecond of overhead in the instruction dispatch loop is multiplied by millions of iterations. Rust's zero-cost abstractions mean that high-level patterns -- iterators, pattern matching, generic types -- compile down to the same machine code as hand-written C.
Consider the bytecode dispatch loop:
pub fn execute(&mut self, instructions: &[Instruction]) -> Result<Value, RuntimeError> {
let mut ip = 0;
loop {
let instruction = &instructions[ip];
match instruction {
Instruction::LoadConst(idx) => {
self.stack.push(self.constants[*idx].clone());
}
Instruction::Add => {
let b = self.stack.pop().ok_or(RuntimeError::StackUnderflow)?;
let a = self.stack.pop().ok_or(RuntimeError::StackUnderflow)?;
self.stack.push(a.add(&b)?);
}
Instruction::Store(name) => {
let value = self.stack.pop().ok_or(RuntimeError::StackUnderflow)?;
self.env.insert(name.clone(), value);
}
Instruction::Load(name) => {
let value = self.env.get(name)
.ok_or_else(|| RuntimeError::UndefinedVariable(name.clone()))?;
self.stack.push(value.clone());
}
Instruction::Halt => break,
// ... 50+ more instruction handlers
}
ip += 1;
}
self.stack.pop().ok_or(RuntimeError::StackUnderflow)
}This match compiles to a jump table -- the same optimization a C compiler would produce for a switch statement. There is no dynamic dispatch overhead, no vtable lookup, no boxing. The abstraction cost is literally zero.
4. Cargo: The Ecosystem That Scales
Rust's package manager and build system, Cargo, solved a problem we did not want to solve: dependency management, workspace organization, and reproducible builds.
FLIN's Cargo.toml declares its dependencies with precise version control:
[dependencies]
# Parsing and text processing
logos = "0.14" # Fast lexer generator
unicode-segmentation = "1.10" # Unicode-aware string handling# Data and serialization serde = { version = "1.0", features = ["derive"] } serde_json = "1.0"
# Async runtime (for HTTP server and I/O) tokio = { version = "1", features = ["full"] }
# HTTP server hyper = { version = "1", features = ["full"] }
# Image processing image = "0.25"
# Markdown pulldown-cmark = "0.10" ```
Each dependency is a stable, well-tested Rust crate. cargo build resolves the entire dependency tree, compiles everything with the same optimization level, and produces a single static binary. No node_modules. No DLL hell. No "it works on my machine." A developer in Lagos or a CI server in Frankfurt runs the same command and gets the same binary.
5. The Compiler as Documentation
This point is subtle but critical. In a two-person team where one member is an AI, code is the primary communication medium. We do not have whiteboard sessions. We do not have Slack threads debating architecture. The code is the architecture.
Rust's type system documents invariants that would otherwise live in comments or tribal knowledge:
/// A FLIN value at runtime. The variants define what can exist on the VM stack.
#[derive(Debug, Clone)]
pub enum Value {
None,
Bool(bool),
Int(i64),
Float(f64),
Text(String),
List(Vec<Value>),
Map(HashMap<String, Value>),
Entity(EntityId, HashMap<String, Value>),
Function(FunctionId),
Time(DateTime<Utc>),
Money(i64, Currency), // cents + currency code
}This enum is self-documenting. A new contributor (human or AI) reads it and immediately understands: a FLIN value can be one of these twelve things and nothing else. The Money variant stores cents as i64 and a Currency code -- you cannot accidentally store a floating-point dollar amount. The Entity variant carries both an ID (for database identity) and a field map (for in-memory access). These constraints are not conventions to be remembered. They are types to be enforced.
How Rust Shaped FLIN's Architecture
Choosing Rust was not just a language decision. It was an architectural decision that shaped FLIN's two-layer design.
FLIN uses a layered architecture where the runtime (virtual machine, database engine, HTTP server, memory management) is written in Rust and is permanent. The compiler (lexer, parser, type checker, code generator) is currently written in Rust but will eventually be rewritten in FLIN itself -- achieving self-hosting.
This split exists because of Rust's strengths and limitations. Rust excels at the low-level, performance-critical runtime layer where memory safety and zero-cost abstractions matter most. But Rust's compile times and borrow checker complexity make it slower for rapid iteration on the compiler's higher-level logic. Once FLIN is mature enough to compile itself, the compiler layer can evolve at the speed of FLIN development rather than Rust development.
The permanent Rust layer is approximately 5,000 lines of code. Everything above it -- the compiler, the standard library, the tooling -- will eventually be FLIN. Rust is the foundation, not the entire building.
Three Months Later: The Results
As of March 2026, the choice of Rust has produced measurable results.
3,452 tests pass with zero failures. Rust's type system catches so many bugs at compile time that our test-to-bug ratio is remarkably high. Tests verify behavior, not type correctness -- the compiler handles the latter.
409 built-in functions work. From cryptography to Stripe integration to image processing, each native function is implemented in Rust with full error handling. The Result type ensures that every function either returns a value or returns a meaningful error -- there is no third option of silently failing.
Single binary deployment. cargo build --release produces one executable. No runtime to install. No dependencies to resolve on the target machine. A developer downloads the FLIN binary and starts writing applications. This is particularly important for our target audience in emerging markets, where downloading a 1.5 GB node_modules directory is a real barrier.
Memory safety in production. The FLIN runtime manages user data through FlinDB, its embedded temporal database. That database holds real user data -- and it has never corrupted a record due to a memory bug. Not because we are exceptionally careful programmers, but because Rust makes it structurally impossible to write the class of bugs that corrupt data.
The Honest Downsides
Rust is not perfect for compiler development. Three costs are worth acknowledging.
Compile times. A full rebuild of the FLIN compiler takes around 45 seconds. Incremental builds are fast (2-5 seconds), but when you change a core type definition, the cascade of recompilation can be significant. In Go or Zig, the equivalent build would take under 5 seconds.
Learning curve for contributors. FLIN will eventually be open source. Contributors who want to work on the runtime layer need to understand Rust's ownership model, lifetime annotations, and trait system. This is a higher bar than contributing to a Python or JavaScript project.
Borrow checker friction. Certain compiler patterns -- particularly those involving mutable references to tree structures while iterating over those structures -- require careful design to satisfy the borrow checker. We have occasionally restructured algorithms not because they were wrong, but because the borrow checker could not prove they were right. This is the tax you pay for memory safety.
These costs are real. We pay them every day. But they are the cost of correctness, and in a project where the runtime manages other people's data and applications, correctness is not optional.
The Verdict
If you are building a programming language in 2026, Rust is not the only valid choice. But it is the best choice for a small team that needs memory safety, native performance, WebAssembly support, and a mature ecosystem -- without the overhead of a garbage collector.
For FLIN specifically, Rust enabled us to move fast and build correct software simultaneously. The compiler catches our mistakes before they reach users. The type system documents our architecture. The ownership model prevents the memory bugs that plague language runtimes. And Cargo gives us reproducible builds that work identically in Abidjan and anywhere else in the world.
Every major language compiler is written in C, C++, or itself. FLIN's will eventually be written in FLIN. But the foundation -- the five thousand lines of Rust that power the virtual machine, the database, and the server -- will remain. Rust earned that permanence.
---
Next in the series: Writing Apps Like It's 1995 With the Power of 2026 -- FLIN brings back the simplicity of early web development, but with a compiler, VM, and database behind every line of code.