Back to flin
flin

The Pipeline Operator: Functional Composition in FLIN

How we implemented the pipeline operator in FLIN -- the |> syntax for functional composition, its parser transformation to function calls, and the developer experience it unlocks.

Thales & Claude | March 25, 2026 10 min flin
flinpipelinefunctionalcomposition

Session 150 added a feature that was technically simple -- about 30 lines of code across three files -- but disproportionately impactful for how FLIN programs read. The pipeline operator.

The idea is borrowed from Elixir, F#, and the (eternally) proposed JavaScript pipeline operator: take a value on the left, pass it as the first argument to the function on the right. Chain as many as you want. Read left to right, like prose.

5 |> double |> add_one |> square

This evaluates to square(add_one(double(5))). Same result. Entirely different reading experience. The nested version reads inside-out. The pipeline version reads left-to-right: start with 5, double it, add one, square it.

For a language that values readability above all else, the pipeline operator was not a luxury. It was a necessity.

Why Pipelines Matter

Consider a data transformation chain without pipelines:

result = format(
    sort(
        filter(
            users,
            u => u.active
        ),
        u => u.name
    ),
    "table"
)

Now with pipelines:

result = users
    |> filter(u => u.active)
    |> sort(u => u.name)
    |> format("table")

The pipeline version tells a story: take the users, filter to active ones, sort by name, format as a table. Each step is a line. The data flows down. The transformations read in order.

The nested version tells the same story, but backwards and inside-out. You have to start at the innermost function call and work outwards to understand the execution order. This is a well-known cognitive burden in functional programming, and the pipeline operator eliminates it.

The Syntax

FLIN's pipeline operator uses the |> token, following the convention of F#, Elixir, and OCaml:

// Basic: value |> function
5 |> double              // becomes double(5)

// Chained: a |> f |> g |> h 5 |> add_one |> double |> square // becomes square(double(add_one(5)))

// With arguments: value |> function(args) 10 |> add(5) // becomes add(10, 5)

// Multi-argument pipeline "hello world" |> split(" ") |> join("-") // becomes join(split("hello world", " "), "-") ```

The key semantic: value |> function(args) becomes function(value, args). The left-hand value is prepended to the argument list. If the function takes no explicit arguments, value |> function becomes function(value).

Lexer: The PipeArrow Token

The first step was teaching the lexer to recognize |> as a single token:

// In src/lexer/scanner.rs
fn scan_pipe(&mut self) -> Token {
    if self.peek() == '>' {
        self.advance(); // consume '>'
        Token::PipeArrow
    } else {
        Token::Pipe // existing | token for union types
    }
}

This is a two-character lookahead. When the scanner sees |, it checks if the next character is >. If yes, it produces a PipeArrow token. If no, it produces the existing Pipe token (used for union types like int | text).

The token was added to the token enum:

pub enum Token {
    // ... existing tokens ...
    PipeArrow,  // |>
}

Three lines in the scanner, one line in the token enum. The lexer changes were trivial.

Parser: Desugaring to Function Calls

The parser is where the real work happens. The pipeline operator is syntactic sugar -- it does not create a new AST node. Instead, the parser transforms a |> f(b, c) into f(a, b, c) during parsing.

This desugaring approach has a significant advantage: every compiler pass after the parser (type checker, code generator, optimizer) never sees a pipeline operator. They see a function call. No changes needed anywhere downstream.

The pipeline operator is parsed as a left-associative infix operator with low precedence (just above assignment):

fn parse_pipeline(&mut self) -> Result<Expr, ParseError> {
    let mut expr = self.parse_logical_or()?;

while self.match_token(&Token::PipeArrow) { let right = self.parse_postfix()?;

expr = match right { // value |> function(args) -> function(value, args) Expr::Call { callee, mut args, span } => { args.insert(0, expr); Expr::Call { callee, args, span } } // value |> function -> function(value) Expr::Identifier { name, span } => { Expr::Call { callee: Box::new(Expr::Identifier { name, span }), args: vec![expr], span, } } _ => { return Err(ParseError::new( "pipeline right-hand side must be a function or function call", self.current_span(), )); } }; }

Ok(expr) } ```

The parser handles two cases:

1. value |> function(args) -- The right side is already a Call expression. Insert the left value as the first argument. 2. value |> function -- The right side is an identifier. Create a new Call expression with the left value as the sole argument.

Any other right-hand side is an error. You cannot write 5 |> 42 or 5 |> "hello".

Chained Pipelines

Because the parser uses a while loop, chained pipelines work naturally:

5 |> add_one |> double |> square

First iteration: 5 |> add_one becomes add_one(5). Second iteration: add_one(5) |> double becomes double(add_one(5)). Third iteration: double(add_one(5)) |> square becomes square(double(add_one(5))).

The left-associativity of the loop produces the correct nesting. Each pipeline step wraps the previous result.

Multi-Argument Pipelines

The args.insert(0, expr) operation handles multi-argument functions:

10 |> add(5)
// Parsed as: add(10, 5)

"hello" |> replace("l", "r") // Parsed as: replace("hello", "l", "r")

data |> transform(config, options) // Parsed as: transform(data, config, options) ```

The piped value always becomes the first argument. This convention is shared by Elixir and follows a natural pattern: most data transformation functions take the data as their first argument and configuration/options as subsequent arguments.

Type Checking

Because pipelines desugar to function calls, the type checker handles them automatically. When 5 |> double becomes double(5), the type checker:

1. Resolves double as a function fn double(x: int) -> int 2. Checks that the argument 5 (an int) matches the parameter type int 3. Returns the result type int

No special pipeline logic in the type checker. No new type rules. The existing function call checking handles everything.

This is the power of desugaring. By transforming syntax at the parser level, the entire downstream pipeline (type checker, code generator, optimizer, formatter) works without modification.

Real-World Pipeline Patterns

Data Processing

report = transactions
    |> filter(t => t.date > last_month)
    |> group_by(t => t.category)
    |> map_values(sum)
    |> sort_by_value("desc")
    |> take(10)
    |> format_table()

Seven transformations, each on its own line, reading top to bottom. Compare this to the nested version and the readability difference is stark.

String Manipulation

slug = title
    |> trim()
    |> lower()
    |> replace(" ", "-")
    |> replace("--", "-")
    |> truncate(50)

Each step transforms the string. The pipeline reads like a recipe: trim it, lowercase it, replace spaces with hyphens, clean up double hyphens, truncate to 50 characters.

Entity Processing

dashboard_data = User.all
    |> filter(u => u.active)
    |> sort(u => u.last_login, "desc")
    |> take(20)
    |> map(u => { name: u.name, last_seen: u.last_login.format("YYYY-MM-DD") })

Starting from all users, filtering, sorting, limiting, and transforming -- each step clearly visible.

Session 150: The Full Picture

The pipeline operator was one of four features implemented in Session 150. The others were where clause constraint validation, a null coalescing operator fix, and rest parameter verification. The session demonstrates a common pattern in FLIN development: combining a significant new feature with targeted bug fixes and feature verification.

Where Clause Validation

Before Session 150, where clauses were parsed but not validated. The syntax worked:

fn max<T>(a: T, b: T) -> T where T: Comparable {
    if a > b { return a }
    return b
}

But the constraint T: Comparable was not checked when calling max. Session 150 fixed this by extracting and merging constraints from both inline syntax () and where clause syntax (where T: Comparable):

fn extract_constraints(
    &self,
    type_params: &[TypeParam],
    where_clauses: &[(String, Vec<String>)],
) -> HashMap<String, Vec<String>> {
    let mut constraints: HashMap<String, Vec<String>> = HashMap::new();

// Collect inline constraints for param in type_params { if !param.constraints.is_empty() { constraints.insert(param.name.clone(), param.constraints.clone()); } }

// Merge where clause constraints for (name, bounds) in where_clauses { constraints.entry(name.clone()) .or_insert_with(Vec::new) .extend(bounds.clone()); }

constraints } ```

This function merges constraints from both sources. A type parameter can have inline constraints and where clause constraints, and both are validated at call sites.

Null Coalescing Fix

A subtle bug in the ?? operator: the JumpIfNone instruction popped the value from the stack, causing a stack underflow when the value was not none. The fix added a Dup instruction before the check:

fn emit_nullish_coalesce(&mut self, left: &Expr, right: &Expr) {
    self.emit_expr(left);
    self.emit_op(OpCode::Dup);           // Keep a copy
    let jump = self.emit_jump_if_not_none();
    self.emit_op(OpCode::Pop);           // Discard the none
    self.emit_expr(right);               // Evaluate right side
    self.patch_jump(jump);
    // If not none: original value remains on stack
    // If none: right side value is on stack
}

This is a classic VM bug -- incorrect stack management in a branching instruction. The fix is two lines (Dup and Pop), but finding the bug required understanding the exact stack state at every point in the instruction sequence.

Test Results

Session 150 added 16 new tests across the four features:

  • 4 where clause constraint tests
  • 4 pipeline operator tests
  • 4 null coalescing fix tests
  • 4 rest parameter verification tests

Total test count reached 1,879 (1,430 library + 449 integration). All passing.

Design Decisions for Pipelines

Left-to-right, value-first. The piped value becomes the first argument, not the last. This follows Elixir's convention and works well with FLIN's function signatures where the data argument typically comes first.

Desugar at parse time. The pipeline operator does not exist in the AST. It is transformed to function calls during parsing. This eliminates complexity in every downstream pass.

No lambda shorthand. Some pipeline proposals allow value |> .method() or value |> x => x.field. We chose not to support these initially. The function call form is clear and sufficient. Lambda shorthands can be added later without breaking existing code.

No await interaction. In languages with async/await, the pipeline operator sometimes interacts with promises (value |> await asyncFn). FLIN defers this interaction, keeping the pipeline purely synchronous for now.

The Bigger Picture

The pipeline operator completed a suite of developer experience features that transformed how FLIN programs look and feel. Destructuring (Session 097-098) made data extraction concise. The Elvis operator (Session 097) made default values clean. The pipeline operator (Session 150) made data transformation chains readable.

Together, these features let developers write code that reads like a description of what it does, rather than a prescription of how the computer should do it. That is the essence of FLIN's design philosophy: express intent, not mechanism.

The next article covers tuples, enums, and structs -- the data structure primitives that these ergonomic features operate on.

---

This is Part 38 of the "How We Built FLIN" series, documenting how a CEO in Abidjan and an AI CTO designed and implemented a programming language from scratch.

Series Navigation: - [36] Tagged Unions and Algebraic Data Types - [37] Destructuring Everywhere - [38] The Pipeline Operator: Functional Composition in FLIN (you are here) - [39] Tuples, Enums, and Structs - [40] Type Guards and Runtime Type Narrowing

Share this article:

Responses

Write a response
0/2000
Loading responses...

Related Articles