I deployed my first production function on AWS Lambda in 2016. Immediately, I was hooked on the promise of Function-as-a-Service (FaaS): atomic scaling, granular observability, and zero downtime updates. It erased the shared-resource risks and complex versioning challenges that came with traditional monolithic deployments. Small, low-traffic services became trivial to manage.
But Lambda had an Achilles' heel: cold starts. Your entire function needed loading upon first use, and with Lambda billing tied closely to memory and CPU usage, heavy bundles meant heavy costs. JavaScript was notorious here, while hand-crafted functions could be mere kilobytes, standard bundled deployments often ballooned into megabytes.
I began exploring dependency reduction techniques in deployments, striving for a better balance of developer experience and optimized bundle sizes. My initial efforts revolved around Webpack, Rollup, and Babel transformations.
I released my first transformed application leveraging Babel to rewrite ASTs (Abstract Syntax Trees), seamlessly hydrating serverless API facades. Around this time, inspired by C#'s LINQ, I crafted auto-sql
, translating expressive JavaScript array methods directly into optimized SQL queries using the TypeScript compiler. Building upon that, I developed auto-serverless
, embedding static dependency injection right into serverless deployments.
Driven by these experiments, I set out to generalize the approach. This became bloom
, a framework that introduced cascading plugin transforms, allowing plugins to modify the behavior and code generation of subsequent plugins by directly manipulating their ASTs. Another breakthrough was the use of tagged types in TypeScript, enabling compile-time enforcement of data governance rules for sensitive information like PII and PCI.
Yet, the ambitious bloom
was shelved temporarily due to competing demands. It seemed too bold, perhaps ahead of its time.
Then, in 2023, Bun emerged, reigniting my excitement. I quickly prototyped a dependency injection transform integrated effortlessly into Bun's esbuild pipeline. This was the spark for `bloom v2`, now named TypeFire.
I'm excited to announce that TypeFire 1.0 is nearing its release as an open-source framework and its potential is thrilling.
Historically, JavaScript development has been weighed down by runtime overhead. Even with modern advancements like Zod for validation, lighter frameworks like Hono, and compiled UI approaches, JavaScript applications remain bloated.
TypeFire changes that. It introduces a new model for JavaScript: one where your code is treated as data and evaluated at compile-time to generate highly optimized, platform-specific output.
What does that unlock?
- Validators that compile to minimal, blazing-fast functions indistinguishable from handcrafted code.
- Type-safe APIs that automatically generate OpenAPI specs, runtime validation, and type inference with zero duplication.
- SQL queries authored as expressive, composable JavaScript that compile to raw SQL with full optimization.
- Reactive UI components that compile into framework-free DOM manipulations or into React, Web Components, or anything else.
- Business logic that compiles differently depending on your target (Node, Workers, Lambda, Deno) with no runtime branching or adapter layers.
It's not just a build step. It's a way to collapse the abstraction gap between DX and performance, turning TypeScript into a living language.
TypeFire transforms JavaScript development into a magical experience, delivering unmatched developer ergonomics without sacrificing performance. Unlike traditional macros, TypeFire harnesses standard, fully type-checked TypeScript, allowing limitless macro extensions and playful experimentation with types.
If you're intrigued, join our early access list, or follow along as we redefine what's possible with JavaScript. And if you're a developer eager to contribute, let's connect. Reach out to [email protected].