Python isn't just glue, it's an implicit JIT ecosystem
Python is known as the glue language.
It's not fast.
It's not magical.
And yes, it always has a breaking point.
But that's exactly what makes it special.
You can:
- write it in your sleep
import any_magic
as you need (batteries included or a globalpip
install)- conjure up some Frankensteinian FortRust++ library written in MMIX assembly from the dawn of the machine age
- execute your code line by line until it (predictably) breaks, leaving you in an interpreter to introspect the ruins
And here's the thing: that's actually a remarkably good process for early development!
The combination is what is most profoundly special, captured in the broader picture of how the Python ecosystem evolves and what directs that evolution.
When you write Python code you're not just writing glue, you an explorer in Python's implicit just-in-time compilation ecosystem.
Every time a new found Python code path becomes hot enough, the ecosystem responds by forging a new component from the barest of metals that is then glued into place.
This glue isn't static. It evolves much as a desire path does, based on the patterns of usage across the ecosystem.
Python's role isn't just connecting components - it's discovering which components need to exist.
The Python performance paradox
Python is slow. It is known.
But maybe, just maybe, we might want it that way?
When a Python code path becomes slow enough to matter, something counter-intuitive happens: the ecosystem doesn't optimize the Python, it glues in something else.
Python is slow to run, but fast to experiment with. You act as a scout, finding a new path, and using it enough to show it matters. If it matters you might just find that dirt path already paved by the time you turn around.
This is an emergent optimization strategy that works better than any planning could hope to.
Python, by itself, is the antithesis of premature optimization. It's all about getting something running and only later deciding to make it work faster (if that even matters).
When new paths are found, the focus is on expressivity, ease of use, and simplicity over performance. You're not going to win the performance battle so you don't even try to fight it.
As we hit friction with Python's speed and the capabilities it offers, we dip into the bucket of Fast™ languages rather than reinventing the wheel.
This Pareto optimal API might cover 80% of the necessary use case (hot path) even if only exposing 20% of the fully fledged bare metal component's capabilities but that's perfectly fine.
Rather than reinventing the wheel in your preferred Fast™ language, done out of love rather being sensible, and trying to cover that 80% of necessary cases (you hope), we instead sticky tape in the best solution that you can back off to (which you could extend to full functionality if your needs go further). It doesn't matter if it's written in FortRust++ as the Python API bends itself towards simple and easy to use.
Python continues bouncing along, optimizing for end user capabilities (ease of use, composition, simplicity, ...) rather than underlying magic. Python doesn't get jealous of other languages in trying to steal that hard won library for itself.
The implicit JIT ecosystem made explicit
Whilst Python is my main language and has lived in my head for half my life, it isn't the only language that exhibits this property.
I might argue it's one of the most successful of the implicit JIT ecosystems but there are many other glue languages with their own claims to fame.
We've seen explicit JIT ecosystems forced into being by many startups and big companies. As the startup grew (or was acquired into a BigCo) we see the Death Star of their engineering orgs burn new hot paths.
- Python had Instagram (Facebook) and YouTube (Google)
- Ruby / Ruby on Rails had GitHub and Shopify
- PHP had Facebook
We've seen the same story play out in more recent times as well, with Figma supercharging Javascript with Rust and (separate to any large companies) a lovely small analysis of adding in ever larger sprinklings of Rust into Javascript to improve performance.
The implicit JIT ecosystem holds true for more than just Python but Python might have been the largest winner.
Torch started life in Lua until it became PyTorch. That language shift was motivated not by speed but by simplicity and the ecosystem. so other libraries had to find their way to Python or face decreased usage.
Python has been at the forefront of the data science / ML ecosystem. numpy
, pandas
, pytorch
, tensorflow
, scikit-learn
, ... All of them were already well optimized. Yet they're still seeing further (slow rolled implicit JIT) gains.
At this stage some of the most computationally expensive projects in the world are run in Python, the slow language.
Slow Python code that's important is a structural flaw that the ecosystem works to correct.
The implicit made explicit
While Python might be the most successful example of an implicit JIT ecosystem, it isn't alone in this pattern. We've seen this story play out repeatedly across different languages and companies, each time following a similar path: start with a "slow" but expressive language, discover the hot paths through real usage, and only then optimize those specific paths with faster components.
What's different is that these are explicit JIT ecosystems forced into being by the startups (now big companies) enabled by those MVP optimal languages. As the true need arose we saw the engineering SWAT team descend to burn in hot paths.
Both Python (through Facebook's Instagram and Google's YouTube) and Ruby (GitHub and Spotify) grew or acquired a success and then had to work to bring the performance in line with their needs.
Facebook began with PHP but spawned the HHVM and Hack language to handle the hundreds of millions of users arriving at their door.
Figma, who already pushed C to the browser via WASM, showed even gluing a Rust process via stdin/stdout into their Node.js backend can be a win.
Those explicit JIT ecosystem examples likely won't work as well as Python's broader implicit JIT ecosystem however. The ecosystem isn't just the individual, it's the collective. A singular sequoia tree sticking out far above the canopy of the dwarfed forest at its feet.
Python as the fastest slow language
The data science and machine learning ecosystem in Python provides perhaps the clearest example of this implicit JIT evolution.
Python's need for numerical processing became numpy
, pandas
, scikit-learn
, and a hundred other base tools. Being written in C and Cython was enough performance to spark the flame. All of these were well optimized and yet we're still seeing further (slow rolled implicit JIT ecosystem) gains. Polars
tries to see where Rust might take the pandas
codebase for example.
Even more surprising is the core ML frameworks themselves.
PyTorch emerged from Torch in Lua not as Python was faster, but for the simplicity and the ecosystem. Like other libraries it had to find a way to surface itself in Python or risk decreased usage.
Torch started life in Lua until it later became PyTorch. That language shift was motivated not by speed but by simplicity and ecosystem. Like other libraries it had to find a way to surface itself in Python or risk decreased usage.
A small library called autograd asked "What if we could differentiate native Python and Numpy code?" and became an underlying inspiration for early JAX. Now Python compiles to run on some of Google's most expensive and custom hardware.
The acronym JAX stands for “just after execution”, since to compile a function we first monitor its execution once in Python.
Somehow Python, the slow language, is now in charge of most of the FLOPS in the world.
Slow Python code that's important is a structural flaw that the ecosystem works to correct.
The glue language is also the LLM language
Python was optimized to be concise, forgiving, incremental, and (relatively) simple for humans - which is the exact same need for LLMs.
For most tasks you likely don't want anything beyond three lines (import
, setup, execution) if you can help it.
Python is optimally placed for that:
- no overhead to play,
- a full ecosystem of interoperable components to thread together,
- and a set of APIs built to be forgivingly simple to a wayward programmer.
The Python interpreter itself becomes a powerful environment for self play, suggesting Python's role as a glue language might be more central to the age of AI than giving you a hot path to run the matrix multiplications.
The implicit JIT ecosystem gets a true JIT
After decades of being the catalyst for an implicit JIT ecosystem, Python is finally experimenting with its own JIT in 3.13.
It's a fitting evolution. That slowness we once cursed might well have been a necessary catalyst for the ecosystem to build hot paths that Python alone would never have been capable of.
Python's performance constraints didn't just create an ecosystem of optimized components, they allowed us to see exactly what paths needed to be optimized, no more and no less. Sometimes the best path forward is by going both fast and slow.