Most recent update: 7th November 2024 - 06:37:34
- 2537 characters
Python as Glue: Programing Language and Sticky Ecosystem
- Title could also be "Of Emojis, Python, Rust, and Common Crawl"
- "Gluing Python, Rust, Emojis, and Common Crawl together"
- Python is the magical glue programming language I've been happy with my entire life
- Low cognitive overhead, rich ecosystem, excellent interoperability, fast development, strong standard library
- Minimal lines of code to get a task done that can be done without a single lookup for most tasks (internal knowledge for syntax and base libraries)
- Most of those lines are high efficiency especially if it's on a common path
- For the uncommon paths that become common it's a structural fault to have it slow because ...
- Python is an ecosystem that agglomerates high efficiency libraries together (glue programming language)
numpy
,pandas
,pytorch
,tensorflow
,scikit-learn
, ...- Even if not everything is exposed to Python it's usually a Pareto optimal selection of the 20% of the API that covers 80% of the tasks
- Reminded, starkly, of this when writing a tutorial for Rust code and writing a Python implementation
- The example task was "too simple" in that it fell within Python's hot path (file I/O, decompression, JSON, HTTP requests, ...) and hence was only a tad less efficient than Rust
- Glue language and passing over to the fast custom language for the compute heavy core task is a winning strategy
- Perhaps reframe as "Reach for Rust less when you're playing with glue"
- Python is the optimal glue language, very much meant to have things glued into it, and Rust is the optimal fast language that's made to be glued in
- Almost all of my large scale work has been Python first, efficiency when needed, and Python gets you most of the way most of the time
- Python's "slow" reputation misses the point
- Glue languages are also likely the optimal target for LLMs
- Whilst Google might have started with a Python based web crawler I'd have transitioned to Rust at this stage - and indeed I know that my
robots.txt
Rust library that I wrote about creating and testing with 34 millionrobots.txt
files has been used by at least one Fortune 500 for their large scale web work