On 2023-05-02, the tech sphere buzzed with the release of Mojo 🔥, a new programming language developed by Chris Lattner, renowned for his work on Clang, LLVM, and Swift. Billed as "Python's faster cousin," and "The programming language for all AI developers", Mojo promised a 68,000x performance uplift and a familiar Pythonic syntax.

As it reaches its first anniversary, we unpack Mojo's journey towards its ambitious promise. This talk delves into the practical experiences developing a Large Language Model Interpretation library as part of an AI Safety Camp project in that language. We cast a critical eye over its performance, evaluate its usability, and explore its potential as a Python superset. Against a backdrop where alternatives like Rust, PyPy and Julia dominate performant programming for AI, we question whether Mojo can carve out its niche or if it will languish as another "could-have-been" in the programming language pantheon.

Jamie Coombes

Affiliation: Coefficient.ai | Coefficient Systems Limited

I'm a Machine Learning Engineer with 3 years of Python and PyTorch development experience. I've provided ML expertise to startups and the UK government, I am interested in beneficial AI applications.

I spoke at EuroPython Prague this summer and I have speaking experience through my prior role as a Science Teacher with TeachFirst. My background is studying Physics and then Atmospheric Physics interpreting large tropical cyclone datasets at Imperial College London.

visit the speaker at: Github • Homepage