Tom MacWright

tom@macwright.com

Rust is a hard way to make a web API

Rust is an amazing language. It has enabled excellent CLI tools like ripgrep and exa. Companies like Cloudflare are using Rust for their own systems and encouraging people to write Rust to run microservices. Rust makes it possible to write really fast software that’s secure, tiny, and more concise than C++ or C.

If I were writing a geocoder, a routing engine, a real-time messaging platform, a database, or a CLI tool, Rust would be at the top of the list.

But last year, I spent some time trying to make Rust work for a plain-vanilla API to power a normal website. It wasn’t a very good fit.

Lots of missing pieces

Rust has a fair number of web server frameworks, database connectors, and parsers. But building authentication? You have only very low-level parts. Where Node.js will give you passport and Rails has devise and Django gives you an auth model out of the box, in Rust you’re going to build this system by learning how to shuttle a shared vec into low-level crypto libraries. There are libraries trying to fix this, like libreauth, but they’re nascent and niche. Repeat for plenty of other web framework problem areas.

How about SDKs? In mainstream languages, you’ll be able to plug into Google Cloud services, AWS, or Stripe by bringing in an official library. Those libraries are mostly great. The aws-sdk-js and Stripe libraries, for example, are incredibly well-designed and maintained.

Not so with Rust. There are a few third-party libraries trying to fill in the blanks, which is great, but with the sheer velocity of those services, will they really be able to give a quality experience?

Some people will say well, X language is so good you can just write an SDK yourself in a weekend! To which I must reply, no.

Rust’s ecosystem is rich in other domains. The crates for building CLIs, managing concurrency, doing really impressive operations with binary data and low-level parsers - they’re spectacular.

Rust’s compiler is faster than it was, but still slow

I’ve been reading Nicholas Nethercote’s excellent blog for years now, in which he describes how the Rust team has made the compiler faster. And they certainly have made it faster!

But compared to other languages you build websites with, it’s slow. It’s much slower than the Go compiler and much, much slower than the startup time for interpreted languages like JavaScript, Ruby, and Python.

Once your code is compiled, everything’s amazing! But in my case, this basic API - which wasn’t even feature-complete and was by no means a complex system - took more than ten minutes to compile. On the weak hardware of Google Code Build, it would run out of time, every time. We couldn’t build anything.

Caching helps as long as you don’t have to rebuild cached dependencies. And, I don’t know, maybe slimming down dependencies would help Rust projects compile faster. But serde, for example - the JSON and other-format serializer/deserializer that nearly everyone uses - takes up a huge chunk of compile time. Should we replace serde with something that compiles faster but lacks great documentation and ecosystem support? It’s a bad trade.

Rust is complicated

Rust makes you think about dimensions of your code that matter tremendously for systems programming. It makes you think about how memory is shared or copied. It makes you think about real but unlikely corner cases and make sure that they’re handled. It helps you write code that’s incredibly efficient in every possible way.

These are all valid concerns. But for most web applications, they’re not the most important concerns. And buzzword-oriented thinking around them leads to some incorrect assumptions.

Take, for example, Rust’s safety. This is a big part of the marketing, and it’s absolutely correct: Rust’s main promise is to be both safe and low-level - it works without a garbage collector, while at the same time protecting against memory-based exploits. When you read “safety”, think about Rust competing with C. Code in C can reference arbitrary memory, can easily overflow and segfault. Rust code can be just as fast as that C code, but protect that memory access, and without the cost of a garbage collector or some kind of runtime checking.

But Rust’s memory rules aren’t more secure than Node.js’s or Python’s. Your web application written in Rust isn’t going to be systematically more or less secure than an application in Python or Ruby. High-level languages with garbage collectors pay a performance penalty in exchange for generally dodging this whole class of exploits and bugs. You can’t reference uninitialized memory in JavaScript because you simply can’t reference memory-as-memory in JavaScript.

Sidenote…This is describing the design goal of Node.js and other systems - they do occasionally have bugs that creep into this problem area. The previous behavior of Node.js's Buffer object, for example, is a good read.

Heck, if you ask some people, Rust is less secure than a GC’ed language for web apps if you use any crates that have unsafe code - which includes Actix, the most popular web framework, because unsafe code allows things like deferencing raw pointers.

If you’re writing a video game, a pause to run garbage collection is bad. If you’re writing code for a microcontroller, any memory “overhead” or waste is really bad. But most web applications can spare a little memory overhead in exchange for productivity.

This argument is pretty much the same for the other attributes of Rust. Its concurrency primitives are amazing if you’re doing something complicated and need blistering-fast performance. But if you aren’t? The Rust async ecosystem is challenging, to say the least: there are different sorts of async, projects that span domains to do async implementations of unrelated stuff like tokio.

It feels a lot less like Node.js, which had a good async story but ugly syntax, than Python Tornado or Twisted, which had a weird async story and also ugly syntax.

Async, I’m sure, will stabilize and homogenize and be a lot easier to do in the future. But I was working in the present.

The Rust ecosystem is not web-centric

There are many people currently learning Rust, writing CLI apps or low-level code in Rust, and having an extremely fun time. There are dramatically fewer people using Rust to write plain-vanilla web applications.

This is an important part of the equation for technology choices: are there people working with the tool and are they roughly in the same domain? Unfortunately, a lot of the incredibly exciting work in the Rust ecosystem has nothing to do with web application servers. There are some promising web frameworks - even a somewhat higher-level framework - but they’re undoubtedly in a niche. Even Actix, the main web framework, has a very top-heavy set of contributors.

If Rust grows at its current rate, the web portion of the community will reach a sort of critical mass, but right now - I don’t think there are enough people using Rust for websites for it to be a practical tool for that purpose. And compare to other communities in which there are entire companies dedicated to building web applications with existing tools - not cutting-edge work, but the kind of stuff that differentiates a mature technology from a new one.

The Juniper crate invites n+1s

This part isn’t just about Rust, it’s about the GraphQL ecosystem and Rust’s involvement in that ecosystem is one example.

The n+1 problem is something that everyone building web applications should understand. The gist is: you have a page of photos (1 query). You want to show the author of each photo. How many queries do you end up with: 1, combining the photos & authors, or a query per photo to get the author after retrieving the photos? Or 2 queries, with the second having something like user.id IN ids to fetch all authors in a single pass and then reconnect them to their photos.

n+1 queries are usually the highest-priority database fixes: they’re usually high-impact, and changing an n+1 query into a single query is usually a big win. And we have lots of ways to try and resolve them: you can write SQL and try to get a lot done in a single query using CTEs and JOINs, like we did at Observable, or use an ORM layer like ActiveRecord that has quick ways to turn n+1 queries into predictable queries.

We were using Juniper, a GraphQL server for Rust applications. GraphQL basically lets your frontend application define queries, instead of the backend. You give it a range of things it could query, and the application - React or something else - sends arbitrary queries to the backend.

This makes things hard for the backend. Any sort of SQL-level optimization is impossible - your server is writing dynamic SQL, so you rely on the intelligence of your GraphQL server, which is not always high. Juniper, for example: n+1 queries by default. The workaround - a dataloader - is rough and independently maintained. So at the end of the day, you’re going to have a blisteringly-fast application layer that’s spending all of its time inefficiently querying your database.

The word is that GraphQL works really well with non-SQL databases which can serve these sorts of requests fast. I’m sure that there’s some special database used internally at Facebook that’s incredible in combination with GraphQL, but the rest of industry is pretty attached to Postgres and its ilk, for good reason.

Let’s have some caveats!

So I tried to lead with the main caveat: this isn’t about Rust in general. It’s about using the language and its ecosystem for a particular goal. Simple web APIs.

The caveat to that: in the general sense, you can build a website with anything and be successful. Remember how OkCupid was implemented in C++. There’s a popular astrology app, Co-star, that’s all Haskell. If you’re great at writing some language and you can hire other people with lots of talent, you can do it and be heroes.

Another caveat: what I was trying to build was a CRUD-heavy web application API for a website. It wasn’t a web “service” as you might call them nowadays, something that did one operation very fast and millions of times, but a web “application” - something that did quite a few different operations and had a fair bit of domain logic in it. If you’re not building that kind of thing, this advice might not apply! If what you need is to do one or two things at hyper-fast speed, like if you’re writing a payment gateway or voice messaging application, Rust’s tradeoffs might work a lot better.

Here’s another caveat: I’m writing this in January 2021. Assuming that society continues to function, Rust will evolve and will probably get a lot better, and it may become really easy to use for web application development.


All said, I really enjoyed working with Rust. It’s a beautiful language with a lot of cool ideas, and I hope that soon I’ll be thinking about something I want to build and Rust will be the right tool. As it is now, though, a lot of the things I want to build are better served by languages that have different priorities.