I could also have called this article "How to Speed Up Rust Compilation on Google Cloud Build" but that wasn't as fun. The setup for this article is quite straight forward. As apposed to previous projects of mine, I've been diving much deeper into Google Cloud Build recently. I've been building a collection of Rust Micro-Services that will be running on Cloud Run and as such I've been dealing with incredibly slow build times with some regularity. Rust notoriously concedes to extremely slow compilation time so as to achieve efficiency and safety at run time. As such, I've taken steps to decrease the build time as much as possible - because sitting there waiting for a build is not how I like to spend my evenings.
By default Cloud build runs on a machine with a single CPU core and 4GB of memory. It costs $0.003 (0.3 cents) per minute of build time but the first 120 minutes are free each day. If you work that out it means I can probably do 4.8ish of the above Rust builds per day at this timing before I'll incur any cost. This isn't bad from a cost perspective as its still basically nothing and early on, I was fine with this limitation as I did local testing first and only had a couple of services. We'll come back to this later though as you may have guessed from the title of the article.
Step 1 - Dependancy Caching
One of the first things I attempted to speed up the builds was to implement dependancy caching. This is a natural first step for any developer who is used to developing in NodeJS. Downloading Dependancies can be a huge time sink on any CI pipeline in any language. The Rust build takes place inside of a docker container as that's what we need to deploy to Cloud Run. Given this info, you might infer I had a dockerfile that looked something like this.
In this dockerfile, you can see that we're building the rust application and then copying it to a slimmed down executable container. This just means that we don't have to deploy all of our build depenancies as those can sometimes be quite chunky. Before we can cache the dependancies though, we first need to seperate downloading the dependancies and the actual build step. This can be done by adding a single line to our dockerfile just before running
Now that won't improve speed on it's own. We'll also need to update our
cloudbuild.yml file to use something called
kaniko rather than using docker directly. Let's start by looking at a normal docker based cloudbuild file.
As you can see, it's quite simple. It runs docker build and then pushes the container to the container registry. I've elected not to include the deploy step here as it has a log of arguments and It's not really the point of the article (even at worst, this step takes a single minute).
Kankio is a docker build caching tool which caches the docker container after each command of the dockerfile completes, and then allows us to reuse the cache for future builds if there aren't any changes up until that command. It also takes care of a lot of the cloudbuild steps for us, We no longer need to pass
images in explicity and It takes care of pushing the image to the registry. As such we can replace all of the above cloudbuild steps with the following:
The main change is that we've switched from
gcr.io/kaniko-project/executor:latest. The args that we've passed have also changed. I personally think the Kaniko arguments is really clear. We pass a destination (the image that we previously would have included at the end of cloudbuild file), We enable caching, and we give it a time for which to retain each cached docker-container (in my case 48 hours). We still pass the env variable to use the newer docker buildkit but that's more dependant on your build.
Step 2 - Just Pay For The Faster Server Stupid
I think I gave away the answer with the title of the article so let's re-cover some ground here. Cloud build is running on a machine with a single CPU core and 4GB of memory. My builds are free for the first 120 minutes of build time each day and on average I'm doing 6-7 builds each evening as I'm still in active development. This has meant that I've been paying for about 43 minutes of build time each day I work on the project. At $0.003 per minute, that works out to 12.9 US Cents per day or $2.97 a month given that I don't do work on the project every day. Google cloud build does offer a better machine for builds though at a little bit of extra cost.
Let's look at the e2-highcpu-8. It has 8 Virtual CPU cores as apposed to the 1 we're currently working with and an additional 4gb of memory. Rust's compiler does something quite interesting when you give it multiple cores. It builds the dependancy graph in parallel, meaning that with 8 cores, we'll be building 8 dependancies at a time. For smaller projects, this probably won't speed up the build too much but for a project with a few dependancies or even 1 super chunky dependancy it will show a marked improvement. To do a build with this better machine, all we do is add the following code to the end of our cloudbuild file.
With that tiny 2 line change, we take the build down to 6 minutes 17 seconds and only 4:42 for the actual rust compilation step. However, as the free build minutes don't apply and it costs $0.016 per minute, each build will cost just under 10 US Cents to perform. If you're doing hundreds of builds an hour, this is likely to add up but personally for my usage I think that's a justifiable cost. Even at our current poor exchange rate, it's only £0.08 GBP to save ourselves 74.75% of the original build time.
Assuming that time is valuable and that you value your time at £15 an hour. By spending £0.08 to save those 18 and half minutes, you're not so much spending £0.08 as you are saving £4.65. As developers, especially when learning, we can be averse to spending money - no matter how little it actually ends up being- so in the nicest possible way (and this is a message to myself as well) "Just Pay For The Faster Server Stupid!"
This site uses Webmentions to handle likes, comments and other interactions. To leave a comment, you can reply on Twitter, GitHub, or Reddit. While the site will count likes from any source, only twitter likes are currently displayed in the facepile above.