At Unleashed Technologies we use Gitlab CI with Docker runners for our continuous integration testing. We've put significant effort into speeding up the build execution speeds. One of the optimizations we made was to share a cache volume across all the CI jobs, allowing them to share files like package download caches.
Configuring the Docker runner was really simple - we simply dropped volumes = ["/srv/cache:/cache:rw"]
into our config.toml
file:
concurrent = 6
check_interval = 0
[[runners]]
name = "ut-ci01"
url = "https://gitlab.example.com/"
token = "xxxxxxxxxxxxx"
executor = "docker"
[runners.docker]
tls_verify = false
image = "unleashed/php:7.1"
privileged = false
disable_cache = false
volumes = ["/srv/cache:/cache:rw"]
[runners.cache]
As a result, all CI jobs will have a /cache
directory available (which is mapped to /srv/cache
on the Docker host).
The next step was making the package managers utilize this cache directory whenever jobs run commands like composer install
or yarn install
. Luckilly these package managers allow us to configure their cache directories using environment variables:
- Composer:
COMPOSER_CACHE_DIR
- Yarn:
YARN_CACHE_FOLDER
- npm:
NPM_CONFIG_CACHE
- bower:
bower_storage__packages
- RubyGems:
GEM_SPEC_CACHE
- pip:
PIP_DOWNLOAD_CACHE
So we simply added these ENV
directives in the Dockerfile
s for our base images:
ENV COMPOSER_CACHE_DIR /cache/composer
ENV YARN_CACHE_FOLDER /cache/yarn
ENV NPM_CONFIG_CACHE /cache/npm
ENV bower_storage__packages /cache/bower
ENV GEM_SPEC_CACHE /cache/gem
ENV PIP_DOWNLOAD_CACHE /cache/pip
Now, whenever a job needs a package installed, it'll pull from our local cache instead of downloading from a remote server! This provides a noticeable speed improvement for our builds.
Enjoy this article?
Support my open-source work via Github or follow me on Twitter for more blog posts and other interesting articles from around the web. I'd also love to hear your thoughts on this post - simply drop a comment below!