Optimizing CI/CD process of the PHP application

Mateusz Piatkowski
6 min readApr 17, 2021
Photo by Bruce Warrington on Unsplash

Doesn’t matter if you are using Jenkins, CircleCI, Gitlab CI, or Bitbucket Pipelines. The CI/CD process is one of the most important steps in software development and if it takes a long time it may frustrate the developers. It’s a good practice to optimize these processes so they execute faster and don’t make your team waiting. In this article, I will present some basic steps you can perform to improve your pipeline’s speed and don’t require many changes in your project.

Of course, each technology requires its own steps in the CI/CD process and this article can’t be universal. As an example, I will take a PHP application written in Symfony and running in Docker. Some of the tips can be applied to any stack though.

The example application

As mentioned above the steps described below are based on the PHP application with the following specification:

All the examples I will be giving for Bitbucket Pipelines but most likely they could be applied in any CI/CD tool.

Improvements step by step

Use custom docker image

It is probably the step that will help you most if you are not doing it yet. Building a custom image can take time. Depending on how many dependencies you have between tens of seconds up to few minutes for a PHP application. Normally for PHP applications, you need to install specific PHP extensions, libraries for image manipulation, compression, etc.

A naive approach is to use the PHP docker image and install all the needed dependencies in the first step of the pipelines. Something like that:

This is the wrong way to install dependencies in CI/CD pipeline

This way every time you run your pipeline you have to install all the dependencies. It takes time, is error-prone as you have to remember to update your pipeline configuration every time you need some new PHP extension or library. You can enable caching so that the aptinstallation will be shorter, but still, it’s just a bad approach.

A much better way is to have your own CI/CD Docker image that you can build, push to Docker Hub and use in your pipeline. This way you execute the build step once on your computer and the pipeline just pulls the image and has all the necessary dependencies already installed. In this case, your pipeline will look something like that:

CI/CD pipeline using a custom Docker image

This image can be the same or very similar to your production image. This way you can manage your dependencies in one place which is the Dockerfile.

Use caches

Even if you are using a custom docker image there are some dependencies that you will have to install in every build. All the Composer dependencies (or another package manager you use in your application) have to be installed according to the current lockfile. In a mature application, they don’t change every day. Most likely you will update some packages every few weeks. It’s a waste to download all the dependencies on every build. Here comes the use of caches.

Bitbucket has some pre-defined caches for most popular package managers. To enable the Composer cache, add the following lines in the step that is installing composer dependencies:

- step: &composer
caches:
- composer

This way Bitbucket will cache all the downloaded packages and if you install the same version of the same package in the next pipeline run it will load it from cache instead of downloading it.

You can also create custom caches and cache any dependencies you are downloading during the build process. However be careful, the cache is being saved in Bitbucket at the end of a successful pipeline and is not being updated automatically on every build. That means that you can’t cache, for example, a PHPStan cache, as it normally changes on every commit.

If, however, you have to download some external files on every build it’s a good practice to cache them like so:

Creating custom cache in bitbucket pipeline

Before using this option I recommend reading Bitbucket documentation (or documentation of the tool you are using) to know what are the limitations and detailed behaviors of caches.

Parallelize pipeline steps

Another improvement you can introduce is the step parallelization. Some of the checks you are running, like various types of tests or linting, do not have to run sequentially. This tasks can be executed at the same time. To achieve it in Bitbucket pipeline you have to use the parallel directive:

Example of parallel steps in Bitbucket pipeline

In the example above first executes the Composer install and generates artifacts that can be used in the subsequent steps. Secondly, the PHPStan and PHPUnit run in parallel. Since many of the checks you would normally run in a build process don’t depend on each other, this method can significantly reduce your pipeline times. A good idea may be also to run in parallel various test suits, test types (unit/integration/acceptance), etc.

Parallelize tests and linters

This post is sponsored by the letter P and the word “Parallelization”.

As I mentioned above you can run several steps of your pipelines in parallel, but you can also speed up the execution of specific tasks. I’m not sure how about other CI/CD tools, but Bitbucket provides multi-core containers. Thanks to this feature, tools that support multi-threading can run faster. Not all the tools provide this feature out of the box, but there are options out there. I will give a few examples that I found especially useful in the PHP project, but I’m sure in other languages you can find equivalents.

  • Parallel PHPUnit testing with Paratest. The installation and usage are trivial. In my case, the unit test execution in the pipeline dropped by around 30%. It will never be as many times faster as the number of cores you have because the process still has to wait for the slowest test, but the change is significant.
  • If you use another tool for testing you can use a nifty little tool called Fastest. It’s a bit more tricky to use, but you can run in parallel virtually any command.
  • Run PHP Code Sniffer in parallel. To enable this option just add <arg name="parallel" value="8"> to your configuration file. The value represents the number of threads you want to run it in.
  • Run Psalm checks with --threads=X option.

Check the capability of your tools. I am sure you will manage to take advantage of multi-threading in most of them.

Use a specific environment setup

This point is maybe not a life-changer but maybe useful in some cases. Most of the time for running your tests you won’t need all the production setup. It’s recommended to have an environment as close to production as possible so you can catch any issues before the deploy, however, some things could be safely disabled. That depends on the project, but you can think of the following things:

  • Sending emails — instead of connecting to a real SMTP server, run a mock such as Mailhog or Mailcatcher.
  • Debug mode — remember to turn it off (by the way, also have it turned off in production!). With debug mode enabled, applications have always some overhead of profiling the endpoints, additional logging, etc.
  • Logging in databases — things like query logging may slow down your tests
  • Other external services — if the tests are connecting to some slow external services, think about mocking them.

Try to profile your application and see what takes the most time during your tests and build. You may be surprised what you find.

Update your tools

This part is just a reminder. It’s always good to have new versions of the tools and libraries you are using. They are more secure, but also often have improved performance. Let me give three examples that have helped us a lot in the build processes:

  • Composer 2.0 — the new release introduced parallel packages downloads. the update was fast and easy and the packages installation is now faster than ever
  • PHPStan 0.12 — with this huge update of that tool the parallel check became a default option. It sped up the analysis “from minutes to seconds” as they claim.

But there is more examples. PHP, just to give one. Versions 7.x gave an amazing performance boost and version 8.x is promising even more.

Summary

As you can see there are dozens of possibilities to speed up your continuous deployment process. This post just shows some examples and just for one specific development stack. Look through your tools, check possibilities and optimize. These changes will pay off in the long run, and it’s not only about time. I am sure the development team will be happier too.

--

--

Mateusz Piatkowski

Senior Software Engineer (Java, PHP, Node, APIs), 100% FOSS. Living between Poland and Spain. 🇵🇱🇬🇧🇪🇸