Running Pre-Hook Scripts Before Each Container Startup and Ensuring Persistent Volume Mounting for Caching

Title: Optimizing Docker Container Setup for Buildkite Agents with Pre-Hooks and Persistent Caching

Context

In our CI/CD setup, we are utilizing Buildkite with multiple agents (4 agents) inside a single instance. Our workflow involves distributing jobs among these agents, where each agent independently builds its own Docker container to execute the required steps.

Challenge

  1. Authentication Before Container Startup: We have a scenario where authentication needs to be performed before each container starts. This step is crucial to ensure proper mounting for all agents.

  2. Persistent Caching with Bazel: Our projects are built inside Docker containers using Bazel. To optimize build times, we need to persist the .cache directory across builds.

Current Pipeline Steps

steps:
  - label: ':bazel: Lint Check'
    command: '.buildkite/steps/compile.sh x86_64 //tools:buildifier.check'
    agents:
      os: amazon-linux
      arch: x86_64
    plugins:
      - 'docker-compose#v4.15.0':
          run: protoverse_x86_64
          workdir: /app
          config:
            - .buildkite/docker-compose.yaml
          volumes:
            - 'protoverse-build-cache-x86_64:/home/developer/.cache'
  - label: ':proto: Lint Check'
    command: '.buildkite/steps/compile.sh x86_64 //tools:v1_proto_lint'
    agents:
      os: amazon-linux
      arch: x86_64
    plugins:
      - 'docker-compose#v4.15.0':
          run: protoverse_x86_64
          workdir: /app
          config:
            - .buildkite/docker-compose.yaml
          volumes:
            - 'protoverse-build-cache-x86_64:/home/developer/.cache'
volumes:
  protoverse-build-cache-x86_64:
  protoverse-build-cache-arm64:

Seeking Solutions

  1. Implementing Pre-Hooks: Is there a recommended approach to execute a pre-hook script for authentication before each container starts?

  2. Optimizing Pipeline Configuration: Our pipeline.yaml is becoming lengthy due to repeated plugin configurations. Is there a way to make plugins variables or use some form of template to shorten and simplify the pipeline configuration?

Any insights or suggestions on how to address these challenges effectively would be greatly appreciated!

Thank you !

Hey @oh-tarnished

Thanks for getting in touch, there are a few options available for each solution you are looking for, so I’ll list some for each to see if they works for you before exploring other ways.

This sounds like a good case for the use of the pre-bootstrap Agent Hook or pre-command Hook.

The former would run before the Job beings to start and could be used to stopped Jobs if authenication wasn’t possible. Downside as this is an Agent Hooks all Builds/Jobs that run on that Agent would run this - which might be fine based on your description of your Agents setup.

The latter, can be configured as an Agent or Repository Hook, offering a bit of flexibility if only required to run certain Repositories. Slight difference being that Job would start running before checking if authentication was possible but before Containers are created and commands run.

YAML Aliases and Anchor as available for use to defining common Plugins configuration for easier re-use in your Pipeline Configuration(s). This would help reduce the amount of code if a Plugin configuration is used multiple times.

Alternatively, if the Plugin configuration does need to change for different use cases, it might be best to make use of the Dynamic Pipelines functionality of Buildkite. For example, you could have a script that takes in a few parameters with a template which generates the YAML that is then uploaded and run as the new Step(s).

Hope that helps, but let us know.

Cheers!

Hey @tomwatt,

Thank you for your help. The pre-command hooks works ! Right now i am trying use the common attribute for docker-compose setup. Facing issues with it.

env:
  PROJECT_ENVIRONMENT: buildkite 
  DRY_RUN: false

common:
 - env_x86_64: &env_x86_64_var
    docker-compose#v4.16.0:
      workdir: /app
      config:
        - .buildkite/docker-compose.yaml

 - env_arm64: &env_arm64_var
    docker-compose#v4.16.0:
      workdir: /app
      config:
        - .buildkite/docker-compose.yaml

steps:

  - label: ":sweating: Random Check ✨"
    command: .buildkite/steps/blah.sh
    plugins:
       *env_x86_64_var

Error :

Docker Compose plugin error
[2023-12-19T11:26:11Z] No build or run options were specified
[2023-12-19T11:26:11Z] 🚨 Error: The command exited with status 1
[2023-12-19T11:26:11Z] user command error: The plugin docker-compose command hook exited with status 1

I am not sure if my configuration is wrong. Can you please help me set it up. Also i see that i need to set the workdir for the plugin to specify the the directory. Is there any way to change it while keeping the common attribute ?

Hey!

So it seems from the error message that you don’t have a run or build configuration for the plugin on either of the anchors. You need to add a run element to the plugin’s configuration with the name of the service you want to run the script on.

Related to your other question, it seems it was answered on the issue you created :slight_smile:

Hey @paula thank you. I’ve fixed it.