Persist Files for Plugin

Hi! I’ve been using the ECS Deploy plugin and Elastic Agents for a while now, but we’ve changed our workflow and now its desirable to have the associated files containers.json etc outside of the repository. Wrote a little bash script to download them from the EA secrets bucket. But they dont persist between steps even if I make them a special folder. Can I get some help as to what I’m missing?

This is the snippet in question of the config download and deployment

 - group: "Deploy :rocket:"
    key: "Deploy"
      - label: ":hammer: Get Infrastructure Config"
        command: ".buildkite/"
      - label: ":ecs: Deployment!"
          - ecs-deploy#v2.1.0:
              cluster: "sandbox-cluster"
              service: "Service"
              service-definition: "infra/service.json"
              desired-count: 1
              container-definitions: "infra/containers.json"
              task-definition: "infra/task-definition.json"
              task-family: "taskapplication"
              target-container-name: "Container"
              target-container-port: 3000
              target-group: "${elb_target_group}"
              image: "${ecr_image_repository}:ci-${BUILDKITE_BUILD_NUMBER}"

And the Bash Script:

set -eo pipefail

AWS_CLI_VERSION=$(aws --version 2>&1 | cut -d " " -f1 | cut -d "/" -f2)

download_object_from_bucket() {
  local bucket_name=$1
  local object_name=$2
  local destination_file_name=$3
  local response

  response=$(aws s3api get-object \
    --bucket "$bucket_name" \
    --key "$object_name" \

  # shellcheck disable=SC2181
  if [[ ${?} -ne 0 ]]; then
    errecho "ERROR: AWS reports put-object operation failed.\n$response"
    return 1

mkdir infra

download_object_from_bucket "secretsbucket" "task-definition.json" "infra/task-definition.json"
download_object_from_bucket "secretsbucket" "service.json" "infra/service.json"
download_object_from_bucket "secretsbucket" "containers.json" "infra/containers.json"

I can confirm that the download is working as expected but Im imagining theres some sort of environment cleaning in between?

Hey @lachl :wave:

Many thanks for the question - and a warm welcome to the Buildkite Forum!!

Thats a good question - I don’t see any specific agent targeting for that group per-se (I assume there is targeting occurring, or potentially to the default queue): but first premise here: are there multiple agents polling for these two steps in your Buildkite tenancy (and potentially, seperate agent instances altogether not sharing a host)?

The directory where builds are kept (example Ubuntu) are here, so when said Get Infrastructure Config job is run, the files should end up in that working directory (though it would be a little different for plugins). I’m of the assumption your agent setup involves multiple agents (let me know otherwise!), which might mean said these files (even though downloaded on the first step) are not present on the agent that picks up the plugin job


Thanks for the response, thats the weird bit, as far as our pipelines needs are concerned its very simple, in broad strokes: dependencies and test step, build and push then deploy. All happens on the same agent.

No worries!

In that case, if those jobs are being run on the same agent, I believe where the plugin’s hooks are running from will need to be attributed.

For example (on MacOS with Apple Silicon), build directories for command jobs live in the opt/homebrew/var/buildkite-agent/builds/ path, and any plugin that is used in your builds are run (the specific version) through another directory on the host - specifically for the former: /opt/homebrew/var/buildkite-agent/plugins.

What I believe is happening here is that the Get Infrastructure Config job is downloading all of the infra/___.json files from S3 into the build directory, and when the plugin’s command hook runs, will not find said files.

I’d suggest either potentially downloading those files through to that specific directory (the ecs-deploy plugin’s first job is to move to the working directory) - or move them there before invoking the plugin. Additionally, it might be asking this/raising as a GitHub issue for potential feature adding (as this is a similar case to the remote task fetching issue currently on said repository).

Cheers :slightly_smiling_face: