Hi! I’ve been using the ECS Deploy plugin and Elastic Agents for a while now, but we’ve changed our workflow and now its desirable to have the associated files containers.json etc outside of the repository. Wrote a little bash script to download them from the EA secrets bucket. But they dont persist between steps even if I make them a special folder. Can I get some help as to what I’m missing?
This is the snippet in question of the config download and deployment
- group: "Deploy :rocket:"
key: "Deploy"
steps:
- label: ":hammer: Get Infrastructure Config"
command: ".buildkite/get_infra.sh"
- label: ":ecs: Deployment!"
plugins:
- ecs-deploy#v2.1.0:
cluster: "sandbox-cluster"
service: "Service"
service-definition: "infra/service.json"
desired-count: 1
container-definitions: "infra/containers.json"
task-definition: "infra/task-definition.json"
task-family: "taskapplication"
target-container-name: "Container"
target-container-port: 3000
target-group: "${elb_target_group}"
image: "${ecr_image_repository}:ci-${BUILDKITE_BUILD_NUMBER}"
And the Bash Script: get_infra.sh
set -eo pipefail
AWS_CLI_VERSION=$(aws --version 2>&1 | cut -d " " -f1 | cut -d "/" -f2)
download_object_from_bucket() {
local bucket_name=$1
local object_name=$2
local destination_file_name=$3
local response
response=$(aws s3api get-object \
--bucket "$bucket_name" \
--key "$object_name" \
"$destination_file_name")
# shellcheck disable=SC2181
if [[ ${?} -ne 0 ]]; then
errecho "ERROR: AWS reports put-object operation failed.\n$response"
return 1
fi
}
mkdir infra
download_object_from_bucket "secretsbucket" "task-definition.json" "infra/task-definition.json"
download_object_from_bucket "secretsbucket" "service.json" "infra/service.json"
download_object_from_bucket "secretsbucket" "containers.json" "infra/containers.json"
I can confirm that the download is working as expected but Im imagining theres some sort of environment cleaning in between?