Unable to upload docker volume as artifact

Using default elastic-ci settings as far as permissions go, and I have a docker-compose file which looks roughly like:

version: "3"
services:
  database:
    image: "public.ecr.aws/docker/library/postgres:13.7-bullseye"
    command:"postgres"
    ports:
      - "5432:5432"
    volumes:
      - ./database-data:/var/lib/postgresql/data/

and a pipeline.yml step that looks like:

  - label: "upload volume"
    plugins:
      - docker-compose#v5.2.0:
          build: database
          config: docker/buildkite/docker-compose.yml
          run: database
      - artifacts#v1.9.3:
          upload:
            - "./docker/buildkite/database-data/**/*"

This gives me the error fatal: failed to upload artifacts: collecting artifacts: resolving glob: permission denied

Oddly enough, if i set the upload path to one level above, like "./docker/buildkite/**/*", then I can properly see all the uploaded artifacts in the pipeline, including the files in the volume, but ideally i’d like to only upload the files that i actually care about.
And even doing so, the re-downloaded volume doesn’t have the correct permission to be used by another step in the pipeline.

Or better yet, is there any natively default way to just reuse a volume across different agents? The use case here is to have a single postgres database, run migrations and seed data, then copy and reuse that same volume across parallel steps independent of each other.

Hey @jasonx!

Welcome to the Buildkite Forum! :wave:

Certainly a interesting question - top of mind solutions would/could involve the cache plugin, or configuring build path of the agents to use a shared volumes once configured should see this through.

Given you mention this is using the Elastic Stack - there might be some more things to offer based on seeing builds/your specific version being utilised and the stack parameters. Would you be able to send any examples to support@buildkite.com for us to take a closer look?

Cheers!