I’m a bit new to buildkite so I apologize if this is seems rudimentary.
I’m running into some troubles with designing a process for cross-pipeline artifact sharing. I have two pipelines each for two different applications, A and B. In pipeline A, I build the application into a binary and upload it to artifact storage. In pipeline B, during my testing step, I need to download the newest binary uploaded by pipeline A. Since this is in a running build, I’m using
buildkite-agent artifact download. However, since pipeline B doesn’t have the artifact, the command can’t find it unless I provide the build id, which changes every build.
I know I can use the RESTful API to query the builds, select the one I want, use the id to get the artifacts, etc. But I’m unsure if this is “idiomatic buildkite”. It also adds extra code to our process and requires creating and managing an API access token (doesn’t look like the BUILDKITE_ACCESS_AGENT_TOKEN works for the api).
I guess my question is, what is the best solution here? Am I missing some kind of feature which allows me to do something like
buildkite-agent artifact download --from pipelineA release.tar.gz . and then sort by date? Or am I simply abusing the artifact system here?
First of all welcome to Buildkite community and thank you for reaching out to us with your question.
One way to approach your usecase is to have pipeline A trigger pipeline B. In pipeline A before triggering pipeline B you will perform artifact upload process.
Now pipeline B build will have below environment variables available to it:
So you will have build_id, build_number information of pipeline A where artifact upload happened. Using these environment variable you can know call buildkite-agent artifact download in pipeline B to fetch the artifacts uploaded by pipeline A.
Please let us know if you have any follow up questions. Thank you and once again welcome to Buildkite community.
Thanks for the response!
That would be perfect but unfortunately, application A and B are on different development cadences. So although B depends on A, B needs to run independently (and will be run much more often than a release of A).
Application A is really an infrastructure tool that creates cloud resources etc which application B (and others) need to test their features against.
When B needs to run independently does it still need to fetch artifacts uploaded by A ? If so will it fetch artifacts uploaded by last build of A ?
Yes B needs to always fetch A’s latest artifacts
@jackbischoff for this scenario, other options I can think about are pipeline A uploading artifacts also to a specific S3 bucket of your own if you are using AWS and then have pipeline B fetch the artifacts from that common S3 bucket.
Other option is in build of pipeline B have a script which queries for latest build info of pipeline A so that way it has necessary information to call buildkite-agent artifact get to fetch artifacts uploaded by pipeline A
Please let me know if either of these options are something that might work for you.
@suma option 1 is similar to what we do now with build assets particularly for releases to customers. But we wanted to explore the possibility of simplifying the ci process for internal tools by just pushing them to buildkite storage. Essentially to have better demarcations around internal/external releases.
Option 2 would solve our use case here, if we didn’t already have code to push/pull to our hosted repository.
I just wanted to make sure I wasn’t missing a magic solution with the buildkite-agent artifact command that would make this trivial.
Thank you for your assistance!