I need to build binaries in one queue and use them in another. How do I do this?

#1

I am trying to create a workflow using buildkite where I cross-compile binaries using EC2 instances and then load them and run them on specialized test workers running buildkite agents in my lab. I have all the pieces built and working but I’m having trouble coordinating the handoff of the binaries from the ec2 instances to the test workers.

Attempt 1 - Queue Handoff

My first attempt was to use different queues in a single pipeline but I quickly realized there is no way to have one queue wait on another. Am I correct that this is not possible?

image

Attempt 2 - Pipeline Trigger

My second attempt was to use triggers where the first pipeline would trigger the second:

The problem is that build #02 needs to know the correct build number to pull the artifacts from. I tried to set this build number from the EC2 build as an environment variable in the trigger but I got this error:

Interpolating “BUILDKITE_BUILD_NUMBER” is currently not supported. Please contact hello@buildkite.com if you need this added.

So I’m at an impasse now. Can you offer any guidance on how I can enable this workflow?

Details

Pipeline 1 -> https://buildkite.com/uavcan/libuavcan-v1
Pipeline 2 -> https://buildkite.com/uavcan/libuavcan-v1-ontarget

Steps in Pipeline 1:

steps:
  - label: ':hammer: native build'
    command: "./ci/native-build.sh"
    artifact_paths: 
      - "build_ci_native/libuavcan"
      - "build_ci_native/docs/**/*"
    plugins:
      - docker#v3.1.0:
          workdir: /repo
          image: "uavcan/libuavcan:latest"
    agents:
      queue: 'default'
  - wait
  - label: ':hammer: s32k build'
    command: "./ci/ontarget-s32k-build.sh"
    artifact_paths: 
      - "build_ci_ontarget_s32k/**/*.log"
      - "build_ci_ontarget_s32k/**/*.elf"
      - "build_ci_ontarget_s32k/**/*.hex"
      - "build_ci_ontarget_s32k/**/*.bin"
      - "build_ci_ontarget_s32k/**/*.jlink"
    plugins:
      - docker#v3.1.0:
          workdir: /repo
          image: "uavcan/libuavcan:latest"
    agents:
      queue: 'default'
  - wait
  - label: ':hammer: native tests'
    command: "./ci/native-test.sh"
    plugins:
      - docker#v3.1.0:
          workdir: /repo
          image: "uavcan/libuavcan:latest"
    agents:
      queue: 'default'
  - trigger: "libuavcan v1 ontarget"
    label: ":mag: ontarget testing"
    build:
      message: "${BUILDKITE_MESSAGE}"
      commit: "${BUILDKITE_COMMIT}"
      branch: "${BUILDKITE_BRANCH}"
      env:
        LIBUAVCAN_ARTIFACTS_BUILD: "${BUILDKITE_BUILD_NUMBER}"

Steps in pipeline 2:

steps:
  - label: ':mag: ontarget-s32k'
    command: "./ci/ontarget-s32k-test.sh"
    agents:
      queue: 'ontarget-s32k'

this pipeline’s command script includes this line:

buildkite-agent artifact download "build_ci_ontarget_s32k/*.hex" .

Prototype Pi Agent

#2

:wave: Welcome! Awesome looking project, we love seeing folks use Buildkite for hardware related things.

Let me see if I can assist!

My first attempt was to use different queues in a single pipeline but I quickly realized there is no way to have one queue wait on another. Am I correct that this is not possible?

It’s absolutely possible to use different queues in the same pipeline, we do it all the time! Let’s see if we can figure out why your Attempt 1 isn’t working.

First thing I’ve noticed is that you are defining your entire pipeline in the pipeline settings, which whilst a thing you can absolutely do, we recommend that folks keep the pipeline along side code in .buildkite/pipeline.yml and then have a single step defined in the settings that uploads that file:

steps:
  - label: ":pipeline:"
     command: "buildkite-agent pipeline upload"

That makes it easier to test different branch config on different branches, and you also have things like BUILDKITE_BUILD_NUMBER available for interpolation.

I actually suspect that you may have bumped into a bug with defining wait steps in that pipeline settings YAML format beta.

Are you able to swap to the standard approach I suggested above? I suspect your wait steps will work as expected then.

1 Like
#3

Ah, okay. Let me try the experiment again using a .buildkite/pipeline.yml. If the wait works then this is the ideal solution for me. I’ll have time to play with this later tonight (I’m in PST). Thanks for getting back to me. Your support has be just fantastic so far.

#4

No problems! We’re around for the rest of the day! We’re investigating why it didn’t work in the Pipeline YAML Settings too :thinking:

#5

Brilliant! It works:

2019-05-10 04:53:24 INFO   Successfully downloaded "build_ci_ontarget_s32k/test_time.hex" 664769 bytes
2019-05-10 04:53:24 INFO   Successfully downloaded "build_ci_ontarget_s32k/test_build_config.hex" 636869 bytes
2019-05-10 04:53:24 INFO   Successfully downloaded "build_ci_ontarget_s32k/test_bus.hex" 636419 bytes
2019-05-10 04:53:24 INFO   Successfully downloaded "build_ci_ontarget_s32k/test_math_saturation.hex" 1125649 bytes
total 3.0M

That output was from the Pi. It waited on the ec2 build and pulled the hex files successfully.

So I noticed, after I switched to my vscode editor with yaml linting, that the yaml I had in the pipeline settings was malformed. It seems you have a bug in your validator for this input?

This proves out the entire workflow as I proposed it to the rest of my team. I just need to write some python to control the firmware upload and UART monitoring and I’ll have a full CI pipeline with real microcontrollers attached. This is huge. Thanks again.

1 Like
#6

Great to hear! Any idea what the linting error you got was?

#7

I had this:

env:
      - VERBOSE: 1

which is invalid. Should be:

env:
      VERBOSE: 1