I have a scenario, where I want a task to run on all available hosts. To describe: lets say I have 3 hosts and each host running 4 agents each using docker, so total I got 12 agents now.
My rspec tests run in parallel mode using all 12 agents.
Now before running the rspec tests(which uses a docker image), I build a test docker image, but this task of building of test-image should be run on all 3 hosts as the test-image will be required on all 3 hosts, when the tests run.
Right now, I have specified three tasks in my pipeline to build the test-image on all 3 hosts, to ensure image is available, but when I plan to horizontally scale and increase my hosts along with the agents, I will have to add extra steps in pipeline.yml to build image on those hosts.
Is there any better way to work this out, by which I can ensure my building of test image task runs on all available hosts
Hmm, I think you may want to try an alternative approach, so you don’t have to build on all the different hosts.
You could build a docker image in the first step, push it to a registry, and then use that image in all subsequent steps (so you’re not having to pay the docker build cost for each of your hosts). Here are some docs on what that could look like (the docker-compose-plugin has a good example of this):
A further optimisation to that is to set aside a set of agents that only build docker images (and don’t autoscale them as you would your other agents). By having long-lived docker build agents, you can utilise the local image cache to keep your builds fast.
Thanks for the response, I had this in mind, but i was kind of avoiding to send to dockerhub.
But I will try this and see how much time it takes, if it’s something that doesn’t take much time, can go for this solution.
I was thinking if there is some option that I am missing.
Another option that came in my mind was using dynamic pipeline generation, but not sure if that is a good approach?