I'm using Docker Swarm to test services on AWS. I recently applied an update to the service like this:
docker service update --image TestImage:v2 --update-parallelism 2 \
--update-delay 10s TestService2
The update worked as intended, and the service updated the task containers to v2. However a quick docker service ps TestService2 | grep "v1"
reveals a bunch of shutdown TestImage:v1
tasks.
a0w77kj0k6jfg4r9g4nz47zzg \_ TestService2.1 TestImage:v1 W1 Shutdown Shutdown 36 minutes ago
2of4mc63ekzbib01w3x7q6sdm \_ TestService2.2 TestImage:v1 W2 Shutdown Shutdown 37 minutes ago
495frrpza5pxt205o1594x54a \_ TestService2.3 TestImage:v1 W1 Shutdown Shutdown 36 minutes ago
57l0gsqd26u2e5gdj30w8mcn9 \_ TestService2.4 TestImage:v1 M1 Shutdown Shutdown 36 minutes ago
baoe1i79fswb34ydwbpafg6tm \_ TestService2.5 TestImage:v1 M3 Shutdown Shutdown 35 minutes ago
3uxi7kwxb73z69km6s17son58 \_ TestService2.6 TestImage:v1 M2 Shutdown Shutdown 37 minutes ago
99cg4arnt1y52nd8d422bdu49 \_ TestService2.7 TestImage:v1 M3 Shutdown Shutdown 36 minutes ago
cq5716jqp40h6jugo1j9ilzwp \_ TestService2.8 TestImage:v1 M1 Shutdown Shutdown 35 minutes ago
awlz1kxbrjk51dey7frm14d8u \_ TestService2.9 TestImage:v1 W3 Shutdown Shutdown 35 minutes ago
4xdi9a1jweyehfqlt76uynf3i \_ TestService2.10 TestImage:v1 M2 Shutdown Shutdown 36 minutes ago
eo4t6i0gaj5i294fcdnb3qncq \_ TestService2.11 TestImage:v1 W3 Shutdown Shutdown 35 minutes ago
3ydeuxdjquulv5xj94b9ovuwu \_ TestService2.12 TestImage:v1 W1 Shutdown Shutdown 36 minutes ago
How can I remove these without going to each individual swarm node and running docker rm
on the exited containers? I don't think theres a docker service
command to do it, I've looked through the docs, but does anyone know of a hack or script that I can run on a Swarm Manager to clean up the nodes?
Thanks!
The containers for those services are removed after a rolling update; you are simply left with a log of those that were shutdown.
You can limit the number you see using
docker swarm update --task-history-limit 5