wondering what kinds of projects are/are not suitable for this. the only context i have is from working at open source devtool companies that provide docker builds for people to pull down. might speed up the release process slightly. i suspect my company https://github.com/airbytehq/airbyte/ could benefit. but is it also useful for internal usage?
Our biggest strength is building images, internal or not, in generic CI providers like GitHub Actions, CircleCI, etc. Depot excels at building images in these environments because it provides solutions to limitations like persistent SSD caching, multi-architecture support, and faster compute resources.
Things where Depot may not be a good fit are when you only ever need Docker images for local development. In those situations, the network becomes the bottleneck because you have to pull down the built image each time. That said, if you need to build multi-architecture images locally, Depot becomes useful again as it can build native images for each architecture, no emulation needed.
If you're entirely on-prem, we don't currently support that so Depot isn't a fit there.
CircleCI's remote docker have a restriction that only one of jobs can access same remote docker enginge at a time. Say, a job A build an image, then job B, C try to use same remote docker, but only one of them have the cache.
Yup the caches for each architecture are available in parallel and multiple builds for a single architecture can simultaneously use the same build machine for a single project. So we don't limit the concurrency.
I believe Cloud Build has no persistent caching so you are forced to use remote cache saving and loading. Which can incur a network latency that can slow the build to some extent. Cloud Build with Kaniko also expires the layer cache after 6 hours by default.
GitHub Actions is similar except that there is the ability to store Docker cache using GitHub's Cache API via the `cache-to=gha` and `cache-from=gha` directives. However, this has limitations like only being able to store a total cache size of 10GB per repository. You also have network latency for loading/saving that cache as well.
With Depot, the cache is kept on a persistent disk. So no need to save/load it or incur network latency doing so. It's there ready to be used by any builds that come in for the given projects.
wondering what kinds of projects are/are not suitable for this. the only context i have is from working at open source devtool companies that provide docker builds for people to pull down. might speed up the release process slightly. i suspect my company https://github.com/airbytehq/airbyte/ could benefit. but is it also useful for internal usage?