Does using the Docker approach on mean downloading a very large image every deploy?

I need to use fontforge, libreoffice, and texlive, amongst other OS-level packages, so have to use Docker it looks like.

Does this mean that every time I want to deploy, I have to press the manual deploy button (or trigger via the REST API), AND the system has to DOWNLOAD the entire docker image every deploy? My docker image (b/c of the OS dependencies) is over 2GB, that will mean 10+ minutes between every deploy for example? Versus with Vercel, b/c of git hooks, a new deploy only takes about 45 seconds.

Oh in addition, I need to upload my image to github registry, so that is another $20/month and double the time between deploys?

Basically I am creating 1 “wrapper” image which has all the os-packages, and that won’t change much if ever. But my “app” image will then use that wrapper image, and it will change every day. So how does the deploy process work here in detail, what is refetched, and what is cached? Ideally the wrapper/base image will be cached, then it only needs to download my app image every deploy? Something like that.

Thanks for the clarification.

Hello again,

I’ve replied to the other tickets/topics you’ve opened recently and this seems to be a related question.

Auto-deploy is not available on image-based services, the alternative options are noted in the docs:

Looking at the service you linked to the ticket you opened, I think the long deploys were a misconfiguration between the PORT env var you set and the port the service was using. Your most recent deploy appears to have been under 60 seconds.


This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.