Next.js app with 13MB image: "Server failed: Out of memory"

A Next.js app that uses the ‘next/image’ tag with a 13MB PNG image: pages/index.js

Try to load the site and the image won’t load. The “Events” tab has “Server Error: Out of memory (used over 512MB)”.

  • This happens on the 1GB RAM plan too.
  • This doesn’t happen when I run locally: NODE_OPTIONS=--max_old_space_size=64 yarn run start.
  • This doesn’t happen on Vercel.

Full repro steps:

Hi @cakoose , thanks for reaching out, and thank you for providing such an excellent, minimal repro repository and steps.

I was able to repro using a Starter service. The server is able to serve the image at the 2GB RAM plan. The root cause is the next/image component does image optimization on the fly. My theory is that when the first request for the image is made, the Next.js app is using a lot of memory to compress the 13MB image; the image loaded into my browser is 194Kb. I’ve found an open Github issue that describes the issue.

The reason why this does not happen on Vercel is because Vercel does image optimization that is separate from Next.js’ built-in Image Optimization. Feel free to submit a request at https://feedback.render.com/ for this. We’d love to be able to do this automatically for our users in the future!

I believe you can work around this issue by doing 1 of the following options:

  • Pinning next.js version to 10.0.7. I was able to deploy a fork of your repo with the next.js version pinned to 10.0.7 and the Starter service was able to serve the image.

  • As part of your build step, optimize your images, so that when the image is first requested, the next.js app is not using as much memory to serve the image.

  • Set the unoptimized prop for the component to true (docs). However, this might result in a bad experience for your visitors.

2 Likes

Thanks for investigating! I’ll follow that GitHub issue for updates.

I mistakenly thought the issue Render-specific because I tried --max_old_space_size=64 locally and the issue did not repro. But that doesn’t exactly limit total memory. I used Docker to more reliably limit memory and the issue reproduced locally.

1 Like