Pg_restore issue migrating from heroku to render

I am experiencing a timeout issue when attempting to restore database tables that are over 300k long on render. My database is 300mb in size. When render imports this table, it responds with

pg_restore: error: error returned by PQputCopyData: could not receive data from server: Operation timed out
SSL SYSCALL error: Operation timed out

I have tried psql restore and pg_restore commands with compressed and uncompressed files, but it still breaks when the data table is too large.

I have tried the solution outlined here below, but I am still running into the same issue.

Am I missing a configuration somewhere that would allow pg_restore to finish? Is there anything that can be done on render’s side to help with these large datasets?

Hey @jmarsh24,

Are you running the restore locally? Also, how long does the command run before giving you this error?

To speed things up, it might help to create a Docker service from this repo: https://github.com/render-examples/pg-toolkit. You can try to copy the SQL dump to the service, SSH into it, then use pg_restore from within the service using the Render internal connection string. This may possibly solve the issue you are facing.

Let me know if you’re able to try this!

I was able to use wormhole to transfer my dump file and then I manually restored from that. It solved the issue.

Glad to hear you were able to resolve your issue. Our Datastores team will be taking a closer look at your experience migrating in the separate feedback email thread.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.