I am experiencing a timeout issue when attempting to restore database tables that are over 300k long on render. My database is 300mb in size. When render imports this table, it responds with
pg_restore: error: error returned by PQputCopyData: could not receive data from server: Operation timed out
SSL SYSCALL error: Operation timed out
I have tried psql restore and pg_restore commands with compressed and uncompressed files, but it still breaks when the data table is too large.
I have tried the solution outlined here below, but I am still running into the same issue.
Am I missing a configuration somewhere that would allow pg_restore to finish? Is there anything that can be done on render’s side to help with these large datasets?
Are you running the restore locally? Also, how long does the command run before giving you this error?
To speed things up, it might help to create a Docker service from this repo: https://github.com/render-examples/pg-toolkit. You can try to copy the SQL dump to the service, SSH into it, then use pg_restore from within the service using the Render internal connection string. This may possibly solve the issue you are facing.
Glad to hear you were able to resolve your issue. Our Datastores team will be taking a closer look at your experience migrating in the separate feedback email thread.