Skip to content

How to insert large .sql files without running out of memory #2256

Answered by jackc
natjw asked this question in Q&A
Discussion options

You must be logged in to vote

There's no way to stream the value of a single conn.Exec in pgx / pgconn. And if these files are large enough to cause a problem, then streaming the files probably wouldn't be a good solution as PostgreSQL will be buffering the entire file on it's side.

The solution is to send one SQL statement at a time. There is a parser designed to split statements internal to one of my other projects. https://pkg.go.dev/github.com/jackc/tern/[email protected]/migrate/internal/sqlsplit However, it only works on string inputs where you would need it to work on an io.Reader. But perhaps it can be a starting place for you.

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by natjw
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants