Go does not have a built-in 'bulk insert' function for databases; you must use your database driver's specific batch API or transaction support. For example, with the database/sql package and PostgreSQL, you typically use tx.Exec with a prepared statement inside a transaction to insert multiple rows efficiently.
import (
"database/sql"
_ "github.com/lib/pq"
)
func bulkInsert(db *sql.DB, rows []Row) error {
tx, err := db.Begin()
if err != nil {
return err
}
defer tx.Rollback()
stmt, err := tx.Prepare("INSERT INTO users (name, email) VALUES ($1, $2)")
if err != nil {
return err
}
defer stmt.Close()
for _, r := range rows {
if _, err := stmt.Exec(r.Name, r.Email); err != nil {
return err
}
}
return tx.Commit()
}
Note: For true bulk loading (e.g., 10k+ rows), use driver-specific features like pq.CopyIn for PostgreSQL or sqlx's Copy methods, as looping Exec is slow.