Use the os.Open function to get a file handle, then read its contents using either io.ReadAll for the entire file or bufio.Scanner for line-by-line processing. Always remember to close the file or use defer to ensure resources are released properly.
Here is the most common pattern for reading an entire file into memory:
package main
import (
"fmt"
"io"
"os"
)
func main() {
// Open the file
file, err := os.Open("data.txt")
if err != nil {
fmt.Println("Error opening file:", err)
return
}
defer file.Close() // Ensure the file is closed when the function returns
// Read all content
content, err := io.ReadAll(file)
if err != nil {
fmt.Println("Error reading file:", err)
return
}
fmt.Println(string(content))
}
If you need to process large files line-by-line to avoid loading everything into RAM, use bufio.Scanner:
package main
import (
"bufio"
"fmt"
"os"
)
func main() {
file, err := os.Open("large_log.txt")
if err != nil {
fmt.Println("Error opening file:", err)
return
}
defer file.Close()
scanner := bufio.NewScanner(file)
for scanner.Scan() {
line := scanner.Text()
fmt.Println("Processing:", line)
}
if err := scanner.Err(); err != nil {
fmt.Println("Error during scanning:", err)
}
}
Key points to remember:
- Always check the error returned by
os.Openand reading functions; ignoring them can lead to silent failures. - Use
defer file.Close()immediately after a successful open to guarantee cleanup, even if the function returns early due to an error. io.ReadAllis convenient for small files but will consume significant memory for large datasets; preferbufio.Scanneror manual buffering for large files.- If you need to read binary data,
io.ReadAllworks fine, but you can also useio.Copyto stream data directly to another writer without intermediate buffering.