RNG

thoughts on software development and everything else

Turning SQL tables into Go structs

2019-07-18

Recently I had to create some scripts in Go for manipulating data in a MySQL database. Along the way I found some great open-source libraries that made the whole process easier.

The basic way to use SQL in Go is through the database/sql package. Results of SQL statements are returned as sql.Rows objects, and you have to iterate over those to build up a slice of whatever you want to get. You also have to provide a parameter for each column of the returned result set, which is frustrating when you are dealing with tables that have a large number of columns.

To make this more convenient, you can use the sqlx library which lets you just provide a struct for your result set to be unpacked into.

You could also use a more fully-featured ORM of course, like GORM or xorm, but I like sqlx because it’s more lightweight and requires less setup if all you want to do is use SQL query strings.

The result is turned into a slice of structs without needing to iterate over Rows. Each struct needs to have every column you are expecting to return tagged with its column name. If a column is nullable, then you need to use the NullString, NullInt64 etc types from the database/sql library. This is because NULLs will be returned from the DB as nil, and you can’t store nil into other data types in Go.

type MyTable struct {
	myIdColumn int            `db:"id"`
	firstName  string         `db:"first_name"`
	nickname   sql.NullString `db:"nickname"`
}

Making structs for every table in your DB is a slow and dull process to do manually. It’s also easy to make little spelling errors that slow you down.

Luckily, Frank Meyer has created a great little tool tables-to-go which can automatically create structs for every table in your schema.

Install it with

go get github.com/fraenky8/tables-to-go

Call it with:

tables-to-go -t mysql -h <host> -d <schema> -u <user> -p <password> -pn <packagename>

Most of the flags are the same as you would use with the mysql CLI.

And voila! A struct for each table in your DB, one file per table.

CREATE TABLE customer  (
  id INT PRIMARY KEY,
  first_name VARCHAR(20),
  last_name  VARCHAR(20) NOT NULL,
  balance DECIMAL
);

becomes

package mypkg

import (
	"database/sql"
)

type Customer struct {
	ID        int             `db:"id"`
	FirstName sql.NullString  `db:"first_name"`
	LastName  string          `db:"last_name"`
	Balance   sql.NullFloat64 `db:"balance"`
}

And you can then query it like so:

db, err := sqlx.Connect("mysql", "user:password@tcp(host)/schema")

// Get for single values
var maxId int
err = db.Get(&maxId, "SELECT MAX(id) FROM customer")

// Get for single rows
bestCustomer := Customer{}
err = db.Get(&bestCustomer, "SELECT * FROM customer ORDER BY balance DESC LIMIT 1")

// Select for multiple values
others := []Customer{}
err = db.Select(&others, "SELECT * FROM customer WHERE id != ?", bestCustomer.ID)

Working with the nullInt64 etc types is still a bit annoying, having to check for myNullInt64.Valid before you know if you have something in the myNullInt64.Int64 field. But maybe there is a handy library out there for that too!