If you’ve ever had to write a database layer in Go, Python, or TypeScript, you can understand the pain of writing and maintaining code like this:
func (r *RepoLayer) Create(ctx context.Context) (error) {
_, err := r.DB.ExecContext(ctx,
`INSERT INTO user (username, created_at) VALUES (?, ?)`,
"Username", "CreatedAt");
return err
}
While this example is easy to understand (create a user with Username and Created At), these queries can become very complex when you start introducing joins and variables within the queries (e.g. finding all users with a specific zip code, when we have a users table and addresses table).
Utilizing SQLC, creating and integrating a database layer into your project is made simple, so long as you have an understanding of database queries.
In this article I’m going to outline the work necessary to create a robust and more maintainable ORM layer for Go. We’ll be using some command line tools, so make sure you can compile Go, run a docker container, and run SQLC. To run SQLC, follow the installation steps outlined at SQLC’s installation page.
Step 0: Creating the Go project
This is necessary in all Go projects; create a directory for your project to reside (in my example, I’m working on an API) and using the CLI, init your Go project:
{project-root} > go mod init github.com/username/project-name
Afterwards you can start coding in Go. If you create a “main.go” file in your base directory, you should be able to run the following code:
package main
func main() {
print("testing")
}
Back to the CLI, run the code above and you should see an output of “testing”
{project-root} > go run main.go
testing
{project-root} >
Once you’ve setup the project, you can add things like a License file, .gitignore, a ReadMe, etc. and start using source control if you want.
Step 1: Necessary Setup
First, create a ‘database’ folder, and inside that folder, a ‘migrations’ folder and a ‘queries’ folder such that the directory structure looks like this:
project-root/
└── database/
├── migrations/
└── queries/
Next, create the following files:
- database/migrations/000-bootstrap-extensions.sql
- database/migrations/001-pg-config-schemas.sql
- database/migrations/002-create-trigger-updated-at.sql
- database/migrations/NNN-migration-template.sql
- database/docker-build-run.sh
- database/Dockerfile
- database/sqlc.yaml
After, your directory structure should look something like:
project-root/
├── database/
│ ├── migrations/
| | | 000-bootstrap-extensions.sql
| | | 001-pg-config-schemas.sql
│ │ └── NNN-migration-template.sql
│ ├── queries/
│ ├── docker-build-run.sh
│ ├── Dockerfile
│ └── sqlc.yaml
├── .gitignore
├── LICENSE
├── go.mod
├── main.go
└── README.md
What does each file here do?
- 000-bootstrap-extensions.sql enables PGCrypto, which is needed for security purposes.
- 001-pg-config-schemas.sql creates the mechanism for keeping track of schema versions, which allows us to utilize migrations properly (executing migrations in order). It also adds a validation mechanism to make sure the schema versions execute in order.
- 002-create-trigger-update-at.sql creates a function which we can call to track when an entry has been updated (if the table has an “updated_at” column) — this is a useful function and can be used for auditing purposes.
- NNN-migration-template.sql gives a template for adding a migration to your directory. You can copy and paste this when creating a new migration file.
- docker-build-run.sh is a bash script that simply removes any existing running container (for this database), builds a new Docker container, and runs the newly created Docker container. If you use the “ — debug” flag, you can see all database statements as well.
- Dockerfile is based on the database engine you want to use and copies the migrations folder into the necessary location. We’re using Postgres for this example.
- sqlc.yaml defines the database engine, queries and migrations locations, and the package output.
You can copy and paste the values from my project to get running quickly, however I recommend you read and understand the files to truly know what’s happening. You will also have to adjust some values for your project (names, directories, ports, etc).
For reference, my project name is ‘go-auth-api’, the database name is ‘go-auth-db’, and the username is ‘goauth’. There is also more information in the README file.
https://github.com/eplewis89/go-auth-api/tree/main/database
Step 2: Migrations
Migrations define how a database is organized, what tables and views exist, and how tables relate to each other. SQLC uses this information to create models and their relationships for the ORM layer. When our database starts up, it reads files that exist in the migrations folder (in simplistic terms). Using this knowledge, we can structure a simple database as follows:
Note: at the moment, we will focus on creating users.
In order to create our first migration, copy and paste the “NNN-migration-template.sql” file and rename it “003-create-users.sql”. Notice that we’re incrementing the migration version number by 1, this will ensure the migrations are displayed in order for easier reference. You can name these migrations whatever you want, but for ease of maintenance, you should stick with a common theme (number first, operation, table name).
Within the create user migration file, we should write the migration to include whatever is necessary for our database structure. In my project, I’m going to have a way for a user to login, reset their password, delete their profile, etc. which is why the extra data is included. My migration for the user table looks like this:
BEGIN TRANSACTION;
LOCK TABLE db_config IN EXCLUSIVE MODE;
CALL start_schema_update(3);
create table Users
(
id serial not null
constraint Users_pkey
primary key,
first_name varchar(100) not null default '',
last_name varchar(100) not null default '',
email varchar(100) unique not null default '',
email_activated boolean not null default FALSE,
encr_password varchar(100),
salt_password varchar(100),
temp_password varchar(100),
created_at timestamp default (now() at time zone 'utc') not null,
updated_at timestamp default null,
deleted_at timestamp default null,
is_deleted boolean not null default FALSE
);
alter table Users owner to goauth;
-- create updated by trigger
create trigger users_trigger_updated_at
before update on Users for each row
execute procedure trigger_updated_at();
COMMIT;
This creates a user table with an auto-incrementing id, which means we won’t ever have to insert that value, some necessary user and password data, and some tracking data.
At this point, you can test that SQLC works by opening a terminal and running SQLC generate. You should see no terminal output if you managed to write the migrations properly, and a newly created “repo” folder within the “database” folder.
{project-root} > cd database/
{project-root}/database > sqlc generate
...
{project-root}/database >
After generation is done, have a look around at the repo folder to see how SQLC builds the database layer automatically; at this point there should be “db.go” and “models.go” files.
Step 3: Queries
If everything is working up to this point, we now need to add functions to interact with the data in the database. When you write a query, you’re doing something with the data that exists in the database, and this is what SQLC uses to create functions for us to interact with the database.
In the queries folder, create a file named “users.sql”. Personally, I like to have a query file per table, and if the query has functionality that mostly deals with that table, that’s where I put that specific query. After the file is created, we can begin populating queries, such as “get all users that have a gmail domain in their email,” or “get all users created before a specific date”. In order for SQLC to compile these queries into functions, you must specify the name and functionality of the query. My users.sql query file looks like this:
-- name: CreateUser :one
INSERT INTO
Users
(first_name, last_name, email, encr_password, salt_password)
VALUES
($1,$2,$3,$4,$5)
RETURNING *;
-- name: FindUserByAccessToken :one
SELECT *
FROM Users u
WHERE u.id
IN
(
SELECT act.user_id
FROM AccessTokens act
WHERE act.token=$1
);
-- name: FindUserById :one
SELECT *
FROM Users
WHERE id=$1;
-- name: FindUserByEmail :one
SELECT *
FROM Users
WHERE email=$1;
-- name: GetTempPasswordForUser :one
SELECT temp_password
FROM Users
WHERE id=$1;
-- name: UpdateUserProfile :exec
UPDATE Users
SET
first_name=$1,
last_name=$2
WHERE id=$3;
-- name: UpdateUserPassword :exec
UPDATE Users
SET encr_password=$1
WHERE id=$2;
-- name: UpdateUserTempPassword :exec
UPDATE Users
SET temp_password=$1
WHERE id=$2;
-- name: UpdateUserActivation :exec
UPDATE Users
SET email_activated=$1
WHERE id=$2;
-- name: SetUserDeleted :exec
UPDATE Users
SET
deleted_at = now() at time zone 'utc',
is_deleted = TRUE
WHERE id=$1;
-- name: ForceDeleteUser :exec
DELETE FROM Users
WHERE id=$1;
-- name: GetUserWithAccessToken :one
SELECT
u."id",
u."first_name",
u."last_name",
u."email",
u."email_activated",
u."encr_password",
u."salt_password",
u."temp_password",
a."id" AS "access_token_id",
a."token" AS "access_token_value",
a."expiration_time" AS "access_token_expiration",
a."user_id" AS "access_token_user_id"
FROM Users AS u
INNER JOIN AccessTokens AS a
ON u.id = a.user_id
AND a.token=$1;
Once you write some queries you are content with, go back to the CLI and enter “sqlc generate” once more. Again, you shouldn’t have any issues pop up, and if you do, fix them before moving on.
After generation, there should be a file in the repo folder named “users.sql.go” — take a look around at this file and you can see how SQLC interprets the queries and generates functions automatically.
Step 4: Integration
At this point, we can write a simple function to insert, update, and delete a user. Back in our main.go file, we can integrate the SQLC functions from our repo layer to look something like this:
package main
import (
"context"
"log"
"github.com/eplewis89/go-auth-api/database/repo"
"github.com/jmoiron/sqlx"
)
func main() {
if err := run(); err != nil {
log.Fatal(err)
}
}
func run() error {
ctx := context.Background()
connStr := "some connection string"
conn, err := sqlx.Open("pgx", connStr)
if err != nil {
return err
}
defer conn.Close()
db := repo.New(conn)
// list all authors
user, err := db.FindUserByEmail(ctx, "some email")
if err != nil {
return err
}
log.Println(user)
return nil
}
As you can see, in this example we are opening a PGX connection, creating a repo layer (our ORM) on top of the connection, and using SQLC to interact with the database.
If you’ve made it this far, congratulations! You can now use this baseline to extend your application.