Database Migrations with Liquibase and docker

The long story

Today I want to share an experience that I had in one open source project that I work.
For database migrations we are experienced using flyway and it works pretty well for our postgres databases.
One new story came up in our sprint and it was to we apply migrations on an SQL Server database.
As usually, we prepared an docker-compose for run flyway locally and test the flow… and BOOM !! Bad news. The SQL server that we were using is and old one.. 2014. For our “lucky” flyway for community doesn't work for it

Our team started looking for flyway alternatives and we found Liquibase.
I made an Proof of concept for this guy and it’s very similar as we usually to work.
So let’s stop talking and see some code.

The Files

all those files can be found on my github

On our POC project structure, we have the changelog.xml inside config folder. This is the main configuration file.
Inside scripts, kind of obvious, we have the .sql scripts that we want to execute on the target database. Let’s talk about each of them.

Liquibase config files(changelog) accepts a lot of types, like Json, Sql, Xml, Yaml.. In this example, I used xml.
In line 8, I added the includeAll node, with the path “scripts”. This config will tell to Liquibase to execute all scripts inside scripts folder.

The scripts are very simple for tests purposes and it only add a new table.

The docker-compose.database.yml is for we have an postgres alpine database (very light) to execute our test scripts on with the Liquibase

On docker-compose.liquibase.yml we use the liquibase docker image.
For commad, we use the “url” option with our connection string with the postgres driver from jdbc. In this example I’m using the host “lobo-db”, that’s the container name that we use on database compose(line 8) and we’re using the same network(lobo-net) the containers will talk each other with the container name.
Other option that we use in command, is the “changeLogFile” and we point it to the file from config folder, our “changelog.xml”.
And for the final argument, we use update.
The volumes section is very important because it maps our files to the docker image files.
On line 9 it maps our config/changelog.xml to liquibase/changelog.xml
The liquibase folder is the main folder that liquibase uses for search changelog files.
On line 10, we map our scripts folder to liquibase/scripts, again, the liquibase is an default folder and on our changelog file, we assing /scripts as execute folder, so liquibase will search inside of liquibase/scripts.

Action, please…

Let’s execute first our compose for database…
So, from an command line, inside our project folder..
docker-compose -f docker-compose.database.yml up

if you connect on it, you can see that we have an empty database

Then, let’s execute our compose for liquibase…
So, from an command line, inside our project folder..
docker-compose -f docker-compose.liquibase.yml up

anddd voalá! If we check again our database, we can see news tables there! Our 2 teste_liquibase tables, from scripts and databasechangelog & databasechangelog from liquibase control migrations

That’s all?

With this proof of concept we can test our scripts locally and then use it on our pipelines! Sweet!

Hope it can help someone ;)