When developing an application, integration tests are always an important part of our testing coverage, since they help us ensuring that the different pieces together work fine. In our case, the integration tests that we wanted to write were related to PostgreSQL. Specifically, we wanted to test our partial indexes: this feature is not available in H2, so how can we make sure that everything works correctly and nothing breaks in production? Integration tests and Docker come to rescue.
The MySQL / PSQL literature on integration tests is pretty poor, but there are some github repos that can help us getting started with that. In this post we're going to use docker-it-scala and, as docker API client, docker-java (instead of this one, we could have also used the spotify's docker client). Also, if you run MySQL instead of PSQL this is a good repo where to find lots of examples.
The first thing that we need to do is to override the configuration settings to run our PSQL docker image.
import com.whisk.docker._
import com.whisk.docker.impl.dockerjava.DockerKitDockerJava
trait DockerPostgresService extends DockerKit {
val internalPort = 44444
val externalPort = 5432
val user = "user"
val password = "safepassword"
val database = "mydb"
val dbUrl = s"jdbc:postgresql://localhost:$port/$database?autoReconnect=true&useSSL=false"
val driver = "org.postgresql.Driver"
val dockerImage = "postgres:9.6.2"
val postgresContainer = DockerContainer(dockerImage)
.withPorts((externalPort, Some(internalPort)))
.withEnv(s"POSTGRES_USER=$user", s"POSTGRES_PASSWORD=$password", s"POSTGRES_DB=$database")
.withReadyChecker(
new PostgresReadyChecker(dbUrl, user, password, driver).looped(15, 1.second)
)
// adds our container to the DockerKit's list
abstract override def dockerContainers: List[DockerContainer] =
postgresContainer :: super.dockerContainers
}
The piece of code above creates our new PSQL docker container and adds it to the list of containers of which DockerKit
will pull the images. Also, for the post purposes, every value has been put in the code to make things easier, but it would be better to define everything in a config file.
To verify that our connection to the db is established, we can define a PostgresReadyChecker
that will keep trying to connect to the database. We can reuse the one defined in the docker-it-scala
examples.
It's pretty long-winded when it tries to connect, but I didn't find any way to make it less louder than it is.
class PostgresReadyChecker(url: String,
user: String,
password: String,
driver: String) extends DockerReadyChecker {
override def apply(container: DockerContainerState
)(implicit docker: DockerCommandExecutor, ec: ExecutionContext) =
container.getPorts().map( ports =>
Try {
Class.forName(driver)
Option(DriverManager.getConnection(url, user, password))
.map(_.close)
.isDefined
}.getOrElse(false)
)
}
Great, everything is in place, now we can have fun (for a given value of _) with the integration tests!!
As mentioned before, we wanted to test our partial indexes. Let's say that our (really simplified) db tracks the users' quests in a MMPORG. We can pick up and leave the same quest multiple times and we want to keep track of all this actions, like a log db. We can have only one quest active at a time, but we can have many in Cancelled state. Our quests
table is defined as follows:
CREATE TABLE quests (
user_id SERIAL NOT NULL,
quest_id SERIAL NOT NULL,
status VARCHAR NOT NULL
);
CREATE UNIQUE INDEX quests_partial_idx
ON quests (user_id, quest_id, status)
WHERE NOT (status = 'Cancelled');
Our schema will be created using a Flyway migration, so the code above will be defined in src/main/resources/db/migration/V001__db.sql
.
We can then write our integration test:
import com.whisk.docker.scalatest.DockerTestKit
import org.scalatest.{Matchers, WordSpec}
class QuestDaoTest
extends WordSpec
with Matchers
with DockerTestKit
with DockerPostgresService {
trait QuestFixture {
//db config
val config = new HikariConfig
config.setJdbcUrl(dbUrl)
config.setUsername(dbUser)
config.setPassword(dbPassword)
// our Flyway migration
val flyway = new Flyway()
flyway.setDataSource(new HikariDataSource(config))
flyway.migrate()
// Slick apis
val db = Database.forURL(url = dbUrl, user = dbUser, password = dbPassword, driver = "org.postgresql.Driver")
val questDao = new QuestDao(db)
}
"QuestDao - Docker" should {
"Not insert multiple quests with Active status" in new QuestFixture {
// insert an Active quest
questDao.insert(quest) shouldBe 'right'
// insert another Active quest
questDao.insert(anotherActiveQuest) shouldBe 'left'
}
"Insert multiple quest with Cancelled status" in new QuestFixture {
// insert a Cancelled quest
questDao.insert(cancelledQuest) shouldBe 'right
// insert another Cancelled quest
questDao.insert(anotherCancelledQuest) shouldBe 'right
}
}
}
Et voila', our integration test is ready to be run!
All these classes are defined in an it
directory that contains only the integration tests code, keeping the unit tests' one separated. To run our integration tests we can just run sbt it:test
, but what if we want to run them together with our unit tests?
Easy peasy, we just have to create 2 objects in your project folder:
// Configs.scala
import sbt._
object Configs {
val IntegrationTest = config("it") extend (Test)
val all = Seq(IntegrationTest)
}
// Testing.scala
import sbt._
import sbt.TaskKey
import sbt.Keys._
import Dependencies.docker
import com.lucidchart.sbt.scalafmt.ScalafmtCorePlugin.autoImport.scalafmtSettings
object Testing {
import Configs._
lazy val testAll = TaskKey[Unit]("test-all")
private lazy val itSettings =
inConfig(IntegrationTest)(Defaults.testSettings) ++
Seq(
fork in IntegrationTest := false,
parallelExecution in IntegrationTest := false,
scalaSource in IntegrationTest := baseDirectory.value / "src/it/scala",
libraryDependencies ++= docker.all
)
lazy val settings = itSettings ++
inConfig(IntegrationTest)(scalafmtSettings) ++
Seq(
testAll := (test in IntegrationTest).dependsOn(test in Test).value
)
}
In this way, sbt test-all
wil run all the unit and integration tests together.
Conclusion
This post wants to be a brief introduction to integration testing with PSQL and Docker and their setup. In our case, database integration testing was fundamental, since the unit tests alone couldn’t help us testing partial indexes.
But do never underestimate unit testing. As when building a house, we need to start from the foundation up to the roof. Let’s say that, if unit testing help our house to stand solid in the ground, integration tests will try to tackle bad weather as much as possible.
Happy coding! ☔️