Database testing with Javascript running both locally and on Gitlab pipeline. Example with Js lambda, dockerized postgres and vitest

Elvis Ciotti
5 min readJan 29, 2024

--

If you want to test some Javascript code running queries on a database, my recommendation is running a dockerized database for testing, so you can easily
- drop and recreate the schema before the tests start
- truncate all the tables (and optionally add some common fixed data used by all tests) before each test. Each test will add its own data (fixtures), call your logic (e.g. lambda function) and make the assertions, without the need to cleanup.

Dockerised database

Here is the docker-compose.yml to run a local postgres.

Place it at tests/functional/db/docker-compose.yml
Keep the path I indicate so the following scripts can find them. When it all works you’ll easily move and replace them with the IDE’s help).

Uncomment the command if you want the container to display all the queries when you watch the docker container logs

version: '3'

services:
db:
image: postgres:latest
container_name: postgresql-sql-test
ports:
- "5433:5432"
environment:
POSTGRES_USER: vitest
POSTGRES_PASSWORD: vitest.pass
restart: always
# command: ["postgres", "-c", "log_statement=all", "-c", "log_destination=stderr"]

Vitest and db connection config files

have a look at the files, explanations in the comments

vitest.functional.config.js

Simple config file including the next file as setup

import { defineConfig } from 'vitest/config';

export default defineConfig({
test: {
coverage: {
exclude: [
'**/data/**',
'docs/**',
'tests/db',
'tests/src/fixtures'
],
},
globals: true,
include: ['tests/functional/**/*.test.js'],
reporters: ['verbose'],
root: './',
setupFiles: 'tests/functional/setup.js'
},
});

tests/functional/setup.js

this file defines the environment variables to connect to the db. If TEST_RUNNER variable is not defined (local usage), you are basically connecting to the docker container running on port 5433. It then starts docker, and drops the schema.

import { dropSchemaIfExistsSync, startDockerAndWaitToComeUpSync } from './src/fixtures/db.js';

// variable read by the app when executed by vitest
process.env.POSTGRES_USER = 'vitest';
process.env.POSTGRES_PASSWORD = 'vitest.pass';
process.env.POSTGRES_DB = 'mydb';

if (process.env.TEST_RUNNER === 'gitlab') {
console.log('Setting Vitest config for gitlab runner');
process.env.POSTGRES_HOST = 'postgres';
process.env.POSTGRES_PORT = '5432';
} else {
console.log('Setting Vitest config for local runner');
process.env.POSTGRES_HOST = 'localhost';
process.env.POSTGRES_PORT = '5433';

startDockerAndWaitToComeUpSync();
dropSchemaIfExistsSync();
}

Since your tests include the source code, you need to make sure you are connecting to Postgres taking the host, user, port and password from those variables on all your environment. Example using postgres npm package (not tested as my code does something more complicated using IAM tokens to connect to RDS).

import postgres from 'postgres';

export const sql = postgres({
host: process.env.POSTGRES_HOST,
port: process.env.POSTGRES_PORT,
username: process.env.POSTGRES_USER,
db: process.env.POSTGRES_DB
});

tests/functional/src/fixtures/db.js

This file define the function defined above. I’m using node to launch docker processes. I set PG variables so the psql command doesn’t need any parameters to import SQL files.

The truncate function will be used later.

import { execSync } from 'child_process';

const cliOptions = {
cwd: `${__dirname}/../../db`,
encoding: 'utf-8',
};

const psqlExecOptions = {
...cliOptions,
env: {
PGPASSWORD: 'vitest.pass',
PGUSER: 'vitest',
PGDATABASE: 'mydb',
...(process.env.TEST_RUNNER === 'gitlab' ? {
PGHOST: 'postgres',
PGPORT: 5432,
} : {
PGHOST: 'localhost',
PGPORT: 5433,
}),
},

};

export const startDockerAndWaitToComeUpSync = () => {
execSync('docker-compose up -d db', cliOptions);
while (!isDbRunning()) {
console.log('wait for postgres to come up');
}
};

const isDbRunning = () => {
try {
execSync("psql -c 'select now()'", psqlExecOptions);
return true;
} catch (e) {
return false;
}
};

export const dropSchemaIfExistsSync = () => {
execSync('psql < schema-drop-recreate.sql', psqlExecOptions);
console.log('Recreated funding schema.');
};

export const truncateAllTablesAndAddInitDataSync = () => {
execSync('psql < truncate-and-initial-data.sql', psqlExecOptions);
console.log('All tables truncated and initial data added.');
};

tests/functional/db/schema-drop-recreate.sql

Simply Add the tables you need to drop, and the initial data

\set QUIET 1
SET client_min_messages TO WARNING;

truncate table table1, table2 ;


-- intitial data here

\set QUIET 0

tests/functional/db/truncate-and-initial-data.sql

change with your schema and add your queries to create table structure.

You can create this from an existing database with pg_dump -h localhost -U user -d dbname — schema-only > schema-only.sql and then clean what’s not needed.

SET client_min_messages TO WARNING;

DROP SCHEMA IF EXISTS public CASCADE;
CREATE SCHEMA public;

-- create schema queries

How tests should be written using those helpers

This is an example of lambda function test, where I truncate all the tables before each test. Each tests add initial SQL data/fixtures, calls the handler, then asserts the response

import { describe, expect, it, vi } from 'vitest';
import { handler } from '../../../../../src/lambdas/yourLambda.js';
import { truncateAllTablesAndAddInitDataSync } from '../../fixtures/db.js';


describe('src/lambdas/transfer/getAccountHolderTransferMethodsByCurrency', () =>
beforeEach(() => {
truncateAllTablesAndAddInitDataSync();
});

it('...', async () => {
// add specific data using sql, the same way you do in your code
const response = await handler({ requestContext, pathParameters: { currencyCode: 'CFH' } });

expect(response.statusCode).toEqual(200);
const bodyDecoded = JSON.parse(response.body);

const expectedData = {}; // fill this
expect(bodyDecoded).toEqual(expectedData);
});

// more tests here


});

Node package.json

Make sure you have these scrips

{
...
"type": "module",
"scripts": {
"test:ci": "TEST_RUNNER=gitlab vitest --config vitest.functional.config.js",
"test": "TEST_RUNNER=local vitest run --config vitest.functional.config.js --coverage.enabled --coverage.provider=v8 --coverage.all",
...
}

How to run on gitlab

create a .gitlab-ci.yml file with this.

I’m defining a service postgres (the host), that runs on default port 5432 (this cannot be changed), then I define variables POSTGRES_USER and POSTGRES_PASSWORD that are read by the tests, and other PG* variables needed by the job.

The job functional_tests runs the test:ci npm script above (basically setting the TEST_RUNNER=gitlab variable that the code above uses). When this variable is set to gitlab the docker launch and schema creation is skipped, as on gitlab it’s simpler to launch via scripts. I’m running this on a node image, so I first install the postgres client, then import directly the same SQL file to drop the schema.

image: node:20

services:
- name: postgres:latest
alias: postgres

stages:
- test

variables:
# service
POSTGRES_USER: vitest
POSTGRES_PASSWORD: vitest.pass
# vars for psql command in script
PGHOST: postgres
PGUSER: ${POSTGRES_USER}
PGPASSWORD: ${POSTGRES_PASSWORD}
PGDATABASE: mydb

functional_tests:
stage: test
before_script:
- env
- apt-get update -yqq || true
- apt-get install -y postgresql-client-15 > /dev/null
- psql < tests/functional/db/schema-drop-recreate.sql
- psql < tests/functional/db/truncate-and-initial-data.sql
- npm ci
script:
- npm run test:ci

Conclusions

With the config above you can launch tests locally and on gitlab using dockerised database, minimising the config repetition, and using the same approach of dropping the database at the beginning, and truncate all the tables before each test.

An alternative approach is using testcontainers (see docs) and gitlab docker in docker (dind), as I normally do with Java applications, but I had no time to play with javascript yet. This approach seems simple enough, plus with a continuosly running postgres, locally it’s easier to tail the SQL commands and get the postgres shell, as testcontainers (at least with Java) it might run on a random port.

Clap if useful

--

--

Elvis Ciotti
Elvis Ciotti

Written by Elvis Ciotti

Software Contractor — Java, Spring, k8s, AWS, Javascript @ London - hire me at https://elvisciotti.github.io/

No responses yet