Django tests with Gitlab CI & Postgres
3 min read

Django tests with Gitlab CI & Postgres

Django tests with Gitlab CI & Postgres

When I moved my small Django project from SQLite to Postgres, I dockerised it, setup VS Code to allow me to run my pytests in the Docker container, made a couple of test tweaks where Postgres output differs from SQLite output, and was pretty happy.

Then I tried to get the tests running in GitLab CI. This made me less happy.

In the end though, after many struggles and an inbox full of failed pipeline mails I found the solution.

So what was the problem, and what can you do to avoid it?

Well, first, I tried to take the recommended GitLab settings. They even have a sample .gitlab-ci.yml file just for Django!.

The salient points of this sample file are:

services:
  - postgres:latest

variables:
  POSTGRES_DB: database_name

to pull in the Postgres Docker container, and set up the DB name. Then:

test:
  variables:
    DATABASE_URL: "postgresql://postgres:postgres@postgres:5432/$POSTGRES_DB"

to trigger the test runs.

My problem - I got about 1000 lines of stack trace because none of the tests that required the DB could be run, so I got repeated errors for each test, with a warning at the end that looked like this:

  /usr/local/lib/python3.8/site-packages/django/db/backends/postgresql/base.py:304: RuntimeWarning: Normally Django will use a connection to the 'postgres' database to avoid running initialization queries against the production database when it's not needed (for example, when running tests). Django was unable to create a connection to the 'postgres' database and will use the first PostgreSQL database instead.
    warnings.warn(
-- Docs: https://docs.pytest.org/en/stable/warnings.html

Hmm ...

Working on the assumption that the GitLab documentation wasn't completely wrong, I looked for something other than the database not being available. This meant the Docker container I was pushing with my Django project in couldn't connect to the GitLab Postgres database I defined above.

Let's have a look at my Django Database settings ...

DATABASES={
   'default':{
      'ENGINE':'django.db.backends.postgresql',
      'NAME':'postgres',
      'USER':'postgres',
      'PASSWORD':'postgres',
      'HOST':'db',
      'PORT':'5432',
   }
}

That looks right, no?

Ah, but wait one minute. I'm connecting to host db, becuase that's how I've set up my docker-compose.yml:

  db:
    image: postgres:13
    environment:
      - POSTGRES_DB=postgres
      - POSTGRES_USER=postgres
      - POSTGRES_PASSWORD=postgres

So, locally, I'm connecting to the db container, and in CI I'm still trying to connect to the db container, which now doesn't exist, because it's actually called postgres there.

The solution is to make the DATABASES config in settings.py a little more dynamic. I did this using a package called dj-database-url. This may be overkill as I could have done it with a simple os.env.get, but I rather like the self-contained simplicity of this package, which makes it clear what I'm doing, what the default is, and how long a connection can remain open.

So now, in my settings.py I have:

import dj_database_url
...
DATABASES = {
    "default": dj_database_url.config(
        default="postgres://postgres:postgres@db:5432/postgres", conn_max_age=600
    )
}

In .env I added my connection string for local dev (even if this is unnecessary as the default works there - it's nice to have a reminder):

DATABASE_URL=postgres://postgres:postgres@db:5432/postgres

And in my .gitlab-ci.yml I added a similar setting:

.test_template: &test_definition
  stage: test
  services:
    - postgres:13
  variables:
    POSTGRES_DB: postgres
    POSTGRES_HOST: postgres
    POSTGRES_USER: postgres
    POSTGRES_PASSWORD: postgres
    PYTHONPATH: ./src
  before_script:
    - pip install pipenv
    - pipenv install --system --dev
    - pytest --version

test:unit:
  <<: *test_definition
  script:
    - export DATABASE_URL=postgresql://$POSTGRES_USER:$POSTGRES_PASSWORD@${POSTGRES_HOST}:5432/$POSTGRES_DB
    - pytest . --cov=src -vvv

And now my pipeline (and my inbox!) is happy again!

Note on migrations

If you've moved from another database (eg. SQLite, like I did), you will need to run your migrations again and create a new superuser. You can do this in one of 2 ways:

  1. Log into the running container with docker exec -it <container name> bash and run python manage.py migrate as you would normally.

  2. Do it via your docker compose using docker-compose exec <service name> python manage.py migrate

Enjoying these posts? Subscribe for more