My world of technology

Tag: flask

Creating development environment using Docker Compose

Hello and welcome to the next entry in this series. Today I will cover creating development environment on local machine using Docker Compose. This way we can setup both API application and database server quickly and effortlessly.

For purpose of demonstration we can reuse the dockerfiles created in previous entries in the series (for postgres server and API app). We were already able to setup and run these containers separately, but using Docker Compose we can create both of them at the same time with just one command.

The important part is networking: we need to make sure that containers can „talk” to each other. For that let’s create separate network called „docker_network” utilising bridge type driver. There is also need to expose the ports; one thing changes compared to the previous setup. We don’t need to specify port forwarding for database server and API container can talk to database directly on port 5432 (default for Postgres). We still need to forward the requests to API app on different port than the one exposed by Flask container (5000).

I’ve also added some safety mechanism in form of healthcheck for database and „depends_on” clause for api which will cause the API to start only when database container is up and ready.

Putting this all together compose file looks the following:

services:
  db:
    container_name: postgres
    build: ./db/
    ports:
      - "5432"
    expose:
    - "5432"
    environment:
    - POSTGRES_USER=postgres
    - POSTGRES_PASSWORD=postgrespassword
    - POSTGRES_DB=postgres
    networks:
    - docker_network
    healthcheck:
      test: ["CMD", "pg_isready", "-U", $POSTGRES_USER, "-d", $POSTGRES_DB]
      interval: 1s
      timeout: 5s
      retries: 10
  api:
    container_name: rpgapp
    build: .
    ports:
      - "5005:5000"
    networks:
      - docker_network
    depends_on:
      db:
        condition: service_healthy
networks:
  docker_network:
    driver: bridge

Let’s give it a spin and start the environment using command:

docker compose up

Afterwards we can see in Docker desktop app that both containers are running as part of single compose stack:

To confirm that our app works let’s try to create a new character via API call:

We have successfully created environment consisting of database and application server with just one command!

That’s it for today, in my next entry I will cover creating Prometheus container for monitoring our application.

Making application interact with a database

Hello and welcome to the next entry in the series. Now that we have a properly configured database we can rewrite our application to interact with it. For this we will be using SQLAlchemy – a powerful tool dedicated for working with databases. You can find the documentation of this library under this link.

The list of libraries necessary for this project have grown a lot due to need for SQLAlchemy, as well as driver dedicated for PostgreSQL. Thankfully Python allows us to import the requirements just by running single command. By the way, you can find the requirements.txt file in my repository.

To install needed libraries download requirements.txt file and run the following command in the context of directory where it is present:

python -m pip install -r requirements.txt

Now that’s sorted out we can get to work. We have a working database, but in what way we can tell our application how to connect to it? SQLAlchemy uses engine object which specifically describes how to „talk” to specified database. It requires info on type of database used, username, password, host, port and database name. I’ve used .env file to store all of this information in a separate file and only import the values within codebase. Creation of the engine looks the following:

I added print at the end for debug purpose (the password information is hidden). Let’s run this code and see if we connected successfully:

The engine information is there – we are using postgresql and connected as rpgapp_user to host 127.0.0.1 on port 5455 to database called „rpgapp”.

Before we get to coding the functions part we also need to specify structure of the data. In SQLAlchemy it’s called models; this structure defines both Python and SQL database object that we are going to interact with.

For our purpose I’ve created a simple model called Character model. We are going to store characters data in table called „characters” which has three columns: characterid(which at the same time is primary key), name(string) and character level(integer value). Putting this all together model definition looks like this:

Finally we can get to the main dish and start coding the functions for our API to retrieve and store data in database. Let’s start with create function so we can populate the database.

I’ve created a separate endpoint which only allows for POST method requests called „/character/create”. It takes three arguments in form of the fields in character table – name, level and id. Afterwards CharacterModel object is created based on the input.

We use the existing engine connection and establish a session. A session object in SQLAlchemy is a way to handle transactions with database. We add created character object to the session and commit it to the database. If everything goes right we then make a response to the query with info on the character data and status code (200). In case of any issues we catch an exception and return 500. Below is the code snippet:

Let’s try to raise a request and test newly created endpoint:

Indeed, it works, in return we receive name and id of the character as well as 200 return code. Now let’s code a GET /character endpoint which will return list of all our characters:

Once again we are using session object, but this time with .query method looking for all objects that match CharacterModel. Before we return data to the user we need to prepare a json object using json.dumps method out of list of characters. Let’s run a GET request to this endpoint:

I’ve created quite a few of them during testing and we get entire list of them.

The endpoints could use some more work in regards to checking the input data quality, auto numerating ID in the database etc, but this is something I will do at later date. In my next entry I will cover how to create an Azure PostgreSQL Flexible Server using Terraform. Thank you for reading and see you next time!

© 2025 KW Digital

Theme by Anders NorenUp ↑