Deploying your Python Data Science project with Production-Grade Quality has never been easier!

A project template for deploying your Python application via FastAPI and NGINX in a Docker container

R&D Labs
4 min readJun 16, 2021

As software developers, we develop tools as stand-alone applications or as part of a larger product. Often, these tools can be useful in other applications too, whether they are small, such as automatic styling of company documents, or large — in fact, the template is based on our work for the Science Based Target Initiative temperature scoring tool. Eventually, odds are that you would like to integrate your application into a product, a process, or another application.

At the same time, we do not always know in advance how or where our application will be used, and the target application might even be written a different language. Therefore, we want to deploy our app in an independent, scalable, secure, and easily maintainable manner. In short, we want to be ready to deploy and integrate our application with production-grade quality.

The question is how to achieve that goal: How can we easily deploy and integrate our Python application into other applications with production-grade quality?

This article, together with our open-source template GitHub repository, provides an answer to that question and will help you to get up and running.

Photo by Markus Distelrath on Pixabay

Why Python?

We can be short about this. We have focused our template on Python applications because Python has a number of attributes that make it suitable for an application that we want to integrate elsewhere as a service.

  • It is one of the most widely used programming languages, has become the go-to language for data science and machine learning projects, and is one of the most popular languages for open-source development.
  • It is a great language to prototype with.
  • Many useful (open-source) libraries are available.

Microservices

The solution our template provides is to offer your application as a self-contained microservice that is accessible through an API. The microservice architecture is growing rapidly as a paradigm. For example, the global market for cloud microservices is predicted to increase at a compound annual growth rate of 21.37% between 2019 and 2026 according to Verified Market Research.

The main advantages of a microservice architecture, as opposed to a monolithic architecture, stem from its modularity and include:

  1. Small components offer an easier understanding of functionality and easier maintenance.
  2. The independence of microservices means that they can be monitored independently and scaled on demand.
  3. Integration in projects is easy with APIs.
  4. Microservices can be developed independently by different development teams, which offers flexibility in terms of technology and time.

As a result, microservices enjoy a shorter time-to-market. In addition, the microservice architecture is highly compatible with the recent movement to cloud-native technologies, so your app will be future-proof!

Containers

If microservices are the conceptual solution to our problem, the vehicle for its implementation is the container. In our template, we host the application’s API in a Docker container, which is currently the industry standard. (Docker) containers are extremely versatile and offer the following benefits:

  1. Containers can be run anywhere where you have Docker installed: locally, on servers on-premise, or in the cloud.
  2. Version and dependency consistency is ensured so that containers always work on any system.
  3. Consumers can spin up your application on their premises, reducing complications regarding data security.
  4. Containers are one of the building blocks of a cloud-native architecture.
  5. Containers are simple to get started with.

The Template

The example application in our template is a simple machine learning model to predict the type of flower, based on its sepal and petal widths and lengths. We used scikit-learn’s decision tree classifier and trained it on scikit-learn’s iris data set.

The repository is structured as follows:

.
├── app # Python application
├── model # App engine
└── main.py # Definition HTTP methods
├── config # Configuration of NGINX in docker
└── Dockerfile # Construct Docker image

The app directory contains our example application and API endpoints. We trained the model with train_model.py in the app/model directory, and subsequently, stored it as a pickle. The app/main.py file defines the HTTP methods of our API and uses our trained model to make predictions.

For our API, we make use of the FastAPI library in Python. FastAPI is straightforward to use, fast (as the name indicates) and offers Swagger API documentation out of the box. For development purposes, you can serve the API on your local machine by running the main.py file from your IDE.

The config directory contains the configuration files to set up NGINX in our Docker container. NGINX offers a broad set of functionality, such as web serving, reverse proxying, caching, load balancing, and more. In this case, it efficiently takes care of all logistics of handling requests that are sent to our endpoints, including logging.

By building the Dockerfile you create a Docker image, which serves as a blueprint to spin up containers. You can spin up containers from a locally built Docker image or a stored version on DockerHub, that you can automate to build from your GitHub repository.

Implement your application

To get started with your Python application, implement it in the app folder and adjust the endpoints to your needs. Please note that FastAPI requires your response objects to inherit from the pydantic basemodel class, which helps you with type safety.

Navigate to your project folder in the command-line interface and run the following command:

docker-compose up --build

After the build is finished, you should be able to see your API documentation at http://localhost:5000/docs.

Conclusion

We hope this generic setup to integrate Python applications through APIs hosted in Docker containers will serve as a useful tool in your arsenal, and allow you to successfully move your applications from the lab to the factory!

--

--

R&D Labs

We work and experiment with both new modelling approaches and IT techniques and concepts in order to research their applicability to investment decision making