Friday, July 21, 2023

Python FastAPI Microservices with Polylith

I like FastAPI. It's is a good choice for developing Python APIs. FastAPI has a modern feel to it, and is also easy to learn. When I think about it, I could say the same about Polylith.

Modern Python Microservices

Building Python services with FastAPI and Polylith is a pretty good combination. With FastAPI, you'll have a modern framework including tools like Pydantic. The simplicity of the Polylith Architecture - using bases & components - helps us build simple, maintainable, testable, and scalable services.

You will have a developer experience without the common Microservice struggles, such as code duplication or maintaining several different (possibly diverging) repositories.

Getting started

Okay, let's build something. I will write a simple CRUD service that will handle messages. Since I'm starting from a clean sheet, I can choose where to begin the actual coding. I'd like to start with writing a function in a message module, and will figure out the solution while coding. The Polylith tooling support (totally optional) will help me add new code according to the structure of architecture. I already have a Workspace prepared, have a look at the docs for how to set up a Polylith Workspace. Full example at: python-polylith-example

Great! Now I have a namespace package that is ready for coding. I'll continue with writing a create function. While working on it, I realize I need a persistent storage. I decide to try out SQLAlchemy and the Session thing.

def create(content: str) -> int:
    with Session.begin() as session:
        data = crud.create(session, content)
        return data.id

Again, I will use the poly tool to create a database component, where I will put all the SQLAlchemy setup along with a model and the function where data is added to the DB.

def create(session: Session, content: str) -> Message:
    data = Message(content=content)

    session.add(data)
    session.flush()

    return data

In this example, I will go for SQLite to keep things simple. Use your favorite data storage here.

from sqlalchemy import create_engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker

DATABASE_URL = "sqlite:///./sql_app.db"
DATABASE_ARGS = {"check_same_thread": False}

engine = create_engine(DATABASE_URL, connect_args=DATABASE_ARGS)
Session = sessionmaker(autocommit=False, autoflush=False, bind=engine)
Base = declarative_base()

I now have the basics done and am ready to write the endpoint.

I could (of course) also have started from the endpoint, if I wanted to. Polylith allows you to postpone design decisions until you are ready for it.

Before continuing, I will add the components to the packages section of the workspace pyproject.toml:

packages = [
    {include = "my_namespace/database", from = "components"},
    {include = "my_namespace/message", from = "components"},
]

The Base Plate

API endpoints are recommended to have in what is called a base in Polylith. The idea is very much like LEGO, when building things with bricks and place them on a base plate. Yes, the FastAPI endpoint is the base plate. I will create a new base with the poly tool, and add the FastAPI code in there. It will import the message component I wrote before, and consume the create function.

@app.post("/message/", response_model=int)
def create(content: str = Body()):
    return message.create(content)

Just as with the components, I will add the base to the packages section of the pyproject.toml:

{include = "my_namespace/message_api", from = "bases"},

I can now try out the API in the browser (I have already added the FastAPI dependency). Developing & trying out code is normally done from the development area (i.e. the root of the workspace). This is where you have all the things set up, all code and all dependencies in one single virtual environment. This is a really good thing for a nice Developer Experience.

Reusing existing code

I realize that I need something that will log what is happening in the service. Luckily, there's already a log component there that was added before, for a different service. I'll go ahead and import the logger into my endpoint and start logging.

✨ This is where Polylith really shines. ✨ The components that you develop are ready to be re-used already from start. There's no need to extract code into libraries or doing something additional. All previosly written components are already there, and you just add them to your project. You might think that the logger is a silly little thing? Yes, it is. But this is only a simple example of code that is shared between projects in a Polylith Workspace.

The Project

There's one important part missing, and that is the project. A project is the artifact that will be deployed. It is a package of all project-specific code, along with project-specific configuration. Using the poly tool and adding the base to the project-specific pyproject.toml. Note the relative path to the bases folder.

packages = [
    {include = "my_namespace/message_api", from = "../../bases"},
]

I'm feeling lazy and will use the poly tool to add the other needed bricks for this specific project (components and bases are often referred to as bricks). The poly tool will analyze the imports and add only the ones needed for this project when running the poetry poly sync command. There's also a check command available that will report any missing bricks.

You still need to add the third party dependencies, though. There's limits to the magic of the poly tool 🪄 🧙. To keep track on what is needed in the bricks, you can actually run poetry poly check again. It will notify about the usage of SQLAlchemy in the bricks. You can use the poly check command during development to guide you in the dependencies actually needed for this specific project.

Add sqlalchemy to the [tool.poetry.dependencies] section manually. You can update the lock-file for the project by running the builtin Poetry command:

poetry lock --directory projects/message_fastapi_project

All project-specific dependencies should now be set up properly! When packaging the project, simply run the poetry build-project (with --directory projects/my_message_fastapi_project if you run the command from the workspace root). You can use the built wheel to deploy the FastAPI service. It contains all the needed code, packaged and re-arranged without any relative paths.

What's in my Workspace?

Check the state of the workspace with poetry poly info command. You will get a nice overview of the added projects, bricks and the usage of bricks in each project.

In this article, I have written about the usage of Polylith when developing services. Adding new services is a simple thing when working in a Polylith workspace, and the tooling is there for a nice Developer Experience. Even if it sometimes might feel like a superpower, it's basically only about keeping things simple. Don't hesitate to reach out if you have feedback or questions.

Additional info

Full example at: python-polylith-example
Docs about Polylith: Python tools for the Polylith Architecture

Top photo by Yulia Matvienko on Unsplash

Monday, July 17, 2023

GCP Cloud Functions with Python and Polylith

Running a full-blown 24/7 running service is sometimes not a good fit for a feature. Serverless functions (also known as Lambdas or Cloud functions) is an alternative for those single-purpose kind of features. The major cloud providers have their own flavors of Serverless, and Cloud Functions is the one on Google Cloud Platform (GCP).

GCP Cloud Functions & Polylith

Polylith is good choice for this. The main use case for Polylith is to support having one or more Microservices, Serverless functions or apps, and being able to share code & deploy in a simple and developer friendly way.

One thing to notice is that GCP Cloud Functions for Python expects a certain file structure when packaging the thing to deploy. There should be a main.py in the top directory and it should contain the entry point for the function. Here's an example, using the functions-framework from GCP to define the entry point.

  import functions_framework
  

  @functions_framework.http
  def handler(request):
      return "hello world"
  

You'll find more about Python Cloud Functions on the official GCP docs page.

In addition to the main.py, there should also be a requirements.txt in the top folder, defining the dependencies of the Cloud function. Other than that, it's up to you how to structure the code.

   /
   main.py
   requirements.txt
  

How does this fit in a Polylith workspace?

I would recommend to create a base, containing the entry point.

    poetry poly create base --name my_gcp_function
  
Never heard about Polylith? Have a look a the documentation.

Polylith will create a namespace package in the bases directory. Put the handler code in the already created core.py file.

Make sure to export the handler function in the __init__.py of the base. We will use this one later, when packaging the cloud function.

    from my_namespace.my_gcp_function.core import handler

    __all__ = ["handler"]
  

If you haven’t already, create a project:

    poetry poly create project --name my_gcp_project
  

Just as with any other Polylith project, add the needed bricks to the packages section.

 [tool.poetry]
 packages = [{include=”my_namespace/my_gcp_function”,from="../../bases"}]
    

 # this is specific for GCP Cloud Functions:
 include = ["main.py", "requirements.txt"]
  

Packaging a Cloud Function

To package the project as a GCP Cloud Function, we will use the include property to make sure the needed main.py and requirements.txt files are included. These ones will be generated by a deploy script (see further down in this article).

Just as we normally would do to package a Polylith project, we will use the poetry build-project command. Before running that command, we need to make some additional tasks.

Polylith lives in a Poetry context, using a pyproject.toml to define dependencies, but we can export it to the requirements.txt format. In addition to that, we are going to copy the "interface" for the base (__init__.py) into a main.py file. This file will be put in the root of the built package. GCP will then find the entry point handler function.

To summarize:

  • Export the dependencies into a requirements.txt format
  • Copy the interface into a main.py
  • run build-project
Depending on how you actually deploy the function to GCP, it is probably a good idea to also rename the wheel that has been built to "function.zip". A wheel is already a zipped folder, but with a specific file extension.

 # export the dependencies
 poetry export -f requirements.txt --output requirements.txt
   
 # copy the interface into a new main.py file
 cp ../../bases//messages_gcp_function/__init__.py ./main.py

 # build the project
 poetry build-project
   
 # rename the built wheel
 mv dist/my_gcp_function_project-0.1.0-py3-none-any.whl dist/function.zip
 

That's all you need. You are now ready to write & deploy GCP Cloud Functions from your Polylith workspace. 😄

Additional info

Full example at: python-polylith-example
Docs about Polylith: Python tools for the Polylith Architecture

Top photo by Rodion Kutsaiev on Unsplash