Building a rate limiter in Python 3 with Falcon and run it via Docker

Post to Twitter

Having your REST services exposed publicly (and sometimes internally) can lead to particular bots or people abusing the service by essentially performing a denial of service (whether intentional or not) on your application. Rate limiting is a fact of life in REST applications and microservices world. Rate limiting if your not familiar allows you to control how much end users or systems can hit your service endpoints. So essentially something to allow them to hit the service 30 times per minute but no more.

In this article we will build a simple rate limiter in Python 3 and will utilize the Falcon framework.

Note: The rate limiting code here is mostly borrowed from here. I’ve done a few modifications to the file.

For this article you will need Python 3.4+ installed as well as the latest Docker and Docker Compose. I’m using the Docker for Mac beta.

I created a virtual environment for my project and have my project files structured as follows (using PyCharm):

Project Structure

As you can see by the image above you will need to add some files and a docker folder if you want to follow the same structure.

In the docker folder create and open the Dockerfile and add the following:

FROM python:3.4.4

RUN pip install -U pip
RUN pip install -U falcon redis

EXPOSE 8080

COPY . /falconratelimiter
WORKDIR /falconratelimiter

CMD ["python", "/falconratelimiter/app.py"]

Note: You can also add a requirements.txt and use the following if you want to avoid the RUN commands to install the two packages.

FROM python:3.4.4-onbuild

In the same folder add a docker-compose.yml file with the following:

version: "2"
services:
  redis:
    image: redis
    ports:
      - "6379:6379"
  app:
    build:
      context: ../
      dockerfile: docker/Dockerfile
    depends_on:
      - redis
    ports:
      - "8080:8080"
    links:
      - redis

Above we are fetching the Redis image from the Docker hub and exposing the default port. Next, we set the location of the Dockerfile and link it to Redis while exposing the 8080 port.

Let’s add the code starting with app.py:

import json
from wsgiref import simple_server

import falcon
from rate_limiter import RateLimiter


class HelloResource(object):
    def on_get(self, req, resp):
        resp.content_type = 'application/json'
        resp.status = falcon.HTTP_200
        resp.body = json.dumps({'message': 'hello'})

app = falcon.API(middleware=[RateLimiter(limit=2)])
hello = HelloResource()
app.add_route('/hello', hello)


if __name__ == '__main__':
    httpd = simple_server.make_server('0.0.0.0', 8080, app)
    httpd.serve_forever()

This is a very simple Falcon application and uses the WSGIRef server which is okay for testing and small load applications. For real world apps you may want to checkout Gunicorn or uWSGI (or others).

In the simple app I’m just mapping one route which is: /hello

I’ve set the default rate limit on the hello call to just 2 calls per minute. This makes it easy to test.

Time to add the code for the rate_limit.py file:

# Most of this code is from:
# https://github.com/projectweekend/Falcon-PostgreSQL-API-Seed/blob/master/app/middleware/rate_limit.py
from time import time

import falcon
import redis


class RateLimiter(object):

    def __init__(self, limit=100, window=60):
        self.limit = limit
        self.window = window
        self.redis = redis.StrictRedis(host='redis', port=6379)

    def process_request(self, req, res):
        requester = req.env['REMOTE_ADDR']

        # un-comment if you want to ignore calls from localhost
        # if requester == '127.0.0.1':
        #     return

        key = "{0}: {1}".format(requester, req.path)
        print('Key: {0}'.format(key))

        try:
            remaining = self.limit - int(self.redis.get(key))
        except (ValueError, TypeError):
            remaining = self.limit
            self.redis.set(key, 0)

        expires_in = self.redis.ttl(key)

        if expires_in == -1:
            self.redis.expire(key, self.window)
            expires_in = self.window

        res.append_header('X-RateLimit-Remaining: ', str(remaining - 1))
        res.append_header('X-RateLimit-Limit: ', str(self.limit))
        res.append_header('X-RateLimit-Reset: ', str(time() + expires_in))

        if remaining > 0:
            self.redis.incr(key, 1)
        else:
            raise falcon.HTTPTooManyRequests(
                title='Rate Limit Hit',
                description='Blocked: Too many requests'
            )

A few things to note on the code above. The rate limit is stored per the user’s (or service’s) IP address, this can be easily changed to store their username, a token, etc. The data is stored in Redis via the EXPIRE command and the time remaining on any further rate limited calls is checked with the Redis TTL command.

Rate limit headers are returned with useful information like the limit, reset time before you can call again (and not get rate limited), and the actual call limit you can make (per minute in this case).

Try it out with Docker Compose from a terminal in the docker folder:

$ docker-compose up

In another terminal (or tab) run cURL against the running service:

$ curl -X GET http://localhost:8080/hello -v

Response:

* Connected to localhost (::1) port 8080 (#0)
> GET /hello HTTP/1.1
> Host: localhost:8080
> User-Agent: curl/7.43.0
> Accept: */*
> 
* HTTP 1.0, assume close after body
< HTTP/1.0 200 OK
< Date: Wed, 22 Jun 2016 03:47:13 GMT
< Server: WSGIServer/0.2 CPython/3.4.4
< x-ratelimit-limit: : 2
< content-length: 20
< x-ratelimit-reset: : 1466567293.11422
< x-ratelimit-remaining: : 1
< content-type: application/json
< 
{"message": "hello"}

Run the cURL command a couple more times until you get rate limited:

* Connected to localhost (::1) port 8080 (#0)
> GET /hello HTTP/1.1
> Host: localhost:8080
> User-Agent: curl/7.43.0
> Accept: */*
> 
* HTTP 1.0, assume close after body
< HTTP/1.0 429 Too Many Requests
< Date: Wed, 22 Jun 2016 03:48:11 GMT
< Server: WSGIServer/0.2 CPython/3.4.4
< content-length: 82
< content-type: application/json; charset=UTF-8
< x-ratelimit-limit: : 2
< x-ratelimit-reset: : 1466567293.4633517
< x-ratelimit-remaining: : -1
< vary: Accept
< 
{
    "title": "Rate Limit Hit",
    "description": "Blocked: Too many requests"
}

Wait until the reset and try again and you’ll see you can once again hit the endpoint.

Post to Twitter

This entry was posted in Open Source, Python, Redis. Bookmark the permalink.

One Response to Building a rate limiter in Python 3 with Falcon and run it via Docker

  1. Thanks, Chad for your article!

Comments are closed.