Back to Blog
·7 min read·Blog

How to connect Django to SingleStore DB

SingleStore is a high-performance database that is perfect for data-intensive applications. This article will show you how to configure Django to use SingleStore as a database.

Themba Mahlangu

Themba Mahlangu

How to connect Django to SingleStore DB

If you're looking for a high-performance database for your Django application, you should consider using Singlestore DB. SingleStore DB is a powerful, scalable database that is perfect for applications that need to handle large amounts of data. Some of the benefits of using Singlestore include:

  • Excellent performance: SingleStore is designed for high performance, and it delivers.

  • Scalability: SingleStore is highly scalable, so it can grow with your application.

  • Easy to use: (mostly) compatible with MySQL client software, this means you can use it with the Django ORM without modifying your code.

Prerequisites

  • SingleStore.com account and a license, or alternatively access to a SingleStore cluster.

  • Basic knowledge of Django

This post focuses on the basics of getting up and running with SingleStore and Django.

Installing Singlestore

SingleStore DB can be deployed on bare metal, VMs, or in the cloud using SingleStore tools (preferred) or configuration management tools. In this section, we will briefly look at two methods of deploying a cluster. You can find more deployment instructions here.

If just want to give it a spin, sign up for the free trial on SingleStore. To try the self-hosted version on a VM with enough power, set it up on Digital ocean. You can also run a cluster locally using docker. Unfortunately, the docker image currently does not work for ARM chips, you will have to use one of the other methods to create your cluster.

If you already have your cluster up and running you can skip ahead to the Configuring Django for SingleStore section.

Docker Compose

  • compose.yml

    text

version: '3'

volumes:
  sdb: {}
  pg: {}

services:
  django:
    build: .
    depends_on:
      - db
    volumes:
      - .:/app
    env_file:
      - .env
    ports:
      - "8000:8000"
    command: python3 manage.py runserver 0.0.0.0:8000

  db:
    image: postgres
    volumes:
      - pg:/var/lib/postgresql/data
    env_file:
      - .env

  singlestore:
      image: 'memsql/cluster-in-a-box'
      ports:
        - 3306:3306
        - 8080:8080
      env_file:
        - .env
      volumes:
        - sdb:/var/lib/memsql

Notes

Notice we are using both Postgres and SingleStore DB. More on that later

env_file: .env - this refers to the environment variables file below. Do not commit this file to your repository.

  • .env

    text

# postgres
POSTGRES_DB=postgres
POSTGRES_USER=postgres
POSTGRES_PASSWORD=postgres

# Single store
LICENSE_KEY=your-license-key-here-from-theconsole==
SINGLESTORE_DB_USER=root
SINGLESTORE_DB_NAME=analytics_db
SINGLESTORE_DB_PASSWORD=password
SINGLESTORE_DB_HOST=your-host-db
ROOT_PASSWORD=password
START_AFTER_INIT: 'Y'

Notes

LICENSE_KEY - this is the 'on-prem' license that can be retrieved from the SingleStore console.

Create a SingleStore Cluster on DigitalOcean

Create a droplet on DigitalOcean. Your droplet should have at least 4 cores, and 8GB of ram. If you don't already have an account, you can sign up and up to $100 of free credits.

After creating the Log into your droplet via SSH and continue with the installation. This section assumes you have knowledge of Linux.

  • shell

    text

$ wget -O - 'https://release.memsql.com/release-aug2018.gpg' \
  2>/dev/null | sudo apt-key add - && apt-key list

# verify apt-transport-https is installed
$ apt-cache policy apt-transport-https

# if not installed, you will have to install it.
$ sudo apt -y install apt-transport-https

# add the repo to retrieve packages
$ echo "deb [arch=amd64] https://release.memsql.com/production/debian \
  memsql main" | sudo tee /etc/apt/sources.list.d/memsql.list
  
# install packages. 
$ sudo apt update && sudo apt -y install singlestoredb-toolbox  \
  singlestore-client singlestoredb-studio

Notes

The shell commands will install the management tools, client application, and SingleStore DB studio. The full instructions for installing SingleStore on Linux are on this page.

  • cluster.yaml

    text

license: your key==
high_availability: false
optimize: true
memsql_server_version: 7.8.5
hosts:
  - hostname: 127.0.0.1
    nodes:
      - role: Master
        config:
          port: 3306
      - role: Leaf
        config:
          port: 3307
    localhost: true

Notes

licence: - retrieve this from your SingleStore account dashboard.

high_availability: false - we have set this to false, if you are running production workloads see the docs on this for more info.

  • shell

    text

$ sdb-deploy setup-cluster  --cluster-file cluster.yaml

# enable and start services
$ sudo systemctl enable singlestoredb-studio.service
$ sudo systemctl start singlestoredb-studio

# open firewall ports
$ ufw allow 8080
$ ufw allow 3306
$ ufw enable

Notes

sdb-deploy --cluster-file cluster.yaml - this will deploy a cluster using the configuration we defined in cluster.yaml

sudo systemctl commands - enable and start the singlestore-dbstudio service.

ufw allow 8080,3306 - open firewall ports 8080,3306 to allow incoming traffic, see this article for more information about using ufw.

Go to your browser, the UI should be running on your-droplet-ip:8080 or localhost:8080 if you are using docker for your cluster. and you should see the below.

Notes

Set port and host to 3306 and 127.0.0.1.

add the username (root/admin) and password you used when creating the cluster

Add the cluster name and description

Configuring Django for Singlestore

SingleStore does not support foreign keys. This means that you cannot (easily) migrate some of Django's native apps, e.g. the Django admin, Authentication, and Contenttypes without heavily modifying the code. A good workaround is to use SingleStore for your data-intensive apps/work and let Postgres handle the rest. Fortunately, Django can readily handle just this scenario.

Installing Dependencies

  • requirements.txt

    text

django==4.0
django_tables2==2.4.1
django-htmx==1.9.0
mysqlclient==2.1.0
django_mysql==4.7.0
psycopg2==2.9.3

Notes

django_mysql extends Django's build in MySQL support with features not available on other dbs. mysqlclient is the python interface for MySQL

django_mysql is the driver for Postgres

  • Dockerfile

    text

FROM python:3.9-buster

RUN apt-get update && \
    apt-get install python3-dev \
    default-libmysqlclient-dev build-essential -y

ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
WORKDIR /app
COPY requirements.txt /app/
RUN pip install -r requirements.txt
COPY . /app/

Notes

include default-libmysqlclient-dev in your Dockerfile

  • settings.py

    text

DATABASES = {
  "default": {
        'ENGINE': 'django.db.backends.postgresql',
        'NAME': os.environ.get('POSTGRES_NAME'),
        'USER': os.environ.get('POSTGRES_USER'),
        'PASSWORD': os.environ.get('POSTGRES_PASSWORD'),
        'HOST': 'db',
        'PORT': 5432,
    },
    "analytics_db": {
        'ENGINE': 'django.db.backends.mysql',
        'HOST': os.getenv("SINGLESTORE_DB_HOST", 'singlestore-host'),
        'NAME': os.getenv("SINGLESTORE_DB_NAME", 'analytics_db'),
        'USER': os.getenv("SINGLESTORE_DB_USER", 'root'),
        'PASSWORD': os.getenv("SINGLESTORE_DB_PASSWORD", 'your-db-password'),
        'PORT': '3306',
        'OPTIONS': {
            'auth_plugin': 'mysql_native_password',
        }
    }
}

DATABASE_ROUTERS = [
    'apps.analytics.router.AnalyticsRouter',
]

Notes

the DATABASES setting defines the configuration for the default postgres db and the analytics database.

the DATABASE_ROUTERS list includes a router for the analytics app which we will add next. The router determines which database to use when performing a query.

Create the database router

  • apps/analytics/router.py

    text

class AnalyticsRouter:
    """
    A router to control all database operations on models in the
    analytics application.
    """
    route_app_labels = {'analytics'}

    def db_for_read(self, model, **hints):
        """
        Attempts to read analytics model go to analytics_db.
        """
        if model._meta.app_label in self.route_app_labels:
            return 'analytics_db'
        return None

    def db_for_write(self, model, **hints):
        """
        Attempts to write analytics models go to analytics_db.
        """
        if model._meta.app_label in self.route_app_labels:
            return 'analytics_db'
        return None

    def allow_relation(self, obj1, obj2, **hints):
        """
        Do not allow relations for the analytics app is
        involved.
        """

        return None

    def allow_migrate(self, db, app_label, model_name=None, **hints):
        """
        Make sure the analytics app only appears in the
        'analytics_db' database.
        """
        if app_label in self.route_app_labels:
            return db == 'analytics_db'
        return None

Notes

The database router class provides up to four methods.

db_for_read - the database to be used for read operations

db_for_write - the database to be used for writes of objects of a type of Model, in this case, any model in the analytics app.

def allow_relation - this is completely disabled since SingleStore does not support relationships.

def allow_migrate - Determines whether migration should run on the database with the alias db.

Create the database in SingleStore

The last step before you can start developing is to create your database in SingleStore. You can do this from the SingleStore UI. This is the only bit of SQL you have to write.

text

CREATE DATABASE analytics_db;

Start Developing

  • apps/analytics/models.py

    text

from django.db import models

class CityTemperature(models.Model):
    region = models.CharField(max_length=300)
    country = models.CharField(max_length=300)
    state = models.CharField(max_length=300)
    city = models.CharField(max_length=300)
    day = models.CharField(max_length=3)
    month = models.CharField(max_length=20)
    year = models.CharField(max_length=4)
    avg_temperature = models.DecimalField(max_length=10, max_digits=10, decimal_places=2)
  • shell

    text

$ docker-compose run --rm django python3 manage.py makemigrations
$ docker-compose run --rm django python3 manage.py migrate
$ docker-compose run --rm django python3 manage.py migrate \
 analytics --database=analytics_db

Notes

create a model in your analytics app (this can be anything you want).

makemigrations and migrate are the usual management commands for creating and applying migrations.

the last command, migrate analytics --database=analytics_db will only migrate the

analytics

app to the analytics_db database.

A note on running migrations

And that's it, you can now continue developing your Django app as usual. Just remember to specify the database if you want to apply migrations to the SingleStore database.

Further reading

Article from usefathom.com, the reason why we went down this road. The company uses the SingleStore in production, we highly recommend the blog.

A post from Synaptic on why they chose SingleStore, an interesting article with insights into why they chose SingleStore.

Django docs on multiple databases

Check out the pipelines feature on singlestore, you can use it to load the City temperatures data into the database.