The steps below will get you up and running with a local development environment. All of these commands assume you are in the root of your generated project.
If you’re new to Docker, please be aware that some resources are cached system-wide and might reappear if you generate a project multiple times with the same name.
Docker & Docker Compose
If you don’t have it yet, follow the installation instructions .
For development without docker see this page.
The project includes a
with a set of convenience commands to help you get started.
Build the Stack
This can take a while, especially the first time you run this particular command on your development system:
$ make build # the command above is just a shortcut for $ docker-compose -f local.yml build
This brings up Django, a worker node PostgreSQL, Redis and Mailhog. The first time it is run it might take a while to get started, but subsequent runs will occur quickly.
Run the Stack
$ make run
Open a terminal at the project root and run the following for local development:
$ docker-compose -f local.yml up # or use the shortcut $ make run
Go to http://localhost:8000 and you should see the default landing page!
Create your first user
The Vanty Starter Kit Sign up flow works out of the box. By default you can already register and log in a user.
However for development, it is necessary to create a superuser that will be able to access the Django admin and Control Panel .
We added a modified createsuperuser command that will also create a tenant for the new superuser.
# the shortcut $ make verified-superuser
The command above will prompt you to add a password and will automatically create a superuser that using the email that you specified when building the project. To use another email run the command manually like below.
$ docker-compose -f local.yml python manage.py create_verified_superuser email@example.com
Docker Tips & Tricks
Compose file and environments
By default, docker will always use a docker-compose.yml or compose.yml file in the root of your folder. You can always change this by setting the environment variable
to point to another file for example
$ export COMPOSE_FILE=local.yml
You can now run the following and docker will use the correct file.
$ docker-compose up
To run in a detached (background) mode, just:
$ docker-compose up -d
Executing Management Commands
As with any shell command that we wish to run in our container, this is done using the
docker-compose -f local.yml run --rm
$ docker-compose -f local.yml run --rm django python manage.py migrate $ docker-compose -f local.yml run --rm django python manage.py createsuperuser
is the target service we are executing the commands against.
Docker Compose file
The local environment brings up several containers that are essential for local development. This includes cache, web app, database, worker and mail server containers. That is a lot but docker abstracts that away so that you don’t have to do this manually.
We will include a few excerpts from your project’s
and highlight the important bits.
Database service (excerpt local.yml)
postgres: build: context: . dockerfile: ./compose/production/postgres/Dockerfile volumes: - local_postgres_data:/var/lib/postgresql/data - local_postgres_data_backups:/backups env_file: - ./.envs/.local/.postgres
Let’s look at the
. Generally, the stack’s behavior is governed by a number of environment variables (\<span class=”title-ref”>env(s)\</span>, for short) residing in
, for instance, this is what is generated for you: :
.envs ├── .local │ ├── .django │ └── .postgres └── .production ├── .django └── .postgres
By convention, for any service
is an environment when there is a
file in the project root), given
requires configuration, a
\<span class=”title-ref”>service configuration\</span> file exists.
Consider the aforementioned
# PostgreSQL # ------------------------------------------------------------------------------ POSTGRES_HOST=postgres POSTGRES_DB=<your project slug> POSTGRES_USER=XgOWtQtJecsAbaIyslwGvFvPawftNaqO POSTGRES_PASSWORD=jSljDz4whHuwO3aJIgVBrqEml5Ycbghorep4uVJ4xjDYQu0LfuTZdctj7y0YcCLu
The three envs we are presented with here are
(by the way, their values have also been generated for you). You might have figured out already where these definitions will end up; it’s all the same with
service container envs.
One final touch: should you ever need to merge
in a single
$ python merge_production_dotenvs_in_dotenv.py
file will then be created, with all your production envs
residing beside each other.
Django and worker service
django: build: context: . dockerfile: ops/compose/local/django/Dockerfile image: advantchdemo_local_django restart: unless-stopped depends_on: - postgres - mailhog volumes: - .:/app:z env_file: - ./.envs/.local/.django - ./.envs/.local/.postgres ports: - "8000:8000" command: /start worker: image: advantchdemo_local_django command: python manage.py run_huey -w3 env_file: - ./.envs/.local/.django - ./.envs/.local/.postgres depends_on: - postgres - django - mailhog volumes: - .:/app:z
In addition to the web container, the compose file will also bring up worker service for running async tasks. The worker container is a essentially a copy of the django service, except that instead of running a server, it will run a worker task. If this is consuming too many system resources you can also use a procfile based approach to run all of this in one container.
When developing locally you can go with [MailHog] for email testing
was set to
on setup. To proceed,
mailhogcontainer is up and running;