{% extends 'workflows/base_shell.html' %} {% load static %} {% block title %}Developer Handbook{% endblock %} {% block extra_css %} {% endblock %} {% block shell_body %} {% include 'workflows/includes/app_header.html' with header_show_home=1 header_inside_shell=1 %}

Developer Handbook

Project Wiki

Engineering runbook for development, deployment, maintenance, and extension of the current company portal deployment.

Overview Structure Workflow Local Dev Docker Database Guidelines Translations PDF Pipeline Email Pipeline Nextcloud Builders Testing Backup Hosts & Domains CI/CD Deployment Commands Troubleshooting Security

1) Overview

This handbook is for developers and maintainers. It documents the actual engineering workflow of the standalone product repository.

2) Repository Structure

3) Working Model

Branch strategy

Normal change flow

  1. start from develop
  2. implement the change
  3. run validation
  4. update docs if architecture or workflow changed
  5. push to GitHub and let CI run
  6. deploy from the Mac if test-server verification is needed
  7. promote develop into main when stable

4) Local Development Workflow

Start

cd /path/to/workdock-platform
docker compose up -d --build

Main URLs

Bootstrap users

5) Docker Operations

docker compose up -d --build
docker compose restart web
docker compose restart worker
docker compose logs --no-color --tail=120 web
docker compose logs --no-color --tail=120 worker
docker compose down
docker compose down -v
The source code is bind-mounted into the container. Most template/view/static changes only require a web restart, not a full rebuild. Image changes such as system packages require docker compose up -d --build.

6) Database and Migrations

docker compose exec -T web python manage.py makemigrations
docker compose exec -T web python manage.py migrate
docker compose exec -T web python manage.py check

Role and Permission Model

7) Engineering Guidelines

Core rules

  1. Preserve behavior while refactoring.
  2. Prefer shared components over page-local special cases.
  3. Do not overwrite environment-specific runtime config as a side effect of code deploys.
  4. Keep code-driven behavior and data-driven behavior mentally separate.
  5. Update documentation in the same branch when operational workflow changes.

Code vs data

8) Translation Workflow

Standard Django i18n path

make i18n-update-en
make i18n-compile

Equivalent raw commands:

docker compose exec -T web django-admin makemessages -l en
docker compose exec -T web django-admin compilemessages

9) PDF Pipeline

xhtml2pdf is sensitive to layout complexity. Keep print templates conservative and verify every structural change with a real generated PDF.

10) Email Pipeline

11) Nextcloud Integration

12) Branding

12b) App Registry

12c) Trial Lifecycle

13) Builder Architecture

Form Builder

Intro Builder

Dynamic content should use explicit DE/EN fields with German fallback, not machine translation at runtime.

Audit Trail

14) Testing and Validation

docker compose exec -T web python manage.py check
docker compose exec -T web python manage.py test
docker compose exec -T web python manage.py run_staging_e2e_check

15) Backup and Restore

make backup-create
make backup-verify BACKUP_DIR=backups/backup_YYYYmmdd_HHMMSS

16) Host and Domain Configuration

17) CI/CD

Current operating model

Why deploy is manual right now

The test server is inside the local network and uses a private IP address 192.168.2.55. GitHub-hosted runners on the public internet cannot reliably reach that target. Because of that, the correct deployment path today is:

  1. push code to GitHub
  2. let GitHub run CI
  3. deploy from the Mac on the same LAN

Automatic CD from GitHub becomes appropriate only after moving to a public server or using a self-hosted runner inside the LAN.

What to do for normal work

  1. Start from develop.
  2. Do the implementation work.
  3. Push to GitHub.
  4. Let CI finish.
  5. Run the local test deployment helper from the Mac.
  6. Verify the updated version in the browser.
  7. When the integration line is stable, merge develop into main.

One-command test deployment

From the Mac on the same network:

git checkout develop
./scripts/deploy_test_from_mac.sh

This helper script does all of the following:

  1. checks that the current branch is develop
  2. fast-forwards from origin/develop
  3. checks that the server env file exists
  4. syncs the repository to /opt/workdock with rsync
  5. preserves server-local env files like .env.test and .env.prod
  6. runs the remote deployment script
  7. waits for the health endpoint
  8. prints the deployed commit and branch

Current test server values

GitHub Actions status

If the local deploy helper fails

  1. Check whether /opt/workdock/.env.test still exists on the server.
  2. Check SSH access from the Mac:
    ssh -4 root@192.168.2.55
  3. Check server health directly:
    curl -I http://192.168.2.55:8088/healthz/
  4. Check container status:
    ssh root@192.168.2.55 "cd /opt/workdock && docker compose --env-file .env.test -f docker-compose.prod.yml ps"
The current LAN test deployment intentionally uses DJANGO_DEBUG=1 in .env.test because the security checks correctly reject insecure cookie settings when DEBUG=0 and the deployment is still plain HTTP behind a local test topology. This is acceptable for the test box only. Production must run with HTTPS and DEBUG=0.

18) Deployment

Test server stack

What the deploy script does

  1. Validate env file presence
  2. Build web, worker, and caddy
  3. Start db and redis
  4. Initialize writable volume ownership for media/static/backups
  5. Run migrations
  6. Run bootstrap_initial_users
  7. Run collectstatic
  8. Optionally run manage.py check
  9. Start web, worker, and caddy
  10. Wait until /healthz/ becomes healthy

Manual deploy

The preferred current test-deployment path is the local helper script from a Mac or another LAN-connected workstation:

./scripts/deploy_test_from_mac.sh

This script fast-forwards develop, checks that the remote env file exists, syncs the repo to the server with rsync, runs the remote deployment, verifies the health endpoint, and prints the deployed commit hash.

The script explicitly preserves server-local env files such as .env.test and .env.prod so deployment does not wipe machine-specific secrets.

Direct server-side deploy is still available if the code is already on the server:

cd /opt/workdock
RUN_DJANGO_CHECK=0 DEPLOY_HEALTH_URL="http://127.0.0.1:8088/healthz/" ./scripts/deploy_stack.sh .env.test docker-compose.prod.yml

Validation after deploy

curl -I http://192.168.2.55:8088/healthz/
ssh root@192.168.2.55 "cd /opt/workdock && docker compose --env-file .env.test -f docker-compose.prod.yml ps"

Runtime config sync

Deployment updates code. It does not automatically overwrite runtime database configuration. Use explicit sync when you want local configuration compared or applied to the server.

Supported sync scopes:

Export locally:

docker compose exec -T web python manage.py export_portal_app_config --output /tmp/portal-app-config.json
docker compose exec -T web python manage.py export_portal_deployment_config --output /tmp/portal-deployment-config.json
docker compose cp web:/tmp/portal-app-config.json /tmp/portal-app-config.json
docker compose cp web:/tmp/portal-deployment-config.json /tmp/portal-deployment-config.json

Copy the JSON files to the server host:

scp -4 /tmp/portal-app-config.json /tmp/portal-deployment-config.json root@192.168.2.55:/opt/workdock/

Because the server runs baked container images instead of a bind-mounted app tree, copy the files into the running web container before importing:

ssh -4 root@192.168.2.55 '
docker cp /opt/workdock/portal-app-config.json workdock-web-1:/tmp/portal-app-config.json &&
docker cp /opt/workdock/portal-deployment-config.json workdock-web-1:/tmp/portal-deployment-config.json
'

Dry-run the import first:

ssh -4 root@192.168.2.55 '
docker exec workdock-web-1 python manage.py import_portal_app_config /tmp/portal-app-config.json --dry-run &&
docker exec workdock-web-1 python manage.py import_portal_deployment_config /tmp/portal-deployment-config.json --dry-run
'

Only apply the import after the dry run looks correct:

ssh -4 root@192.168.2.55 '
docker exec workdock-web-1 python manage.py import_portal_app_config /tmp/portal-app-config.json &&
docker exec workdock-web-1 python manage.py import_portal_deployment_config /tmp/portal-deployment-config.json
'
Uploaded branding assets such as logo, favicon, and PDF letterhead are intentionally not included in deployment-config sync. They remain explicit media assets.

Proxmox / LXC requirement

The current server is an Ubuntu CT on Proxmox running Docker inside the container. The CT required Proxmox-side configuration before Docker containers could start correctly.

features: nesting=1,keyctl=1
lxc.apparmor.profile: unconfined
lxc.mount.entry: /dev/null sys/module/apparmor/parameters/enabled none bind 0 0

Those lines belong in /etc/pve/lxc/<CTID>.conf on the Proxmox host, followed by pct restart <CTID>.

Production expectations

Release checklist

  1. Run manage.py check
  2. Run tests or targeted verification
  3. Run translation compile step
  4. Rebuild containers if Python dependencies changed, then verify python -c "import requests" does not emit a compatibility warning
  5. Generate at least one onboarding/offboarding PDF if PDF templates changed
  6. Verify MailHog or SMTP path if email behavior changed
  7. Verify Nextcloud upload if integration behavior changed
  8. Update Project Wiki and Developer Handbook if architecture or operational workflow changed
  9. Take a snapshot commit before major next-phase work

19) Command Reference

Local development

docker compose up -d --build

Start or rebuild the local stack.

docker compose restart web
docker compose restart worker

Restart app services after code or template changes.

Validation

docker compose exec -T web python manage.py check

Run Django system checks.

docker compose exec -T web python manage.py test

Run the full test suite.

Local test deployment

./scripts/deploy_test_from_mac.sh

Sync the current develop checkout to the LAN test server and deploy it.

Direct server deployment

cd /opt/workdock
RUN_DJANGO_CHECK=0 DEPLOY_HEALTH_URL="http://127.0.0.1:8088/healthz/" ./scripts/deploy_stack.sh .env.test docker-compose.prod.yml

Deploy when code is already present on the server.

Config sync

docker compose exec -T web python manage.py export_portal_app_config --output /tmp/portal-app-config.json
docker compose exec -T web python manage.py export_portal_deployment_config --output /tmp/portal-deployment-config.json

Export runtime configuration from local.

docker exec workdock-web-1 python manage.py import_portal_app_config /tmp/portal-app-config.json --dry-run
docker exec workdock-web-1 python manage.py import_portal_deployment_config /tmp/portal-deployment-config.json --dry-run

Validate server-side config import before applying it.

Backup

make backup-create
make backup-verify BACKUP_DIR=backups/backup_YYYYmmdd_HHMMSS

Create and verify backup bundles.

20) Troubleshooting

21) Security and Maintenance Notes

For the next coder

{% endblock %}