diff --git a/DEPLOYMENT.md b/DEPLOYMENT.md index 4d45bcd..f398469 100644 --- a/DEPLOYMENT.md +++ b/DEPLOYMENT.md @@ -211,6 +211,67 @@ cd /opt/workdock RUN_DJANGO_CHECK=1 ./scripts/deploy_stack.sh .env.prod docker-compose.prod.yml ``` +## Runtime config sync +Deployment updates code. It does not automatically overwrite runtime database configuration. + +Use explicit sync when you want local configuration to be compared or applied to another environment. + +### Supported sync scopes +- `PortalAppConfig` +- `PortalBranding` +- `PortalCompanyConfig` + +### Step 1: export locally +```bash +docker compose exec -T web python manage.py export_portal_app_config --output /tmp/portal-app-config.json +docker compose exec -T web python manage.py export_portal_deployment_config --output /tmp/portal-deployment-config.json +docker compose cp web:/tmp/portal-app-config.json /tmp/portal-app-config.json +docker compose cp web:/tmp/portal-deployment-config.json /tmp/portal-deployment-config.json +``` + +### Step 2: copy JSON files to the server host +```bash +scp -4 /tmp/portal-app-config.json /tmp/portal-deployment-config.json root@192.168.2.55:/opt/workdock/ +``` + +### Step 3: copy JSON files into the running web container +The server uses baked images, not a bind-mounted app tree. Because of that, the running `web` container cannot automatically read arbitrary files from `/opt/workdock`. + +Use: +```bash +ssh -4 root@192.168.2.55 ' + docker cp /opt/workdock/portal-app-config.json workdock-web-1:/tmp/portal-app-config.json && + docker cp /opt/workdock/portal-deployment-config.json workdock-web-1:/tmp/portal-deployment-config.json +' +``` + +### Step 4: dry-run the import on the server +```bash +ssh -4 root@192.168.2.55 ' + docker exec workdock-web-1 python manage.py import_portal_app_config /tmp/portal-app-config.json --dry-run && + docker exec workdock-web-1 python manage.py import_portal_deployment_config /tmp/portal-deployment-config.json --dry-run +' +``` + +### Step 5: apply the import on the server +Only do this if the dry run looks correct. + +```bash +ssh -4 root@192.168.2.55 ' + docker exec workdock-web-1 python manage.py import_portal_app_config /tmp/portal-app-config.json && + docker exec workdock-web-1 python manage.py import_portal_deployment_config /tmp/portal-deployment-config.json +' +``` + +### Notes +- `PortalAppConfig` covers app order, section, visibility, and overrides. +- deployment-config sync covers branding/company text and metadata. +- uploaded branding files are intentionally excluded: + - logo + - favicon + - PDF letterhead +- use dry-run first. Treat config sync as an explicit operator action, not something hidden inside deploy. + ## GitHub Actions workflows ### Test deployment workflow File: @@ -406,6 +467,65 @@ For production, you may later want image-tag based rollback. That is not necessa - test deployment is intentionally weaker than production on transport security - production should not reuse the test env model +## Command reference +Use this as the short operational index. + +### Local development +```bash +docker compose up -d --build +``` +Start or rebuild the local stack. + +```bash +docker compose restart web +docker compose restart worker +``` +Restart the app services after code/template changes. + +### Validation +```bash +docker compose exec -T web python manage.py check +``` +Run Django system checks. + +```bash +docker compose exec -T web python manage.py test +``` +Run the full test suite. + +### Local test deployment +```bash +./scripts/deploy_test_from_mac.sh +``` +Sync the current `develop` checkout to the LAN test server and deploy it. + +### Direct server deployment +```bash +cd /opt/workdock +RUN_DJANGO_CHECK=0 DEPLOY_HEALTH_URL="http://127.0.0.1:8088/healthz/" ./scripts/deploy_stack.sh .env.test docker-compose.prod.yml +``` +Deploy when code is already present on the server. + +### Config export/import +```bash +docker compose exec -T web python manage.py export_portal_app_config --output /tmp/portal-app-config.json +docker compose exec -T web python manage.py export_portal_deployment_config --output /tmp/portal-deployment-config.json +``` +Export runtime configuration from local. + +```bash +docker exec workdock-web-1 python manage.py import_portal_app_config /tmp/portal-app-config.json --dry-run +docker exec workdock-web-1 python manage.py import_portal_deployment_config /tmp/portal-deployment-config.json --dry-run +``` +Validate server-side config import before applying it. + +### Backup +```bash +make backup-create +make backup-verify BACKUP_DIR=backups/backup_YYYYmmdd_HHMMSS +``` +Create and verify backup bundles. + ## Current known-good state Validated manually: - repository pushed to private GitHub diff --git a/backend/workflows/templates/workflows/developer_handbook.html b/backend/workflows/templates/workflows/developer_handbook.html index e3b3c98..1668ffe 100644 --- a/backend/workflows/templates/workflows/developer_handbook.html +++ b/backend/workflows/templates/workflows/developer_handbook.html @@ -38,6 +38,7 @@ Hosts & Domains CI/CD Deployment + Commands Troubleshooting Security @@ -484,6 +485,39 @@ RUN_DJANGO_CHECK=0 DEPLOY_HEALTH_URL="http://127.0.0.1:8088/healthz/" ./scripts/
curl -I http://192.168.2.55:8088/healthz/
ssh root@192.168.2.55 "cd /opt/workdock && docker compose --env-file .env.test -f docker-compose.prod.yml ps"
+ Deployment updates code. It does not automatically overwrite runtime database configuration. Use explicit sync when you want local configuration compared or applied to the server.
+Supported sync scopes:
+PortalAppConfigPortalBrandingPortalCompanyConfigExport locally:
+docker compose exec -T web python manage.py export_portal_app_config --output /tmp/portal-app-config.json
+docker compose exec -T web python manage.py export_portal_deployment_config --output /tmp/portal-deployment-config.json
+docker compose cp web:/tmp/portal-app-config.json /tmp/portal-app-config.json
+docker compose cp web:/tmp/portal-deployment-config.json /tmp/portal-deployment-config.json
+ Copy the JSON files to the server host:
+scp -4 /tmp/portal-app-config.json /tmp/portal-deployment-config.json root@192.168.2.55:/opt/workdock/
+ Because the server runs baked container images instead of a bind-mounted app tree, copy the files into the running web container before importing:
+ssh -4 root@192.168.2.55 '
+docker cp /opt/workdock/portal-app-config.json workdock-web-1:/tmp/portal-app-config.json &&
+docker cp /opt/workdock/portal-deployment-config.json workdock-web-1:/tmp/portal-deployment-config.json
+'
+ Dry-run the import first:
+ssh -4 root@192.168.2.55 '
+docker exec workdock-web-1 python manage.py import_portal_app_config /tmp/portal-app-config.json --dry-run &&
+docker exec workdock-web-1 python manage.py import_portal_deployment_config /tmp/portal-deployment-config.json --dry-run
+'
+ Only apply the import after the dry run looks correct:
+ssh -4 root@192.168.2.55 '
+docker exec workdock-web-1 python manage.py import_portal_app_config /tmp/portal-app-config.json &&
+docker exec workdock-web-1 python manage.py import_portal_deployment_config /tmp/portal-deployment-config.json
+'
+ The current server is an Ubuntu CT on Proxmox running Docker inside the container. The CT required Proxmox-side configuration before Docker containers could start correctly.
features: nesting=1,keyctl=1
@@ -511,7 +545,50 @@ lxc.mount.entry: /dev/null sys/module/apparmor/parameters/enabled none bind 0 0<
Take a snapshot commit before major next-phase work
- 19) Troubleshooting
+ 19) Command Reference
+
+ Local development
+ docker compose up -d --build
+ Start or rebuild the local stack.
+ docker compose restart web
+docker compose restart worker
+ Restart app services after code or template changes.
+
+
+ Validation
+ docker compose exec -T web python manage.py check
+ Run Django system checks.
+ docker compose exec -T web python manage.py test
+ Run the full test suite.
+
+
+ Local test deployment
+ ./scripts/deploy_test_from_mac.sh
+ Sync the current develop checkout to the LAN test server and deploy it.
+
+
+ Direct server deployment
+ cd /opt/workdock
+RUN_DJANGO_CHECK=0 DEPLOY_HEALTH_URL="http://127.0.0.1:8088/healthz/" ./scripts/deploy_stack.sh .env.test docker-compose.prod.yml
+ Deploy when code is already present on the server.
+
+
+ Config sync
+ docker compose exec -T web python manage.py export_portal_app_config --output /tmp/portal-app-config.json
+docker compose exec -T web python manage.py export_portal_deployment_config --output /tmp/portal-deployment-config.json
+ Export runtime configuration from local.
+ docker exec workdock-web-1 python manage.py import_portal_app_config /tmp/portal-app-config.json --dry-run
+docker exec workdock-web-1 python manage.py import_portal_deployment_config /tmp/portal-deployment-config.json --dry-run
+ Validate server-side config import before applying it.
+
+
+ Backup
+ make backup-create
+make backup-verify BACKUP_DIR=backups/backup_YYYYmmdd_HHMMSS
+ Create and verify backup bundles.
+
+
+ 20) Troubleshooting
- Page looks stale: restart
web and hard-refresh browser
- Second request hangs: inspect web logs and verify health endpoint
@@ -522,7 +599,7 @@ lxc.mount.entry: /dev/null sys/module/apparmor/parameters/enabled none bind 0 0<
- Requests dependency warning appears: verify
chardet==5.2.0 is installed in the rebuilt image and restart web/worker
- 20) Security and Maintenance Notes
+ 21) Security and Maintenance Notes
- Containers run as non-root
app user.
- Keep secrets in
.env, not in tracked files.