Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found
Select Git revision
  • 299-delete-planed-assemblies
  • 302-habitat-info
  • 607-schedule-versions
  • 720-schedule_source
  • add-django-ninja
  • andi/develop
  • andi/schedule-api
  • andi/speaker_import
  • apiv2-errors
  • apiv2-init
  • camp23-prod
  • chore/backoffice-list
  • chore/conference-singleton
  • chore/event-views
  • chore/singleton/badge
  • chore/singleton/markdown
  • chore/update-rooms
  • cyroxx/add_edit_links
  • cyroxx/bulletin_description
  • develop
  • editMail
  • feat/dynamic-link-forms
  • feat/unit-integration-tests
  • feature/RegisterSpeaker
  • feature/audit_log
  • feature/bg-eyecandy
  • feature/conference-query-set
  • feature/event_import_slugs_of_serial_event
  • feature/mqtt
  • feature/parallax-css-testpage
  • feature/pypy
  • feature/scheduleimport_skippedrooms
  • feature/show_vods
  • production
  • room-docu
  • stable-38c3
  • camp23-prod
  • camp23-prod-archive
  • prod-2024-10-14_22-49
  • prod-2024-10-20_23-27
  • prod-2024-10-29_09-45
  • prod-2024-10-31_13-17
  • prod-2024-11-01_11-14
  • prod-2024-11-02_21-16
  • prod-2024-11-03_01-42
  • prod-2024-11-03_12-10
  • prod-2024-11-16_03-41
  • prod-2024-12-04_00-57
  • prod-2024-12-05_00-48
  • prod-2024-12-05_10-09
  • prod-2024-12-10_00-17
  • prod-2024-12-10_07-23
  • prod-2024-12-10_23-04
  • prod-2024-12-14_03-24
  • prod-2024-12-16_02-27
  • prod-2024-12-17_15-05
  • prod-2024-12-19_02-32
  • prod-2024-12-20_12-25
  • prod-2024-12-21_10-44
  • prod-2024-12-21_13-42
  • prod-2024-12-22_00-55
  • prod-2024-12-22_01-34
  • prod-2024-12-22_17-25
  • prod-2024-12-22_21-12
  • prod-2024-12-23_23-39
  • prod-2024-12-24_14-48
  • prod-2024-12-25_01-29
  • prod-2024-12-25_15-54
  • prod-2024-12-25_21-04
  • prod-2024-12-26_00-21
  • prod-2024-12-26_13-12
  • prod-2024-12-26_21-45
  • prod-2024-12-27_00-34
  • prod-2024-12-27_13-29
  • prod-2024-12-27_16-01
  • prod-2024-12-27_16-37
  • prod-2024-12-27_20-15
  • prod-2024-12-27_21-15
  • prod-2024-12-28_02-32
  • prod-2024-12-28_12-24
  • prod-2024-12-28_18-32
  • prod-2024-12-29_02-25
  • prod-2024-12-29_02-55
  • prod-2024-12-29_03-20
  • prod-2024-12-29_03-32
  • prod-2024-12-29_20-35
  • prod-2024-12-30_03-16
  • prod-2024-12-30_12-40
  • prod-2024-12-31_09-54
  • prod-2025-01-07_13-15
  • prod-2025-01-20_00-20
  • prod-2025-01-21_22-00
  • prod-2025-01-21_22-46
  • prod-2025-04-18_22-42
94 results

Target

Select target project
No results found
Select Git revision
  • 299-delete-planed-assemblies
  • 302-habitat-info
  • 445-schedule-redirects
  • 511-schedule-foo-fixed
  • 607-schedule-versions
  • 623-wiki-im-baustellenmodus-sollte-mal-als-wiki-admin-trotzdem-seiten-anlegen-bearbeiten-konnen
  • 720-schedule_source
  • andi/develop
  • andi/schedule-api
  • andi/speaker_import
  • badge-redeem-404
  • camp23-prod
  • chore/event-views
  • cyroxx/add_edit_links
  • cyroxx/bulletin_description
  • deploy/curl-verbose
  • develop
  • editMail
  • feat/dynamic-link-forms
  • feat/unit-integration-tests
  • feature/568-habitatmanagement
  • feature/RegisterSpeaker
  • feature/audit_log
  • feature/bg-eyecandy
  • feature/conference-query-set
  • feature/mqtt
  • feature/parallax-css-testpage
  • feature/pypy
  • feature/scheduleimport_skippedrooms
  • fix/index
  • fix/public-badge-access-rights
  • fix/registration_mail_subject
  • ical-export
  • production
  • room-docu
  • camp23-prod
  • camp23-prod-archive
  • prod-2024-10-14_22-49
  • prod-2024-10-20_23-27
  • prod-2024-10-29_09-45
  • prod-2024-10-31_13-17
  • prod-2024-11-01_11-14
  • prod-2024-11-02_21-16
  • prod-2024-11-03_01-42
  • prod-2024-11-03_12-10
  • prod-2024-11-16_03-41
  • prod-2024-12-04_00-57
  • prod-2024-12-05_00-48
  • prod-2024-12-05_10-09
  • prod-2024-12-10_00-17
  • prod-2024-12-10_07-23
  • prod-2024-12-10_23-04
  • prod-2024-12-14_03-24
  • prod-2024-12-16_02-27
  • prod-2024-12-17_15-05
  • prod-2024-12-19_02-32
  • prod-2024-12-20_12-25
  • prod-2024-12-21_10-44
  • prod-2024-12-21_13-42
  • prod-2024-12-22_00-55
  • prod-2024-12-22_01-34
  • prod-2024-12-22_17-25
  • prod-2024-12-22_21-12
  • prod-2024-12-23_23-39
  • prod-2024-12-24_14-48
  • prod-2024-12-25_01-29
  • prod-2024-12-25_15-54
  • prod-2024-12-25_21-04
  • prod-2024-12-26_00-21
  • prod-2024-12-26_13-12
  • prod-2024-12-26_21-45
  • prod-2024-12-27_00-34
  • prod-2024-12-27_13-29
  • prod-2024-12-27_16-01
  • prod-2024-12-27_16-37
  • prod-2024-12-27_20-15
76 results
Show changes

Commits on Source 715

615 additional commits have been omitted to prevent performance issues.
1000 files
+ 39954
29207
Compare changes
  • Side-by-side
  • Inline

Files

+6 −0
Original line number Diff line number Diff line
@@ -10,3 +10,9 @@ src/hub/.settings.secret

# local media files
src/media/

.tools/
.vscode/
.venv*/

**/*_cache
+7 −9
Original line number Diff line number Diff line
@@ -2,18 +2,16 @@
root = true

[*]
indent_style = space
indent_size = 4
end_of_line = lf
charset = utf-8
trim_trailing_whitespace = true
insert_final_newline = true
end_of_line = lf
indent_size = 2
indent_style = space
max_line_length = 120
trim_trailing_whitespace = true

[*.md]
trim_trailing_whitespace = false

[*.{css,html,js,scss,j2}]
indent_size = 2

[*.yml]
indent_size = 2
[*.py, *.toml]
indent_size = 4
Original line number Diff line number Diff line
# Reformat with ruff
cad2f7373a716750d234842d93849af3014b72ad
# Reformat commits
dc0e95b225cf4db7aa4ff67a4bcd3572625eba56
14b7cb96e61e576ea62f5ee1b98d8bf4337895a8
## ruff
cad2f7373a716750d234842d93849af3014b72ad
## prettier
0bb342a92c644a25ca11b549c7a94c996c591279
## djLint
31839261af33014cbc7d5af93fcd1189d91d4755
# Split PlainUI views
caeff36b26f18a0732ddfcc752752529787a0632

.gitattributes

0 → 100644
+4 −0
Original line number Diff line number Diff line
/.yarn/**            linguist-vendored
/.yarn/releases/*    binary
/.yarn/plugins/**/*  binary
/.pnp.*              binary linguist-generated
+10 −9
Original line number Diff line number Diff line
@@ -83,15 +83,6 @@ db.sqlite3-journal
profile_default/
ipython_config.py

# pdm
#   Similar to Pipfile.lock, it is generally recommended to include pdm.lock in version control.
#pdm.lock
#   pdm stores project-wide configurations in .pdm.toml, but it is recommended to not include it
#   in version control.
#   https://pdm.fming.dev/#use-with-ide
.pdm.toml
.pdm-python

# Environments
.env
.venv
@@ -108,3 +99,13 @@ dmypy.json

# Cython debug symbols
cython_debug/

# yarn
.pnp.*
.yarn/*
!.yarn/patches
!.yarn/plugins
!.yarn/releases
# SDKS are excluded as we do not want them in the repository
# !.yarn/sdks
!.yarn/versions
+150 −109
Original line number Diff line number Diff line
@@ -5,6 +5,7 @@ stages:
  - container-test
  - publish
  - deploy
  - cleanup

workflow:
  rules:
@@ -28,15 +29,20 @@ workflow:
# Use build cache to speed up CI
default:
  cache:
    - key:
        files:
          - uv.lock
      paths:
        - $UV_CACHE_DIR
    - key: "python-default"
      paths:
        - .cache/pip
        - .cache/pdm
    - key:
        files:
          - src/plainui/yarn.lock
          - yarn.lock
      paths:
        - .yarn-cache/
        - .yarn/cache/

.django_runner_settings:
  before_script:
@@ -46,20 +52,27 @@ default:
        "tag": "$CI_COMMIT_TAG",
        "commit": "$CI_COMMIT_SHA",
        "branch": "$CI_COMMIT_BRANCH",
        "ci": true
        "ci": true,
        "pipeline": {
          "id": "$CI_PIPELINE_ID",
          "url": "$CI_PIPELINE_URL",
          "date": "$CI_PIPELINE_CREATED_AT"
        }

      }
      EOF
  services:
    - name: postgis/postgis:15-3.3
      alias: db_server
  variables:
    ALLOWED_HOSTS: "127.0.0.1,hubapp"
    ALLOWED_HOSTS: "127.0.0.1,hubapp,hubapp-${CI_JOB_ID}"
    DJANGO_DEBUG: "I_KNOW_WHAT_I_AM_DOING"
    POSTGRES_HOST_AUTH_METHOD: trust
    POSTGRES_PASSWORD: runner
    POSTGRES_USER: ci
    SSO_SECRET_GENERATE: "True"
    STORAGE_TYPE: local
    UV_CACHE_DIR: .uv-cache

# Kaniko build setup
.build:
@@ -69,15 +82,13 @@ default:
    name: gcr.io/kaniko-project/executor:debug
    entrypoint: [""]
  variables:
    KANIKO_CACHE_ARGS:
      --cache=true
    KANIKO_CACHE_ARGS: --cache=true
      --cache-copy-layers=true
      --cache-run-layers=true
      --cache-ttl=24h
      --cache-dir=$CI_PROJECT_DIR/.cache/kaniko
      --cache-repo=$CI_REGISTRY_IMAGE/cache
    KANIKO_ARGS:
      --skip-unused-stages=true
    KANIKO_ARGS: --skip-unused-stages=true
      --context $CI_PROJECT_DIR
      --build-arg REGISTRY=git.cccv.de/crews/hub/dependency_proxy/containers/
  before_script:
@@ -105,45 +116,45 @@ default:
        "ci": true
      }
      EOF
  after_script:
    - uv cache prune --ci


generate_css:
  extends:
    - .default-rules
  image: node:20-bookworm
  stage: prepare
  needs: []
  script:
    - cd src/plainui/
    - echo 'yarn-offline-mirror ".yarn-cache/"' >> .yarnrc
    - echo 'yarn-offline-mirror-pruning true' >> .yarnrc
    - yarn install --non-interactive --no-progress --frozen-lockfile
    - yarn run build
  artifacts:
    paths:
      - src/plainui/static/plainui/rc3*.css*
# Crane setup
.crane:
  image:
    name: gcr.io/go-containerregistry/crane:debug
    entrypoint: [""]
  variables:
    GIT_STRATEGY: none
  before_script:
    - '[ -n "$DOCKER_CONFIG" ] || export DOCKER_CONFIG=$HOME/.docker'
    - crane auth login -u $CI_REGISTRY_USER -p $CI_REGISTRY_PASSWORD $CI_REGISTRY

meta_build:
  stage: prepare
  extends: .build
  variables:
    PIP_CACHE_DIR: "$CI_PROJECT_DIR/.cache/pip"
    UV_CACHE_DIR: .uv-cache
  script:
    - /kaniko/executor
      $KANIKO_ARGS
      $KANIKO_CACHE_ARGS
      --dockerfile $CI_PROJECT_DIR/Dockerfile
        --target base
        --destination $CI_REGISTRY_IMAGE/build_image:$CI_PIPELINE_ID
      --target meta
      --destination $CI_REGISTRY_IMAGE/meta_image:mi-$CI_PIPELINE_ID
  rules:
    - when: always

.test:
  extends:
    - .django_runner_settings
  image: $CI_REGISTRY_IMAGE/build_image:$CI_PIPELINE_ID
  image: $CI_REGISTRY_IMAGE/meta_image:mi-$CI_PIPELINE_ID
  variables:
    UV_CACHE_DIR: .uv-cache
  needs:
    - meta_build
  after_script:
    - uv cache prune --ci

app_version:
  stage: test
@@ -154,8 +165,8 @@ app_version:
    DATABASE_URL: "postgis://ci:runner@db_server/migrations"
  script:
    - python3 -V
    - pdm sync -d --no-editable -G dev
    - pdm app_version
    - uv sync
    - uv run task app_version
  allow_failure: true
  rules:
    - when: always
@@ -165,8 +176,7 @@ lint:
  extends: .test
  script:
    - python3 -V
    - pdm sync -d --no-editable -G dev
    - pdm run lint
    - tox -e py-lint,html-lint
  allow_failure: true
  rules:
    - when: always
@@ -176,8 +186,24 @@ format:
  extends: .test
  script:
    - python3 -V
    - pdm sync -d --no-editable -G dev
    - pdm run format
    - tox -e py-format,html-format
    - git diff --exit-code -- . ':!src/version.json'
  allow_failure: true
  rules:
    - when: always

format_prettier:
  image: node:22-alpine
  stage: test
  needs: []
  before_script:
    - apk add --no-cache git
    - corepack enable
    - 'echo "cacheFolder: \".yarn/cache/\"" >> .yarnrc.yml'
    - yarn install --immutable
  script:
    - yarn format
    - git diff --exit-code -- . ':!src/version.json' ':!.yarnrc.yml'
  allow_failure: true
  rules:
    - when: always
@@ -191,8 +217,8 @@ migration_check:
    DATABASE_URL: "postgis://ci:runner@db_server/migrations"
  script:
    - python3 -V
    - pdm sync -d --no-editable -G dev
    - pdm run check-migrations
    - uv run task manage makemigrations --check
    - uv run task manage makemigrations --dry-run

translations_check:
  stage: test
@@ -204,8 +230,8 @@ translations_check:
  allow_failure: true
  script:
    - python3 -V
    - pdm install --no-editable --prod
    - pdm manage makemessages
    - uv sync
    - uv run task manage makemessages
    - git diff --exit-code -- . ':!src/version.json'

requirements_export:
@@ -213,16 +239,16 @@ requirements_export:
  extends: .test
  script:
    - python3 -V
    - pdm lock --check
    - pdm export --no-hashes -o requirements.txt --prod
    - pdm export --no-hashes -o requirements.dev.txt --de
    - uv lock --check
    - uv export --frozen -o=requirements.dev.txt --no-hashes --all-groups
    - uv export --frozen -o=requirements.txt --no-hashes --no-dev
    - git diff --exit-code -- . ':!src/version.json'
  rules:
    - changes:
        - requirements.txt
        - requirements.dev.txt
        - project.toml
      - pdm.lock
        - uv.lock

django-tests:
  stage: test
@@ -238,22 +264,24 @@ django-tests:
    DATABASE_URL: "postgis://ci:runner@db_server/self_test"
  script:
    - python3 -V
    - pdm sync -d --no-editable -G dev
    - pdm run test
    - pdm run coverage xml -i
    - tox -e django-test
    - tox -e coverage-report
  coverage: '/(?i)total(?:\s+\d+){4}\s+(\d+)%/'
  artifacts:
    reports:
      coverage_report:
        coverage_format: cobertura
        path: .tools/coverage/coverage.xml
        path: .tools/coverage/report.xml
    paths:
      - .tools/coverage/html_report/
    expire_in: 1 week
    expose_as: "HTML Coverage Report"

build_nginx:
  stage: build
  extends:
    - .build
  needs:
    - generate_css
    - migration_check
    - django-tests
  script:
@@ -269,7 +297,6 @@ build_release:
  extends:
    - .build
  needs:
    - generate_css
    - migration_check
    - django-tests
  script:
@@ -315,12 +342,12 @@ build_test:
    SERVE_API: "no"
    SERVE_BACKOFFICE: "no"
    SERVE_FRONTEND: "no"
    BASE_URL: http://hubapp:8000
    BASE_URL: http://hubapp-${CI_JOB_ID}:8000
  services:
    - name: postgis/postgis:15-3.3
      alias: db_server
    - name: "$CI_REGISTRY_IMAGE/ci/hub:ci-$CI_PIPELINE_ID-test"
      alias: hubapp
      alias: hubapp-${CI_JOB_ID}
      variables:
        DJANGO_CREATE_ADMIN_PASSWORD: "Test1234"
        DJANGO_DEBUG: "no"
@@ -330,6 +357,8 @@ build_test:
        SELECTED_CONFERENCE_ID: "017c0749-a2ea-4f86-92cd-e60b4508dd98"
  before_script:
    - pip3 install requests
    # TODO: remove this sleep once we have debugged the case for the 0 ms failure
    - sleep 30
    - echo "testing on $BASE_URL"
    - curl --max-time 8 --retry 3 --fail ${BASE_URL}/.well-known/version
    - curl --max-time 3 --retry 3 --fail ${BASE_URL}/.well-known/health
@@ -381,47 +410,19 @@ test_nginx_static:

.publish-image:
  stage: publish
  extends: .crane
  needs:
    - build_release
    - test_image_frontend
    - test_image_api
    - test_nginx_static

  image:
    name: gcr.io/go-containerregistry/crane:debug
    entrypoint: [""]
  variables:
    GIT_STRATEGY: none
  before_script:
    - '[ -n "$DOCKER_CONFIG" ] || export DOCKER_CONFIG=$HOME/.docker'
    - crane auth login -u $CI_REGISTRY_USER -p $CI_REGISTRY_PASSWORD $CI_REGISTRY

.publish-clean:
  extends:
    - .publish-image
  after_script:
    - crane delete $CI_REGISTRY_IMAGE/ci/hub:ci-${CI_PIPELINE_ID}-test
    - crane delete $CI_REGISTRY_IMAGE/ci/hub:ci-${CI_PIPELINE_ID}
    - crane delete $CI_REGISTRY_IMAGE/ci/nginx:ci-${CI_PIPELINE_ID}

publish-commit:
  extends:
    - .publish-clean
  script:
    - crane tag $CI_REGISTRY_IMAGE/ci/hub:ci-${CI_PIPELINE_ID} ${CI_COMMIT_SHORT_SHA}
    - crane tag $CI_REGISTRY_IMAGE/ci/nginx:ci-${CI_PIPELINE_ID} ${CI_COMMIT_SHORT_SHA}
  rules:
    - if: '$CI_COMMIT_BRANCH == "develop" ||  $CI_PIPELINE_SOURCE == "merge_request_event" || $CI_COMMIT_TAG'
    - when: never

publish-mr:
  extends:
    - .publish-image
  needs:
    - publish-commit
  script:
    - crane tag $CI_REGISTRY_IMAGE/ci/hub:${CI_COMMIT_SHORT_SHA} mr-${CI_MERGE_REQUEST_ID}
    - crane tag $CI_REGISTRY_IMAGE/ci/nginx:${CI_COMMIT_SHORT_SHA} mr-${CI_MERGE_REQUEST_ID}
    - crane tag $CI_REGISTRY_IMAGE/ci/hub:ci-$CI_PIPELINE_ID mr-${CI_MERGE_REQUEST_ID}
    - crane tag $CI_REGISTRY_IMAGE/ci/nginx:ci-$CI_PIPELINE_ID mr-${CI_MERGE_REQUEST_ID}
  rules:
    - if: '$CI_PIPELINE_SOURCE == "merge_request_event"'
    - when: never
@@ -429,11 +430,9 @@ publish-mr:
publish-develop:
  extends:
    - .publish-image
  needs:
    - publish-commit
  script:
    - crane copy $CI_REGISTRY_IMAGE/ci/hub:${CI_COMMIT_SHORT_SHA} $CI_REGISTRY_IMAGE:development
    - crane copy $CI_REGISTRY_IMAGE/ci/nginx:${CI_COMMIT_SHORT_SHA} $CI_REGISTRY_IMAGE/nginx:development
    - crane copy $CI_REGISTRY_IMAGE/ci/hub:ci-$CI_PIPELINE_ID $CI_REGISTRY_IMAGE:development
    - crane copy $CI_REGISTRY_IMAGE/ci/nginx:ci-$CI_PIPELINE_ID $CI_REGISTRY_IMAGE/nginx:development
  rules:
    - if: $CI_COMMIT_BRANCH == 'develop'
    - when: never
@@ -442,17 +441,33 @@ publish-develop:
publish-production:
  extends:
    - .publish-image
  needs:
    - publish-commit
  script:
    - crane copy $CI_REGISTRY_IMAGE/ci/hub:$CI_COMMIT_SHORT_SHA $CI_REGISTRY_IMAGE:production
    - crane copy $CI_REGISTRY_IMAGE/ci/hub:ci-$CI_PIPELINE_ID $CI_REGISTRY_IMAGE:production
    - crane tag $CI_REGISTRY_IMAGE:production latest
    - crane copy $CI_REGISTRY_IMAGE/ci/nginx:$CI_COMMIT_SHORT_SHA $CI_REGISTRY_IMAGE/nginx:production
    - crane copy $CI_REGISTRY_IMAGE/ci/nginx:ci-$CI_PIPELINE_ID $CI_REGISTRY_IMAGE/nginx:production
    - crane tag $CI_REGISTRY_IMAGE/nginx:production latest
  rules:
    - if: '$CI_COMMIT_TAG =~ /^prod-\d{4}-\d{2}-\d{2}_\d{2}-\d{2}$/'
    - when: never

publish-sentry-release:
  stage: publish
  image: python:3.11-bookworm
  needs:
    - publish-production
  before_script:
    - pip3 install sentry-cli
  script:
    - sentry-cli releases new $CI_COMMIT_TAG
    - sentry-cli releases set-commits $CI_COMMIT_TAG --commit "hub / hub@$CI_COMMIT_SHA"
    - sentry-cli releases finalize $CI_COMMIT_TAG
  rules:
    # skip if SENTRY URL is not set
    - if: $SENTRY_URL == null
      when: never
    - if: '$CI_COMMIT_TAG =~ /^prod-\d{4}-\d{2}-\d{2}_\d{2}-\d{2}$/'
    - when: never

deploy_develop:
  stage: deploy
  allow_failure: true
@@ -482,6 +497,7 @@ deploy_production:
  allow_failure: true
  needs:
    - publish-production
    - publish-sentry-release
  image: python:3.11-bookworm
  script:
    - 'curl -X POST "$DEPLOYMENT_SERVICEWEBHOOK_URL_PRODUCTION"'
@@ -500,3 +516,28 @@ deploy_production:

    # otherwise, skip this
    - when: never

cleanup-ci-images:
  stage: cleanup
  extends:
    - .crane
  script:
    - crane delete $CI_REGISTRY_IMAGE/ci/hub:ci-${CI_PIPELINE_ID}
    - crane delete $CI_REGISTRY_IMAGE/ci/nginx:ci-${CI_PIPELINE_ID}
    - crane delete $CI_REGISTRY_IMAGE/ci/hub:ci-${CI_PIPELINE_ID}-test
    - crane delete $CI_REGISTRY_IMAGE/meta_image:mi-$CI_PIPELINE_ID
  needs:
    - job: publish-production
      optional: true
    - job: publish-develop
      optional: true
    - job: publish-mr
      optional: true
  rules:
    - if: $CI_COMMIT_BRANCH == "develop"
    - if: $CI_COMMIT_TAG
    - if: $CI_PIPELINE_SOURCE == 'merge_request_event'
    - if: $FORCE_PIPELINE_RUN == 'true'
      when: on_success
    - when: never
  allow_failure: true
Original line number Diff line number Diff line
@@ -2,14 +2,14 @@ exclude: ^.*.min.*|migrations|yarn.lock|venv$

repos:
  - repo: https://github.com/pre-commit/pre-commit-hooks
      rev: v2.3.0
    rev: v5.0.0
    hooks:
      - id: check-yaml
      - id: check-toml
      - id: check-merge-conflict
      - id: check-ast
  - repo: https://github.com/astral-sh/ruff-pre-commit
      rev: v0.1.5
    rev: v0.9.7
    hooks:
      - id: ruff
        args: [--fix]
@@ -18,31 +18,47 @@ repos:
    rev: v1.0.0
    hooks:
      - id: check-json5
    - repo: https://github.com/pdm-project/pdm
      rev: 2.19.2
  - repo: https://github.com/djlint/djLint
    rev: v1.36.4
    hooks:
          - name: check production requirements
            id: pdm-export
            args: ['-o', 'requirements.txt', '--without-hashes', '--prod']
            files: ^pdm.lock$
    - repo: https://github.com/pdm-project/pdm
      rev: 2.19.2
      - id: djlint-reformat-django
      - id: djlint-reformat-jinja
      - id: djlint-django
      - id: djlint-jinja
  - repo: https://github.com/astral-sh/uv-pre-commit
    rev: 0.6.2
    hooks:
          - name: check development requirements
            id: pdm-export
            args: ['-o', 'requirements.dev.txt', '--without-hashes', '--dev']
            files: ^pdm.lock$
    - repo: https://github.com/pdm-project/pdm
      rev: 2.19.2
      - id: uv-lock
  - repo: https://github.com/astral-sh/uv-pre-commit
    rev: 0.6.2
    hooks:
           - id: pdm-lock-check
      - name: uv-export-prod
        id: uv-export
        args:
          - --frozen
          - -o=requirements.txt
          - --no-hashes
          - --no-dev
      - name: uv-export-dev
        id: uv-export
        args:
          - --frozen
          - -o=requirements.dev.txt
          - --no-hashes
          - --all-groups
  - repo: local
    hooks:
      - name: Check for uncreated migrations.
        id: migrations-check
        language: system
            entry: sh -c "src/manage.py makemigrations --check --dry-run"
        entry: sh -c "uv run task manage makemigrations --check --dry-run"
        files: "models/.*.py$"
        types:
          - python
            stages: [pre-commit]
        stages:
          - pre-commit
      - name: prettier
        id: prettier
        entry: yarn prettier --write --ignore-unknown
        language: node
        "types": [text]

.prettierignore

0 → 100644
+15 −0
Original line number Diff line number Diff line
**/*.html
**/*.j2
.yarn
.yarnrc.yml
.venv*

static.dist/
**/vendor/**/*
**/list_script.js

src/core/fixtures/local/**/*

src/plainui/static/plainui/hub.*

src/**/templates/**/*.js

.prettierrc

0 → 100644
+1 −0
Original line number Diff line number Diff line
{}
+61 −0
Original line number Diff line number Diff line
activitypub
Andi
ASGI
backoffice
blocktranslate
clonbares
CLUBFRIENDS
Conferencemember
csrf
csrfmiddlewaretoken
datatables
dect
derefer
dereferrer
Dereferrer
Disclaimern
Einlöseseite
emph
endblocktranslate
endfor
endspaceless
engelsystem
exneuland
favorited
forloop
gettz
Habitatsbeitritt
Habitatseinladung
Habitatszuordnung
Historieneintrag
htmlhead
jitsi
JITSI
keepalive
markdownify
merch
Merch
metanav
mgmt
msgid
msgstr
naturaltime
nplurals
orga
Orga
pentabarf
plainui
pois
pretix
Registrierungs
Registrierungsinformationen
Registrierungsstart
renderable
Roang
Screensharing
Shadowbanned
Shibboleet
unhide
Vorlesungsraum
Vorlesungssaal
yesno

.pylintrc.toml

0 → 100644
+231 −0
Original line number Diff line number Diff line
[tool.pylint.main]
# Specify a score threshold under which the program will exit with error.
fail-under = 8.0

# Files or directories to be skipped. They should be base names, not paths.
ignore = ["CVS", ".git", "__pycache__", ".tools", "node_modules", "migrations"]

# Add files or directories matching the regular expressions patterns to the
# ignore-list. The regex matches against paths and can be in Posix or Windows
# format. Because '\\' represents the directory delimiter on Windows systems, it
# can't be used as an escape character.
# ignore-paths =

# Files or directories matching the regular expression patterns are skipped. The
# regex matches against base names, not paths. The default value ignores Emacs
# file locks
ignore-patterns = ["^\\.#"]

# Use multiple processes to speed up Pylint. Specifying 0 will auto-detect the
# number of processors available to use, and will cap the count on Windows to
# avoid hangs.
#jobs = 0

# List of plugins (as comma separated values of python module names) to load,
# usually to register additional checkers.
load-plugins = ["pylint_django"]

# Pickle collected data for later comparisons.
persistent = true

# Resolve imports to .pyi stubs if available. May reduce no-member messages and
# increase not-an-iterable messages.
prefer-stubs = true

# Minimum Python version to use for version dependent checks. Will default to the
# version used to run pylint.
py-version = "3.13"

[tool.pylint.basic]

# Good variable names which should always be accepted, separated by a comma.
good-names = ["i", "j", "k", "ex", "Run", "_"]

[tool.pylint.classes]
# Warn about protected attribute access inside special methods
# check-protected-access-in-special-methods =

# List of method names used to declare (i.e. assign) instance attributes.
defining-attr-methods = [
  "__init__",
  "__new__",
  "setUp",
  "asyncSetUp",
  "__post_init__",
]


[tool.pylint.design]
# List of regular expressions of class ancestor names to ignore when counting
# public methods (see R0903)
# exclude-too-few-public-methods =

# List of qualified class names to ignore when counting class parents (see R0901)
# ignored-parents =

# Maximum number of arguments for function / method.
max-args = 5

# Maximum number of attributes for a class (see R0902).
max-attributes = 7

# Maximum number of boolean expressions in an if statement (see R0916).
max-bool-expr = 5

# Maximum number of branch for function / method body.
max-branches = 12

# Maximum number of locals for function / method body.
max-locals = 15

# Maximum number of parents for a class (see R0901).
max-parents = 7

# Maximum number of positional arguments for function / method.
max-positional-arguments = 5

# Maximum number of public methods for a class (see R0904).
max-public-methods = 20

# Maximum number of return / yield for function / method body.
max-returns = 6

# Maximum number of statements in function / method body.
max-statements = 50

# Minimum number of public methods for a class (see R0903).
# min-public-methods =


[tool.pylint.format]
# Maximum number of characters on a single line.
max-line-length = 160

# Maximum number of lines in a module.
max-module-lines = 1000

[tool.pylint.imports]
# List of modules that can be imported at any level, not just the top level one.
allow-any-import-level = ["core.templatetags.hub_absolute"]

[tool.pylint.logging]
# The type of string formatting that logging methods do. `old` means using %
# formatting, `new` is for `{}` formatting.
logging-format-style = "new"

[tool.pylint."messages control"]
# Disable the message, report, category or checker with the given id(s). You can
# either give multiple identifiers separated by comma (,) or put this option
# multiple times (only on the command line, not in the configuration file where
# it should appear only once). You can also use "--disable=all" to disable
# everything first and then re-enable specific checks. For example, if you want
# to run only the similarities checker, you can use "--disable=all
# --enable=similarities". If you want to run only the classes checker, but have
# no Warning level messages displayed, use "--disable=all --enable=classes
# --disable=W".
disable = [
  "raw-checker-failed",
  "bad-inline-option",
  "locally-disabled",
  "file-ignored",
  "suppressed-message",
  "useless-suppression",
  "deprecated-pragma",
  "use-implicit-booleaness-not-comparison-to-string",
  "use-implicit-booleaness-not-comparison-to-zero",
  "use-symbolic-message-instead",
  "missing-module-docstring",                         # we did not write docstrings for all modules - Can be enabled with time
  "missing-class-docstring",                          # we did not write docstrings for all classes - Can be enabled with time
  "missing-function-docstring",                       # we did not write docstrings for all functions - Can be enabled with time
  "too-many-ancestors",                               # we do not want to limit the number of ancestors to have full use of django features
  "wrong-import-position",                            # import position is enforced by isort/ruff
  "wrong-import-order",                               # import order is enforced by isort/ruff
  "logging-too-many-args",                            # The way most of our logging is done, would trigger this. Enable with logging statement updates
  # Let's ignore design issues for now
  "too-few-public-methods",
  "too-many-ancestors",
  "too-many-arguments",
  "too-many-boolean-expressions",
  "too-many-branches",
  "too-many-function-args",
  "too-many-instance-attributes",
  "too-many-locals",
  "too-many-nested-blocks",
  "too-many-positional-arguments",
  "too-many-public-methods",
  "too-many-return-statements",
  "too-many-statements",

  "unused-argument",
  "protected-access",
]


[tool.pylint.miscellaneous]
# List of note tags to take in consideration, separated by a comma.
notes = ["FIXME", "XXX", "TODO"]


[tool.pylint.refactoring]
# Maximum number of nested blocks for function / method body
max-nested-blocks = 5

[tool.pylint.reports]

# Set the output format. Available formats are: text, parseable, colorized, json2
# (improved json format), json (old json format) and msvs (visual studio). You
# can also give a reporter class, e.g. mypackage.mymodule.MyReporterClass.
output-format = "colorized"

# Tells whether to display a full report or only the messages.
reports = true

# Activate the evaluation score.
score = true

[tool.pylint.typecheck]
# List of decorators that produce context managers, such as
# contextlib.contextmanager. Add to this list to register other decorators that
# produce valid context managers.
contextmanager-decorators = ["contextlib.contextmanager"]

# List of members which are set dynamically and missed by pylint inference
# system, and so shouldn't trigger E1101 when accessed. Python regular
# expressions are accepted.
# generated-members =

# Tells whether missing members accessed in mixin class should be ignored. A
# class is considered mixin if its name matches the mixin-class-rgx option.
# Tells whether to warn about missing members when the owner of the attribute is
# inferred to be None.
ignore-none = true

# This flag controls whether pylint should warn about no-member and similar
# checks whenever an opaque object is returned when inferring. The inference can
# return multiple potential results while evaluating a Python object, but some
# branches might not be evaluated, which results in partial inference. In that
# case, it might be useful to still emit no-member and other checks for the rest
# of the inferred objects.
ignore-on-opaque-inference = true

# List of symbolic message names to ignore for Mixin members.
ignored-checks-for-mixins = [
  "no-member",
  "not-async-context-manager",
  "not-context-manager",
  "attribute-defined-outside-init",
]

# List of class names for which member attributes should not be checked (useful
# for classes with dynamically set attributes). This supports the use of
# qualified names.
ignored-classes = [
  "optparse.Values",
  "thread._local",
  "_thread._local",
  "argparse.Namespace",
]

[tool.pylint.variables]
# Tells whether unused global variables should be treated as a violation.
allow-global-unused-variables = true

.yarnrc.yml

0 → 100644
+1 −0
Original line number Diff line number Diff line
yarnPath: .yarn/releases/yarn-4.6.0.cjs
+94 −50
Original line number Diff line number Diff line
@@ -50,12 +50,11 @@ Um eine andere Demo Datei zu laden kann die Umgebungsvariable `DJANGO_LOAD_FIXTU
   - Linux: Pakete `python3`, `postgresql` und `gettext`
   - Mac: `brew install python3 postgresql gettext` bzw. https://postgresapp.com/
   - Windows: [latest stable Python 3 release](https://www.python.org/downloads/windows/) und [PostgreSQL Installer](https://www.postgresql.org/download/windows/) und [gettext binaries](https://mlocati.github.io/articles/gettext-iconv-windows.html)
    - Python Paketmanager: `pdm`:
      Kann mit OS Installationsmitteln oder mit `pip install pdm` installiert werden.
   - Python Paketmanager: `uv`:
     Kann mit OS Installationsmitteln, mit `pip install uv` oder `pipx install uv` installiert werden. (Siehe die [Offizielle Dokumentation](https://docs.astral.sh/uv/getting-started/installation/))
1. Klone dieses Repository an beliebigen Ort.
1. Erstelle die virtuelle Umgebung mit `pdm info`
1. Installiere die python dependencies mit `pdm install`
6. (falls nicht bereits vorhanden) lege einen PostgreSQL-User an (`createuser -P hub_app`) sowie eine Datenbank (`createdb -O hub_app hub`) - unter Linux ggf. via `sudo -u postgres`
1. Installiere die python dependencies mit `uv sync`
1. (falls nicht bereits vorhanden) lege einen PostgreSQL-User an (`createuser -P hub_app`) sowie eine Datenbank (`createdb -O hub_app hub`) - unter Linux ggf. via `sudo -u postgres`
1. Konfiguriere deine Instanz mittels:
   - Umgebungsvariablen (z.B. direnv, oder env Datei)
   - oder mittels einer `local_settings.py` in `src/hub/` an (Dokumentation zur Datenbankkonfiguration in der [Django-Dokumentation](https://docs.djangoproject.com/en/3.1/ref/settings/#databases)):
@@ -67,8 +66,9 @@ SESSION_COOKIE_SECURE = False
SESSION_COOKIE_DOMAIN = None
IS_API = True
IS_FRONTEND = True
STORAGE_TYPE = 'local'
EMAIL_BACKEND = 'django.core.mail.backends.console.EmailBackend'  # "send" emails as console output
SELECTED_CONFERENCE_ID = '40ba6cda-1970-409f-81ef-efb87ef09d95' # change this to the id of your conference you want to display in frontend (matches the one from rc3_2020.json)
SELECTED_CONFERENCE_ID = '40ba6cda-1970-409f-81ef-efb87ef09d95' # change this to the id of your conference you want to display in frontend (matches the one from rc3_2021.json)
METRICS_SERVER_IPS = ['127.0.0.1']   # Change this if you want to test the Prometheus / Grafana metrics endpoint (http://127.0.0.1:8000/metrics/)

DATABASES = {
@@ -83,22 +83,28 @@ DATABASES = {
}
```

8. Lege die Datenbanktabellen an: `pdm manage migrate`
9. Richte einen Admin-Nutzer (für die Anwendung) ein: `pdm manage createsuperuser`
10. optional: Import von Demo-Daten: `pdm manage.py loaddata .src/core/fixtures/rc3_2021.json`
8. Lege die Datenbanktabellen an: `uv run task manage migrate`
9. Richte einen Admin-Nutzer (für die Anwendung) ein: `uv run task manage createsuperuser`
10. optional: Import von Demo-Daten: `uv run task.py loaddata .src/core/fixtures/rc3_2021.json`
11. optional: Für deinen Adminuser via [Admin-Seite](http://localhost:8000/c3admin/) einen `ConferenceMember` anlegen um den User für das Frontend freizuschalten

## Nutzung

1. Aktiviere das virtual env `pdm venv activate`
2. Wende ggf. vorhandene DB-Migrations an (wenn du gerade aus dem Git geupdatet hast): `pdm manage migrate`
3. Stelle sicher dass alle Translations aktuell sind: `pdm manage compilemessages`
4. Lasse alle staticfiles einsammeln: `pdm manage collectstatic`
5. Starte den Dev-Server: `pdm manage runserver`
2. Wende ggf. vorhandene DB-Migrations an (wenn du gerade aus dem Git geupdatet hast): `uv run task manage migrate`
3. Stelle sicher dass alle Translations aktuell sind: `uv run task manage compilemessages`
4. Lasse alle staticfiles einsammeln: `uv run task manage collectstatic --noinput`
5. Starte den Dev-Server: `uv run task manage runserver`
6. Besuche die lokale Instanz: [Admin-Seite](http://localhost:8000/c3admin/), [API](http://localhost:8000/api/) und [plainui-Frontend](http://localhost:8000/)

## PlainUI Development Tipps

### Ein Theme erstellen

Sämtliche Hub-Komponenten müssen eigene Variablen definieren. Alle Variablen sind in [`_variables-hub.scss`](./src/plainui/styles/_variables-hub.scss) enthalten.
Gegebenenfalls müssen neue Variablen hinzugefügt werden. Ein neues Theme kann dann Variablen aus Bootstrap oder dem Hub überschreiben.

Als Beispiel kann das [`hub-high-contrast.scss`](./src/plainui/styles/themes/hub-high-contrast.scss) genommen werden.

### CSS Kompilieren (PlainUi)

1. Gehe in das Verzeichnis `src/plainui/`
@@ -106,6 +112,20 @@ DATABASES = {
3. Führe `yarn build` aus um das CSS zu kompilieren
4. Um das CSS beim Entwickeln automatisch neu zu kompilieren gibt es `yarn watch`

#### Kompiliertes CSS mit lokalem docker-compose Setup ausliefern

Um das kompilierte CSS über eine lokale Instanz auszuliefern die z.B. mit `docker compose up` ausgeliefert wurde muss dem `nginx` Container in der `docker-compose.yml` ein zusätzlicher Ordner gemounted werden:

```
volumes:
  - ./src/plainui/static:/www/static
  ...
```

#### CSS Watch im Container

1. Um die Styles zu kompilieren und eine watch zu starten benutze `docker compose --profile build up -d local-static`

### Components

Wiederverwendbare Elemente können als Komponenten integriert werden. Dafür gibt es jeweils 2 "components"-Ordner. Die HTML-Komponenten befinden sich in `plainui/jinja2/plainui/components`. Sollten zusätzliche Styles benötigt werden, können diese in `plainui/styles/components` hinzugefügt werden, welche anschließend in `rc3.scss` integriert werden. Wichtig ist, ein **eigenes File pro Komponente** anzulegen. Für Jinja eignen sich dazu [Macros](https://jinja.palletsprojects.com/en/2.10.x/templates/#import) oder [Includes](https://jinja.palletsprojects.com/en/2.10.x/templates/#include). Zusätzlich ist die Komponente in der Übersicht hinzuzufügen - lokal erreichbar unter <http://localhost:8000/component_gallery>.
@@ -114,25 +134,26 @@ Wiederverwendbare Elemente können als Komponenten integriert werden. Dafür gib

### Abhängigkeitsverwaltung

Zur Verwaltung der verwendeten python Abhängigkeiten wir der [Paketmanager PDM](https://pdm-project.org/en/latest/) verwendet.
Zur Verwaltung der verwendeten python Abhängigkeiten wir der [Paketmanager UV](https://docs.astral.sh/uv/) verwendet.
Mittels diesem werden die `requirements.txt` und die `requirements.dev.txt` erzeugt.

#### Installation der aktuellen Abhängigkeiten

Um die aktuellen python Abhängigkeiten zu installieren kann das Kommando `pdm install` verwendet werden.
Mit dem Kommando `pdm sync --clean` kann sichergestellt werden, dass genau die Versionen aus dem pdm lock file installiert werden.  
*Achtung*: **Nicht in der Lock Datei enthaltene python Pakete werden deinstalliert!**
Um die aktuellen python Abhängigkeiten zu installieren kann das Kommando `uv sync` verwendet werden.
Mit dem Kommando `ur sync --frozen` kann sichergestellt werden, dass genau die Versionen aus dem uv lock file installiert werden.
_Achtung_: **Nicht in der Lock Datei enthaltene python Pakete werden deinstalliert!**

#### Hinzufügen von neuen Abhängigkeiten

Um neue Abhängigkeiten hinzuzufügen kann das Kommando `pdm add` verwendet werden. Mit dem Parameter `--dev` kann eine Abhängigkeit als Entwicklungs-Abhängigkeit deklariert werden. Diese wird dann in den Produktivinstanzen nicht installiert.
Um neue Abhängigkeiten hinzuzufügen kann das Kommando `uv add` verwendet werden. Mit dem Parameter `--dev` kann eine Abhängigkeit als Entwicklungs-Abhängigkeit deklariert werden. Diese wird dann in den Produktivinstanzen nicht installiert.

Anschließend müssen die `requirements.txt` und `requirements.dev.txt` angepasst werden.
Wenn `pre-commit` verwendet wird, passiert dies automatisch beim commit.
Alternativ können die folgenden Kommandos verwendet werden

```bash
pdm export --no-hashes -o requirements.txt --prod
pdm export --no-hashes -o requirements.dev.txt --dev
uv export --format requirements-txt --no-hashes -o requirements.txt --no-dev
uv export --format requirements-txt --no-hashes -o requirements.dev.txt --all-groups
```

### Debugging DJANGO
@@ -154,9 +175,9 @@ oder:

### Übersetzungen extrahieren & compilieren

-   `pdm manage makemessages`
- `uv run task manage makemessages`
- Die Übersetzungsdateien in `<app>/locale/<sprache>/LC_MESSAGES/django.po` wurden um alle neuen Übersetzungsstrings erweitert. Bearbeiten und jeweils bei `msgstr` die Übersetzungen einfügen!
-   `pdm manage compilemessages`
- `uv run task manage compilemessages`
- Zum Ausprobieren müsst ihr django neu starten um die neuen Übersetzungsdateien zu laden

### Übersetzungen definineren
@@ -166,7 +187,7 @@ oder:

### Static Files einsammeln lassen

-   `pdm manage collectstatic`
- `uv run task manage collectstatic`
- ohne dies brechen u.a. Unittests mit Fehlern wie "Missing staticfiles manifest entry" ab

### Tests
@@ -174,7 +195,6 @@ oder:
Es gibt Django (Unit-)Tests für die einzelnen Apps. Diese finden sich jeweils im `tests` Ordner der App.
Außerdem gibt es im repository root unter [tests](tests) auch noch einfache Integrationstests.


#### Test mit Übersetzten Inhalten

Um sicherzustellen, dass Tests mit Übersetzungen in allen Umgebungen funktionieren,
@@ -182,6 +202,7 @@ müssen die Übersetzungsfunktionen gemocked werden.
Dazu gibt es in `core.tests.utils` eine `mocktrans` Funktion die alle Text einfach mich `_trans` ergänzt (z.B. wird aus "Login" dann "Login_trans").

Die Funktion kann dann wie folgt verwendet werden:

```python
with patch('core.models.ticket._', side_effect=mocktrans):
    translation = funktions_aufruf()
@@ -193,8 +214,8 @@ self.assertEqual(translation, "NoTIcket_trans")
Um die Tests ausführen zu können muss der Datenbanknutzer das Recht haben neue Datenbaken anzulegen.
Dafür mit `psql postgres` die Datenbankkonsole starten. Dort `ALTER USER hub_app CREATEDB;` ausführen (ggf. `hub_app` durch den gewählten Nutzernamen ersetzen). Am Ende mit `Strg-D` Konsole wieder schließen.

Um die Django Tests auszuführen kann das Kommando `pdm test` verwendet werden. \
Um nur die Tests einer App auszuführen stattdessen `pdm test -- <app>.tests` ausführen. \
Um die Django Tests auszuführen kann das Kommando `uv run tox -e django-test` verwendet werden. \
Um nur die Tests einer App auszuführen stattdessen `uv run tox -e django-test -- <app>.tests` ausführen. \
Hilfreiche Argumente sind `-v 2` um die ausgeführten Tests anzuzeigen, `--failfast` um nach dem ersten Fehler abzubrechen, und `--keepdb` um nicht jedes mal die Migrationen durchführen zu müssen. \
Für weitere Infos zu dem Befehl ist https://docs.djangoproject.com/en/3.1/ref/django-admin/#django-admin-test hilfreich.

@@ -221,8 +242,21 @@ Dieses hat die folgenden Kommandos:
- `ruff check`: Diess Kommando checkt auf häufige Fehler in python code und hat auch teilweise Möglichkeiten diese zu beheben
- `ruff format`: Dieses Kommando re-formatiert alle python dateien nach einem voregebenen Code-Style

### djLint

**Leider ist es nicht möglich djLint so zu konfigurieren, dass es automatisch die richtigen Profile verwendet.**
**Wenn eine Integration in einen Editor gewünscht ist, muss dies berücksichtigt werden!**

Manuell kann es folgendermaßen ausgeführt werden:

- `djlint --profile django --extension html --reformat .` um alle .html Dateien zu formatieren
- `djlint --profile jinja --extension j2 --reformat .` um alle .j2 Dateien zu formatieren
- `djlint --lint .` um alle ein potenzielle Fehler per Linting zu finden

## Häufige Fehler

**Datenbank-Migration schlägt fehl mit "'DatabaseOperations' object has no attribute 'geo_db_type'"**

```
$ ./manage.py migrate
<snip>
@@ -231,6 +265,16 @@ AttributeError: 'DatabaseOperations' object has no attribute 'geo_db_type'

Dieser Fehler tritt auf, wenn man [PostGIS](https://postgis.net/)-Felder mit dem normalen Django-Postgres-Backend anlegen möchte. Statt dessen als Engine `django.contrib.gis.db.backends.postgis` verwenden.

**Docker-Build schlägt fehl mit "error creating zfs mount"**

Hierbei handelt es sich um ein Problem von Docker multi-stage Builds und dem ZFS-Storage Treiber.

Da das Problem nur auftritt wenn einzelne Image-Layer nicht im Cache sind lässt sich das Problem umgehen indem man den Build solange ausführt bis er erfolgreich ist.

Ab Version 22 von `docker-ce` sollte das Problem nicht mehr auftreten.

Siehe auch den [zughörigen Issue auf GitHub](https://github.com/moby/buildkit/issues/1758).

## Docker Image Abhängigkeiten

Das Bild zeigt die aktuellen Docker Image Abhängigkeiten aus dem Multistage Dockerfile
+71 −64
Original line number Diff line number Diff line
ARG REGISTRY=""
FROM ${REGISTRY}python:3.13-bookworm as base

ARG DEVELOPMENT=False
FROM ${REGISTRY}python:3.13-bookworm AS base
COPY --from=ghcr.io/astral-sh/uv:0.5.21 /uv /uvx /bin/

ENV DEBIAN_FRONTEND=noninteractive
ENV PDM_CHECK_UPDATE=false
# Set up uv environment
ENV UV_LINK_MODE=copy UV_COMPILE_BYTECODE=1 UV_TOOL_BIN_DIR=/opt/uv-bin/

RUN echo 'APT::Install-Recommends "false";' > /etc/apt/apt.conf.d/99-unattended-minimal && \
    echo 'APT::Install-Suggests "false";' >> /etc/apt/apt.conf.d/99-unattended-minimal && \
@@ -12,28 +12,30 @@ RUN echo 'APT::Install-Recommends "false";' > /etc/apt/apt.conf.d/99-unattended-

RUN --mount=target=/var/lib/apt/lists/,type=cache,sharing=locked \
    --mount=target=/var/cache/apt/archives/,type=cache,sharing=locked \
    --mount=target=/root/.cache/pip,type=cache,sharing=locked \
    # Prevent apt cache cleaning
    rm -f /etc/apt/apt.conf.d/docker-clean && \
    apt-get update && \
    apt-get install \
        gdal-bin  \
        gettext \
        locales \
        rsync && \
        locales && \
    dpkg-reconfigure locales && \
       locale-gen C.UTF-8 && \
       /usr/sbin/update-locale LANG=C.UTF-8 && \
    pip install -U pdm && \
    # Fix caching issue with kaniko see: https://github.com/GoogleContainerTools/kaniko/issues/3246
    mkdir -p /app/plainui

# Add path for future pdm location
ENV PATH="/install/.venv/bin:$PATH"
    # Add path for future uv location
    ENV PATH="/install/.venv/bin:/opt/uv-bin:$PATH"

######################################### [meta] #############################
FROM base AS meta

RUN uv tool install tox --with tox-uv

######################################### [build] #############################

FROM base as build
FROM base AS build

ENV LC_ALL=C.UTF-8

@@ -41,87 +43,53 @@ RUN --mount=target=/var/lib/apt/lists/,type=cache,sharing=locked \
    --mount=target=/var/cache/apt/archives/,type=cache,sharing=locked \
    apt-get update && \
    apt-get install \
        build-essential \
        yarnpkg && \
        build-essential && \
    mkdir /install

WORKDIR /install
COPY pyproject.toml pdm.lock README.md /install/
RUN --mount=target=/root/.cache/pdm,type=cache,sharing=locked \
    pdm install --check --prod --no-editable
COPY pyproject.toml uv.lock README.md /install/
RUN --mount=type=cache,target=/root/.cache/uv \
    uv sync --frozen --no-install-project --no-editable

######################################### [build-dev] #########################

FROM build as build-dev
FROM build AS build-dev

WORKDIR /install
RUN --mount=target=/root/.cache/pdm,type=cache,sharing=locked \
    pdm install --check --dev --no-editable
RUN --mount=type=cache,target=/root/.cache/uv \
    uv sync --frozen --no-install-project --no-editable

######################################### [build-static] ######################
######################################### [node-build] ########################

FROM build as build-static
FROM ${REGISTRY}node:20-alpine AS node-base

# Only copy over the requirements files, use cache if they have not changed.
RUN mkdir -p /app/plainui/
COPY src/plainui/package.json src/plainui/yarn.lock /app/plainui/
COPY src/plainui/package.json src/plainui/yarn.lock src/plainui/.yarnrc.yml /app/plainui/
COPY src/plainui/.yarn/releases/yarn-4.5.1.cjs /app/plainui/.yarn/releases/

WORKDIR /app/plainui
RUN /usr/bin/yarnpkg

COPY src/ /app/

RUN /usr/bin/yarnpkg build
WORKDIR /app
RUN export DJANGO_SETTINGS_MODULE='hub.settings.build' && \
    python3 /app/manage.py collectstatic --noinput && \
    python3 /app/manage.py compilemessages && \
    unset DJANGO_SETTINGS_MODULE


######################################### [nginx] #############################

FROM ${REGISTRY}nginx:1.25-alpine-slim as nginx-forwarder
ENV APP_SOCKET="/run/hub/app.sock"
ENV SCRIPT_NAME=""

VOLUME /www/media

COPY deployment/docker/index.html deployment/docker/error_*.html /www/default/
COPY deployment/docker/nginx.conf /etc/nginx/templates/default.conf.template

FROM nginx-forwarder as nginx

COPY --from=build-static /app/static.dist /www/static
RUN corepack enable && \
    /usr/local/bin/yarnpkg set version stable && \
    /usr/local/bin/yarnpkg install

######################################### [webworker-base] ####################

FROM base as webworker-base
ARG DEVELOPMENT
FROM base AS webworker-base

ENV APP_HOME=${APP_HOME:-/app_home}

VOLUME /data /app/media

ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
ENV LC_ALL=C.UTF-8
ENV DJANGO_SETTINGS_MODULE=hub.settings.default
ENV DOCKER_UID=1000

RUN --mount=target=/var/lib/apt/lists/,type=cache,sharing=locked \
    --mount=target=/var/cache/apt/archives/,type=cache,sharing=locked \
    if [ "$DEVELOPMENT" = "True" ]; then\
        apt-get install \
            yarnpkg; \
    fi && \
    useradd -u $DOCKER_UID -ms /bin/bash -d /app_home appuser
RUN useradd -u $DOCKER_UID -ms /bin/bash -d /app_home appuser

COPY deployment/docker/app.sh /usr/local/bin/app
COPY deployment/docker/check_django.sh /usr/local/bin/hub_healthcheck
COPY deployment/docker/check_psql.py /usr/local/bin/postgres_healthcheck

COPY --from=build /install/.venv /install/.venv
ENV PATH="/install/.venv/bin:$PATH"

RUN install -d -m 0755 -o appuser -g appuser /app/hub /data /app/media /run/hub && \
@@ -131,17 +99,53 @@ RUN install -d -m 0755 -o appuser -g appuser /app/hub /data /app/media /run/hub
ENTRYPOINT ["/usr/local/bin/app"]
CMD ["webworker"]

######################################### [node-build] ######################

FROM node-base AS node-build

WORKDIR /app/plainui

COPY src/plainui/styles/ /app/plainui/styles
RUN /usr/local/bin/yarnpkg build

ENTRYPOINT ["/usr/local/bin/yarnpkg"]
CMD ["watch"]

######################################### [build-static] ######################

FROM build AS build-static

COPY --from=node-build /app/plainui/static/plainui/*.css* /app/plainui/static/plainui/
COPY /src/ /app

WORKDIR /app
RUN export DJANGO_SETTINGS_MODULE='hub.settings.build' && \
    python3 /app/manage.py collectstatic --noinput && \
    python3 /app/manage.py compilemessages && \
    unset DJANGO_SETTINGS_MODULE

######################################### [nginx] #############################

FROM ${REGISTRY}nginx:1-alpine-slim AS nginx
ENV APP_SOCKET="/run/hub/app.sock"
ENV SCRIPT_NAME=""

COPY deployment/docker/index.html deployment/docker/error_*.html /www/default/
COPY deployment/docker/nginx.conf /etc/nginx/templates/default.conf.template

COPY --from=build-static /app/static.dist /www/static

######################################### [webworker] #########################

FROM webworker-base as webworker
FROM webworker-base AS webworker

COPY --from=build /install/.venv /install/.venv
COPY --from=build-static --chown=appuser /app /app
USER appuser

######################################### [dev] ###############################

FROM webworker-base as dev
FROM webworker-base AS dev

ENV DJANGO_SETTINGS_MODULE=hub.settings.dev
ENV SERVE_API=yes
@@ -154,11 +158,14 @@ RUN install -o appuser -g appuser -m 774 /dev/null /data/django.log

# Copy additional dev dependencies
COPY --from=build-dev /install/.venv /install/.venv
# Copy plainui styles
COPY --from=node-build /app/plainui/static/*.css* /app/plainui/static/
ENV PATH="/install/.venv/bin:$PATH"

USER appuser
COPY --from=build-static --chown=appuser /app /app
WORKDIR /app
RUN export DJANGO_SETTINGS_MODULE='hub.settings.build' && \
    app build && \
    python3 /app/manage.py collectstatic --noinput && \
    python3 /app/manage.py compilemessages && \
    unset DJANGO_SETTINGS_MODULE
+47 −20
Original line number Diff line number Diff line
@@ -31,17 +31,26 @@ als PDF: [Grobes Datenmodell](docs/Grobes Datenmodell.pdf) und [automatisch expo

## REST API

Am Beispiel der Konferenz-Slug "camp23", grundsätzlich sind alle hier aufgeführten Endpoints per GET abrufbar (Restriktionen bei nicht-öffentlichen Events, etc. sind möglich).
Grundsätzlich sind alle hier aufgeführten Endpoints per GET abrufbar (Restriktionen bei nicht-öffentlichen Events, etc. sind möglich).
Manche Endpunkte sind zusätzlich "schreibbar" und können zur Anlage bzw. Bearbeitung der jeweiligen Daten genutzt werden.

Testinstanz: <https://hub.test.c3voc.de/api/> / <https://staging.hub.c3events.de/api/> \
Prodinstanz: <https://api.events.ccc.de/congress/2023/>
Prodinstanz: <https://api.events.ccc.de/congress/2024/>

| Kategorie  | Endpunkt                                 | GET | POST | PUT | DEL | Beschreibung                                          |
| ---------- | ---------------------------------------- | --- | ---- | --- | --- | ----------------------------------------------------- |
| Auth       | `/auth/get-token`                        |     | x    |     |     | Ausstellen eines API-Tokens                           |
| Persönlich | `/me`                                    | x   |      | x   |     | eigenes Profil / Settings                             |
| Persönlich | `/me/friends`                            | x   |      | x   | x   | Liste der Buddies                                     |
| Persönlich | `/me/badges`                             | x   |      |     |     | Liste aller Badges/Achievements                       |
| Persönlich | `/me/events`                             | x   |      |     |     | Favorisierte Events                                   |
| Persönlich | `/me/events/<uuid>/`                     |     |      | x   | x   | Events (ent-)favorisieren                             |
| Persönlich | `/me/friends`                            | x   |      | x   | x   | Liste der Buddies                                     |
| Persönlich | `/me/received-messages/`                 | x   |      |     |     | Übersicht empfangener PN                              |
| Persönlich | `/me/received-messages/<uuid>`           | x   |      |     |     | Details einer empfangenen PN                          |
| Persönlich | `/me/send-message`                       |     | x    |     |     | Send a new PN                                         |
| Persönlich | `/me/sent-messages/`                     | x   |      |     |     | Übersicht gesendeter PN                               |
| Persönlich | `/me/sent-messages/<uuid>`               | x   |      |     |     | Details einer gesendeten PN                           |
| Persönlich | `/me/delete-message/<uuid>`              | x   |      |     |     | PN löschen                                            |
| Konferenz  | `/`                                      | x   |      |     |     | Metadaten des Konferenz                               |
| Konferenz  | `/tags`                                  | x   |      |     |     | Liste aller Tags auf der Konferenz                    |
| Konferenz  | `/tracks`                                | x   | x    |     |     | Liste der Tracks                                      |
@@ -63,3 +72,21 @@ Prodinstanz: <https://api.events.ccc.de/congress/2023/>

Per POST werden neue Einträge angelegt, per PUT bestehende verändert.
Details zu den einzelnen Endpunkten folgen in Kürze™.

### API-Beispiel mit cURL

Zuerst einen Token generieren:

```bash
curl https://{API_URL}/api/auth/get-token -H "Content-Type: application/json" -X POST --data '{"username": "{USERNAME}", "password": "{PASSWORD}"}'
```

Mit diesem Token können dann Endpunkte aufgerufen werden die eine Authentifizierung erfordern:

```bash
curl https://{API_URL}/api/me -H "Content-Type: application/json" -H "Authorization: Token {API_TOKEN}"
```

## Development

see [Development.md](./Development.md)

cspell.config.yaml

0 → 100644
+41 −0
Original line number Diff line number Diff line
---
$schema: https://raw.githubusercontent.com/streetsidesoftware/cspell/main/cspell.schema.json
version: "0.2"

import:
  - "@cspell/dict-python/cspell-ext.json"
  - "@cspell/dict-de-de/cspell-ext.json"

dictionaryDefinitions:
  - name: project-words
    path: "./.project-dictionary.txt"
    addWords: true

dictionaries:
  - python
  - html
  - project-words
  - en_US
  - de-de

ignorePaths:
  - "node_modules"
  - ".venv"
  - ".tools"
  - "/.project-dictionary.txt"
  - "src/**/migrations"
  - "src/**/fixtures"
  - "src/**/vendor"
  - "src/**/tests"
  - "src/static.dist"
  - "src/hub/settings"
  - "*.svg"
  - "tox.ini"

languageSettings:
  - languageId: markdown
    caseSensitive: true
  - languageId: python
    ignoreRegExpList:
      - /from .* import .*/g
      - /import .*/g
Original line number Diff line number Diff line
Deployment
===========
# Deployment

# Überblick

@@ -12,6 +11,7 @@ Alternativ kann aich wie gehabt die Konfiguration mit einer `local_settings.py`
Egal welche der Methden gewählt wir, muss zumindest angegeben werden welche Funktionen auszurollen sind und wie die Datenbank zu erreichen ist.
Für die Umgebungsvariablen Konfiguration kann die `docker-compose.yml` als Beispiel zu rate gezogen werden.
Beispiel einer `local_settings.py`:

```python
IS_API = False
IS_BACKOFFICE = False
@@ -30,7 +30,6 @@ DATABASES = {
}
```


# Tipps & Tricks

## Backup
@@ -40,6 +39,7 @@ Allgemein: für ein normales Backup sollte einfach die Datenbank über Bordmitte
Django hat auch eine Funktion für den Export von Daten eingebaut, diese Variante ist allerdings nicht für den Produktiven Einsatz empfohlen.
Das Problem liegt hier in der Struktur der Daten bei der Wiederherstellung. Diese ist mit `dumpdata`/`loadata` nicht garaniert.
Grundsätzlich funktioniert es wie folgt:

1. Docker Container mit "debug" Kommando starten
2. `./manage.py dumpdata` mit natural keys und ohne Django-Interna laufen lassen

Original line number Diff line number Diff line
@@ -9,7 +9,6 @@ if [ "$1" == "version" ]; then
    exec python3 $APP_HOME/manage.py appversion
fi

if [[ "$1" != "build" ]]; then
RETRIES=10
until postgres_healthcheck; do
if [[ ${RETRIES} -eq 0 ]]; then
@@ -20,11 +19,10 @@ if [[ "$1" != "build" ]]; then
sleep 1
RETRIES=$((RETRIES-1))
done
fi

cd $APP_HOME
CONTAINER_MIGRATED="CONTAINER_MIGRATED_FLAG"
if [[ ! -e "$HOME/$CONTAINER_MIGRATED" && ("$DJANGO_MIGRATE" == "yes" || "$STARTMODE" == "init") ]]; then
if [[ ! -e "$HOME/$CONTAINER_MIGRATED" && "$DJANGO_MIGRATE" == "yes"  ]]; then
    python3 $APP_HOME/manage.py migrate --noinput
    touch $HOME/$CONTAINER_MIGRATED
fi
@@ -47,34 +45,7 @@ if [ ! -e "$HOME/$ADMIN_CREATED" -a -n "$DJANGO_CREATE_ADMIN_PASSWORD" ]; then
    touch $HOME/$ADMIN_CREATED
fi

if [ "$1" == "init" -o "$STARTMODE" == "init" ]; then
    # activate provided SSH key, if any
    if [ -n "${INIT_STATICFILES_SSH_KEY_SECRET}" ]; then
        cat "/run/secrets/${INIT_STATICFILES_SSH_KEY_SECRET}" > "$HOME/.ssh/id_rsa"
        echo >> "$HOME/.ssh/id_rsa"  # ensure newline at the end of the key file -.-
        chmod 600 "$HOME/.ssh/id_rsa"
    fi

    # build gzip files for static assets for better performance
    find /app/static.dist -type f -not -name "*.png" -not -name "*.jpg" -exec gzip -k {} +

    # copy static files to destination
    if [ -n "${INIT_STATICFILES_RSYNC_TARGET}" ]; then
        echo -n "rsyncing static files to \"${INIT_STATICFILES_RSYNC_TARGET}\" ... "
        rsync -az -e "ssh -o StrictHostKeyChecking=accept-new" --stats /app/static.dist/ "${INIT_STATICFILES_RSYNC_TARGET}"
        echo "done"
    else
        echo "WARNING: not copying static files anywhere because you did not provide a INIT_STATICFILES_RSYNC_TARGET"
    fi

    # django migrations will be run automatically already if STARTMODE=init
    [ "$STARTMODE" == "init" ] || python3 $APP_HOME/manage.py migrate --noinput

    # idle forever (docker swarm does not have a restart-policy)
    echo "Initialization complete."
    while true; do sleep 5; done
    exit 0
fi
touch $HOME/PREPARATION_DONE

if [ "$1" == "webworker" -o "$STARTMODE" == "webworker" ]; then
    # start gunicorn (might be called indirectly via supervisor when started with 'all')
@@ -106,25 +77,10 @@ if [ "$1" == "shell" -o "$1" == "createsuperuser" -o "$1" == "test" ]; then
fi

if [ "$1" == "housekeeping" ]; then
    interval="${HOUSEKEEPING_SLEEP_SECONDS:-300}"
    if [ "$interval" -gt 0 ]; then
        python3 $APP_HOME/manage.py housekeeping --forever --forever-delay="$interval"
    else
        python3 $APP_HOME/manage.py housekeeping
    fi
fi

if [ "$1" == "build" ]; then
    cd $APP_HOME/plainui
    /usr/bin/yarnpkg
    /usr/bin/yarnpkg build
    cd $APP_HOME
    python3 $APP_HOME/manage.py collectstatic --no-input
    exec python3 $APP_HOME/manage.py compilemessages
    python3 $APP_HOME/manage.py housekeeping --forever
fi

cat <<EOD
Specify argument: init|webworker|shell|createsuperuser
Additional development image arguments: build
Specify argument: webworker|shell|createsuperuser
EOD
exit 1
Original line number Diff line number Diff line
@@ -9,11 +9,6 @@ if [[ -z "$TARGET" ]]; then
	exit 1;
fi

if [ "$STARTMODE" == "init" ]; then
    # the init container must always be considered 'healthy'
    exit 0;
fi

response="$(curl --silent --show-error --fail-with-body --unix-socket /run/hub/app.sock --resolve "${TARGET}:80:127.0.0.1" "http://${TARGET}${SCRIPT_NAME}/.well-known/health")"
echo "$response"

Original line number Diff line number Diff line
<!DOCTYPE html><html><body>You don't have access to this location.</body></html>
<!DOCTYPE html>
<html lang="en">
  <head>
    <title>403 - Forbidden</title>
  </head>
  <body>You don't have access to this location.</body>
</html>
Original line number Diff line number Diff line
<!DOCTYPE html><html><body>This is not the page you're looking for.</body></html>
<!DOCTYPE html>
<html lang="en">
  <head>
    <title>404 - Not Found</title>
  </head>
  <body>This is not the page you're looking for.</body>
</html>
Original line number Diff line number Diff line
<!DOCTYPE html><html><body>Please stand by, application will be available again shortly.</body></html>
<!DOCTYPE html>
<html lang="en">
  <head>
    <title>500 - Internal Server Error</title>
  </head>
  <body>Please stand by, application will be available again shortly.</body>
</html>
Original line number Diff line number Diff line
<!DOCTYPE html><html><body>There is nothing to be seen here.</body></html>
<!DOCTYPE html>
<html lang="en">
  <head>
    <title>Invalid Path</title>
  </head>
  <body>There is nothing to be seen here.</body>
</html>
Original line number Diff line number Diff line
@@ -6,7 +6,6 @@ IS_BACKOFFICE = True
IS_FRONTEND = False

ALLOWED_HOSTS = ['hub.rc3.world']
WORKADVENTURE_URL_SCHEME = 'https://play.{assembly_slug}.at.rc3.world/'  # other available placeholders are {room_id} and {username}

LOGGING = {
    'version': 1,
+57 −0
Original line number Diff line number Diff line
"""
Simple script to clean up old pipelines from the GitLab project.
"""

from datetime import UTC, datetime, timedelta

import requests
from environ import environ
from rich.progress import Progress

env = environ.FileAwareEnv(
    API_TOKEN=(str, None),
    THRESHOLD_ALL=(int, 52),
    THRESHOLD_FAILED=(int, 26),
    CI_API_V4_URL=(str, None),
    CI_PROJECT_ID=(int, None),
)


TOKEN = env('API_TOKEN')
PER_PAGE = 100
DELETE_ALL = datetime.now(tz=UTC) - timedelta(weeks=env('THRESHOLD_ALL'))
DELETE_FAILED = datetime.now(tz=UTC) - timedelta(weeks=env('THRESHOLD_FAILED'))
BASE_URL = f'{env("CI_API_V4_URL")}/projects/{env("CI_PROJECT_ID")}'

if TOKEN is None:
    raise ValueError('API_TOKEN is required')

while True:
    response = requests.get(
        f'{BASE_URL}/pipelines?per_page={PER_PAGE}&updated_before={DELETE_ALL.strftime("%Y-%m-%dT%H:%M:%SZ")}&sort=asc&order_by=updated_at&page=1',
        headers={'PRIVATE-TOKEN': TOKEN},
    )
    pipelines = response.json()
    if not pipelines:
        break
    with Progress() as progress:
        task1 = progress.add_task('[red]Deleting...', total=len(pipelines))
        for pipeline in pipelines:
            delete_response = requests.delete(f'{BASE_URL}/pipelines/{pipeline["id"]}', headers={'PRIVATE-TOKEN': TOKEN})
            assert delete_response.status_code == 204
            progress.update(task1, advance=1)

while True:
    response = requests.get(
        f'{BASE_URL}/pipelines?per_page={PER_PAGE}&updated_before={DELETE_FAILED.strftime("%Y-%m-%dT%H:%M:%SZ")}&status=failed&sort=asc&order_by=updated_at&page=1',
        headers={'PRIVATE-TOKEN': TOKEN},
    )
    pipelines = response.json()
    if not pipelines:
        break
    with Progress() as progress:
        task1 = progress.add_task('[red]Deleting...', total=len(pipelines))
        for pipeline in pipelines:
            delete_response = requests.delete(f'{BASE_URL}/pipelines/{pipeline["id"]}', headers={'PRIVATE-TOKEN': TOKEN})
            assert delete_response.status_code == 204
            progress.update(task1, advance=1)
Original line number Diff line number Diff line
@@ -4,8 +4,6 @@ services:
    build:
      context: .
      target: "${BUILD_TARGET:-dev}"
      args:
        DEVELOPMENT: "True"
    volumes:
      - ./src:/app
      - hubrun:/run/hub/
@@ -36,6 +34,18 @@ services:
    profiles:
      - test

  local-static:
    image: node-builder:latest
    build:
      context: .
      target: node-build
    volumes:
      # Mount only the static files, as we need the node_modules to be installed from build
      - ./src/plainui/styles:/app/plainui/styles
      - ./src/plainui/static:/app/plainui/static
    profiles:
      - build

  nginx:
    image: hubnginx:latest
    build:

pdm.lock

deleted100644 → 0
+0 −1916

File deleted.

Preview size limit exceeded, changes collapsed.

+238 −48
Original line number Diff line number Diff line
@@ -2,9 +2,7 @@
name = "hub"
dynamic = ["version"]
description = "Management tool for chaos events."
authors = [
    {name = "Roang", email = "lucas@brandstaetter.tech"},
]
authors = [{ name = "Hub Team", email = "hub@cccv.de" }]
dependencies = [
    "babel~=2.16",
    "bleach~=6.1",
@@ -12,15 +10,15 @@ dependencies = [
    "django-bootstrap5~=24.3",
    "django-cors-headers~=4.5",
    "django-debug-toolbar~=4.4.6",
    "django-environ~=0.11.2,<1",
    "django-modeltranslation~=0.18.9",
    "django-environ~=0.11.2,<0.12",
    "django-modeltranslation>=0.19.0,<0.20",
    "django-oauth-toolkit~=3.0.1",
    "django-ratelimit~=4.1.0",
    "django-redis~=5.4.0",
    "django-storages~=1.14.4",
    "django-timezone-field~=7.0.0",
    "django-widget-tweaks~=1.5.0",
    "Django==5.1.2,<6",
    "Django>=5.1.2,<5.2.0",
    "djangorestframework~=3.15.2",
    "freezegun~=1.5.1",
    "Jinja2~=3.1.4,<4",
@@ -31,89 +29,281 @@ dependencies = [
    "openpyxl~=3.1.2",
    "ordered-set~=4.1.0",
    "pandas~=2.2.3",
    "Pillow~=11.0",
    "Pillow<11.0",
    "Pygments~=2.18",
    "pyjwt~=2.9.0",
    "requests-file~=1.5.1,<2",
    "requests~=2.31.0",
    "requests-file>=2.0.0,<3",
    "requests>=2.31.0,<3",
    "segno~=1.6.1",
    "sentry-sdk~=2.17.0",
    "sentry-sdk>=2.17.0,<3",
    "tzdata~=2024.2",
    "pdm>=2.11.2",
    "psycopg[binary,pool]>=3.2.3",
    "gunicorn>=23.0.0",
    "pydantic>=2.9.2",
    "django-rich>=1.13.0",
    "django-csp>=3.8",
    "rules>=3.5",
    "django-ical>=1.9.2",
    "django-stubs-ext>=5.1.2",
    "ipython>=8.31.0",
    "django-extensions>=3.2.3",
    "python-magic>=0.4.27",
]
requires-python = "==3.13.*"
readme = "README.md"
license = { text = "MIT" }

[tool.pdm]
distribution = false

[tool.pdm.version]
source = "scm"
write_to = "version.txt"

[tool.pdm.dev-dependencies]
lint = [
    "ruff>=0.1.11",
    "djlint>=1.34.1",
]
local = [
    "debugpy>=1.8.0",
    "icecream>=2.1.3",
]
[dependency-groups]
dev = [
    "tox>=4.11.4",
    "coverage>=7.4.0",
    "tox-pdm>=0.7.2",
    "pre-commit>=3.6.0",
    "docutils>=0.21.2",
    "pre-commit>=3.6.0",
    "tblib>=3.0.0",
    "tox-uv>=1.19.0",
    "tox>=4.11.4",
    "taskipy>=1.14.1",
]
lint = ["ruff>=0.1.11", "djlint>=1.35.4"]
local = ["debugpy>=1.8.0", "icecream>=2.1.3"]
typing = [
    "django-stubs[mypy-compatible]>=5.1.1",
    "pyright>=1.1.392.post0",
    "djangorestframework-stubs[compatible-mypy]>=1.4.0",
    "django-stubs>=5.1.2",
]
static-analysis = ["pylint>=3.3.1", "pylint-django>=2.6.1"]
watchfiles = ["django-watchfiles>=1.0.0"]

[tool.pdm.scripts]
# Generic scripts
whoami = { shell = "echo `{pdm} -V` was called as '{pdm} -V'" }
# uv currently has no task runner, use taskipy instead.
# Ref: https://github.com/astral-sh/uv/issues/5903
[tool.taskipy.tasks]
# Application version
app_version = "./src/manage.py appversion"
# Django scripts
start = "./src/manage.py runserver"
shell = "./src/manage.py shell"
manage = "./src/manage.py"
start = { cmd = "./src/manage.py runserver", cwd = "." }
shell = { cmd = "./src/manage.py shell_plus", cwd = "." }
manage = { cmd = "./src/manage.py", cwd = "." }
compilemessages = { cmd = "./manage.py compilemessages", cwd = "./src" }
# TOX scripts
all = "tox"
check-migrations = "tox -e migrations"
coverage-report = "tox -e coverage-report"
style-check = {composite = ["format", "lint"]}
format = {composite = ["py-format"]}
lint = {composite = ["py-lint"]}
style-check = "task format && task lint"
format = "tox -e py-format,html-format"
lint = "tox -e py-lint,html-lint"
py-format = "tox -e py-format"
py-lint = "tox -e py-lint"
html-format = "tox -e html-format"
html-lint = "tox -e html-lint"
test = "tox -e django-test"
live-test = "tox -e live-test"

[tool.coverage.run]
data_file = ".tools/coverage/coverage"
dynamic_context = "test_function"
omit =[
    "*/migrations/*"]
omit = ["*/migrations/*"]
branch = true

[tool.coverage.report]
show_missing = true
exclude_also = [
    "def __repr__",
    "if settings.DEBUG",
    "if TYPE_CHECKING:",
    "raise NotImplementedError",
]

[tool.coverage.xml]
output = ".tools/coverage/coverage.xml"
output = ".tools/coverage/report.xml"

[tool.coverage.html]
directory = ".tools/coverage/html_report"

[tool.djlint]
indent = 2
ignore = "H006,H013,H021,H022,H030,H031"
linter_output_format = "{filename}:{line}: {code} {message} {match}"
preserve_blank_lines = true
# We neeed to override the default settings for the GitLab CI
# Ref: https://github.com/djlint/djLint/issues/1028
exclude = ".venv"
extend_exclude = "__pypackages__,_build,.bzr,.direnv,.eggs,.git,.git-rewrite,.hg,.ipynb_checkpoints,.mypy_cache,.nox,.pants.d,.pytest_cache,.pytype,.ruff_cache,.svn,.tox,.venv,.vscode,buck-out,dist,node_modules,venv,.tools"

[tool.mypy]
plugins = ["mypy_django_plugin.main"]

[tool.django-stubs]
django_settings_module = "hub.settings.base"

[tool.tox]
requires = ["tox>=4"]
env_list = [
    "py-lint",
    "py-format",
    "html-lint",
    "html-format",
    "django-test",
    "coverage-repor",
]
work_dir = ".tools/tox"

[tool.tox.env_run_base]
runner = "uv-venv-lock-runner"
skip_install = true

[tool.tox.env.hub.set_env]
ALLOWED_HOSTS = { replace = "env", name = "ALLOWED_HOSTS" }
DATABASE_URL = { replace = "env", name = "DATABASE_URL" }
DJANGO_DEBUG = "I_KNOW_WHAT_I_AM_DOING"
DJANGO_SETTINGS_MODULE = "hub.settings.test"
DISABLE_REQUEST_LOGGING = "True"
SERVE_API = "True"
SERVE_BACKOFFICE = "True"
SERVE_FRONTEND = "True"
SSO_SECRET_GENERATE = "True"
STORAGE_TYPE = "local"
USE_PLAIN_STATICFILES = "True"
METRICS_SERVER_IPS = "127.0.0.1,localhost"

[tool.tox.env.py-lint]
uv_sync_flags = ["--only-group=lint"]
commands = [["ruff", "check", "."]]

[tool.tox.env.py-format]
uv_sync_flags = ["--only-group=lint"]
commands = [["ruff", "format", ".", "--diff"]]

[tool.tox.env.html-lint]
uv_sync_flags = ["--only-group=lint"]
commands = [
    [
        "djlint",
        "--profile",
        "jinja",
        "-e",
        "j2",
        "--lint",
        ".",
    ],
    [
        "djlint",
        "--profile",
        "django",
        "-e",
        "html",
        "--lint",
        ".",
    ],
]

[tool.tox.env.html-format]
uv_sync_flags = ["--only-group=lint"]
commands = [
    [
        "djlint",
        "--profile",
        "jinja",
        "-e",
        "j2",
        "--reformat",
        ".",
    ],
    [
        "djlint",
        "--profile",
        "django",
        "-e",
        "html",
        "--reformat",
        ".",
    ],
]

[tool.tox.env.django-test]
set_env = [
    { replace = "ref", of = [
        "tool",
        "tox",
        "env",
        "hub",
        "set_env",
    ] },
    { PYTHONWARNINGS = "always" },
]
allowlist_externals = ["coverage"]
extras = ["dev"]
commands = [
    [
        "coverage",
        "run",
        "--concurrency=multiprocessing",
        "{toxinidir}/src/manage.py",
        "test",
        "-v",
        "2",
        "--parallel=8",
        'src',
        { replace = "posargs", extend = true },
    ],
    [
        "coverage",
        "combine",
    ],
]

[tool.tox.env.django-test-single]
set_env = [
    { replace = "ref", of = [
        "tool",
        "tox",
        "env",
        "hub",
        "set_env",
    ] },
    { PYTHONWARNINGS = "always" },
]
extras = ["dev"]
change_dir = "{toxinidir}/src"
allowlist_externals = ["coverage"]
commands = [
    [
        "coverage",
        "run",
        "./manage.py",
        "test",
        "-v",
        "2",
        { replace = "posargs", extend = true },
    ],
]

[tool.tox.env.live-test]
groups = "dev"
change_dir = "{toxinidir}"
commands = [["python", "-m", "unittest", "{posargs}"]]

[tool.tox.env.live-test.set_env]
PYTHONWARNINGS = "always"
SERVE_API = "yes"

[tool.tox.env.coverage-report]
deps = ["coverage"]
commands = [["coverage", "report"], ["coverage", "xml"], ["coverage", "html"]]

[tool.pyright]

include = ["src"]
exclude = [
    "**/node_modules",
    "**/dist",
    "**/build",
    "**/.venv",
    "**/.vscode",
    "**/.tools",
]
venvPath = "."
venv = ".venv"
verboseOutput = true

reportMissingTypeArgument = "information"
reportPrivateUsage = "information"
typeCheckingMode = "off"             # "off", "basic", "standard", "strict"
Original line number Diff line number Diff line
# This file is @generated by PDM.
# Please do not edit it manually.

anyio==4.6.2.post1
# This file was autogenerated by uv via the following command:
#    uv export --frozen -o=requirements.dev.txt --no-hashes --all-groups
annotated-types==0.7.0
anyio==4.8.0
asgiref==3.8.1
asttokens==2.4.1
babel==2.16.0
beautifulsoup4==4.12.3
bleach==6.1.0
blinker==1.8.2
boto3==1.35.44
botocore==1.35.44
cachetools==5.5.0
certifi==2024.8.30
cffi==1.17.1; platform_python_implementation != "PyPy"
astroid==3.3.8
asttokens==3.0.0
babel==2.17.0
beautifulsoup4==4.13.3
bleach==6.2.0
boto3==1.36.26
botocore==1.36.26
cachetools==5.5.2
certifi==2025.1.31
cffi==1.17.1 ; platform_python_implementation != 'PyPy'
cfgv==3.4.0
chardet==5.2.0
charset-normalizer==3.4.0
click==8.1.7
charset-normalizer==3.4.1
click==8.1.8
colorama==0.4.6
coverage==7.6.3
cryptography==43.0.3
cssbeautifier==1.15.1
debugpy==1.8.7
coverage==7.6.12
cryptography==44.0.1
cssbeautifier==1.15.3
debugpy==1.8.12
decorator==5.2.0
defusedxml==0.7.1
dep-logic==0.4.9
dill==0.3.9
distlib==0.3.9
django==5.1.2
django==5.1.6
django-bootstrap5==24.3
django-cors-headers==4.5.0
django-cors-headers==4.7.0
django-csp==3.8
django-debug-toolbar==4.4.6
django-environ==0.11.2
django-modeltranslation==0.18.13
django-extensions==3.2.3
django-ical==1.9.2
django-modeltranslation==0.19.12
django-oauth-toolkit==3.0.1
django-ratelimit==4.1.0
django-recurrence==1.11.1
django-redis==5.4.0
django-storages==1.14.4
django-stubs-ext==5.1.1
django-stubs[mypy-compatible]==5.1.1
django-rich==1.14.0
django-storages==1.14.5
django-stubs==5.1.3
django-stubs-ext==5.1.3
django-timezone-field==7.0
django-watchfiles==1.1.0
django-widget-tweaks==1.5.0
djangorestframework==3.15.2
djlint==1.35.2
djangorestframework-stubs==3.15.3
djlint==1.36.4
docutils==0.21.2
editorconfig==0.12.4
et-xmlfile==1.1.0
executing==2.1.0
filelock==3.16.1
findpython==0.6.2
editorconfig==0.17.0
et-xmlfile==2.0.0
executing==2.2.0
filelock==3.17.0
freezegun==1.5.1
gunicorn==23.0.0
h11==0.14.0
hishel==0.0.33
html-tag-names==0.1.2
html-void-elements==0.1.0
httpcore==1.0.6
httpx[socks]==0.27.2
icecream==2.1.3
identify==2.6.1
icalendar==6.1.1
icecream==2.1.4
identify==2.6.7
idna==3.10
installer==0.7.0
jinja2==3.1.4
ipython==8.32.0
isort==6.0.0
jedi==0.19.2
jinja2==3.1.5
jmespath==1.0.1
jsbeautifier==1.15.1
json5==0.9.25
jsbeautifier==1.15.3
json5==0.10.0
jwcrypto==1.5.6
lxml==5.3.0
lxml==5.3.1
markdown-it-py==3.0.0
markdownify==0.13.1
markupsafe==3.0.2
matplotlib-inline==0.1.7
mccabe==0.7.0
mdurl==0.1.2
mistletoe==1.4.0
msgpack==1.1.0
mslex==1.3.0 ; sys_platform == 'win32'
mypy==1.15.0
mypy-extensions==1.0.0
nodeenv==1.9.1
numpy==2.1.2; python_version >= "3.12"
numpy==2.2.3
oauthlib==3.2.2
odfpy==1.4.1
openpyxl==3.1.5
ordered-set==4.1.0
packaging==24.1
packaging==24.2
pandas==2.2.3
parso==0.8.4
pathspec==0.12.1
pbs-installer==2024.10.16
pdm==2.19.3
pillow==11.0.0
pexpect==4.9.0 ; sys_platform != 'emscripten' and sys_platform != 'win32'
pillow==10.4.0
platformdirs==4.3.6
pluggy==1.5.0
pre-commit==4.0.1
psycopg-binary==3.2.3; implementation_name != "pypy"
psycopg-pool==3.2.3
psycopg[binary,pool]==3.2.3
pycparser==2.22; platform_python_implementation != "PyPy"
pygments==2.18.0
pre-commit==4.1.0
prompt-toolkit==3.0.50
psutil==6.1.1
psycopg==3.2.4
psycopg-binary==3.2.4 ; implementation_name != 'pypy'
psycopg-pool==3.2.5
ptyprocess==0.7.0 ; sys_platform != 'emscripten' and sys_platform != 'win32'
pure-eval==0.2.3
pycparser==2.22 ; platform_python_implementation != 'PyPy'
pydantic==2.10.6
pydantic-core==2.27.2
pygments==2.19.1
pyjwt==2.9.0
pyproject-api==1.8.0
pyproject-hooks==1.2.0
pylint==3.3.4
pylint-django==2.6.1
pylint-plugin-utils==0.8.2
pyproject-api==1.9.0
pyright==1.1.394
python-dateutil==2.9.0.post0
python-dotenv==1.0.1
pytz==2024.2
python-magic==0.4.27
pytz==2025.1
pyyaml==6.0.2
redis==5.1.1
regex==2024.9.11
requests==2.31.0
requests-file==1.5.1
resolvelib==1.0.1
rich==13.9.2
ruff==0.7.0
s3transfer==0.10.3
redis==5.2.1
regex==2024.11.6
requests==2.32.3
requests-file==2.1.0
rich==13.9.4
ruff==0.9.7
rules==3.5
s3transfer==0.11.2
segno==1.6.1
sentry-sdk==2.17.0
shellingham==1.5.4
six==1.16.0
sentry-sdk==2.22.0
six==1.17.0
sniffio==1.3.1
socksio==1.0.0
soupsieve==2.6
sqlparse==0.5.1
sqlparse==0.5.3
stack-data==0.6.3
taskipy==1.14.1
tblib==3.0.0
tomli==2.2.1
tomlkit==0.13.2
tox==4.23.0
tox-pdm==0.7.2
tqdm==4.66.5
truststore==0.9.2; python_version >= "3.10"
types-pyyaml==6.0.12.20240917
tox==4.24.1
tox-uv==1.25.0
tqdm==4.67.1
traitlets==5.14.3
types-pyyaml==6.0.12.20241230
types-requests==2.32.0.20241016
typing-extensions==4.12.2
tzdata==2024.2
unearth==0.17.2
urllib3==2.2.3
virtualenv==20.27.0
urllib3==2.3.0
uv==0.6.2
virtualenv==20.29.2
watchfiles==1.0.4
wcwidth==0.2.13
webencodings==0.5.1
+63 −62
Original line number Diff line number Diff line
# This file is @generated by PDM.
# Please do not edit it manually.

anyio==4.6.2.post1
# This file was autogenerated by uv via the following command:
#    uv export --frozen -o=requirements.txt --no-hashes --no-dev
annotated-types==0.7.0
asgiref==3.8.1
babel==2.16.0
beautifulsoup4==4.12.3
bleach==6.1.0
blinker==1.8.2
boto3==1.35.44
botocore==1.35.44
certifi==2024.8.30
cffi==1.17.1; platform_python_implementation != "PyPy"
charset-normalizer==3.4.0
cryptography==43.0.3
asttokens==3.0.0
babel==2.17.0
beautifulsoup4==4.13.3
bleach==6.2.0
boto3==1.36.26
botocore==1.36.26
certifi==2025.1.31
cffi==1.17.1 ; platform_python_implementation != 'PyPy'
charset-normalizer==3.4.1
colorama==0.4.6 ; sys_platform == 'win32'
cryptography==44.0.1
decorator==5.2.0
defusedxml==0.7.1
dep-logic==0.4.9
distlib==0.3.9
django==5.1.2
django==5.1.6
django-bootstrap5==24.3
django-cors-headers==4.5.0
django-cors-headers==4.7.0
django-csp==3.8
django-debug-toolbar==4.4.6
django-environ==0.11.2
django-modeltranslation==0.18.13
django-extensions==3.2.3
django-ical==1.9.2
django-modeltranslation==0.19.12
django-oauth-toolkit==3.0.1
django-ratelimit==4.1.0
django-recurrence==1.11.1
django-redis==5.4.0
django-storages==1.14.4
django-rich==1.14.0
django-storages==1.14.5
django-stubs-ext==5.1.3
django-timezone-field==7.0
django-widget-tweaks==1.5.0
djangorestframework==3.15.2
et-xmlfile==1.1.0
filelock==3.16.1
findpython==0.6.2
et-xmlfile==2.0.0
executing==2.2.0
freezegun==1.5.1
gunicorn==23.0.0
h11==0.14.0
hishel==0.0.33
httpcore==1.0.6
httpx[socks]==0.27.2
icalendar==6.1.1
idna==3.10
installer==0.7.0
jinja2==3.1.4
ipython==8.32.0
jedi==0.19.2
jinja2==3.1.5
jmespath==1.0.1
jwcrypto==1.5.6
lxml==5.3.0
lxml==5.3.1
markdown-it-py==3.0.0
markdownify==0.13.1
markupsafe==3.0.2
matplotlib-inline==0.1.7
mdurl==0.1.2
mistletoe==1.4.0
msgpack==1.1.0
numpy==2.1.2; python_version >= "3.12"
numpy==2.2.3
oauthlib==3.2.2
odfpy==1.4.1
openpyxl==3.1.5
ordered-set==4.1.0
packaging==24.1
packaging==24.2
pandas==2.2.3
pbs-installer==2024.10.16
pdm==2.19.3
pillow==11.0.0
platformdirs==4.3.6
psycopg-binary==3.2.3; implementation_name != "pypy"
psycopg-pool==3.2.3
psycopg[binary,pool]==3.2.3
pycparser==2.22; platform_python_implementation != "PyPy"
pygments==2.18.0
parso==0.8.4
pexpect==4.9.0 ; sys_platform != 'emscripten' and sys_platform != 'win32'
pillow==10.4.0
prompt-toolkit==3.0.50
psycopg==3.2.4
psycopg-binary==3.2.4 ; implementation_name != 'pypy'
psycopg-pool==3.2.5
ptyprocess==0.7.0 ; sys_platform != 'emscripten' and sys_platform != 'win32'
pure-eval==0.2.3
pycparser==2.22 ; platform_python_implementation != 'PyPy'
pydantic==2.10.6
pydantic-core==2.27.2
pygments==2.19.1
pyjwt==2.9.0
pyproject-hooks==1.2.0
python-dateutil==2.9.0.post0
python-dotenv==1.0.1
pytz==2024.2
redis==5.1.1
requests==2.31.0
requests-file==1.5.1
resolvelib==1.0.1
rich==13.9.2
s3transfer==0.10.3
python-magic==0.4.27
pytz==2025.1
redis==5.2.1
requests==2.32.3
requests-file==2.1.0
rich==13.9.4
rules==3.5
s3transfer==0.11.2
segno==1.6.1
sentry-sdk==2.17.0
shellingham==1.5.4
six==1.16.0
sniffio==1.3.1
socksio==1.0.0
sentry-sdk==2.22.0
six==1.17.0
soupsieve==2.6
sqlparse==0.5.1
tomlkit==0.13.2
truststore==0.9.2; python_version >= "3.10"
sqlparse==0.5.3
stack-data==0.6.3
traitlets==5.14.3
typing-extensions==4.12.2
tzdata==2024.2
unearth==0.17.2
urllib3==2.2.3
virtualenv==20.27.0
urllib3==2.3.0
wcwidth==0.2.13
webencodings==0.5.1

src/.pdm-python

0 → 100644
+1 −0
Original line number Diff line number Diff line
/install/.venv/bin/python
 No newline at end of file
+3 −0
Original line number Diff line number Diff line
@@ -91,7 +91,10 @@ section-order = [
    "django-storages",
    "django-widget-tweaks",
    "django_bootstrap5",
    "django_rich",
    "djangorestframework",
    "csp",
    "modeltranslation",
    "ragelimit",
    "rules",
]

src/api/ical.py

0 → 100644
+75 −0
Original line number Diff line number Diff line
from typing import Optional

from django_ical.views import ICalFeed

from django.utils.text import slugify

from core.models import (
    Conference,
    Event,
)


class EventICalFeed(ICalFeed):
    """
    EventICalFeed serializes Events to an iCal response.
    """

    def __init__(self, events: list[Event], name: Optional[str] = None, conference: Optional[Conference] = None):
        super().__init__()
        self.name = name
        self.conference = conference
        self.events = events

    def product_id(self) -> str:
        issuer = self.conference.name if self.conference else 'Hub'
        product = self.name if self.name else 'iCal export'
        return f'-//{issuer}//{product}'

    def title(self) -> str:
        if self.name:
            return self.name
        return 'Hub iCal export'

    def items(self) -> list[Event]:
        return self.events

    def item_title(self, item) -> str:
        return item.name

    def item_description(self, item) -> str:
        return item.description

    def item_start_datetime(self, item):
        return item.schedule_start

    def item_end_datetime(self, item):
        return item.schedule_end

    def item_location(self, item) -> str:
        if item.room:
            return item.room.name
        if item.location:
            return item.location
        return ''

    def item_guid(self, item) -> str:
        return item.pk

    def item_categories(self, item) -> list[str]:
        categories = []
        if item.recording == Event.Recording.YES:
            categories.append('recorded')
        elif item.recording == Event.Recording.NO:
            categories.append('not recorded')

        if item.kind == Event.Kind.OFFICIAL or item.assembly.is_official:
            categories.append('official')

        if item.track:
            categories.append(item.track.name)

        return categories

    def file_name(self) -> str:
        return slugify(self.title()) + '.ics'
Original line number Diff line number Diff line
@@ -18,7 +18,7 @@ msgstr ""
"Plural-Forms: nplurals=2; plural=(n != 1);\n"

msgid "SSO__authorize"
msgstr "Authorisieren von"
msgstr "Autorisieren von"

msgid "SSO__authorize_scopes"
msgstr "Die Seite möchte folgendes wissen:"
@@ -33,7 +33,7 @@ msgid "SSO__authorize_accept"
msgstr "akzeptieren"

msgid "SSO__out_of_band_noscript"
msgstr "Bitte kopiere nun das Token in das Program, welches die Authentifikation angefragt hat. Du kannst das Token in der Adress Zeile zwischen '#code=' und '&' finden."
msgstr "Bitte kopiere nun das Token in das Program, welches die Authentifikation angefragt hat. Du kannst das Token in der Adress-Zeile zwischen '#code=' und '&' finden."

msgid "SSO__out_of_band"
msgstr "Bitte kopiere nun das Token in die Anwendung, welche die Authentifikation angefragt hat."
+0 −36
Original line number Diff line number Diff line
# Generated by Django 3.1.2 on 2020-10-31 14:30

from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import uuid


class Migration(migrations.Migration):

    initial = True

    dependencies = [
        migrations.swappable_dependency(settings.AUTH_USER_MODEL),
        ('core', '0005_remove_children_and_fsk'),
    ]

    operations = [
        migrations.CreateModel(
            name='StickerToken',
            fields=[
                ('token', models.CharField(max_length=50, primary_key=True, serialize=False)),
                ('permanent', models.BooleanField(default=False)),
                ('issued', models.DateTimeField(auto_now_add=True)),
                ('sticker', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='+', to='core.sticker')),
            ],
        ),
        migrations.CreateModel(
            name='RoomLinkUserId',
            fields=[
                ('id', models.UUIDField(default=uuid.uuid4, primary_key=True, serialize=False)),
                ('roomlink', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='+', to='core.roomlink')),
                ('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='+', to=settings.AUTH_USER_MODEL)),
            ],
        ),
    ]
Original line number Diff line number Diff line
# Generated by Django 3.1.2 on 2020-11-09 10:36

from django.db import migrations, models


class Migration(migrations.Migration):

    dependencies = [
        ('core', '0007_rename_stickers_to_badges'),
        ('api', '0001_initial'),
    ]

    operations = [
        migrations.AlterField(
            model_name='StickerToken',
            name='sticker',
            field=models.ForeignKey(on_delete=models.deletion.CASCADE, related_name='+', to='core.badge'),
        ),
        migrations.RenameModel(old_name='StickerToken', new_name='BadgeToken'),
        migrations.RenameField(model_name='BadgeToken', old_name='sticker', new_name='badge'),
    ]
+0 −16
Original line number Diff line number Diff line
# Generated by Django 3.1.4 on 2020-12-25 21:21

from django.db import migrations


class Migration(migrations.Migration):

    dependencies = [
        ('api', '0002_rename_stickers_to_badges'),
    ]

    operations = [
        migrations.DeleteModel(
            name='BadgeToken',
        ),
    ]
Original line number Diff line number Diff line
# Generated by Django 3.2.10 on 2022-01-26 21:50

from django.db import migrations


class Migration(migrations.Migration):

    dependencies = [
        ('api', '0003_delete_badgetoken'),
    ]

    database_operations = [
        migrations.AlterModelTable('RoomLinkUserId', 'core_roomlinkuserid')
    ]

    state_operations = [
        migrations.DeleteModel('RoomLinkUserId')
    ]

    operations = [
        migrations.SeparateDatabaseAndState(
            database_operations=database_operations,
            state_operations=state_operations)
    ]
Original line number Diff line number Diff line
@@ -74,43 +74,63 @@ class IsConferenceService(ConferencePermission):


class AssemblyPermission(ConferencePermission):
    _assembly: Assembly

    def get_assembly(self, *, view, obj=None) -> Assembly:
        if hasattr(self, '_assembly'):
            return self._assembly
        if hasattr(view, 'assembly'):
            assembly = view.assembly
        elif slug := view.kwargs.get('assembly', None):
            assembly = Assembly.objects.get(slug=slug)
        elif isinstance(obj, Assembly):
            assembly = obj
        elif hasattr(obj, 'assembly'):
        elif obj and hasattr(obj, 'assembly'):
            assembly = obj.assembly
        elif obj and hasattr(obj, 'issuing_assembly'):
            assembly = obj.issuing_assembly
        else:
            raise ObjectDoesNotExist('Assembly for this view not found')
            raise Assembly.DoesNotExist
        self._assembly = assembly
        return assembly


class IsPublicAssemblyReadOnly(AssemblyPermission):
    def has_permission(self, request, view):
        try:
            self.get_assembly(view=view)
            return self.has_object_permission(request, view)
        except Assembly.DoesNotExist:
            return bool(getattr(view, 'fallback_to_object', False))

    def has_object_permission(self, request, view, obj=None):
        try:
            assembly = self.get_assembly(view=view)
        except ObjectDoesNotExist:
            assembly = self.get_assembly(view=view, obj=obj)
        except Assembly.DoesNotExist:
            return False
        return request.method in permissions.SAFE_METHODS and assembly.is_public


class IsAssemblyService(AssemblyPermission):
    def has_permission(self, request, view):
        try:
            self.get_assembly(view=view)
            return self.has_object_permission(request, view)
        except Assembly.DoesNotExist:
            return bool(getattr(view, 'fallback_to_object', False))

    def has_object_permission(self, request, view, obj=None):
        assembly = self.get_assembly(view=view, obj=obj)

        return assembly.technical_user == request.user


class IsAssemblyManager(AssemblyPermission):
    def has_permission(self, request, view):
        try:
            self.get_assembly(view=view)
            return self.has_object_permission(request, view)
        except Assembly.DoesNotExist:
            return bool(getattr(view, 'fallback_to_object', False))

    def has_object_permission(self, request, view, obj=None):
        user = request.user
@@ -120,20 +140,23 @@ class IsAssemblyManager(AssemblyPermission):
            return False

        try:
            assembly = self.get_assembly(view=view)
            assembly = self.get_assembly(view=view, obj=obj)
        except ObjectDoesNotExist:
            return False
        query_set = Assembly.objects.filter(pk=assembly.id)
        return query_set.filter(members__member=user, members__can_manage_assembly=True).exists()
        return request.user.has_perm('core.change_assembly', assembly)


class HasIssuingToken(AssemblyPermission):
    def has_permission(self, request, view):
        try:
            self.get_assembly(view=view)
            return self.has_object_permission(request, view)
        except Assembly.DoesNotExist:
            return bool(getattr(view, 'fallback_to_object', False))

    def has_object_permission(self, request, view, obj=None):
        try:
            assembly = self.get_assembly(view=view)
            assembly = self.get_assembly(view=view, obj=obj)
        except ObjectDoesNotExist:
            return False
        if not (issuing_token := view.kwargs.get('issuing_token', None)):
Original line number Diff line number Diff line
@@ -4,7 +4,7 @@ import logging
import re
from collections import OrderedDict
from datetime import datetime, timedelta
from typing import Optional, TYPE_CHECKING
from typing import TYPE_CHECKING, Any
from uuid import UUID

from lxml import etree as ET
@@ -18,6 +18,7 @@ from core.models.conference import Conference, ConferenceMember
from core.models.events import Event, EventParticipant
from core.models.rooms import Room
from core.models.users import PlatformUser
from core.markdown import normalize_markdown

if TYPE_CHECKING:
    from django.db.models import QuerySet
@@ -90,8 +91,14 @@ class RoomDay:

class ScheduleEncoder(json.JSONEncoder):
    tz = None
    conference: Conference

    def __init__(self, *args, conference: Conference = None, **kwargs):
        assert conference is not None
        self.conference = conference
        super().__init__(*args, **kwargs)

    def encode_duration(self, duration: Optional[timedelta]) -> Optional[str]:
    def encode_duration(self, duration: timedelta | None) -> str | None:
        """converts a python `timedelta` to the schedule xml timedelta string that represents this timedelta. ([d:]HH:mm)"""

        if duration is None:
@@ -109,6 +116,7 @@ class ScheduleEncoder(json.JSONEncoder):
            return {'id': None, 'name': p, 'public_name': p}

        if isinstance(p, PlatformUser):
            # TODO: Update after deciding oh one or more conferences in #648
            member: ConferenceMember = p.conferences.first()  # TODO search for correct conference
            name = p.get_display_name()

@@ -117,7 +125,7 @@ class ScheduleEncoder(json.JSONEncoder):
                'name': name,
                'public_name': name,
                'avatar': p.avatar_url,
                'biography': member.description if member else None,
                'biography': normalize_markdown(self.conference, member.description) if member and member.description else None,
                # 'links': p.links,  # TODO
                'url': p.get_absolute_url(),
            }
@@ -131,7 +139,7 @@ class ScheduleEncoder(json.JSONEncoder):
                'name': name,
                'public_name': name,
                'avatar': p.participant.avatar_url,
                'biography': member.description if member else '',
                'biography': normalize_markdown(self.conference, member.description) if member and member.description else '',
                # 'links': p.participant.links,  # TODO
                'url': p.participant.get_absolute_url(),
            }
@@ -162,7 +170,6 @@ class ScheduleEncoder(json.JSONEncoder):
        start = event.schedule_start.astimezone(tz or self.tz) if event.schedule_start is not None else None
        additional_data = event.additional_data or {}
        legacy_id = additional_data.get('id') or int(re.sub('[^0-9]+', '', str(event.id))[0:6])
        slug = f'{event.conference.slug}-{legacy_id}-{event.slug}'

        if event.streaming == Event.Streaming.NO:
            additional_data['do_not_stream'] = True
@@ -177,9 +184,9 @@ class ScheduleEncoder(json.JSONEncoder):
            **additional_data,
            # ATTENTION: if the description also exists in additional_data it is overwritten
            'abstract': event.abstract,
            'description': event.description,
            'description': normalize_markdown(self.conference, event.description) if event.description else event.description,
            'persons': self.collect_persons(event),
            'slug': slug,
            'slug': event.voc_slug,
            'url': event.get_absolute_url(),
            'guid': event.id,
            'date': start.isoformat() if start is not None else None,
@@ -215,8 +222,8 @@ class ScheduleEncoder(json.JSONEncoder):
            'type': obj.room_type,
            'stream_id': obj.voc_stream_id,
            'capacity': obj.capacity,
            'description_en': obj.description_en,
            'description_de': obj.description_de,
            'description_en': normalize_markdown(self.conference, obj.description_en) if obj.description_en else obj.description_en,
            'description_de': normalize_markdown(self.conference, obj.description_de) if obj.description_de else obj.description_de,
            'features': features,
            'assembly': obj.assembly if obj.assembly else None,
            # Future TODO 'url': obj.get_absolute_url(),
@@ -227,7 +234,7 @@ class ScheduleEncoder(json.JSONEncoder):
            'name': obj.name,
            'slug': obj.slug,
            'guid': obj.id,
            # Future TODO: # 'type': obj., # channel vs. cluster vs. virtual vs. ...?
            # Future TODO: # 'type': obj., # cluster vs. virtual vs. ...?
            # 'description_en': obj.description_en,
            # 'description_de': obj.description_de,
            # Future TODO  'url': obj.get_absolute_url(),
@@ -251,9 +258,9 @@ class ScheduleEncoder(json.JSONEncoder):
            return str(obj)
        return None

    def default(self, obj):
        r = self.transform(obj)
        return json.JSONEncoder.default(self, obj) if r is None else r
    def default(self, o: Any):
        r = self.transform(o)
        return json.JSONEncoder.default(self, o) if r is None else r


class Schedule:
@@ -356,15 +363,15 @@ class Schedule:
        raise Warning('  illegal start time: ' + start_time.isoformat())

    def __str__(self):
        return json.dumps(self, indent=2, cls=ScheduleEncoder)
        return json.dumps(self, indent=2, cls=ScheduleEncoder, conference=self.conference)

    def json(self):
        return json.dumps(self, cls=ScheduleEncoder)
        return json.dumps(self, cls=ScheduleEncoder, conference=self.conference)

    # dict_to_etree from http://stackoverflow.com/a/10076823
    def xml(self):
        root_node = None
        encoder = ScheduleEncoder()
        encoder = ScheduleEncoder(conference=self.conference)

        def _set_attrib(tag, k, v):
            if isinstance(v, str):
Original line number Diff line number Diff line
@@ -11,10 +11,12 @@ from core.models.assemblies import Assembly
from core.models.badges import Badge, BadgeToken, BadgeTokenTimeConstraint
from core.models.conference import Conference, ConferenceMember, ConferenceTrack
from core.models.events import Event
from core.models.links import Link
from core.models.messages import DirectMessage
from core.models.metanavi import MetaNavItem
from core.models.rooms import Room
from core.models.users import PlatformUser, UserTimelineEntry
from core.templatetags.hub_absolute import hub_absolute


class ParameterisedHyperlinkedIdentityField(HyperlinkedIdentityField):
@@ -33,7 +35,7 @@ class ParameterisedHyperlinkedIdentityField(HyperlinkedIdentityField):
        self.lookup_fields = kwargs.pop('lookup_fields', self.lookup_fields)
        super().__init__(*args, **kwargs)

    def get_url(self, obj, view_name, request, format):
    def get_url(self, obj, view_name, request, format):  # pylint: disable=redefined-builtin
        """
        Given an object, return the URL that hyperlinks to the object.

@@ -50,16 +52,64 @@ class ParameterisedHyperlinkedIdentityField(HyperlinkedIdentityField):
        return reverse(view_name, kwargs=kwargs, request=request, format=format)


class HubHyperlinkedIdentityField(HyperlinkedIdentityField):
    """
    Represents the instance, or a property on the instance, using hyperlinking.

    lookup_fields is a tuple of tuples of the form:
        ('model_field', 'url_parameter')

    taken from https://github.com/encode/django-rest-framework/issues/1024
    """

    lookup_fields = (('pk', 'pk'),)

    def __init__(self, *args, **kwargs):
        self.lookup_fields = kwargs.pop('lookup_fields', self.lookup_fields)
        super().__init__(*args, **kwargs)

    def get_url(self, obj, view_name, request, format):  # pylint: disable=redefined-builtin
        """
        Given an object, return the URL that hyperlinks to the object.

        May raise a `NoReverseMatch` if the `view_name` and `lookup_field`
        attributes are not configured to correctly match the URL conf.
        """
        kwargs = {}
        for model_field, url_param in self.lookup_fields:
            attr = obj
            for field in model_field.split('.'):
                attr = getattr(attr, field)
            kwargs[url_param] = attr

        # TODO: add request=request, format=format,
        return hub_absolute(view_name, **kwargs, i18n=False)


class LinkRelatedField(serializers.RelatedField):
    """
    A read only field that represents its targets using their
    plain string representation.
    """

    def __init__(self, **kwargs):
        kwargs['read_only'] = True
        super().__init__(**kwargs)

    def to_representation(self, value: Link):
        return value.to_dict()


class ValidatingModelSerializer(serializers.ModelSerializer):
    def validate(self, data):
        instance = self.Meta.model(**{field: value for field, value in data.items() if field in self.Meta.model._meta.fields})
        for f, v in {field: value for field, value in data.items() if field in self.Meta.model._meta.fields}.items():
    def validate(self, attrs):
        instance = self.Meta.model(**{field: value for field, value in attrs.items() if field in self.Meta.model._meta.fields})
        for f, v in {field: value for field, value in attrs.items() if field in self.Meta.model._meta.fields}.items():
            getattr(instance, f).set(v)
        try:
            instance.clean()
        except ValidationError as err:
            raise serializers.ValidationError(err.args[0])
        return data
        return attrs


class HubModelSerializer(ValidatingModelSerializer):
@@ -147,8 +197,7 @@ class AssemblySerializer(HubModelSerializer):
            'slug',
            'id',
            'name',
            'state_assembly',
            'state_channel',
            'state',
            'hierarchy',
            'parent',
            'assembly_location',
@@ -158,7 +207,7 @@ class AssemblySerializer(HubModelSerializer):
            'rooms_url',
            'badges_url',
        ]
        staff_only_fields = ['state_assembly', 'state_channel', 'hierarchy']
        staff_only_fields = ['state', 'hierarchy']

    def __init__(self, *args, **kwargs):
        super().__init__(*args, **kwargs)
@@ -178,7 +227,7 @@ class BadgeSerializer(HubModelSerializer):
    )

    def create(self, validated_data):
        issuing_assembly = Assembly.objects.filter(slug=self.context['assembly'])
        issuing_assembly = Assembly.objects.filter(slug__iexact=self.context['assembly'])
        conference = Conference.objects.filter(slug=self.context['conference'])
        validated_data.update({'conference': conference, 'issuing_assembly': issuing_assembly})
        return super().create(validated_data)
@@ -218,26 +267,29 @@ class BadgeTokenUpdateSerializer(BadgeTokenSerializer):
class RoomSerializer(HubModelSerializer):
    assembly = serializers.SlugRelatedField(read_only=True, slug_field='slug')

    links = serializers.StringRelatedField(
    links = LinkRelatedField(
        many=True,
        read_only=True,
    )

    public_url = HubHyperlinkedIdentityField(view_name='plainui:room', lookup_fields=(('slug', 'slug'),))

    class Meta:
        model = Room
        read_only_fields = ['id']
        fields = [
            'id',
            'name',
            'slug',
            'blocked',
            'room_type',
            'capacity',
            'links',
            'assembly',
            'links',
            'backend_link',
            'public_url',
        ]
        staff_only_fields = ['blocked', 'backend_link']
        staff_only_fields = ['blocked']


class EventSerializer(HubModelSerializer):
@@ -258,6 +310,7 @@ class EventSerializer(HubModelSerializer):
            'track',
            'assembly',
            'room',
            'location',
            'language',
            'description',
            'is_public',
@@ -375,3 +428,34 @@ class DirectMessageSendSerializer(HubModelSerializer):
        write_only_fields = [f for f in fields if f != 'id']

    recipient = PlatformUserByUsernameFieldSerializer()


class BadgeRewardSerializer(serializers.Serializer):
    """
    A serializer for awarding a badge to a user.

    The abstract methods are unused, this class is only to show the expected fields.
    """

    username = serializers.CharField()
    dect = serializers.CharField()


class DectNumberSerializer(serializers.Serializer):
    """
    A serializer for checking if a DECT number exists.

    The abstract methods are unused, this class is only to show the expected fields.
    """

    dect = serializers.IntegerField()


class DectVerificationTokenSerializer(serializers.Serializer):
    """
    A serializer for verifying a DECT number.

    The abstract methods are unused, this class is only to show the expected fields.
    """

    token = serializers.CharField()
Original line number Diff line number Diff line
from .assemblies import *  # noqa: F401, F403
from .badges import *  # noqa: F401, F403
from .bbb import *  # noqa: F401, F403
from .engelsystem import *  # noqa: F401, F403
from .events import *  # noqa: F401, F403
from .map import *  # noqa: F401, F403
from .messages import *  # noqa: F401, F403
from .metrics import *  # noqa: F401, F403
from .rooms import *  # noqa: F401, F403
from .schedule import *  # noqa: F401, F403
from .workadventure import *  # noqa: F401, F403

__all__ = ('*',)  # noqa: F405
Original line number Diff line number Diff line
from api.tests.badges.create_redeem_token import *  # noqa: F401, F403
from api.tests.badges.award import AwardTestCase
from api.tests.badges.create_redeem_token import CreateRedeemTokenTests

__all__ = (
    'AwardTestCase',
    'CreateRedeemTokenTests',
)
Original line number Diff line number Diff line
import uuid
from datetime import datetime
from http import HTTPStatus
from zoneinfo import ZoneInfo

from rest_framework.authtoken.models import Token
from zoneinfo import ZoneInfo

from django.test import TestCase, override_settings
from django.urls import reverse
@@ -26,7 +26,7 @@ class CreateRedeemTokenTests(TestCase):
            end=datetime(2020, 12, 30, 23, 45, 00, tzinfo=tz),
        )
        self.conf.save()
        self.assembly = Assembly(name='TestAssembly', slug='asmbly', conference=self.conf, state_assembly=Assembly.State.PLACED)
        self.assembly = Assembly(name='TestAssembly', slug='asmbly', conference=self.conf, state=Assembly.State.PLACED)
        self.assembly.save()

        self.badge = Badge(

src/api/tests/bbb.py

deleted100644 → 0
+0 −48
Original line number Diff line number Diff line
import uuid
from uuid import uuid4

from django.test import TestCase, override_settings
from django.urls import reverse

from core.models import Assembly, Conference, Room

TEST_CONF_ID = uuid.uuid4()


@override_settings(SELECTED_CONFERENCE_ID=TEST_CONF_ID, INTEGRATIONS_BBB=True)
class BBBTest(TestCase):
    def test_MeetingEnded(self):
        conf = Conference(id=TEST_CONF_ID, slug='conf', name='TestConf')
        conf.save()
        assembly = Assembly(name='TestAssembly', slug='asmbly', conference=conf)
        assembly.save()
        room = Room(
            conference=conf,
            assembly=assembly,
            name='Room 1',
            room_type=Room.RoomType.BIGBLUEBUTTON,
            backend_status=Room.BackendStatus.ACTIVE,
            backend_link=str(uuid4()),
            backend_data={'close_secret': 'asdf'},
        )
        room.save()

        self.client.get(
            reverse('api:bbb_meeting_end'),
            {
                'meetingID': room.backend_link,
                'close_secret': 'invalid',
            },
        )
        room.refresh_from_db()
        self.assertEqual(room.backend_status, Room.BackendStatus.ACTIVE)

        self.client.get(
            reverse('api:bbb_meeting_end'),
            {
                'meetingID': room.backend_link,
                'close_secret': 'asdf',
            },
        )
        room.refresh_from_db()
        self.assertEqual(room.backend_status, Room.BackendStatus.INACTIVE)

File changed.

Preview size limit exceeded, changes collapsed.

src/api/tests/rooms.py

0 → 100644
+72 −0

File added.

Preview size limit exceeded, changes collapsed.

+13 −26

File changed.

Preview size limit exceeded, changes collapsed.

Original line number Diff line number Diff line
@@ -8,7 +8,7 @@ __all__ = [


@api_view(['GET'])
def api_root(request, format=None):
def api_root(request, format=None):  # pylint: disable=redefined-builtin
    links = {
        'conference': {
            'info': reverse('api:conference-detail', request=request, format=format),