- About Kultus
- Service architecture
- Development with Docker
- Development without Docker
- Configuration
- API Documentation
- Audit logging
- Keeping Python requirements up to date
- Code format
- Commit message format
- Issues board
- Maintaining
- Releases, changelogs and deployments
What is it? A service for teachers to find and book culture and leisure sectors activities for their student groups.
Why is it? To make booking culture and leisure activities easy to all participants:
- help teachers find & enrol their student groups to activities
- help providers (culture and leisure department and partly metropolitan-area ecosystem) to show activities for teachers & their student groups
- gather data (equal possibilities and use to all groups, use of Kuva's work, resource needs etc) for service development
Who use the service?
- Teacher
- A school or kindergarten teacher who seeks educational activities for student groups.
- Provider
- A third party or a Culture and Leisure Division (i.e. KUVA) services user who wants to provide activities for teachers & their student groups
- Admin
- A user who manages the system's rights, pages etc.
The Kultus service consists of:
- Kultus API: The API backend service - The primary source of data. Integrates to LinkedEvents API and extends it's features.
- Providers' UI: A restricted UI where the events are maintained and published. Often called as "Admin UI".
- Teachers' UI: (This service). The frontend service where the groups can view and enrol in events.
- Headless CMS: Content Management Service that provides dynamic pages and dynamic content for the teachers' UI. It also provides content for the header and the footer. A React component library can be found from https://github.com/City-of-Helsinki/react-helsinki-headless-cms.
- LinkedEvents API: A city of Helsinki centralized API for events.
- Notification Service API: A service used by the Kultus API to send SMS messages.
- Unified Search: Provide a list of divisions.
- Helsinki Profile (Keycloak): Authorization service
- Mailer: A service used by the Kultus API to send emails.
- PowerBI: Data visualization.
- Digia Iiris: Web analytics (a Matomo service).
- Sentry: A monitoring service.
- Gruppo: Email newsletter service (https://api.createsend.com/api/v3.2/).
flowchart LR
subgraph ExternalServices["Third party services"]
direction LR
Gruppo["Gruppo"]
Matomo["Matomo"]
Sentry["Sentry"]
Mailer["Mailer"]
end
subgraph HelsinkiGraph["City of Helsinki services"]
direction LR
PowerBI["PowerBI"]
Keycloak["Keycloak"]
NotificationService["Notification Service"]
LinkedEvents["Linked Events"]
HeadlessCMS["HeadlessCMS"]
UnifiedSearch["UnifiedSearch"]
end
subgraph KultusBackend["Kultus Backend"]
direction LR
KultusAPI["Kultus API"]
end
subgraph PublicFrontend["Public Frontend"]
direction LR
KultusUI["Teachers UI"]
end
subgraph AdminInterface["Admin Interface"]
direction LR
KultusAdminUI["Providers UI"]
end
subgraph KultusInternal["Kultus"]
direction LR
KultusBackend
PublicFrontend
AdminInterface
end
KultusAPI == Syncs events data with ==> LinkedEvents
KultusAPI -- Authenticates via --> Keycloak
KultusAPI -- Sends SMS via --> NotificationService
KultusAPI -. Sends emails via .-> Mailer
KultusAPI -- Provides data for --> PowerBI
KultusAPI -. Reports errors to .-> Sentry
KultusUI == Depends on ==> KultusAPI
KultusUI == Uses content from ==> HeadlessCMS
KultusUI -- Getches divisions from --> UnifiedSearch
KultusUI -. Tracks usage via .-> Matomo
KultusUI -. Reports errors to .-> Sentry
KultusUI -. Subscribe to newsletter .-> Gruppo
KultusAdminUI == Depends on ==> KultusAPI
KultusAdminUI -- Authenticates with --> Keycloak
KultusAdminUI == Uses content from ==> HeadlessCMS
KultusAdminUI -. Tracks usage via .-> Matomo
KultusAdminUI -. Reports errors to .-> Sentry
style KultusInternal fill:#757575
Kultus API environments (this service):
- Production environment: https://kultus.api.hel.fi/graphql
- Staging environment: https://kultus.api.stage.hel.ninja/graphql
- Testing environment: https://kultus.api.test.hel.ninja/graphql
Teachers UI (the public UI) environments:
- Production environment: https://kultus.hel.fi/
- Staging environment: https://kultus-ui.stage.hel.ninja/
- Testing environment: https://kultus-ui.test.hel.ninja/
Providers UI (the admin client) environments:
- Production environment: https://kultus-admin.hel.fi/
- Staging environment: https://kultus-admin-ui.stage.hel.ninja/
- Testing environment: https://kultus-admin-ui.test.hel.ninja/
Headless CMS environments:
- Production environment: https://kultus.content.api.hel.fi/graphql
- Testing environment: https://kultus.app-staging.hkih.hion.dev/graphql
LinkedEvents
- Production environment: https://api.hel.fi/linkedevents/v1/
- Testing environment: https://linkedevents.api.test.hel.ninja/v1/
Notification service
- Production environment: https://kuva-notification-service.api.hel.fi/v1/
- Testing environment: https://kuva-notification-service.api.stage.hel.ninja/v1/
Unified Search
- Production environment: https://kuva-unified-search.api.hel.fi/search
- Testing environment: https://kuva-unified-search.api.stage.hel.ninja/search
This API leverages the following key frameworks and libraries:
- Django: A high-level Python Web framework that encourages rapid development and clean, pragmatic design. It provides a robust foundation for building web applications.
- Graphene-Django: A library that seamlessly integrates GraphQL with Django, allowing you to build powerful and efficient APIs.
- Django REST framework: A powerful and flexible toolkit for building Web APIs with Django.
- Django-helusers: A set of Django utilities and extensions used by the City of Helsinki, likely providing features like user management or authentication specific to their ecosystem.
Prerequisites:
- Docker
- Docker Compose
- Python 3.12 for running pre-commit hooks (see .pre-commit-config.yaml)
- Copy
.env.exampleto.env - Configure settings, see Configuration
- Run
docker compose up
The project is now running at http://localhost:8081
Prerequisites (defined by Dockerfile and compose.yaml):
- PostgreSQL 17
- Python 3.12
Steps:
- Copy
.env.exampleto.env - Install Python requirements, see Installing Python requirements
- Setup database, see Database
- Configure settings, see Configuration
- Run the server, see Daily running, Debugging
- Run
pip install -r requirements.txt - Run
pip install -r requirements-dev.txt(development requirements)
To setup a database compatible with default database settings:
Create user and database
sudo -u postgres createuser -P -R -S palvelutarjotin # use password `palvelutarjotin`
sudo -u postgres createdb -O palvelutarjotin palvelutarjotin
Allow user to create test database
sudo -u postgres psql -c "ALTER USER palvelutarjotin CREATEDB;"
- Create
.envfile:touch .envor make a copy of.env.example - Set the
DEBUGenvironment variable to1. - Run
python manage.py migrate - Run
python manage.py runserver localhost:8081 - The project is now running at http://localhost:8081
-
Set value for
SECRET_KEYto.envwith Generating secret key for Django instructions -
You must config Kultus API to integrate with LinkedEvents API
Add the following lines to your local
.env. Take a look at the.env.exampleto see list of required variablesLINKED_EVENTS_API_ROOT=<your_linked_event_api_url> # e.g. http://localhost:8000/v1/ LINKED_EVENTS_API_KEY=<your_linked_event_api_key> # value from Api key field in LinkedEvents data source LINKED_EVENTS_DATA_SOURCE=<your_linked_event_data_source> # e.g. local-kultus
-
If you are not using local Linked Event, contact LinkedEvents team to provide these information.
- Or you may find them on Azure DevOps if you have access to kultus:
- From Kultus API testing variables:
- LINKED_EVENTS_API_ROOT=https://linkedevents.api.test.hel.ninja/v1/
- LINKED_EVENTS_DATA_SOURCE=kultus
- LINKED_EVENTS_API_KEY secret from:
- From Kultus API testing variables:
- Or you may find them on Azure DevOps if you have access to kultus:
-
If you installed LinkedEvents yourself, you can create API_KEY and DATA_SOURCE from your local LinkedEvents admin interface at http://path_to_your_linked_events/admin/events/datasource/add/
-
-
Create superuser:
-
If you run the Kultus API using Docker, first enter the backend container using
docker exec -it kukkuu-backend bashand run the next command inside the container -
Run this command from the project root to create superuser:
python manage.py add_admin_user -u <username> -p <password> -e <email-address>
Then you can use this account to login to Kultus API admin interface at for example http://path_to_your_kultus_api/admin
-
-
Create Provider Organisation
- At least a single organisation is required to be present in LinkedEvents and in Kultus.
- This will be used on Provider UI where user can pick their organisation after login.
- If you don't have an organisation in LinkedEvents yet you should create one.
- In case you're using an existing LinkedEvents testing environment you
can just pick one of the
idvalues from the organisation list, e.g. ahjo:u4804001010 - In case you've setup LinkedEvents locally and don't have an existing organisation:
- If you run the default importer in LinkedEvents, there will be already some organisations created there, you can use them instead of create your own organisation, but it's recommended to create new one
- To create new organisation in LinkedEvents, visit: http://path_to_your_linked_events/admin/django_orghierarchy/organization/add/
- Before adding organization, you must first create a data source as instructed earlier. You must also set the
value to true for
objects may be edited by usersso that the data source can be selected for the organization.
- After you have an organisation in LinkedEvents, create a similar one in Kultus at
http://localhost:8081/admin/organisations/organisation/add/
- Name: <name of the organisation in LinkedEvents>, e.g.
Kaupunginkirjasto - Phone number: Can be left empty
- Type:
Provider - Persons: Can be left empty
- Publisher id: <id of the organisation in LinkedEvents>, e.g.
ahjo:u4804001010
- Name: <name of the organisation in LinkedEvents>, e.g.
-
Create/update event permissions
- If you only want to work with the GraphQL API without using UI (Teacher UI and Provider UI), when running the API in debug mode, there will be a GraphQL client already available at http://path_to_your_kultus_api/graphql where you can run your graphql query/mutation. Note that in order to execute mutation or some query requires authentication. In that case, you'll have to login to the admin interface at the beginning of your session, after that you can use that session to run graphql mutation in http://path_to_your_kultus_api/graphql
- To be able to manage events via Provider UI, you have to log in the Provider UI first, and create an user there. You'll have to input some information and select the organisation from the organisation list that you created in step 2. After that, login to the Kultus-API admin interface using the superuser account, find the new user and assign staff permission to this user. After that the user can create/edit events from Provider UI
-
Configuration needed to use Provider UI and Teacher UI locally:
- These keyword set variables need to be configured in order to populate dropdown boxes' data in the UI:
- KEYWORD_SET_CATEGORY_ID
- KEYWORD_SET_TARGET_GROUP_ID
- KEYWORD_SET_ADDITIONAL_CRITERIA_ID
- In case you're using an existing LinkedEvents testing environment you
can just use the existing keyword sets i.e.
kultus:categories,
kultus:target_groups and
kultus:additional_criteria
by setting the following variables in your
.envfile:KEYWORD_SET_CATEGORY_ID=kultus:categories KEYWORD_SET_TARGET_GROUP_ID=kultus:target_groups KEYWORD_SET_ADDITIONAL_CRITERIA_ID=kultus:additional_criteria
- In case you've setup LinkedEvents locally and don't have existing keyword sets:
- You'll have to create the KeywordSet in LinkedEvents, add some Keywords to the KeywordSet, then set the
KeywordSet ids to
.env.- Create three keyword sets in LinkedEvents using this address:
http://path_to_your_linked_event/admin/events/keywordset / with the following name:
Kultus Targer Groups,Kultus Additional Criteria,Kultus Categories - Add some Keywords to all aboves KeywordSets. There should be some keywords already available in the system if you run the required importers in LinkedEvents. Or you can create new keywords yourself.
- Get the IDs of those keyword sets and put them in
.envKEYWORD_SET_CATEGORY_ID=kultus:categories KEYWORD_SET_ADDITIONAL_CRITERIA_ID=kultus:additional_criteria KEYWORD_SET_TARGET_GROUP_ID=kultus:target_groups
- Create three keyword sets in LinkedEvents using this address:
http://path_to_your_linked_event/admin/events/keywordset / with the following name:
- You'll have to create the KeywordSet in LinkedEvents, add some Keywords to the KeywordSet, then set the
KeywordSet ids to
- These keyword set variables need to be configured in order to populate dropdown boxes' data in the UI:
-
(Optional) To use the SMS notification functionality, you have to acquire the API_KEY from Notification Service API and then add these lines to your local
.env:NOTIFICATION_SERVICE_API_TOKEN=your_api_key NOTIFICATION_SERVICE_API_URL=notification_service_end_point
-
(Optional) The notification templates can be imported via
- a) Google sheet importer
- b) Template file importer
The importer can be used to create and update the notification templates or to check whether they are in sync. The importer can be used via Django management commands (in notification_importers app) or admin site tools.
To enable admin site tools, some configuration is needed:
To enable a selected importer (
NotificationFileImporterorNotificationGoogleSheetImporter)NOTIFICATIONS_IMPORTER = ( "notification_importers.notification_importer.NotificationFileImporter" )
If a Google sheet importer is used, also
NOTIFICATIONS_SHEET_IDis neededNOTIFICATIONS_SHEET_ID = "1234"
If a File importer is used, files should be stored in notification_importers app in notification_importers/templates/sms and notification_importers/templates/email folders. There is also a naming convention used there. The file name must be given in this pattern [notification_type]-[locale].[html|j2].
-
(Optional) To offer Kindergartens, schools and colleges from the Servicemap of the Helsinki, the Servicemap API needs to be configured. By default it is using the open data from https://www.hel.fi/palvelukarttaws/rest/v4/unit/ and it should work out of the box.
env = environ.Env(
SERVICEMAP_API_ROOT=(str, "https://www.hel.fi/palvelukarttaws/rest/v4/unit/"),
)
SERVICEMAP_API_CONFIG = {"ROOT": env.str("SERVICEMAP_API_ROOT")}Django needs a value for SECRET_KEY to start.
For production, you should use a strong, long, randomly generated key.
For local development, if you prefer, you can alternatively use a shorter, manually generated key.
Here's how you can generate a value for SECRET_KEY using Python (Based on Django v5.1.6's
get_random_secret_key &
get_random_string):
import secrets, string
allowed_chars = string.ascii_lowercase + string.digits + "!@#$%^&*(-_=+)"
"".join(secrets.choice(allowed_chars) for i in range(50))To view the API documentation, in DEBUG mode visit http://localhost:8081/graphql and checkout the
Documentation Explorer section
Audit logging is implemented with django-auditlog, but it has some extended features applied with auditlog_extra -app.
The configuration that defines which models are in the scope of the audit logging can be found in auditlog_settings.py.
The GraphQL query/mutation and admin site views can be logged by using the mixins and decorators that auditlog_extra provides.
References:
-
Django-auditlog
PyPi: https://pypi.org/project/django-auditlog/.
Github: https://github.com/jazzband/django-auditlog.
Docs: https://django-auditlog.readthedocs.io/en/latest/index.html.
-
Install
pip-tools:pip install pip-tools
-
Add new packages to
requirements.inorrequirements-dev.in -
Update
.txtfile for the changed requirements file:pip-compile requirements.inpip-compile requirements-dev.in
-
If you want to update dependencies to their newest versions, run:
pip-compile --upgrade requirements.in
-
To install Python requirements run:
pip-sync requirements.txt
This project uses Ruff for code formatting and quality checking.
Basic ruff commands:
- lint:
ruff check - apply safe lint fixes:
ruff check --fix - check formatting:
ruff format --check - format:
ruff format
pre-commit can be used to install and
run all the formatting tools as git hooks automatically before a
commit.
New commit messages must adhere to the Conventional Commits specification, and line length is limited to 72 characters.
When pre-commit is in use, commitlint
checks new commit messages for the correct format.
https://helsinkisolutionoffice.atlassian.net/projects/PT/issues
_Enrolment report instances are for data utilizing. They are provided through a JSON view used by external parties. The external parties will need credentials that has the enrolment report view permission to use the view!
Enrolment reports should maintain themselves automatically with nightly running cronjobs, but sometimes some manual syncing might be needed. There are some tools for that in enrolment reports admin page:
- Sync unsynced enrolment reports -button can be used to create all the missing enrolment reports and to sync all the enrolment reports out of sync after the date of the last sync done. If the date of the last sync is greater than the updated_at -field's value in an instance that needs the sync, the sync must be done by selecting the instance from admin list view and using the rehydrate -sync actions.
- Rehydrate the enrolment report instances with LinkedEvents data -action can be used to sync the enrolment report instance with the related enrolment instance. This action also fetches the data from LinkedEvents API, which can lead to some heavy API usage, so please use carefully. All the selected enrolment report instances will be affected.
- Rehydrate the enrolment report instances without LinkedEvents data -action can be used to sync the enrolment report instance with the related enrolment instance without fetching any data from the LinkedEvents API. This action should be used when the sync needs no data from LiknedEvents, for example when only the enrolment status is wanted to be updated.
Enrolment reports can be initialized with the same management command that the cronjob runs: sync_enrolment_reports.
It will create the missing enrolment reports and sync the enrolment report instances that are out of sync with the
related enrolment instance. The sync_enrolment_reports command takes in 2 optional parameters:
- --sync_from, which can be used to set the date of the updated_at -field that will be used to fetch the enrolments being handled in the sync process.
- --ignore_linkedevents, which can be used to prevent data fetching from LinkedEvents API.
The used environments are listed in Service environments.
The application uses automatic semantic versions and is released using Release Please.
Release Please is a GitHub Action that automates releases for you. It will create a GitHub release and a GitHub Pull Request with a changelog based on conventional commits.
Each time you merge a "normal" pull request, the release-please-action will create or update a "Release PR" with the changelog and the version bump related to the changes (they're named like release-please--branches--master--components--palvelutarjotin).
To create a new release for an app, this release PR is merged, which creates a new release with release notes and a new tag. This tag will be picked by Azure pipeline and trigger a new deployment to staging. From there, the release needs to be manually released to production.
When merging release PRs, make sure to use the "Rebase and merge" (or "Squash and merge") option, so that Github doesn't create a merge commit. All the commits must follow the conventional commits format. This is important, because the release-please-action does not work correctly with merge commits (there's an open issue you can track: Chronological commit sorting means that merged PRs can be ignored ).
See Release Please Implementation Design for more details.
And all docs are available here: release-please docs.
Use Conventional Commits to ensure that the changelogs are generated correctly.
Release please goes through commits and tries to find "releasable units" using commit messages as guidance - it will then add these units to their respective release PR's and figures out the version number from the types: fix for patch, feat for minor, feat! for major. None of the other types will be included in the changelog. So, you can use for example chore or refactor to do work that does not need to be included in the changelog and won't bump the version.
The release-please workflow is located in the release-please.yml file.
The configuration for release-please is located in the release-please-config.json file. See all the options here: release-please docs.
The manifest file is located in the release-please-manifest.json file.
When adding a new app, add it to both the release-please-config.json and release-please-manifest.json file with the current version of the app. After this, release-please will keep track of versions with release-please-manifest.json.
If you were expecting a new release PR to be created or old one to be updated, but nothing happened, there's probably one of the older release PR's in pending state or action didn't run.
- Check if the release action ran for the last merge to main. If it didn't, run the action manually with a label.
- Check if there's any open release PR. If there is, the work is now included on this one (this is the normal scenario).
- If you do not see any open release PR related to the work, check if any of the closed PR's are labeled with
autorelease: pending- ie. someone might have closed a release PR manually. Change the closed PR's label toautorelease: tagged. Then go and re-run the last merge workflow to trigger the release action - a new release PR should now appear. - Finally check the output of the release action. Sometimes the bot can't parse the commit message and there is a notification about this in the action log. If this happens, it won't include the work in the commit either. You can fix this by changing the commit message to follow the Conventional Commits format and rerun the action.
Important! If you have closed a release PR manually, you need to change the label of closed release PR to autorelease: tagged. Otherwise, the release action will not create a new release PR.
Important! Extra label will force release-please to re-generate PR's. This is done when action is run manually with prlabel -option
Sometimes there might be a merge conflict in release PR - this should resolve itself on the next push to main. It is possible run release-please action manually with label, it should recreate the PR's. You can also resolve it manually, by updating the release-please-manifest.json file.
- Open release-please github action
- Click Run workflow
- Check Branch is master
- Leave label field empty. New label is not needed to fix merge issues
- Click Run workflow -button
There's also a CLI for debugging and manually running releases available for release-please: release-please-cli
When a Release-Please pull request is merged and a version tag is created (or a proper tag name for a commit is manually created), this tag will be picked by Azure pipeline, which then triggers a new deployment to staging. From there, the deployment needs to be manually approved to allow it to proceed to the production environment.
The tag name is defined in the azure-pipelines-release.yml.