Welcome to the Divekit documentation. Choose a section to get started.
This is the multi-page printable view of this section. Click here to print.
Divekit Documentation
- 1: Quick Start
- 1.1: Hello there π
- 1.2: Installation
- 1.3: π§ Individualization
- 1.4: Distribution
- 1.5: Glossary
- 2: Divekit CLI
- 2.1: divekit install
- 2.2: divekit init
- 2.3: divekit doctor
- 2.4: divekit distribute
- 2.5: divekit patch
- 2.6: divekit overview
- 2.7: divekit config
- 2.8: divekit update
- 3: Development
- 3.1: Architecture
- 3.1.1: Overview
- 3.1.2: Components
- 3.1.3: Configuration
- 3.2: Contributing
- 3.2.1: Development Setup
- 3.2.2: Error Handling
- 3.2.3: Contributing Guidelines
- 3.3: Work in Progress
- 3.3.1: π Notes
- 3.3.2: Config Redesign
- 3.3.3: Deployment
- 3.4:
- 3.4.1: Go Testing Guide
- 3.4.2: Testrepo
- 4: Archive
- 4.1: Access Manager
- 4.2: Access Manager 2.0
- 4.3: Automated Repo Setup
- 4.4: Divekit Language Plugin
- 4.5: Divekit Language Server
- 4.6: Evaluation Processor
- 4.7: Operator
- 4.8: Passchecker
- 4.9: Plagiarism Detector
- 4.10: Repo Editor
- 4.11: Report Mapper
- 4.12: Report Visualizer
- 4.13: Test Library
- 4.14: Test page generator
1 - Quick Start
Divekit is a command-line tool for managing individualized programming assignments at scale. It helps educators create, distribute and evaluate programming exercises for large groups of students.
Key Features
- Assignment Individualization: Generate unique variations of programming assignments for each student
- Automated Distribution: Create and manage GitLab repositories for students automatically
- Test Integration: Built-in support for automated testing and evaluation
- Bulk Operations: Efficiently manage assignments for large classes
- Access Control: Manage repository access rights and permissions
Benefits
- Prevent Plagiarism: Each student receives a slightly different version of the assignment
- Save Time: Automate repetitive tasks like repository setup and access management
- Ensure Fairness: Standardized testing and evaluation across all variations
- Scale Easily: Handle large classes with minimal additional effort
Use Cases
Divekit is primarily used in educational settings where:
- Programming assignments need to be distributed to many students
- Each student should receive an individualized version
- Automated testing and evaluation is desired
- Manual administrative overhead should be minimized
The tool consolidates functionality that was previously spread across multiple separate tools into a single, easy-to-use CLI application.
Divekit is a toolkit for managing individualized programming assignments at scale.
1.1 - Hello there π
What is Divekit?
Divekit is a command-line tool for managing individualized programming assignments at scale. It helps educators create, distribute and evaluate programming exercises for large groups of students.
Key Features
- Assignment Individualization: Generate unique variations of programming assignments for each student
- Automated Distribution: Create and manage GitLab repositories for students automatically
- Test Integration: Built-in support for automated testing and evaluation
- Bulk Operations: Efficiently manage assignments for large classes
- Access Control: Manage repository access rights and permissions
Benefits
- Prevent Plagiarism: Each student receives a slightly different version of the assignment
- Save Time: Automate repetitive tasks like repository setup and access management
- Ensure Fairness: Standardized testing and evaluation across all variations
- Scale Easily: Handle large classes with minimal additional effort
Use Cases
Divekit is primarily used in educational settings where:
- Programming assignments need to be distributed to many students
- Each student should receive an individualized version
- Automated testing and evaluation is desired
- Manual administrative overhead should be minimized
The tool consolidates functionality that was previously spread across multiple separate tools into a single, easy-to-use CLI application.
Divekit is a toolkit for managing individualized programming assignments at scale.
Who needs Divekit?
Divekit is designed for different stakeholders in programming education:
Course Instructors
Primary users who:
- Create and manage programming assignments
- Need to distribute exercises to large groups of students
- Want to prevent plagiarism through individualization
- Need efficient ways to evaluate student submissions
Benefits
- Automated repository management
- Built-in individualization system
- Integrated testing capabilities
- Bulk operations for large classes
Teaching Assistants
Support staff who:
- Help manage student repositories
- Assist with assignment evaluation
- Provide technical support to students
Benefits
- Standardized repository structure
- Automated access management
- Consistent testing environment
- Clear overview of student progress
Students
End users who:
- Receive individualized assignments
- Submit their solutions through Git
- Get automated feedback on their work
Benefits
- Personal GitLab repositories
- Immediate feedback through automated tests
- Clear assignment structure
- Consistent submission process
Why Divekit?
How Divekit works
1.2 - Installation
Prerequisites
Before installing Divekit, ensure your system meets the following requirements:
System Requirements
- Operating System: Linux, macOS, or Windows
- GitLab: Access to a GitLab instance with API permissions
GitLab Setup
- Access to a GitLab instance
- Admin rights to create groups and repositories
- Personal Access Token with
api
scope
Creating a Personal Access Token
or
- Navigate to your GitLab profile settings
- Go to “Access Tokens”
- Create a new token with required scopes
- Save the token securely - you’ll need it during installation
Storage Requirements
- Minimum 1GB free disk space
- Additional space for repositories (varies by project size)
Network Requirements
- Stable internet connection
- Access to GitLab API endpoints
- No blocking firewalls for HTTP requests
Optional Requirements
- Docker: For running tests in containers
- Maven/Gradle: For Java project support
- IDE: Any Git-compatible IDE for development
Installation
Download the latest release:
- Navigate to Divekit CLI Releases
- Download a built version for your operating system
Install Divekit:
Navigate to the download directory and run the installation script:
./divekit.exe install # On Windows
./divekit install # On Linux/macOS
Environment Setup
After installation, you need to configure your GitLab token in your system’s environment variables. There are two methods available:
Recommended Method: System Environment Variables
Windows
- Open System Properties > Advanced > Environment Variables
- Add a new User Variable:
- Variable name:
GITLAB_TOKEN
- Variable value:
<YOUR_TOKEN>
- Variable name:
- Restart your terminal
Linux
Add to your ~/.bashrc
or ~/.zshrc
:
export GITLAB_TOKEN="<YOUR_TOKEN>"
Then run:
source ~/.bashrc # or source ~/.zshrc
macOS
Add to your ~/.zshrc
:
export GITLAB_TOKEN="<YOUR_TOKEN>"
Then run:
source ~/.zshrc
Alternative Method (Not Recommended)
You can store the token in a .env
file in the Divekit configuration directory:
# In ~/.divekit/.env
GITLAB_TOKEN=<YOUR_TOKEN>
β οΈ Warning: This method is not recommended as the token is stored unencrypted and poses a potential security risk. Prefer using system environment variables instead.
To improve security when using this method:
# Restrict file permissions (Linux/macOS)
chmod 600 ~/.divekit/.env
This ensures only the file owner of your account (you) can read and write the file.
Verify Installation
Run the doctor command to verify your setup:
divekit doctor
This will check if:
- Divekit is properly installed
- Required environment variables are set
- System requirements are met
Troubleshooting
If you encounter any issues:
- Check if the GitLab token is correctly set in your environment variables
- Run
divekit doctor
for detailed diagnostics - π§ Check the logs in
~/.divekit/logs
- Ensure you have the correct permissions for the installation directory
First Steps After Installation
Create Your First Assignment
- Create and navigate to a new directory:
mkdir my-assignment
cd my-assignment
- Initialize a new Divekit project:
divekit init
Assignment Content Creation
- Add your assignment files to the repository
- Mark solution parts in your code:
public class Example {
//unsup
return actualSolution;
//unsup
}
- Add variables for individualization:
public class $EntityClass$ {
// ...
}
Assignment Distribution
- Verify your setup:
divekit doctor
- Distribute the repositories:
divekit distribute
Next Steps
For more detailed information, please refer to:
- Configuration options in the Configuration section
- Detailed system requirements in the Prerequisites section
1.3 - π§ Individualization
Overview
Divekit allows you to create individualized programming assignments for each student. This is done by defining variables that are populated with random values during the repository generation.
Variable Types
There are three types of variables:
1. Object Variables
Object variables are used to randomize entities and value objects. They are defined in the configuration file {ORIGIN_REPO}/.divekit/variables/variations.json
:
{
"ids": "Vehicle",
"objectVariations": [
{
"id": "Car",
"Class": "Car",
"RepoClass": "CarRepository",
"SetToOne": "setCar",
"SetToMany": "setCars"
},
{
"id": "Truck",
"Class": "Truck",
"RepoClass": "TruckRepository",
"SetToOne": "setTruck",
"SetToMany": "setTrucks"
}
],
"variableExtensions": ["Getter"]
}
2. Relation Variables
Relation variables define relationships between entities. They are defined by two components:
- Relationship types in
{ORIGIN_REPO}/.divekit/variables/relations.json
:
{
"id": "OneToOne",
"Umlet": "lt=-\nm1=1\nm2=1",
"Short": "1 - 1",
"Description": "one to one"
}
- Concrete relationships in the variations.json:
{
"relationShips": [
{
"id": "Rel1",
"relationType": "OneToOne"
}
],
"relationObjects": [
{
"id": "RelVehicleWheel",
"Obj1": "Vehicle",
"Obj2": "Wheel"
}
]
}
3. Logic Variables
Logic variables allow the definition of different business logic variants. Files can be suffixed with _LogicId
and will only be included if this logic variant is selected:
{
"id": "VehicleLogic",
"logicVariations": [
{
"id": "VehicleCrash",
"Description": "Implementieren Sie die Crash-Logik..."
},
{
"id": "VehicleShop",
"Description": "Implementieren Sie die Shop-Logik..."
}
]
}
Using Variables
Variables can be referenced in files with a configurable delimiter (default: $
):
public class $VehicleClass$ {
// ...
}
For each variable, three variants are automatically generated:
- Original:
VehicleClass
->MonsterTruck
- First letter lowercase:
vehicleClass
->monsterTruck
- All lowercase:
vehicleclass
->monstertruck
Persistence
The generated individual variables are stored in the distribution under {ORIGIN_REPO}/.divekit/{DISTRIBUTION_NAME}/individual_repositories.json
and can be reused if needed.
Quality Assurance
Divekit can issue warnings if suspicious values remain after variable replacement (e.g., “Car” in a Truck variant). This helps to identify accidentally unmodified variables.
1.4 - Distribution
Overview
Divekit can distribute your assignment to multiple repositories on GitLab, creating individualized versions for each student or group. This process includes:
- Creating code repositories for each student/group
- Optionally creating test repositories
- Assigning the correct permissions to students
- Individualizing the content based on your configuration
Distribution Guide
Simply use the distribute command to start the distribution process:
divekit distribute
The command will:
- Ask you to select a distribution if multiple are configured
- Check if all configured members exist on GitLab
- Show you a summary of what will be created
- Create the repositories after your confirmation
Example Flow
$ divekit distribute
? Found several distributions. Please choose one:
[ ] local
[x] supervisor
[ ] student
Checking members:
[β] 2 users available
[X] 3 users not found:
- ada
- charles
- jobs
Would create 2 repositories with name "ST2-2024-{uuid}" and assign 2 members.
? Continue? [Y/n]: y
Creating main repositories at #234567:
[ββββββββββββββββββββββββββββββββββββββββββββββββββ] 100% (2/2)
Creating test repositories at #345678:
[ββββββββββββββββββββββββββββββββββββββββββββββββββ] 100% (2/2)
Assigning members:
[βββββββββββββββββββββββββββ ] 50% (1/2)
What Happens During Distribution?
- Divekit creates a new repository for each student/group
- If configured, test repositories are created separately
- Repository contents are individualized based on your configuration
- Students are assigned with appropriate permissions
- Each repository gets a unique identifier
Next Steps
- Learn more about configuration options
- Understand how to individualize assignments
- Check the CLI commands reference for advanced options
1.5 - Glossary
This glossary provides definitions for terms used throughout the Divekit documentation.
Term | Definition |
---|---|
Divekit | The platform for creating and managing AI-powered learning experiences. |
CLI | Command Line Interface |
Origin Repository | The repository where the Divekit project is hosted. |
Group | A collection of 1..* students |
Student | A member of a group. |
Instructor | A member of a group with additional privileges. |
Distributed Repository | A repository on GitLab that is assigned to a group. |
Distribution | A distribution is a set of Distributed Repositories defined in the Origin Repository. |
Divekit Instance | The whole Structure of one Divekit Origin - Distributed Repositories from all Distributions. |
2 - Divekit CLI
2.1 - divekit install
Installs the divekit CLI and the required modules.
Example
$ divekit install
Installing divekit in home directory...
[β] divekit installed in home directory
[β] divekit executable added to PATH
2.2 - divekit init
[!WARNING] Not yet implemented
Initializes a new Divekit origin repository by creating the necessary configuration files.
(npm init
and git init
provide the expected functionality)
Example
$ divekit init
This utility will walk you through creating the
necessary configuration files to turn this current
folder into an `origin` Divekit repository.
It only covers the most common items, and tries to
guess sensible defaults.
Press ^C at any time to quit.
? Repository name: ST2-{{now "2006"}}-{{uuid}}
? Distribution [milestone]: test
? Use default structure [Y/n]: n
? Repository target group id: 234567
? Repository test group id: 345678
? Repository members csv [members.csv]: ./../members.csv
Repository configuration created at `./.divekit_norepo/distribution/test/`
$ ls -a .divekit_norepo/distribution/test/
repositoryConfig.json
2.3 - divekit doctor
[!WARNING] was originally called
divekit init
and should be renamed, because other functionality might be expected due togit init
andnpm init
.
divekit doctor
is an in-depth diagnosis of the entire environment. It provides a comprehensive analysis and detailed information about potential issues.
It is a kind of “first-aid tool” that offers concrete solutions or even automatically fixes problems.
Flutter uses flutter doctor
for similar functionality, so divekit doctor
could be appropriate.
[!NOTE] Works, only token verification is still missing.
divekit doctor (some checks failed)
$ divekit doctor
System:
[β] git available
[β] npm available
[β] Modules installed
β’ 'Automated Repo Setup' available
β’ 'Repo Editor' available
[X] Token not found
β’ Please provide a GitLab API token via `divekit doctor --glpat <YOUR_TOKEN>`
β’ You can create a new token at https://gitlab.git.nrw/-/user_settings/personal_access_tokens?name=git.nrw/divekit&scopes=api
Origin:
[β] Config available and valid
[β] No orphan variables found
[β] No hardcoded variations found
Distribution:
β’ supervisor:
[β] All remotes are reachable
[β] All NoChange files are equal to local files
β’ students:
[!] No remotes found
β’ Run `divekit distribute --distribution students` to distribute the repositories
divekit doctor (all checks passed)
$ divekit doctor
System:
[β] git available
[β] npm available
[β] Modules installed
β’ 'Automated Repo Setup' available
β’ 'Repo Editor' available
[β] Token is valid and has the necessary permissions
Origin:
[β] Config available and valid
[β] No orphan variables found
[β] No hardcoded variations found
Distribution:
β’ supervisor:
[β] All remotes are reachable
[β] All NoChange files are equal to local files
β’ students:
[β] All remotes are reachable
[β] All NoChange files are equal to local files
divekit doctor list
List all available checks with short explanations:
$ divekit doctor list
You can call single checks or check groups by calling
`$ divekit doctor check <comma-separated-dot-notated-paths>`
Example:
`$ divekit doctor check system.token
[β] Token is valid and has the necessary permissions
`
system - checks all children
git - checks if `git` is accessible
npm - checks if `npm` is accessible
modules - checks if module dependencies are accessible
token - checks if the token is accessible and valid
origin - checks all children
config - checks if the origin config is valid
orphan_variables - checks if orphan variable names were found
hardcoded_variations - checks if hardcoded variations were found
distribution - checks all children
<distribution_name> - checks all children
remotes_reachable - checks if all configured remotes are reachable
no_change_files - checks if files not to be changed were changed
divekit doctor check
execute specific checks:
$ divekit doctor check system.token
[β] Token is valid and has the necessary permissions
$ divekit doctor check origin.config
[β] Config is available and valid
$ divekit doctor check system.token,origin.config
System:
[β] Token is valid and has the necessary permissions
Origin:
[β] Config is available and valid
2.4 - divekit distribute
.divekit
.[!WARNING] was originally called
divekit setup
and was renamed because setup sounds like local preparation and not like distribution across multiple repositories.
Creates multiple repositories on GitLab based on the configurations in repositoryConfig.json
.
[!NOTE] Only partially functional - style still different
- Only creates repos with members
- no test repos
- no overview
- Members are assigned directly
- Members are checked but simply ignored
Beispiel Ablauf
$ divekit distribute
? Found several distributions. Please choose one:
[ ] local
[x] supervisor
[ ] student
Checking members:
[β] 2 user available
[X] 3 users not found:
- ada
- charles
- jobs
Would create 2 repository with name "ST2-2024-{uuid}" and assign 2 members.
? Continue? [Y/n]: y
Creating main repositories at #234567:
[ββββββββββββββββββββββββββββββββββββββββββββββββββ] 100% (2/2)
Creating test repositories at #345678:
[ββββββββββββββββββββββββββββββββββββββββββββββββββ] 100% (2/2)
Assigning members:
[βββββββββββββββββββββββββββ ] 50% (1/2)
2.5 - divekit patch
Patch one or several files in all the repos of a certain distribution of the origin repo
Usage:
divekit patch [flags] [files...]
Flags:
-d, --distribution string name of the repo-distribution to patch
-h, --help help for patch
e.g.:
$ divekit patch --distribution "supervisor" E2WhateverTests.java pom.xml
Example Flow (first draft)
$ divekit patch --distribution "supervisor" E2WhateverTests.java pom.xml
? Please type your commit message [Patch applied on 2024-10-04 08:42]: make some tests optional
Following repositories will be updated:
[β] (215x) supervisor::ST2-2024-{uuid}
[β] (215x) supervisor::ST2-2024-{uuid}-test
? Continue? [Y/n]: y
Updating repositories:
[βββββββββββ ] 42% (90/215)
$ divekit patch E2WhateverTests.java pom.xml
? Found several distributions. Please choose one:
[x] local
[ ] supervisor
[ ] student
Following repositories will be updated:
[β] (215x) local::ST2-2024-{uuid}
[β] (215x) local::ST2-2024-{uuid}-test
? Continue? [Y/n]: y
Updating repositories:
[βββββββββββ ] 42% (90/215)
2.6 - divekit overview
$ divekit overview --distribution students
Using:
- {DIVEKIT_HOME}/members/divekit-members-8125814e-01da-42dd-8be3-29df5dcd760e.json
Serving on http://localhost:8080
Opening browser...
Depending on how the current overview file is used, the html could be created dynamically.
…or it could be stored as html and/or markdown in the origin repo, next to the files that store the members.
2.7 - divekit config
$ divekit config
Usage:
divekit config <command> <flags>
Available Commands:
list List configuration values
get Get configuration values
set Set configuration values
unset Unset configuration values
Flags:
-h, --help help for config
-g, --global use global configuration
Examples
Listing all configuration values
divekit config list
You can get single values by calling
`$ divekit config get <dot-notated-path>`
Example:
`$ divekit config get origin.remotes.name
STM2-{{uuid}}
`
origin - the origin configuration
version - the version of the origin configuration
remotes - information about the distributed remotes
name - the name of the remote with template variables
groupIds - the group IDs of the remote with template variables
main - the main group ID
test - the test group ID
membersPath - the path to the members file
Importing members from a CSV file
divekit config set origin.membersPath --import /path/to/members.csv
This command imports the members from the specified CSV file. The CSV should contain usernames (e.g., campusIDs) in the first column.
Import latecomers from a CSV file
divekit config set origin.membersPath --import /path/to/members.csv
Import and overwrite existing members
divekit config set origin.membersPath --import /path/to/members.csv --replace
Adding individual members
divekit config set origin.membersPath --add john.doe,jane.smith
This command adds two new members (John Doe and Jane Smith) to the existing list of members.
Adding a group of members
divekit config set origin.membersPath --add-groups team-a:alice.jones,bob.wilson
This command adds a new group named “team-a” with two members (Alice Jones and Bob Wilson).
Distributing repositories for latecomers
divekit distribute
This command distributes repositories for members who were added after the initial distribution.
(--sync
, --refresh
, or just no flags if we know which ones are missing)
2.8 - divekit update
checks for a newer version and starts an update
Example Process
$ divekit update -y
Checking for updates...
Current version: 0.0.1
Latest version: 0.0.2
Downloading update...
Applying update...
Update applied successfully
New version: 0.0.2
$ divekit update -y
Checking for updates...
Current version: 0.0.2
Latest version: 0.0.2
Already up to date
3 - Development
Resources for developers who want to contribute to Divekit.
3.1 - Architecture
This section covers Divekit’s technical architecture:
The architecture documentation helps developers understand how Divekit works internally.
Components
- Core Components: Detailed documentation of core components and their interactions
3.1.1 - Overview
Divekit is a tool that helps instructors to create and distribute repositories to students.
High-Level Overview
graph TB INST((Instructors)) ORIGIN[Origin Repository] CLI[Divekit CLI] DIST[Distribution] REPOSTUDENT[Student Repositories] REPOTEST[Test Repositories] STUDENTS((Students)) TPAGE[Test Pages] INST -->|Develop| ORIGIN INST -->|Use| CLI ORIGIN -->|Input| CLI CLI -->|Generate| DIST DIST --- REPOTEST DIST --- REPOSTUDENT STUDENTS -->|Work on| REPOSTUDENT TPAGE -->|Get feedback| STUDENTS REPOSTUDENT --->|Update| REPOTEST REPOTEST --->|Update| TPAGE style CLI fill:#42b050,stroke:#333 style ORIGIN fill:#fcf,stroke:#333 style DIST fill:#a3e87e,stroke:#333 style INST fill:#ff9,stroke:#333 style STUDENTS fill:#ff9,stroke:#333 style REPOSTUDENT fill:#6fc5ff,stroke:#333 style REPOTEST fill:#6fc5ff,stroke:#333
Component Details
Divekit CLI
The CLI serves as the central interface for instructors. It controls the entire process of task distribution and management. All necessary commands for creating, distributing, and managing repositories are executed through the CLI.
Origin Repository
The Origin Repository contains the initial version of assignments and tests. It serves as a master template from which individualized versions for students are generated. This is where the original assignments, code scaffolds, and test cases are maintained.
Distribution
A Distribution is the result of the distribution process and consists of two main components:
Student Repositories
Individualized repositories for each student or group, containing:
- Personalized assignments
- Adapted code scaffolds
- Specific resources
Test Repositories
Separate repositories containing test cases and evaluation criteria:
- Automated tests
- Assessment metrics
- Feedback mechanisms
Test Page
A page where students can get feedback on their work.
Students
Students are the users who are working on the repositories. They can be individuals or groups.
Instructor
Instructor is the user who is creating the repositories and distributing them to the students.
3.1.2 - Components
This document describes the core components of Divekit and how they interact.
Components Overview
graph TB subgraph interfaces CLI[CLI Interface] WebUI[Web Interface] end style WebUI stroke-dasharray: 5 5 subgraph core[Modules] ModuleEntry(( )) style ModuleEntry fill:none,stroke:none Config[Configuration Manager] GitAdapter[GitLab Adapter] Indiv[Individualization] Pass[Passchecker] Plag[Plagiarism Checker] User[Usermanagement] end CLI --> ModuleEntry WebUI -.-> ModuleEntry Pass --> GitAdapter Plag --> GitAdapter User --> GitAdapter GitAdapter --> GitLab[GitLab API]
Interfaces
- CLI Interface: Central command-line interface for all user interactions
- Web Interface (planned): Alternative user interface that uses the same modules as the CLI
Modules
- Configuration Manager: Manages all configuration files and user settings
- GitLab Adapter: Central component for all GitLab interactions
- π§ Individualization: Handles the individualization of tasks
- π§ Passchecker: Checks submissions and communicates with GitLab
- π§ Plagiarism Checker: Detects possible plagiarism and interacts with GitLab
- π§ Usermanagement: Manages users and their permissions through GitLab
3.1.3 - Configuration
Divekit uses a hierarchical configuration system with both global and project-specific settings.
Configuration Levels
Divekit uses a multi-level configuration system based on the frequency of changes:
[0] Installation
Configurations that are set once during DiveKit installation and rarely changed afterwards. These contain global defaults and environment settings.
~
βββ .divekit/
βββ .env # Environment variables
βββ hosts.json # Hosts configuration
βββ members # Members configuration
β βββ 2025-01-21_12-28-15_pear_members.json
β βββ 2025-01-27_12-29-00_raspberry_members.json
β βββ 2025-01-27_12-40-02_sandwich_members.json
βββ origin.json # Origin configuration
βββ variation # Variation configuration (not finalized)
βββ relations.json # Relations configuration
βββ variableExtensions.json # Variable extensions configuration
βββ variations.json # Variations configuration
Environment Configuration
~/.divekit/.env
:
API_TOKEN=YOUR_ACCESS_TOKEN
DEFAULT_BRANCH=main
Remotes
Default:
~/.divekit/hosts.json
:
{
"version": "1.0",
"hosts": {
"default": {
"host": "https://gitlab.git.nrw/",
"token": "DIVEKIT_API_TOKEN"
}
}
}
Example:
~/.divekit/hosts.json
:
{
"version": "1.0",
"hosts": {
"default": {
"host": "https://gitlab.git.nrw/",
"tokenAt": "DIVEKIT_API_TOKEN"
},
"archilab": {
"host": "https://gitlab.archi-lab.io/",
"tokenAt": "DIVEKIT_API_TOKEN_ARCHILAB"
},
"gitlab": {
"host": "https://gitlab.com/",
"tokenAt": "DIVEKIT_API_TOKEN_GITLABCOM"
}
}
}
[1] Semester
Configurations that are typically set at the beginning of each semester. These define course-wide settings and distribution templates.
{ORIGIN_DIR}
βββ .divekit/ # Project configuration
βββ distributions/
βββ ST1-M1/ # Sandbox environment config
β βββ config.json # Distribution settings
βββ ST1-M2/ # Student environment config
βββ config.json # Distribution settings
Distribution Configuration (Example)
{ORIGIN}/.divekit/distributions/<distribution>/config.json
:
{
"version": "2.0",
"targets": {
"default": {
"remote": "default", // optional
"groupId": 12345, // optional (if set in global config)
"name": "ST1-M1-{{uuid}}",
"members": {
"path": "$DIVEKIT_MEMBERS/2025-01-25_13-37_ST1-M1_members.json",
"rights": "reporter"
}
},
"test": {
"remote": "gitlab",
"groupId": 67890, // optional (if set in global config)
"name": "ST1-M1-{{uuid}}_test",
"members": {
"path": "$DIVEKIT_MEMBERS/2025-01-25_13-37_ST1-M1_members.json",
"rights": null
}
}
}
}
[2] Milestone
Configurations that change with each milestone or assignment. These include specific repository settings and member assignments.
{ORIGIN_DIR}
βββ .divekit/
βββ distributions/
βββ <distribution>/ # e.g. ST1-M1
βββ config.json # Milestone-specific settings
Members Configuration
members.csv
:
username
tbuck
ada
charles
jobs
woz
generates:
~/.divekit/members/2025-01-25_13-37_ST1-M1_members.json
:
{
"version": "2.0",
"groups": [ // ? rename to "members"?
{
"uuid": "4a28af44-f2cd-4a9e-a93f-2f4c29d6dfc0",
"members": [ // ? rename to "group"?
"torben.buck"
]
},
{
"uuid": "3dc6bbc1-a4eb-44fd-80fc-230bea317bc1",
"members": [
"ada"
]
},
{
"uuid": "1fe6aa82-e04b-435f-8023-10104341825d",
"members": [
"charles"
]
},
{
"uuid": "eb64c6af-67da-4f55-ae3a-d4b2a02baae6",
"members": [
"jobs"
]
},
{
"uuid": "ade17515-bdb9-4398-90c1-cfc078f5ec36",
"members": [
"woz"
]
}
]
}
[3] π§ Call
Configurations that can be overridden during command execution. Any configuration value from the previous levels can be overridden using command-line arguments.
Examples:
# Specify individual files for patching
divekit patch --distribution="sandbox" src/main/java/Exercise.java src/test/java/ExerciseTest.java
# set debug loglevel
divekit patch --loglevel=debug
3.2 - Contributing
Learn how to contribute to the Divekit project.
3.2.1 - Development Setup
This guide will help you set up your development environment for contributing to Divekit.
Prerequisites
- Command Line access
- Internet connection
- Go 1.23 or higher
- Gitlab
- Access Token
- Group IDs
- (Git)
- (npm)
Setting Up the Development Environment
- Clone the repository:
git clone https://gitlab.git.nrw/divekit/tools/divekit-cli.git
- Navigate to the project directory:
cd divekit-cli
- Install the required dependencies:
go mod download
Install local modules (later possibly optional - but for development a huge help):
mkdir pkg
cd pkg
git clone https://gitlab.git.nrw/divekit/modules/gitlab-adapter
git clone https://gitlab.git.nrw/divekit/modules/config-management
cd ..
go work init
go work use ./pkg/gitlab-adapter
go work use ./pkg/config-management
- Build the CLI:
chmod +x build.sh
./build.sh
Then answer the questions or just press Enter for the default values (windows, amd64).
This will create a divekit
executable in the bin
directory. You can run this executable from the command line to use the CLI or run install
on it to install it globally.
For Example:
./bin/divekit_windows_amd64.exe install
This will install the divekit
command globally on your system. You can now run divekit
from any directory.
- Run the CLI:
./bin/divekit_windows_amd64.exe
# or
divekit
…or if you want to execute directly from the source code:
go run cmd/divekit/main.go
- Run the tests:
go test ./...
- Make your changes and submit a merge request.
3.2.2 - Error Handling
The project implements a structured error handling system that distinguishes between critical and non-critical errors. This pattern is currently implemented in the distribute
package and can serve as a template for other packages.
Error Pattern
Each package can define its own error types and handling behavior. The pattern consists of:
- A custom error type that implements the
error
interface - Specific error types as constants
- Methods to determine error severity and behavior
Example from the distribute package:
// Custom error type
type CustomError struct {
ErrorType ErrorType
Message string
Err error
}
// Error types
const (
// Critical errors that lead to termination
ErrConfigLoad // Configuration loading errors
ErrWorkingDir // Working directory access errors
// Non-critical errors that trigger warnings
ErrMembersNotFound // Member lookup failures
)
Example Implementation
Here’s how to implement this pattern in your package:
// Create a new error
if err := loadConfig(); err != nil {
return NewCustomError(ErrConfigLoad, "failed to load configuration", err)
}
// Handle non-critical errors
if err := validateData(); err != nil {
if !err.IsCritical() {
log.Warn(err.Error())
// Continue execution...
} else {
return err
}
}
Error Behavior
Each package can define its own error behavior, but should follow these general principles:
- Critical Errors: Should terminate the current operation
- Non-Critical Errors: Should generate warnings but allow continuation
- Wrapped Errors: Should preserve the original error context
Each error should include:
- An error type indicating its severity
- A descriptive message
- The original error (if applicable)
- A method to determine if it’s critical
This pattern provides consistent error handling while remaining flexible enough to accommodate different package requirements. The distribute
package provides a reference implementation of this pattern.
3.2.3 - Contributing Guidelines
Thank you for considering contributing to Divekit! This document outlines our contribution process and guidelines.
Code of Conduct
- Be respectful and inclusive
- Follow professional standards
- Help others learn and grow
- Report unacceptable behavior
Getting Started
- Fork the repository
- Set up your development environment
- Create a feature branch
- Make your changes
- Submit a pull request
Development Process
Branching Strategy
main
: Production-ready codedevelop
: Integration branch- Feature branches:
feature/your-feature
- Bugfix branches:
fix/issue-description
Commit Messages
Follow conventional commits:
type(scope): description
[optional body]
[optional footer]
The commit message header consists of three parts:
type
: Categorizes the type of change (see below)scope
: Indicates the section of the codebase being changed (e.g.cli
,core
,config
,parser
)description
: Brief description of the change in imperative mood
Examples:
feat(cli): add new flag for verbose output
fix(parser): handle empty config files correctly
docs(readme): update installation instructions
test(core): add tests for user authentication
Types:
feat
: New feature or functionalityfix
: Bug fixdocs
: Documentation changesstyle
: Formatting, missing semicolons, etc. (no code changes)refactor
: Code restructuring without changing functionalitytest
: Adding or modifying testschore
: Maintenance tasks, dependencies, etc.
The body should explain the “why” of the change, while the description explains the “what”.
Pull Requests
- Update documentation
- Add/update tests
- Ensure CI passes
- Request review
- Address feedback
Code Style
- Follow Go best practices and idioms
- Use
gofmt
for consistent formatting - Follow the official Go Code Review Comments
- Use
golint
andgolangci-lint
- Write clear, idiomatic Go code
- Keep functions focused and well-documented
Testing
- Write unit tests using the standard
testing
package - Use table-driven tests where appropriate
- Aim for good test coverage
- Write integration tests for complex functionality
- Use
go test
for running tests - Consider using testify for assertions
Documentation
- Write clear godoc comments
- Update README.md and other documentation
- Include examples in documentation
- Document exported functions and types
- Keep documentation up to date with changes
Review Process
- Automated checks (golangci-lint, tests)
- Code review
- Documentation review
- Final approval
- Merge
Release Process
- Version bump
- Changelog update
- Tag release
- Documentation update
3.3 - Work in Progress
3.3.1 - π Notes
2024-10-01 Stefan, Torben (via Discord)
divekit patch
- Individual files are passed to the command
- Local testing is important for verification
- Variables are also replaced during patching
- Files are currently patched individually (can also be done in one commit)
divekit distribute
- Not
push
because:git push
- Performs consistency checks (merge needed, missing pulls)
- There are differences between Origin and Remote (variables)
- The target is not the client but a creation operation within the server (?)
2024-09-12 Stefan, Fabian, Torben (in Person)
config
- Distribution “test” -> “supervisor” -(later)-> “sandbox”
- Distribution “code” -> “student”
divekit doctor
- move to another “error control” command?
- execute before other appropriate commands and possibly abort
divekit install
- Possibly look into open source to see how others do it
- Offer executables,
divekit install
copies the/an executable into the home directory and writes the path to the divekit executable in the PATH (and an update executable?). divekit install
, which copies divekit into the user directory and adds the divekit path to the PATH (and maybe already prepares all thedoctor
preparations)
divekit init
- Merge with members for latecomers
- Also update overview (new members missing)
- Re-running ensures everything is in place
2024-09-12 Stefan, Fabian, Torben (in person)
divekit doctor
- move to another “error control” command?
- execute before other appropriate commands and possibly abort
divekit distribute
push
->create
?push
->distribute
! (favorite)
3.3.2 - Config Redesign
Current State
ARS
-
{ARS}/.env
» INIT -
{ARS}/originRepositoryConfig.json
» INIT -
{ARS}/relationsConfig.json
» INIT -
{ARS}/variationsConfig.json
» SEMESTER -
{ARS}/repositoryConfig.json
» MILESTONE -
{ARS}/variableExtensionsConfig.json
( » INIT )-
$.[i].variableExtensions.ClassPath.preValue
» SEMESTER
-
RepoEditor (-> PatchTool)
OriginRepo
{OriginRepo}/repositoryConfig.json
$.general
$.repository
repositoryName
» CALLrepositoryCount
» INITrepositoryMembers
» MILESTONE
$.individualRepositoryPersist
$.local
originRepositoryFilePath
» MILESTONEsubsetPaths
» CALL
$.remote
originRepositoryId
» MILESTONEcodeRepositoryTargetGroupId
» MILESTONEtestRepositoryTargetGroupId
» MILESTONEdeleteExistingRepositories
» CALLaddUsersAsGuests
» CALL
$.overview
generateOverview
» INIToverviewRepositoryId
» SEMESTERoverviewFileName
» MILESTONE
Assigned Configurations
[0] INIT
Configurations that typically only need to be defined once during installation.
Optimally in: {$HOME}/.divekit/
{ARS}/.env
» INIT{ARS}/originRepositoryConfig.json
» INIT{ARS}/relationsConfig.json
» INIT{ARS}/variableExtensionsConfig.json
( » INIT )$.[i].variableExtensions.ClassPath.preValue
» SEMESTER
{OriginRepo}/repositoryConfig.json
[1] SEMESTER
Configurations that typically only need to be defined once per semester. They are best stored in the OriginRepo.
Optimally in: {OriginRepo}/.divekit_norepo/{distribution}/
{ARS}/variationsConfig.json
» SEMESTER{OriginRepo}/repositoryConfig.json
$.overview.overviewRepositoryId
» SEMESTER
[2] MILESTONE
Configurations that typically only need to be defined once per milestone. They are best stored in the OriginRepo.
Optimally in: {OriginRepo}/.divekit_norepo/{distribution:{milestone}}/
{ARS}/repositoryConfig.json
» MILESTONE{OriginRepo}/repositoryConfig.json
$.repository.repositoryMembers
» MILESTONE$.local.originRepositoryFilePath
» MILESTONE$.remote
originRepositoryId
» MILESTONEcodeRepositoryTargetGroupId
» MILESTONEtestRepositoryTargetGroupId
» MILESTONE
$.overview.overviewFileName
» MILESTONE
[3] CALL
Configurations that must be defined with each call.
Optimally in: CLI flags
Future
[0] INIT
{ARS}/.env
will be stored in {$HOME}/.divekit/
ACCESS_TOKEN=YOUR_ACCESS_TOKEN
HOST=https://git.st.archi-lab.io
BRANCH=main
{ARS}/originRepositoryConfig.json
-> {$HOME}/.divekit/origin.json
Will be stored here during installation and then copied to the new Origin Repos during divekit init
.
{
"variables": {
"variableDelimiter": "$"
},
"solutionDeletion": {
"deleteFileKey": "//deleteFile",
"deleteParagraphKey": "//delete",
"replaceMap": {
"//unsup": "throw new UnsupportedOperationException();",
"//todo": "// TODO"
}
},
"warnings": {
"variableValueWarnings": {
"typeWhiteList": ["json", "java", "md"],
"ignoreList": ["name", "type"]
}
}
}
Suggested change:
{
"version": "2.0",
"variables": {
"delimiter": "$"
},
"solutionCleanup": {
"deleteFile": "//deleteFile",
"replaceParagraph": {
"//unsup": "throw new UnsupportedOperationException();",
"//todo": "// TODO",
"//delete": null
}
},
"warnings": {
"variation": {
"fileTypes": ["json", "java", "md"],
"ignore": ["name", "type"]
}
}
}
{ARS}/relationsConfig.json
-> {$HOME}/.divekit/variation/relations.json
Will be stored here during installation and then copied to the new Origin Repos during divekit init
.
[!NOTE]
I don’t fully understand what this is for - it may remain here forever and not need to be copied to the Origin Repo?
(what is UmletRev? What does the star mean?)
[
{
"id": "OneToOne",
"Umlet": "lt=-\nm1=1\nm2=1",
"UmletRev": "lt=-\nm1=1\nm2=1",
"Short": "1 - 1",
"Description": "one to one"
},
{
"id": "OneToMany",
"Umlet": "lt=-\nm1=1\nm2=*",
"UmletRev": "lt=-\nm1=*\nm2=1",
"Short": "1 - n",
"Description": "one to many"
},
{
"id": "ManyToOne",
"Umlet": "lt=-\nm1=*\nm2=1",
"UmletRev": "lt=-\nm1=1\nm2=*",
"Short": "n - 1",
"Description": "many to one"
},
{
"id": "ManyToMany",
"Umlet": "lt=-\nm1=*\nm2=*",
"UmletRev": "lt=-\nm1=*\nm2=*",
"Short": "n - m",
"Description": "many to many"
}
]
Suggested change:
id
->key
?
{
"version": "2.0",
"relations": [
{
"id": "OneToOne",
"umlet": "lt=-\nm1=1\nm2=1",
"umletRev": "lt=-\nm1=1\nm2=1",
"short": "1 - 1",
"description": "one to one"
},
{
"id": "OneToMany",
"umlet": "lt=-\nm1=1\nm2=*",
"umletRev": "lt=-\nm1=*\nm2=1",
"short": "1 - n",
"description": "one to many"
},
{
"id": "ManyToOne",
"umlet": "lt=-\nm1=*\nm2=1",
"umletRev": "lt=-\nm1=1\nm2=*",
"short": "n - 1",
"description": "many to one"
},
{
"id": "ManyToMany",
"umlet": "lt=-\nm1=*\nm2=*",
"umletRev": "lt=-\nm1=*\nm2=*",
"short": "n - m",
"description": "many to many"
}
]
}
{ARS}/variableExtensionsConfig.json
-> {$HOME}/.divekit/variation/variableExtensions.json
Will be stored here during installation and then copied to the new Origin Repos during divekit init
.
[
{
"id": "Basic",
"variableExtensions": {
"": {
"preValue": "",
"value": "id",
"postValue": "",
"modifier": "NONE"
},
"Class": {
"preValue": "",
"value": "id",
"postValue": "",
"modifier": "NONE"
},
"Package": {
"preValue": "",
"value": "Class",
"postValue": "",
"modifier": "ALL_LOWER_CASE"
},
"ClassPath": {
"preValue": "thkoeln.st.st2praktikum.racing.", // ??? deprecated ???
"value": "Class",
"postValue": ".domain",
"modifier": "ALL_LOWER_CASE"
}
}
},
{
"id": "Getter",
"variableExtensions": {
"GetToOne": {
"preValue": "get",
"value": "Class",
"postValue": "",
"modifier": "NONE"
},
"GetToMany": {
"preValue": "get",
"value": "s",
"postValue": "",
"modifier": "NONE"
}
}
}
]
Questions
From my notes
I thought I had written this somewhere already, but I can’t find it anymore.
- [0] INIT -> “Installation” exists twice
- Once during DiveKit installation
- Once during DiveKit initialization in a new OriginRepo
So what should go where (have ideas)?
- Is the
preValue
still needed?
I unfortunately don’t remember exactly what/why, but this was causing some significant issues.
3.3.3 - Deployment
[!WARNING]
Not implemented this way yet - the current process is shown in the gif below.
This guide covers the process of deploying and releasing new versions of Divekit.
Version Management
Semantic Versioning
Divekit follows Semantic Versioning:
- MAJOR version for incompatible API changes
- MINOR version for new functionality
- PATCH version for bug fixes
Version Tagging
# Current version is v2.0.0
# Bump patch version (e.g., v2.0.0 -> v2.0.1)
./deploy.sh patch
# Bump minor version (e.g., v2.0.0 -> v2.1.0)
./deploy.sh minor
# Bump major version (e.g., v2.0.0 -> v3.0.0)
./deploy.sh major
# Create alpha/beta versions
./deploy.sh minor -alpha.1 # Creates v2.1.0-alpha.1
./deploy.sh patch -beta.2 # Creates v2.0.1-beta.2
# Rollback options
./deploy.sh rollback # Removes current tag and returns to previous version
./deploy.sh rollback v2.1.0 # Removes specific version tag
Example (current state)
Release Process
- Update version using deploy.sh:
./deploy.sh <patch|minor|major> [-alpha.N|-beta.N]
- Update CHANGELOG.md:
## [2.0.1] - YYYY-MM-DD
### Added
- New feature X
- Command Y support
### Changed
- Improved Z performance
### Fixed
- Bug in command A
- Create release branch:
git checkout -b release/v2.0.1
- Build and test locally:
go test ./...
go build
- Create GitLab release:
- Tag version is created automatically
- Changelog from CHANGELOG.md is included automatically
- CI pipeline automatically:
- Runs all tests
- Builds binaries for all supported platforms
- Creates release artifacts
- Uploads binaries to the release
Deployment Checklist
- All tests passing locally (
go test ./...
) - Documentation updated
- CHANGELOG.md updated
- Version tagged using
deploy.sh
- GitLab CI/CD Pipeline completed successfully:
- Binaries built successfully
- Release artifacts generated
- Release created and verified in GitLab
- Generated binaries tested on sample installation
Rollback Procedure
If issues are found:
- Execute rollback using deploy.sh:
./deploy.sh rollback [version] # Version is optional
This automatically executes the following steps:
- Deletes the specified tag (or current tag if no version specified) locally and remote
- Reverts to the previous version
- Creates a new hotfix branch if desired
Examples:
./deploy.sh rollback # Removes the most recent tag
./deploy.sh rollback v2.1.0 # Removes specific version v2.1.0
./deploy.sh rollback v2.0.0-alpha.1 # Removes a specific alpha version
If manual rollback is necessary:
git tag -d v2.0.1
git push origin :refs/tags/v2.0.1
git checkout -b hotfix/2.0.2
3.4 -
3.4.1 - Go Testing Guide
What should be tested in this project?
Given that this CLI is the entry point for the user to interact with Divekit, it is essential to test all commands.
Currently, there is only one command patch
, but all commands should be tested with the following aspects in mind:
- Command Syntax: Verify that the command syntax is correct
- Command Execution: Ensure that executing the command produces the expected behavior or output
- Options and Arguments: Test each option and argument individually to ensure they are processed correctly and test various combinations of options and arguments
- Error Handling: Test how the command handles incorrect syntax, invalid options, or missing arguments
Additionally, testing the utility functions is necessary, as they are used throughout the entire project. For that the following aspects should be considered:
- Code Paths: Every possible path through the code should be tested, which should include “happy paths” (expected input and output) as well as “edge cases” (unexpected inputs and conditions).
- Error Conditions: Check that the code handles error conditions correctly. For example, if a function is supposed to handle an array of items, what happens when itβs given an empty array? What about an array with only one item, or an array with the maximum number of items?
How should something be tested?
Commands should be tested with integration tests since they interact with the entire project. Integration tests are utilized to verify that all components of this project work together as expected in order to test the mentioned aspects.
To detect early bugs, utility functions should be tested with unit tests. Unit tests are used to verify the behavior of specific functionalities in isolation. They ensure that individual units of code produce the correct and expected output for various inputs.
How are tests written in Go?
Prerequisites
It’s worth mentioning that the following packages are utilized in this project for testing code.
The testing package
The standard library provides the testing package, which is required to support testing in Go. It offers different types from the testing library [1, pp. 37-38]:
testing.T
: To interact with the test runner, all tests must use this type. It contains a method for declaring failing tests, skipping tests, and running tests in parallel.testing.B
: Similar to the test runner, this type is a benchmark runner. It shares the same methods for failing tests, skipping tests and running benchmarks concurrently. Benchmarks are generally used to determine performance of written code.testing.F
: This type generates a randomized seed for the testing target and collaborates with thetesting.T
type to provide test-running functionality. Fuzz tests are unique tests that generate random inputs to discover edge cases and identify bugs in written code.testing.M
: This type allows for additional setup or teardown before or after tests are executed.
The testify toolkit
The testify toolkit provides several packages to work with assertions, mock objects and testing suites [4]. Primarily, the assertion package is used in this project for writing assertions more easily.
Test signature
To write unit or integration tests in Go, it is necessary to construct test functions following a particular signature:
func TestName(t *testing.T) {
// implementation
}
According to this test signature highlights following requirements [1, p.40]:
- Exported functions with names starting with “Test” are considered tests.
- Test names can have an additional suffix that specifies what the test is covering. The suffix must also begin with a capital letter. In this case, “Name” is the specified suffix.
- Tests are required to accept a single parameter of the
*testing.T
type. - Tests should not include a return type.
Unit tests
Unit tests are small, fast tests that verify the behavior of specific functionalities in isolation. They ensure that individual units of code produce the correct and expected output for various inputs.
To illustrate unit tests, a new file named divide.go
is generated with the following code:
package main
func Divide(a, b int) float64 {
return float64(a) / float64(b)
}
By convention tests are located in the same package as the function being tested.
It’s important that all test files must end with _test.go
suffix to get detected by the test runner.
Accordingly divide_test.go
is also created within the main package:
package main
import (
"github.com/stretchr/testify/assert"
"testing"
)
func TestDivide(t *testing.T) {
// Arrange
should, a, b := 2.5, 5, 2
// Act
is := divide(a, b)
// Assert
assert.Equal(t, should, is, "Got %v, want %v", is, should)
}
Writing unit or integration tests in the Arrange-Act-Assert (AAA) pattern is a common practice. This pattern establishes a standard for writing and reading tests, reducing the cognitive load for both new and existing team members and enhancing the maintainability of the code base [1, p. 14].
In this instance, the test is formulated as follows:
Arrange: All preconditions and inputs get set up.
Act: The Act step executes the actions outlined in the test scenario, with the specific actions depending on the type of test. In this instance, it calls the Add function and utilizes the inputs from the Arrange step.
Assert: During this step, the precondition from the Arrange step is compared with the output. If the output does not match the precondition, the test is considered failed, and an error message is displayed.
It’s worth noting that the Act and Assert steps can be iterated as many times as needed, proving beneficial, particularly in the context of table-driven tests.
Table-driven tests for unit and integration tests
To cover all test cases it is required to call Act and Assert multiple times. It would be possible to write one test per case, but this would lead to a lot of duplication, reducing the readability. An alternative approach is to invoke the same test function several times. However, in case of a test failure, pinpointing the exact point of failure may pose a challenge [2]. Instead, in the table-driven approach, preconditions and inputs are structured as a table in the Arrange step.
As a consequence divide_test.go
gets adjusted in the following steps [1, pp. 104-109]:
Step 1 - Create a structure for test cases
In the first step a custom type is declared within the test function. As an alternative the structure could be declared outside the scope of the test function. The purpose of this structure is to hold the inputs and expected preconditions of the test case.
The test cases for the previously mentioned Divide
function could look like this:
package main
import (
"math"
"testing"
)
func TestDivide(t *testing.T) {
// Arrange
testCases := []struct {
name string // test case name
dividend int // input
divisor int // input
quotient float64 // expected
}{
{"Regular division", 5, 2, 2.5},
{"Divide with negative numbers", 5, -2, -2.5},
{"Divide by 0", 5, 0, math.Inf(1)},
}
}
The struct
type wraps name
, dividend
, divisor
and quotient
. name
describes the purpose of a test case
and can be used to identify a test case, in case an error occurs.
Step 2 - Executing each test and assert it
Each test case from the table will be executed as a subtest. To achieve this, the testCases
are iterated over and
each testCase
is executed in a separate goroutine
[3] with t.Run()
.
The purpose of this is to individually fail tests without concerns about disrupting other tests.
Within t.Run()
, the Act and Assert steps get performed:
package main
import (
"github.com/stretchr/testify/assert"
"math"
"testing"
)
func TestDivide(t *testing.T) {
// Arrange
testCases := []struct {
name string // test case name
dividend int // input
divisor int // input
quotient float64 // expected
}{
{"Regular division", 5, 2, 2.5},
{"Divide with negative numbers", 5, -2, -2.5},
{"Divide by 0", 5, 0, math.Inf(1)},
}
for _, testCase := range testCases {
t.Run(testCase.name, func(t *testing.T) {
// Act
quotient := Divide(testCase.dividend, testCase.divisor)
// Assert
assert.Equal(t, testCase.quotient, quotient)
})
}
}
Setup and teardown
Setup and teardown before and after a test
Setup and teardown are used to prepare the environment for tests and clean up after tests have been executed.
In Go the type testing.M
from the testing package fulfills this purpose and is used as a parameter for the
TestMain
function, which controls the setup and teardown of tests.
To use this function, it must be included within the package alongside the tests, as the scope for functions
is limited to the package in which it is defined. This implies that each package can only have one
TestMain
function; consequently, it is called only when a test is executed within the package
[5].
The following example illustrates how it works [1, p. 51]:
package main
func TestMain(m *testing.M) {
// setup statements
setup()
// run the tests
e := m.Run()
// cleanup statements
teardown()
// report the exit code
os.Exit(e)
}
func setup() {
log.Println("Setting up.")
}
func teardown() {
log.Println("Tearing down.")
}
TestMain
runs before any tests are executed and defines the setup
and teardown
functions. The Run
method
from testing.M
is used to invoke the tests and returns an exit code that is used to report the success or failure
of the tests.
Setup and teardown before and after each test
In order to teardown after each test, the t.Cleanup
function can be used provided by the testing package
[2].
Since there is no mention to setup
before each test, it can be assumed that the setup
function is
called at the start of a test.
This example shows how this can be used:
package main
func TestWithSetupAndCleanup(t *testing.T) {
setup()
t.Cleanup(func() {
// cleanup logic
})
// more test code here
}
Write integration tests
Integration tests are used to verify the interaction between different components of a system. However, the mentioned principles for writing unit tests also apply to integration tests. The only difference is that integration tests involve a greater amount of code, as they encompass multiple components.
How to run tests?
To run tests from the CLI, the go test
command is used, which is part of the Go toolchain
[6].
The list shows some examples of how to run tests:
To run a specific test, the
-run
flag can be used. For example, to run theTestDivide
test from thedivide_test.go
file, the following command can be used:go test -run TestDivide
. Note that the argument for-run
is a regular expression, so it is possible to run multiple tests at once.To run all tests in a package, run
go test <packageName>
. Note that the package name should include a relative path if the package is not in the working directory.To run all tests in a project, run
go test ./...
. The argument for test is a wildcard, matching all subdirectories; therefore, it is crucial for the working directory to be set to the root of the project to recursively run all tests.
Additionally, tests can be run from the IDE. For example, in GoLand, the IDE will automatically detect tests and provide a gutter icon to run them [7].
How the command patch
is tested?
Prerequisites
Before patch
can be tested, it is necessary to do the following:
- Replace the placeholders in the file
.env.example
and rename it to.env
. If you have no api token, you can generate one here. - Run the script
setup.ps1
as administrator. This script will install all necessary dependencies and initialize the ARS-, Repo-Editor- and Test-Origin-Repository.
Test data
To test patch
, it was necessary to use a
test origin repository as test data. In this context the test origin repository is a repository that contains
all the necessary files and configurations from ST1 to test different scenarios.
Additionally, a test group was created to test if the Repo-Editor-repository actually pushes the generated files to remote repositories. Currently, the test group contains the following repositories:
coderepos:
ST1_Test_group_8063661e-3603-4b84-b780-aa5ff1c3fe7d
ST1_Test_group_86bd537d-9995-4c92-a6f4-bec97eeb7c67
ST1_Test_group_8754b8cb-5bc6-4593-9cb8-7c84df266f59
testrepos:
ST1_Test_tests_group_446e3369-ed35-473e-b825-9cc0aecd6ba3
ST1_Test_tests_group_9672285a-67b0-4f2e-830c-72925ba8c76e
Structure of a test case
patch
is tested with a table-driven test, which is located in the file patch_test.go
.
The following example shows the structure of a test case:
package patch
func TestPatch(t *testing.T) {
testCases := []struct {
name string
arguments PatchArguments // input
generatedFiles []GeneratedFile // expected
error error // expected
}{
{
"example test case",
PatchArguments{
dryRun: true | false,
logLevel: "[empty] | info | debug | warning | error",
originRepo: "path_to_test_origin_repo",
home: "[empty] | path_to_repositories",
distribution: "[empty] | code | test",
patchFiles: []string{"patch_file_name"},
},
[]GeneratedFile{
{
RepoName: "repository_name",
RelFilePath: "path_to_the_generated_file",
Distribution: Code | Test,
Include: []string{"should_be_found_in_the_generated_file"},
Exclude: []string{"should_not_be_found_in_the_generated_file"},
},
},
error: nil | errorType,
},
}
// [run test cases]
}
The name
field is the name of the test case and is used to identify the test case in case of an error.
The struct PatchArguments
contains all the necessary arguments to run the patch
command:
dryRun
: If true, generated files will not be pushed to a remote repository.logLevel
: The log level of the command.originRepo
: The path to the test origin repository.home
: The path to the divekit repositories.distribution
: The distribution to patch.patchFiles
: The patch files to apply.
The struct GeneratedFile
is the expected result of the patch
command and contains the following properties:
RepoName
: The name of the generated repository.RelFilePath
: The relative file path of the generated file.Distribution
: The distribution of the generated file.Include
: Keywords that should be found in the generated file.Exclude
: Keywords that should not be found in the generated file.
The error
field is the expected error of the patch
command. It can be nil
when no error is expected or
contain a specific error type if an error is expected.
Process of a test case
The following code snippet shows how test cases are processed:
package patch
func TestPatch(t *testing.T) {
// [define test cases]
for _, testCase := range testCases {
t.Run(testCase.name, func(t *testing.T) {
generatedFiles := testCase.generatedFiles
dryRunFlag := testCase.arguments.dryRun
distributionFlag := testCase.arguments.distribution
deleteFilesFromRepositories(t, generatedFiles, dryRunFlag) // step 1
_, err := executePatch(testCase.arguments) // step 2
checkErrorType(t, testCase.error, err) // step 3
if err == nil {
matchGeneratedFiles(t, generatedFiles, distributionFlag) // step 4
checkFileContent(t, generatedFiles) // step 5
checkPushedFiles(t, generatedFiles, dryRunFlag) // step 6
}
})
}
}
Each test case runs the following sequence of steps:
deleteFilesFromRepositories
deletes the specified files from their respective repositories. Prior to testing, it is necessary to delete these files to ensure that they are actually pushed to the repositories, given that they are initially included in the repositories.executePatch
executes the patch command with the given arguments and return the output and the error.checkErrorType
checks if the expected error type matches with the actual error type.matchGeneratedFiles
checks if the found file paths match with the expected files and throws an error when there are any differences.checkFileContent
checks if the content of the files is correct.checkPushedFiles
checks if the generated files have been pushed correctly to the corresponding repositories.
References
[1] A. Simion, Test-Driven Development in Go Packt Publishing Ltd, 2023
[2] “Comprehensive Guide to Testing in Go | The GoLand Blog," The JetBrains Blog (accessed Jan. 29, 2024).
[3] “Goroutines in Golang - Golang Docs," (accessed Jan. 29, 2024).
[4] “Using the Testify toolkit | GoLand," GoLand Help. (accessed Jan. 29, 2024).
[5] “Why use TestMain for testing in Go?" (accessed Jan. 29, 2024).
[6] “Go Toolchain - Go Wiki” (accessed Jan. 29, 2024).
[7] “Run tests | GoLand," GoLand Help. (accessed Jan. 29, 2024).
3.4.2 - Testrepo
The documentation is not yet written. Feel free to add it yourself ;)
Testing Package structure
static final String PACKAGE_PREFIX = "thkoeln.divekit.archilab.";
@Test
public void testPackageStructure() {
try {
Class.forName(PACKAGE_PREFIX + "domainprimitives.StorageCapacity");
Class.forName(PACKAGE_PREFIX + "notebook.application.NotebookDto");
Class.forName(PACKAGE_PREFIX + "notebook.application.NotebookController");
Class.forName(PACKAGE_PREFIX + "notebook.domain.Notebook");
// using individualization and the variableExtensionConfig.json this could be simplified to
// Class.forName("$entityPackage$.domain.$entityClass$");
// ==> Attention: If used, the test can't be tested in the orgin repo itself
} catch (ClassNotFoundException e) {
Assertions.fail("At least one of your entities is not in the right package, or has a wrong name. Please check package structure and spelling!");
}
}
Testing REST Controller
@Autowired
private MockMvc mockMvc;
@Test
public void notFoundTest() throws Exception {
mockMvc.perform(get("/notFound")
.accept(MediaType.APPLICATION_JSON))
.andDo(print())
.andExpect(status().isNotFound());
}
@Transactional
@Test
public void getPrimeNumberTest() throws Exception {
final Integer expectedPrimeNumber = 13;
mockMvc.perform(get("/primeNumber")
.accept(MediaType.APPLICATION_JSON))
.andDo(print())
.andExpect(status().isOk())
.andExpect(jsonPath("$", Matchers.is(expectedPrimeNumber))).andReturn();
}
Testing …
4 - Archive
Archive
Documentation for legacy tools that are being replaced by the CLI.
4.1 - Access Manager
The documentation is not yet written. Feel free to add it yourself ;)
4.2 - Access Manager 2.0
Setup & Run
- Install Python 3 or higher
- Install python-GitLab using
pip install python-gitlab
- Check the file
config.py
to configure the tool - run AccesManager.py using
python AccessManager
Configuration
Option | Purpose |
---|---|
GIT_URL | URL of your GitLab Server |
AUTH_TOKEN | Your personal GitLab access token |
GROUP_ID | Id of the GitLab Group you want to modify |
ACCESS_LEVEL | Access level you want to provide. 1 for Maintainer, 0 for Guest |
STUDENTS | List of users to modify. Users not in this List will be ignored. |
4.3 - Automated Repo Setup
Setup & Run
Install NodeJs (version >= 12.0.0) which is required to run this tool. NodeJs can be acquired on the website nodejs.org.
To use this tool you have to clone the repository to your local drive.
This tool uses several libraries in order to use the Gitlab API etc. Install these libraries by running the command
npm install
in the root folder of this project.Local/GitLab usage
- For local use only
- Copy the origin repository into the folder resources/test/input. If this folder does not exist, create the folder test inside the resources folder and then create the folder input in the newly created folder test
- The generated repositories will be located under resources/test/output after running the tool
- For use with Gitlab:
- Navigate to https://git.st.archi-lab.io/profile/personal_access_tokens (if you are using the gitlab instance * git.st.archi-lab.io*) and generate an Access Token / API-Token in order to get access to the gitlab api
- Copy the Access Token
- Rename the file .env.example to .env
- Open .env and replace YOUR_API_TOKEN with the token you copied.
- Configure the source repository and target group in the config
- For local use only
Before you can configure or run this tool you have to copy all the example config files inside the * resources/examples/config* folder to the resources/config folder in order to create your own config files. If you want to change the standard behaviour you can configure this tool by editing the configs.
To run the application navigate into the root folder of this tool and run
npm start
. The repositories will now be generated.
Configuration
Before you can configure this tool you have to copy all the relevant example config files inside the * resources/examples/config* folder to the resources/config folder in order to create your own config files. If you want to change the standard behaviour you can configure this tool by editing the configs.
The Divekit uses two types of configs, technical configs and domain specific configs. The contents of technical configs often change each time repositories are generated using the Divekit. Therefore these type of configs are located in the resources/config folder of the Divekit. Domain configs do not change each time new repositories are generated because they depend on the type of exercise and the corresponding domain. As a result these configs should be contained in the Origin Project (they dont have to). In the following the different configs, their purpose and their type are listed:
Config | Purpose | Type |
---|---|---|
repositoryConfig | Configure the process of repository generation | Technical Config |
originRepositoryConfig | Configure solution deletion and variable warnings | Domain Config |
variationsConfig | Configure different types of variations | Domain Config |
variableExtensionsConfig | Configure different extensions which are used to generate derivated variables | Domain Config |
relationsConfig | Configure properties of relations which are used to generate relation variables | Domain Config |
Features
Repository Generation
If the Divekit is run the tool will generate repositories based on the configured options defined in the * repositoryConfig*. The following example shows the relevant options where each option is explained in short:
{
"general": {
# Decide wheather you just want to test locally. If set to false the Gitlab api will be used
"localMode": true,
# Decide wheather test repositories should be generated as well. If set to false there will only be generated one code repository for each learner
"createTestRepository": true,
# Decide wheather the repositories should be randomized using the *variationsConfig.json*
"variateRepositories": true,
# Decide wheather the existing solution should be deleted using the SolutionDeleter
"deleteSolution": false,
# Activate warnings which will warn you if there are suspicious variable values remaining after variable placeholders have been replaced
"activateVariableValueWarnings": true,
# Define the number of concurrent repository generation processes. Keep in mind that high numbers can overload the Gitlab server if localMode is set to false
"maxConcurrentWorkers": 1
# Optional flag: set the logging level. Valid values are "debug", "info", "warn", "error" (case insensitive). Default value is "info".
"globalLogLevel": "debug"
},
"repository": {
# The Name of the repositories. Multiple repositories will be named <repositoryName>_group_<uuid>, <repositoryName>_tests_group_<uuid> ...
"repositoryName": "st2-praktikum",
# The number of repositories which will be created. Only relevant if there were no repositoryMembers defined
"repositoryCount": 0,
# The user names of the members which get access to repositories
"repositoryMembers": [
["st2-praktikum"]
]
},
"local": {
# The file path to an origin repository which should be used for local testing
"originRepositoryFilePath": ""
},
"remote": {
# Id of the repository you want to clone
"originRepositoryId": 1012,
# The ids of the target groups where all repositories will be located
"codeRepositoryTargetGroupId": 161,
"testRepositoryTargetGroupId": 170,
# If set to true all existing repositories inside the defined groups will be deleted
"deleteExistingRepositories": false,
# Define wheather users are added as maintainers or as quests
"addUsersAsGuests": false
}
}
If localMode is set to true the application will only generate possible variable variations and randomize files based on a folder which contains the origin repository. This folder should be located in the folder resources/test/input. If the folder resources/test/input does not exist create it within the root folder of this tool or run the tool once in test mode which will generate this folder automatically. This can be used to get an idea which repositories will result based on the configs. The following example shows the location of the origin folder:
root_of_tool
- build
- node_modules
- src
- .gitignore
- .Readme
- resources
- test
- input
- origin-folder
- src
- .gitignore
- .Readme
If you dont want to copy the origin repository each time you want to test a new version specify the file path to the origin repository in the config under local.originRepositoryFilePath.
Partially repository generation
While running the automated-repo-setup
in local mode you have the option to partially generate repositories.
To do so, just configure the repositoryConfig.json
* as such:
{
"general": {
"localMode": true
},
"local": {
"subsetPaths": [
"README.md",
"path/to/malfunction/file.eof"
]
}
}
*only partially shown
Start generation
npm start
Generated files are located under: resources/output/
File Assignment
Although code and test files are separeted into two repositories the exercise only consists of one repository called the origin. It would be really troublesome if you would have to update two repositories all the time while creating a new exercise. Because of that there has to be a way to determine wheather a file has to be copied to the code project, the test project or both. If you want some files to only be copied to a specific repository you can express this behaviour in the filename.
- If the filename contains the string _coderepo the file will only be copied to the code repository.
- If the filename contains the string _testrepo the file will only be copied to the test repository.
- If the filename contains the string _norepo the file will not be copied to the repositories. This can be used to store config files from this tool directly in the origin repository.
- If the filename contains none of those the file will be copied to both repositories.
File Converter
If you want to convert or manipulate certain repository files during the repository generation process File Converters (File Manipulators) can be used. Currently there is only one type of File Manipulator available. Additional converters can be easily added by extending the codebase of the Divekit. The already existing File Manipulartor is called UmletFileManipulator. This manupulator is used to convert the individualized xml representations of Umlet diagrams to image formats. This convert step can not be skipped because it is not possible to replace variables in image representations of umlet diagrams. Therefore the process of individualizing UML diagrams created with umlet is as follows:
UML diagram with placeholder variables (xml) -> UML diagram with already replaced content (xml) -> UML diagram with already replaced content (image file format)
Test Overview
To give an overview on passed and failed tests of a repository a test overview page will be generated using the project report-mapper and report-visualizer. The tools are called within the " .gitlab-ci.yml" file in the deploy stage.
Repository Overview
If you have the generation of the overview table enabled in the repositoryConfig the destination and the name of the overview table can be defined in the file repositoryConfig as well:
{
"overview": {
"generateOverview": true,
"overviewRepositoryId": 1018,
"overviewFileName": "st2-praktikum"
}
}
Given the config shown above a markdown file will be generated which includes a summary of all generated repositories and their members. After that the file will be uploaded to the configured repository:
Solution Deletion
If you want solutions which are contained in your origin project to be removed while creating the code and test repositories enable solution deletion in the repositoryConfig. The originRepositoryConfig specifies the keywords which are used to either
- delete a file
- delete a paragraph
- replace a paragraph
This can be shown best with an example:
// TODO calculate the sum of number 1 and number 2 and return the result
public static int sumInt(int number1, int number2) {
//unsup
return number1 + number2;
//unsup
}
// TODO calculate the product of number 1 and number 2 and return the result
public static int multiplyInt(int number1, int number2) {
//delete
return number1 * number2;
//delete
}
will be changed to:
// TODO calculate the sum of number 1 and number 2 and return the result
public static int sumInt(int number1, int number2) {
throw new UnsupportedOperationException();
}
// TODO calculate the product of number 1 and number 2 and return the result
public static int multiplyInt(int number1, int number2) {
}
The corresponding config entry in the originRepositoryConfig would be:
{
"solutionDeletion": {
"deleteFileKey": "//deleteFile",
"deleteParagraphKey": "//delete",
"replaceMap": {
"//unsup": "throw new UnsupportedOperationException();"
}
}
}
A file containing the string “//deleteFile” would be deleted.
Individualization
If you want your project to be randomized slightly use the configuration files variationsConfig.json, * variableExtensionsConfig* and relationsConfig to create variables. Variables can be referenced later by their name encapsulated in configured signs. e.g.: $ThisIsAVariable$.
Variable Generation
There are three types of variables:
Object Variables
Object Variables are used to randomize Entities and Value Objects. Such variables are created by defining one or multiple ids and an array of possible object variations. Object variations can contain attributes which will later be transformed into a variable. An example attribute could be Class which contains the class name of an entity. Keep in mind that attributes can not only be limited to a single primitive value but can only be expressed as a new object inside the json. The following json shows a possible declaration of two object variations inside the variationsConfig:
{
"ids": "Vehicle",
"objectVariations": [
{
"id": "Car",
"Class": "Car",
"RepoClass": "CarRepository",
"SetToOne": "setCar",
"SetToMany": "setCars"
},
{
"id": "Truck",
"Class": "Truck",
"RepoClass": "TruckRepository",
"SetToOne": "setTruck",
"SetToMany": "setTrucks"
},
{
"id": "Train",
"Class": "Train",
"RepoClass": "TrainRepository",
"SetToOne": "setTrain",
"SetToMany": "setTrains"
}
],
"variableExtensions": [
"Getter"
]
},
{
"ids": ["Wheel1", "Wheel2"],
"objectVariations": [
{
"id": "FrontWheel",
"Class": "FrontWheel",
"RepoClass": "FrontWheelRepository",
"SetToOne": "setFrontWheel",
"SetToMany": "setFrontWheels"
},
{
"id": "BackWheel",
"Class": "BackWheel",
"RepoClass": "BackWheelRepository",
"SetToOne": "setBackWheel",
"SetToMany": "setBackWheels"
}
],
"variableExtensions": ["Getter"]
}
The defined object variations are now randomly assigned to the variables Vehicle, Wheel1 and Wheel2. The following dictionary shows variables which result from above declaration:
VehicleClass: 'Truck',
VehicleRepoClass: 'TruckRepository',
VehicleGetToOne: 'getTruck',
VehicleGetToMany: 'getTrucks',
VehicleSetToOne: 'setTruck',
VehicleSetToMany: 'setTrucks',
Wheel1Class: 'Backwheel',
Wheel1RepoClass: 'BackwheelRepository',
Wheel1GetToOne: 'getBackWheel',
Wheel1GetToMany: 'getBackWheels',
Wheel1SetToOne: 'setBackWheel',
Wheel1SetToMany: 'setBackWheels',
Wheel2Class: 'FrontWheel',
Wheel2RepoClass: 'FrontWheelRepository',
Wheel2GetToOne: 'getFrontWheel',
Wheel2GetToMany: 'getFrontWheels',
Wheel2SetToOne: 'setFrontWheel',
Wheel2SetToMany: 'setFrontWheels'
In the example above you can see that some variables could be derived from already existing variables. The setter variables are a perfect example for this. Such variables can also be defined through variable extensions. This is done for the getter variables in the example. Two steps are required to define such derived variables:
- Define a rule for a variable extension in the config variableExtensionsConfig.json:
{
"id": "Getter",
"variableExtensions": {
"GetToOne": {
"preValue": "get",
"value": "CLASS",
"postValue": "",
"modifier": "NONE"
},
"GetToMany": {
"preValue": "get",
"value": "PLURAL",
"postValue": "",
"modifier": "NONE"
}
}
}
The value attribute references an already existing variable which is modified through the given modifier. Valid modifiers can for example convert the given variable to an all lower case variant.
The resulting value is then concatenated with the preValue and postValue like so: preValue + modifier(value) + postValue.
- Define a certain variable extension for an object by adding the id of the variable extension to the list of variable extensions of an object (see example above).
Relation Variables
Relation Variables are used to randomize relations between entities. They are defined by declaring an array of * relationships* and an array of relationObjects inside the variationsConfig. Both arrays must be of equal length because each set of relationObjects will be assigned to an relationShip.
In order to define a relationShip you have to provide an id and a reference to an relationShip type. These types are defined in the file relationsConfig and can contain any kind of attributes:
{
"id": "OneToOne",
"Umlet": "lt=-\nm1=1\nm2=1",
"Short": "1 - 1",
"Description": "one to one"
}
In order to define a set of relationObjects you have to provide an id and two object references. The following json shows an example definition for relations:
{
"relationShips": [
{
"id": "Rel1",
"relationType": "OneToOne"
},
{
"id": "Rel2",
"relationType": "OneToMany"
}
],
"relationObjects": [
{
"id": "RelVehicleWheel1",
"Obj1": "Vehicle",
"Obj2": "Wheel1"
},
{
"id": "RelVehicleWheel2",
"Obj1": "Vehicle",
"Obj2": "Wheel2"
}
]
}
For each relationship two kind of variables will be generated.
One kind of variable will clarify which objects belong to a certain relationship. These variables will start with for example Rel1 as defined in the section relationShips.
Another kind of variable will clarify which relationship belongs to a set of objects. These variables will start with for example RelVehicleWheel1 as defined in the section relationObjects.
For each of these two kinds a set of variables will be gernated. The first set contains attributes of the relation types defined in the relationsConfig. The other set contains attributes of the objects defined in the variationsConfig.
The following json shows a set of variables which will be generated for a single relationship:
Rel1_Umlet: 'lt=-\nm1=1\nm2=1',
Rel1_Short: '1 - 1',
Rel1_Description: 'one to one',
Rel1_Obj1Class: 'Truck',
Rel1_Obj1RepoClass: 'TruckRepository',
Rel1_Obj1GetToOne: 'getTruck',
Rel1_Obj1GetToMany: 'getTrucks',
Rel1_Obj1SetToOne: 'setTruck',
Rel1_Obj1SetToMany: 'setTrucks',
Rel1_Obj2Class: 'Backwheel',
Rel1_Obj2RepoClass: 'BackwheelRepository',
Rel1_Obj2GetToOne: 'getBackWheel',
Rel1_Obj2GetToMany: 'getBackWheels',
Rel1_Obj2SetToOne: 'setBackWheel',
Rel1_Obj2SetToMany: 'setBackWheels',
RelVehicleWheel1_Umlet: 'lt=-\nm1=1\nm2=1',
RelVehicleWheel1_Short: '1 - 1',
RelVehicleWheel1_Description: 'one to one',
RelVehicleWheel1_Obj1Class: 'Truck',
RelVehicleWheel1_Obj1RepoClass: 'TruckRepository',
RelVehicleWheel1_Obj1GetToOne: 'getTruck',
RelVehicleWheel1_Obj1GetToMany: 'getTrucks',
RelVehicleWheel1_Obj1SetToOne: 'setTruck',
RelVehicleWheel1_Obj1SetToMany: 'setTrucks',
RelVehicleWheel1_Obj2Class: 'Backwheel',
RelVehicleWheel1_Obj2RepoClass: 'BackwheelRepository',
RelVehicleWheel1_Obj2GetToOne: 'getBackWheel',
RelVehicleWheel1_Obj2GetToMany: 'getBackWheels',
RelVehicleWheel1_Obj2SetToOne: 'setBackWheel',
RelVehicleWheel1_Obj2SetToMany: 'setBackWheels',
Logic variables
Logic Variables are used to randomize logic elements of an exercise. The idea behind this concept is that you can define multiple groups of business logic, but only one group of business logic is assigned to each individual exercise. Logic variables can also be used to define text which decribes a certain business logic. Here is an example for the definition of logic variables:
{
"id": "VehicleLogic",
"logicVariations": [
{
"id": "VehicleCrash",
"Description": "Keep in mind that this text is just an example. \nThis is a new line"
},
{
"id": "VehicleShop",
"Description": "The Vehicle Shop exercise was selected"
}
]
}
Above example will generate only one variable which is called VehicleLogicDescription. The interesting part of the logic variations are the ids. If you add an underscore followed by such an id to the end of a file this file is only inserted into an individual repository if the said id was selected during the randomization.
e.g.: The file VehicleCrashTest_VehicleCrash.java is only inserted if the logic VehicleCrash was selected. The file VehicleShopTest_VehicleShop.java is only inserted if the logic VehicleShop was selected.
This can be used to dynamically insert certain test classes which test a specific business logic. If a certain test class was not inserted to an individual repository the one who solves this exercise does not have to implement the corresponding business logic.
Variable Post Processing
Often variable values are needed not only in capital letters but also in lower case format. Therefore for each generated variable there will be three different types generated:
The first type is the variable itselt without further changes e.g.: VehicleClass -> MonsterTruck
The second type sets the first char to lower case e.g.: vehicleClass -> monsterTruck
The third type sets all chars to lower case e.g.: vehicleclass -> monstertruck
Variable Replacement
In the process of repository individualization all defined variables will be replaced in all the origin repository files with their corresponding value. Typically every variable which should be replaced is decorated with a specific string at the start and the end of the variable e.g: $VehicleClass$ or xxxVehicleClassxxx. This string helps identifying variables. If needed this string can be set to an empty string. In this case the variable name can be inserted in specific files without futher decoration. This can lead to problems in terms of variable replacement so that the Divekit will take certain measures to ensure that all variables are replaced correctly. This decoration string can be configured in the originRepositoryConfig:
{
"variables": {
"variableDelimeter": "$"
}
}
Variable Value Warnings
If this feature is activated within the repositoryConfig the tool will spit out warnings which will inform you if there are suspicious variable values remaining after variable placeholders have been replaced. If for example a learner has to solve an exercise which contains Trucks instead of Cars (see config above) then the solution of this leaner should not contain variable values like “Car”, “CarRepository”, “setCar” or “setCars”. In the originRepositoryConfig you can define a whitelist of file types which should be included in the warning process.
Additionally an ignoreList can be configured. If a variable value is contained in one of the defined values inside the ignoreList this specific variable value will not trigger a warning. In addition, the ignoreFileList can contain filenames which should be completely excluded from the warning process.
The following json is an example for the discussed configurable options:
{
"warnings": {
"variableValueWarnings": {
"typeWhiteList": [
"json",
"java",
"md"
],
"ignoreList": [
"name",
"type"
],
"ignoreFileList": [
"individualizationCheck_testrepo.json",
"variationsConfig_testrepo.json",
"IndividualizationTest_testrepo.java"
]
}
}
}
Individual Repository Persist
If you run the tool the default behaviour is that it will generate individual variables for each repository which is specified in the repositoryConfig. If you want to reuse already generated variables you can set " useSavedIndividualRepositories" to “true” and define a file name under “savedIndividualRepositoriesFileName”. The file name is relative to the folder “resources/individual_repositories”. These options are defined in the repositoryConfig:
{
"individualRepositoryPersist": {
"useSavedIndividualRepositories": true,
"savedIndividualRepositoriesFileName": "individual_repositories_22-06-2021 12-58-31.json"
}
}
A single entry in such an individual repositories file can be edited with a normal text editor and could look like this:
{
"id": "67e6be38-ae36-4fbf-9d03-0993d97f7559",
"members": [
"user1"
],
"individualSelectionCollection": {
"individualObjectSelection": {
"Vehicle": "Truck",
"Wheel1": "BackWheel",
"Wheel2": "FrontWheel"
},
"individualRelationSelection": {
"Rel1": "RelVehicleWheel2",
"Rel2": "RelVehicleWheel1"
},
"individualLogicSelection": {
"VehicleLogic": "VehicleCrash"
}
}
}
Components
The component diagram above shows the components of the Divekit which are used in the process of generating and individualizing repositories. In the following the repository generation process will be explained step by step and the components relevant in each step are described:
The Repository Creator delegates most of the tasks involved in the repository generation process to other components. Before repositories are generated the Repository Creator calls the Repository Adapter to prepare the environment. This includes for example creating empty folders for repositories or deleting previous data which is contained in the destination folder. A Repository Adapter functions like an interface to the environment in which new repositories are being generated. At the moment there are two kinds of Repository Adapters: One for the local file system and one for Gitlab.
The Content Retriever retrieves all files from the configured origin repository. In order to access the origin repository the component will use a Repository Adapter. If solution deletion is activated the solution which is contained inside the origin repository will be deleted inside the retrived origin files (not in the origin repository).
For each configured repository or learner a specific configuration is generated by the Individual Repository Manager. This configuration is used by other components while generating repositories and contains for example a unique id and the usernames of learners. If individualization is activated for each configuration specific variations and corresponding variables are generated by the Variation Generator. These variations and variables will also be contained in the seperate configurations which are generated by the Individual Repository Manager.
For each repository configuration generated by the Individual Repository Manager in the previvous step a Content Provider is instantiated. After varying the content by using the randomly generated variations from the previous step the defined File Manipulators (File Converters) are executed. Finally the resulting files are pushed to a new repository using a Repository Adapter.
After all Content Providers are finished with generating each corresponding repository the Overview Generator collects basic information from the Content Providers and generates an overview of all links leading to Code Projects, Test Projects and Test Pages.
The following table lists the relevant packages inside the codebase for each component:
Component | Relevant Packages |
---|---|
Repository Creator | repository_creation |
Individual Repository Manager | repository_creation |
Variation Generator | content_variation |
Content Retriever | content_manager, solution_deletion |
Content Provider | content_manager, content_variation, file_manipulator |
Overview Generator | generate_overview |
Repository Adapter | repository_adapter |
Design-Decisions
Design-Decision | Explanation |
---|---|
Typescript chosen as programming language | Easy handling of dynamic json structures, Good API support for Gitlab, Platform independent, Can be executed locally with nodejs |
4.4 - Divekit Language Plugin
The documentation is not yet written. Feel free to add it yourself ;)
4.5 - Divekit Language Server
The documentation is not yet written. Feel free to add it yourself ;)
4.6 - Evaluation Processor
The documentation is not yet written. Feel free to add it yourself ;)
4.7 - Operator
The documentation is not yet written. Feel free to add it yourself ;)
Developed in a “Praxisprojekt” and not yet tested in practice.
4.8 - Passchecker
The documentation is not yet written. Feel free to add it yourself ;)
4.9 - Plagiarism Detector
The documentation is not yet written. Feel free to add it yourself ;)
4.10 - Repo Editor
The documentation is in a very early stage and some parts might be outdated.
The divekit-repo-editor allows the subsequent adjustment of individual files over a larger number of repositories.
The editor has two different functionalities, one is to adjust a file equally in all repositories and the other is to adjust individual files in repositories based on the project name.
Setup & Run
Install NodeJs (version >= 12.0.0) which is required to run this tool. NodeJs can be acquired on the website nodejs.org.
To use this tool you have to clone this repository to your local drive.
This tool uses several libraries in order to use the Gitlab API etc. Install these libraries by running the command
npm install
in the root folder of this project.Configure Token
- Navigate to your Profile and generate an Access Token / API-Token in order to get access to the gitlab api
- Copy the Access Token
- Rename the file .env.example to .env
- Open .env and replace YOUR_API_TOKEN with the token you copied.
Configure the application via
src/main/config/
and add files toassets/
, see below for more details.To run the application navigate into the root folder of this tool and run
npm start
. All assets will be updated. Usenpm run useSetupInput
if you want to use the latest output of the automated-repo-setup as input for the edit.
Configuration
Place all files that should be edited in the corresponding directories:
input
βββ assets
βββ code
β βββ PROJECT-NAME-WITH-UUID
β β βββ <add files for a specifig student here>
β βββ ...
βββ test
β βββ PROJECT-NAME-WITH-UUID
β β βββ <add files for a specifig student here>
β βββ ...
βββ <add files for ALL repos here>
src/main/config/editorConfig.json
: Configure which groups should be updated and define the commit message:
{
"onlyUpdateTestProjects": false,
"onlyUpdateCodeProjects": false,
"groupIds": [
1862
],
"logLevel": "info",
"commitMsg": "individual update test"
}
Changelog
1.0.0
- add individual updates per project
0.1.1
- add feature to force create/update
0.1.0
- add feature to update or create files based on given structure in
asset/*/
for all repositories
0.0.1
- initialize project based on the divekit-evaluation-processor
4.11 - Report Mapper
Architecture overview
Usage in the pipeline
For the usage in the pipeline you just need node
as prerequisite and then install and use the report-mapper as following:
npm install @divekit/report-mapper
npx report-mapper
Keep in mind, to provide needed input-data based on your configuration.
Complete sample test-repo pipeline-script
image: maven:3-jdk-11
stages:
- build
- deploy
build: # Build test reports
stage: build
script:
- chmod ugo+x ./setup-test-environment.sh
- ./setup-test-environment.sh # copy code from code repo and ensure that test are NOT overridden
- mvn pmd:pmd # build clean code report
- mvn verify -fn # always return status code 0 => Continue with the next stage
allow_failure: true
artifacts: # keep reports for the next stage
paths:
- target/pmd.xml
- target/surefire-reports/TEST-*.xml
pages: # gather reports and visualize via gitlab-pages
image: node:latest
stage: deploy
script:
- npm install @divekit/report-mapper
- npx report-mapper # run generate unified.xml file
- npm install @divekit/report-visualizer
- npx report-visualizer --title $CI_PROJECT_NAME # generate page
artifacts:
paths:
- public
only:
- master
configuration
The report mapper is configurable in two main ways:
- By defining which inputs are expected and therefore should be computed. This is configurable via parameters. You can choose from the following: pmd, checkstyle* and surefire. If none are provided it defaults to surefire and pmd.
npx report-mapper [surefire pmd checkstyle]
- The second option is specific to PMD. PMD for itself has a configuration-file
pmd-ruleset.xml
which configures which PMD rules should be checked. The report mapper also reads from this file and will design the output based on available rules.
Note: The assignment of PMD rules to clean code and solid principles is as of now hardcoded and not configurable.
*The checkstyle-mapper is currently not included in the testing and therefore should be used with caution.
Example simplified pmd-ruleset.xml
:
<?xml version="1.0"?>
<ruleset name="Custom Rules"
xmlns="http://pmd.sourceforge.net/ruleset/2.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://pmd.sourceforge.net/ruleset/2.0.0 https://pmd.sourceforge.io/ruleset_2_0_0.xsd">
<description>
Clean Code Rules
</description>
<!-- :::::: CLEAN CODE :::::: -->
<!-- Naming rules -->
<rule ref="category/java/codestyle.xml/ClassNamingConventions"/>
<rule ref="category/java/codestyle.xml/FieldNamingConventions"/>
<!-- :::::::: SOLID :::::::: -->
<!-- SRP (Single Responsibility Principle) rules -->
<rule ref="category/java/design.xml/TooManyFields"/> <!-- default 15 fields -->
<rule ref="category/java/design.xml/TooManyMethods"> <!-- default is 10 methods -->
<properties>
<property name="maxmethods" value="15" />
</properties>
</rule>
</ruleset>
Getting started
Install
Clone the repository and install everything necessary:
# HTTP
git clone https://github.com/divekit/divekit-report-mapper.git
# SSH
git clone git@github.com:divekit/divekit-report-mapper.git
cd ./divekit-report-mapper
npm ci # install all dependencies
npm test # check that everything works as intended
Provide input data
The input data should be provided in the following structure:
divekit-report-mapper
βββ target
| βββ surefire-reports
| | βββ fileForTestGroupA.xml
| | βββ fileForTestGroupB.xml
| | βββ ...
| βββ checkstyle-result.xml
| βββ pmd.xml
βββ ...
You can find some examples for valid and invalid inputs in the tests: src/test/resources
npm run dev
Understand the Output
The result from the divekit-report-mapper is a XML-File (target/unified.xml
).
It contains the result of all inputs sources in a uniform format. This also includes errors if some or all inputs
provided invalid or unexpected data.
Example with only valid data:
<?xml version="1.0" encoding="UTF-8"?>
<suites>
<testsuite xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation=""
name="E2CleanCodeSolidManualTest" failures="0" type="JUnit" status="failed">
<testcase name="testCleanCodeAndSolidReview" status="failed" hidden="false">
<error message="-%20break%20pipeline%20%3C--%0A" type="java.lang.Exception"><![CDATA[java.lang.Exception:
- break pipeline <--
at thkoeln.st.st2praktikum.exercise1.E2CleanCodeSolidManualTest.testCleanCodeAndSolidReview(E2CleanCodeSolidManualTest.java:13)]]>
</error>
</testcase>
</testsuite>
<testsuite name="Clean-Code-Principles by PMD" status="failed" type="CleanCode">
<testcase name="Keep it simple, stupid" status="passed" hidden="false"></testcase>
<testcase name="Meaningful names" status="failed" hidden="false">
<error type="LocalVariableNamingConventions" location="Line: 90 - 90 Column: 13 - 22"
file="C:\work\gitlab-repos\ST2MS0_tests_group_d5535b06-ae29-4668-8ad9-bd23b4cc5218\src\main\java\thkoeln\st\st2praktikum\bad_stuff\Robot.java"
message="The local variable name 'snake_case' doesn't match '[a-z][a-zA-Z0-9]*'"></error>
</testcase>
</testsuite>
</suites>
For further examples see tests src/test/resources
.
Deployment
All pipeline scripts normally use the latest version from npmjs.com.
The repository is set up with three different GitHub Actions workflows wich trigger
on pushes to the branches main
, stage
and development
.
- main: Build, run tests and publish new npm package. Fails if: build/tests fail, the version is a beta version or the version has not been updated
- stage: same as main but the version must be a beta-version and the package is tagged as beta
- development: Build and run all tests
Version
Complete packages available at npmjs.com. The versioning is mostly based on semantic versioning.
1.1.2
- fixed a bug which caused packages to fail if there were build through the automated GitHub Actions workflow
1.1.1
- moved docs from readme to divekit-docs
- add continues delivery pipeline
- switch to eslint
- add configurability of pmd principles
- add surefire parsing error flag
- update scripts according to new report-visualizer naming
1.0.8
- Parameters processing added, which allow a restriction of the used mappers
- Error handling: If a mapper does not deliver a valid result, an error is indicated in the unified.xml.
4.12 - Report Visualizer
Architecture overview
Usage in the pipeline
For the usage in the pipeline you just need node
as prerequisite and provide the input-data: target/unified.xml
.
Install and use the report-visualizer as following:
npm install @divekit/report-visualizer
npx report-visualizer --title PROJECT_NAME
Complete sample test-repo pipeline-script
image: maven:3-jdk-11
stages:
- build
- deploy
build: # Build test reports
stage: build
script:
- chmod ugo+x ./setup-test-environment.sh
- ./setup-test-environment.sh # copy code from code repo and ensure that test are NOT overridden
- mvn pmd:pmd # build clean code report
- mvn verify -fn # always return status code 0 => Continue with the next stage
allow_failure: true
artifacts: # keep reports for the next stage
paths:
- target/pmd.xml
- target/surefire-reports/TEST-*.xml
pages: # gather reports and visualize via gitlab-pages
image: node:latest
stage: deploy
script:
- npm install @divekit/report-mapper
- npx report-mapper # run generate unified.xml file
- npm install @divekit/report-visualizer
- npx report-visualizer --title $CI_PROJECT_NAME # generate page
artifacts:
paths:
- public
only:
- master
Getting started
Install
Clone the repository and install everything necessary:
# HTTP
git clone https://github.com/divekit/divekit-report-visualizer.git
# SSH
git clone git@github.com:divekit/divekit-report-visualizer.git
cd ./divekit-report-visualizer
npm ci # install all dependencies
Provide input data
The input data should be provided in the following structure:
divekit-report-visualizer
βββ target
| βββ unified.xml
βββ ...
Run it
Directly with provided input target/unified.xml
node bin/report-visualizer
Use predefined input assets/xml-examples/unified.xml
npm run dev
Or use divekit-report-mapper
result*
npm run dev++
*Requirement is that the divekit-report-visualizer
is located in the same directory as the divekit-report-mapper
.
Output (GitLab Pages)
Output in /public
directory. Which is used for GitLab-pages or could be mounted anywhere.
divekit-report-visualizer
βββ target
| βββ unified.xml
βββ public
| βββ index.html
| βββ style.css
βββ ...
The following picture shows an example output with passed test (green), test failures (orange),
errors (red) and a note (gray).
Deployment
Currently, completely manually. In the future done similar to report-mapper
All pipeline scripts normally use the latest version from npmjs.com.
Version
Complete packages available at npmjs.com. The versioning is mostly based on semantic versioning.
1.0.3
- Updating naming: form
divekit-new-test-page-generator
todivekit-report-visualizer
1.0.2
- Added hidden metadata in the header indicating the number of failed tests.
- Added possibility to pass a special ‘NoteTest’ test case which is displayed separately.
- Updated the error message for generation problems so that it is displayed even if only parts of the test page could not be generated.
- Fixed an error where the test page could not be generated if there was no input.
4.13 - Test Library
The documentation is not yet written. Feel free to add it yourself ;)
4.14 - Test page generator
The documentation is not yet written. Feel free to add it yourself ;)