Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
M
moobench
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Wiki
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Package Registry
Model registry
Operate
Environments
Terraform modules
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Terms and privacy
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
SustainKieker
moobench
Graph
7339ecba588d2020e788d9897dd1f0941e5a6fba
Select Git revision
Branches
12
KIEKER-1983-MooBench-Kieker-Python
KIEKER-1990
UpdateDockerfile
addCloudprofiler
addCloudprofiler2
addCloudprofiler3
cp
main
default
protected
testCP
testCP2
testCP3
testNewDockerfile
12 results
You can move around the graph by using the arrow keys.
Begin with the selected commit
Created with Raphaël 2.2.0
8
May
7
26
Apr
28
Jan
24
22
16
14
10
5
3
2
Sep
31
Aug
11
Jul
10
6
4
6
Jun
5
13
Apr
8
Feb
7
6
2
1
30
Jan
27
22
20
19
18
17
16
14
8
Nov
18
Sep
17
14
13
2
1
31
Aug
23
22
19
17
15
12
11
10
8
3
27
Jul
22
2
1
23
Jun
22
17
8
Nov
14
Oct
7
Sep
3
31
Aug
29
27
25
22
21
20
19
18
16
21
Jul
20
19
18
17
16
15
14
13
10
26
Feb
3
Dec
2
1
30
Nov
29
28
25
23
21
20
14
30
Nov
28
10
Feb
29
Aug
1
30
Jul
29
17
25
Jun
24
17
12
6
5
4
3
27
May
23
22
20
19
14
8
7
22
Apr
20
Mar
19
18
14
13
11
7
6
5
4
26
Feb
25
13
6
21
Nov
18
28
Oct
25
17
16
15
Remove external file
Add token
Enable auto-push
Debug output for GH action
Put workflow_dispatch first, hoping this will allow manual triggering
Remove debug output
Fix folder for formatting
More debug output
Add MOOBENCH_CONFIGURATIONS to create correct JSON
Allow writing results
Output JSON for debugging
Move result file to main folder
Fix naming: customSmallerIsBetter
Execute benchmarking with GH benchmark-action
OpenTelemetry usually only has 3 configurations, so 30 values are
Don't fail immediately after benchmarking error (but still fail the
Also print warmed-up values for intermediary results
Check whether specified configurations are legal
Don't execute prometheus by default
Also output configurations
Adapt to updates results file names
Refactor OpenTelemetry: Use case instead of multiple ifs
Update to changed default opentelemetry port
Fail in Jenkins after failure instead of continueing to run and fail
Also fix workflows
Store all results in results-$FRAMEWORK_NAME to make copying between
Unify Kieker and OpenTelemetry benchmark.sh structure
Don't store downloaded python files
Check for right result count after no default logging execution
Also adapt inspectIT
Use correct python configurations
Also apply MOOBENCH_CONFIGURATIONS in OpenTelemetry
Also set MOOBENCH_CONFIGURATIONS in python
Create all labels, even if they are not used, to avoid error
Handle changed indices in R script
Correctly get labels
Only create R labels for selected configurations
Fix OT config
Remove Java 8 for inspectIT
Also update OpenTelemetry and Kieker-python to Java 21 and remove Java 8
Loading