KEcoLab/Developer Docs/CI CD Pipeline: Difference between revisions
CI/CD pipeline page is created |
Content added |
||
Line 6: | Line 6: | ||
The execution of a GitLab CI/CD pipeline is typically triggered by specific events. This configuration is written in <code>.yaml</code> file. For this project, the <code>rules</code> keyword is used in each stage (<code>build</code>, <code>energy_measurement</code>, and <code>resul</code>) with the following condition: | The execution of a GitLab CI/CD pipeline is typically triggered by specific events. This configuration is written in <code>.yaml</code> file. For this project, the <code>rules</code> keyword is used in each stage (<code>build</code>, <code>energy_measurement</code>, and <code>resul</code>) with the following condition: | ||
<code class="language-yaml"> | |||
rules: | |||
- if: $CI_PIPELINE_SOURCE == 'merge_request_event' | |||
</code> | |||
This rule specifies that the stage (and all jobs within it) should only be executed when the pipeline is triggered by a '''merge request event'''. | |||
<h1> Setup & Configuration </h1> | |||
This CI/CD pipeline is a remote execution of the actual program running in a Lab at German.To execute the pipeline, certain configurations are need to be done. | |||
<h2> a. Environmental Variables </h2> | |||
Environmental variables are key-value pairs that are configured and used to execute the program effectively. There are two environmental variables that need to be set. | |||
<ol> | |||
<li> '''LABPC_IP'''</li> | |||
<ul> | |||
<li> IP address of the machine where the application will be tested. </li> | |||
<li> It is set to <code>192.168.170.23</code> </li> | |||
</ul> | |||
<li> '''PM_IP''' </li> | |||
<ul> | |||
<li> IP address of the Power Meter used for energy measurements </li> | |||
<li> It is set to <code>192.168.170.22</code> </li> | |||
</ul> | |||
</ol> | |||
<h2> b. Tags</h2> | |||
Tags are keywords that specify which runner (machines, such as physical or virtual) should execute the pipeline. For this pipeline, <code>EcoLabWorker</code> is used. | |||
<h1> Pipeline Stages</h1> | |||
Different stages of pipeline is defined in order to execute the process of measuring and analysing the energy consumption smoothly and error free. For this pipeline, we defined three stages: Build, Measurement and Result. | |||
<ol> | |||
<li> <h2>Build stage</h2> </li> | |||
First stage of the pipeline is Build. In this stage, the application that needs to be tested in LABPC is installed. | |||
<ol> | |||
<li> '''Docker image''' - The build stage utilizes the alpine Docker image. Alpine Linux is a lightweight and security-focused distribution, making it an ideal choice for CI/CD environments where minimizing image size and maximizing efficiency are crucial. </li> | |||
<li> '''Before Script''' - The before_script section contains commands that are executed before the main script section. Here, ‘echo $CI_MERGE_REQUEST_TITLE’ to print the title of the merge request.</li> | |||
<li> '''Script''' - The core of the build stage lies within the script section. This section defines the commands that perform the actual application installation. Once connected to the LABPC, the script uses the flatpak remote-add command to add the Flathub remote repository. This allows Flatpak to download and install applications from Flathub. flatpak install command installs the application specified by the merge request title.</li> | |||
<li> '''rules''' - The rules section defines when this stage should be executed. In this case, it ensures that the build stage only runs when the pipeline is triggered by a merge request event.</li> | |||
</ol> | |||
<pre> | |||
# Build stage | |||
build: | |||
stage: build | |||
image: alpine | |||
tags: | |||
- EcoLabWorker | |||
before_script: | |||
- echo $CI_MERGE_REQUEST_TITLE | |||
script: | |||
# Flatpak command for installing test application based on merge request title from flathub | |||
- ssh -o StrictHostKeyChecking=no -i ~/.ssh/kecolab kecolab@$LABPC_IP " | |||
flatpak remote-add --user --if-not-exists flathub https://flathub.org/repo/flathub.flatpakrepo && | |||
flatpak install --user $CI_MERGE_REQUEST_TITLE -y " | |||
rules: | |||
- if: $CI_PIPELINE_SOURCE == 'merge_request_event' | |||
</pre> | |||
<li> <h2>Energy measurement stage</h2> </li> | |||
Once the application is installed, the energy measurement stage will commence. Its purpose is to quantify the energy consumption of the application under various usage scenarios. | |||
<ol> | |||
<li> '''timeout''' - Energy measurements can sometimes take a long time, especially for complex applications or extended test scenarios. This timeout prevents the pipeline from getting stuck if a test runs longer than expected.</li> | |||
<li> '''Before Script''' - Before the actual measurements begin, the before_script section prepares the LABPC for the tests bycopies the test scripts from the GitLab runner to the /tmp directory on the LABPC. The configuration.sh file performs specific configuration to be performed on the LABPC before the actual test scenarios are executed.</li> | |||
<li> '''Script''' - The script executes three test scenarios such as baseline, suspended and idle. For each scenario it performs, power meter readings, hardware readings, scenario execution, process termination and data export.</li> | |||
<li> '''Artifacts section''' - The artifacts section defines which files generated during this stage should be passed on to the next stage (the result stage).</li> | |||
</ol> | |||
<pre> | |||
# Energy measurement stage | |||
energy_measurement: | |||
stage: energy_measurement | |||
image: alpine | |||
timeout: 12h | |||
tags: | |||
- EcoLabWorker | |||
before_script: | |||
# Copy Usage scenario scripts from test_scripts dir to the LABPC | |||
- scp -o StrictHostKeyChecking=no -r -i ~/.ssh/kecolab scripts/test_scripts/$CI_MERGE_REQUEST_TITLE/* kecolab@$LABPC_IP:/tmp/ | |||
# Check for configuration script for application under test | |||
- ssh -o StrictHostKeyChecking=no -i ~/.ssh/kecolab kecolab@$LABPC_IP 'export DISPLAY=:0 && export TERM=xterm && cd /tmp/ && if [ -f "configuration.sh" ]; then chmod +x configuration.sh; fi; exit' | |||
script: | |||
- export CURRENT_DATE=$(date +%Y%m%d) | |||
# Start taking PM Readings (Script 1) | |||
- cd /home/gitlab-runner/GUDEPowerMeter && nohup python3 check_gude_modified.py -i 1 -x 192.168.170.22 >> ~/testreadings1.csv 2>/dev/null & | |||
# Start taking Hardware readings using collectl (for script 1) | |||
. . . | |||
Check full code [here](https://invent.kde.org/teams/eco/remote-eco-lab/-/blob/master/pipelines/.energy_measurement.yaml?ref_type=heads) | |||
</pre> | |||
<li> <h2>Result stage</h2> </li> | |||
The result stage is the final stage in energy measurement pipeline. Its primary function is to process the raw data collected in the energy_measurement stage and generate meaningful reports that summarize the application's energy consumption characteristics. | |||
The script section defines the steps involved in analyzing the data and creating the reports: | |||
<ol> | |||
<li> '''Data extraction:''' The raw data from measurement stage (<code>.csv</code> file) is extracted using <code>gunzip</code>.</li> | |||
<li> '''Data processing:''' The data is preprocessing using an R script <code>~/Preprocessing.R</code> and performs necessary data cleaning, transformation, and aggregation.</li> | |||
<li> '''Report Generation:''' A set of R scripts is executed to generate reports for each scenario, such as <code>~/sus_analysis_script.R</code> for report under suspended scenario and <code>~/idle_analysis_script.R</code> for report under idle scenario.</li> | |||
</ol> | |||
<h3> Artifacts section:</h3> | |||
The artifacts section specifies which files generated in this stage should be made available for download after the pipeline completes. This includes all the generated reports (<code>SUS_Report.pdf</code>, <code>Idle_Report.pdf</code>), LaTeX files, graphics directories, and supporting files. By defining these files as artifacts, they can be easily downloaded from the GitLab CI/CD interface, allowing developers to review the energy analysis results. | |||
<pre> | |||
# Result Stage (To Generate Energy Measurement Report) | |||
result: | |||
stage: result | |||
image: invent-registry.kde.org/sysadmin/ci-images/kecolab-analysis:latest | |||
dependencies: | |||
# Use Artifacts from Previous stage | |||
- energy_measurement | |||
script: | |||
- export CURRENT_DATE=$(date +%Y%m%d) | |||
- gunzip test1.csv-kecolab-$CURRENT_DATE.tab.gz | |||
- gunzip test2.csv-kecolab-$CURRENT_DATE.tab.gz | |||
- gunzip test3.csv-kecolab-$CURRENT_DATE.tab.gz | |||
# Preprocess Raw data for OSCAR Script | |||
- Rscript ~/Preprocessing.R test1.csv-kecolab-$CURRENT_DATE.tab test2.csv-kecolab-$CURRENT_DATE.tab test3.csv-kecolab-$CURRENT_DATE.tab $CI_PROJECT_DIR | |||
# Run OSCAR Analysis script to generate a report SUS | |||
- Rscript ~/sus_analysis_script.R | |||
- cp -r ~/SUS_Report.pdf ~/SUS_Report.tex ~/sus_graphics ~/SUS_Report_files $CI_PROJECT_DIR/ | |||
# Run OSCAR Analysis script to generate a report for Idle Mode | |||
- Rscript ~/idle_analysis_script.R | |||
- cp -r ~/Idle_Report.pdf ~/Idle_Report.tex ~/idle_graphics ~/Idle_Report_files $CI_PROJECT_DIR/ | |||
artifacts: | |||
paths: | |||
- SUS_Report.pdf | |||
- SUS_Report.tex | |||
- SUS_Report_files | |||
- sus_graphics | |||
- Idle_Report.pdf | |||
- Idle_Report.tex | |||
- Idle_Report_files | |||
- idle_graphics | |||
rules: | |||
- if: $CI_PIPELINE_SOURCE == 'merge_request_event' | |||
</pre> | |||
</ol> | |||
<h1> Conclusion</h1> | |||
In conclusion, this GitLab CI/CD configuration effectively automates the process of analyzing and measuring the application’s energy consumption by leveraging Flatpak for streamlined application installation, SSH for secure remote execution on the LABPC, and R scripts for automated report generation. |
Latest revision as of 12:24, 1 April 2025
Introduction
Continuous Integration and Continuous Development(CI/CD) pipelines automate the process of building, testing and deploying software. This project leverages GitLab’s CI/CD pipelines to automatically measure and analysis the software’s energy consumption through a set of process. This pipeline not only ensures the application's functionality but also provides valuable insights into its energy footprint under various usage scenarios.
Pipeline Triggers and Rules
The execution of a GitLab CI/CD pipeline is typically triggered by specific events. This configuration is written in .yaml
file. For this project, the rules
keyword is used in each stage (build
, energy_measurement
, and resul
) with the following condition:
rules:
- if: $CI_PIPELINE_SOURCE == 'merge_request_event'
This rule specifies that the stage (and all jobs within it) should only be executed when the pipeline is triggered by a merge request event.
Setup & Configuration
This CI/CD pipeline is a remote execution of the actual program running in a Lab at German.To execute the pipeline, certain configurations are need to be done.
a. Environmental Variables
Environmental variables are key-value pairs that are configured and used to execute the program effectively. There are two environmental variables that need to be set.
- LABPC_IP
- IP address of the machine where the application will be tested.
- It is set to
192.168.170.23
- PM_IP
- IP address of the Power Meter used for energy measurements
- It is set to
192.168.170.22
b. Tags
Tags are keywords that specify which runner (machines, such as physical or virtual) should execute the pipeline. For this pipeline, EcoLabWorker
is used.
Pipeline Stages
Different stages of pipeline is defined in order to execute the process of measuring and analysing the energy consumption smoothly and error free. For this pipeline, we defined three stages: Build, Measurement and Result.
-
Build stage
- Docker image - The build stage utilizes the alpine Docker image. Alpine Linux is a lightweight and security-focused distribution, making it an ideal choice for CI/CD environments where minimizing image size and maximizing efficiency are crucial.
- Before Script - The before_script section contains commands that are executed before the main script section. Here, ‘echo $CI_MERGE_REQUEST_TITLE’ to print the title of the merge request.
- Script - The core of the build stage lies within the script section. This section defines the commands that perform the actual application installation. Once connected to the LABPC, the script uses the flatpak remote-add command to add the Flathub remote repository. This allows Flatpak to download and install applications from Flathub. flatpak install command installs the application specified by the merge request title.
- rules - The rules section defines when this stage should be executed. In this case, it ensures that the build stage only runs when the pipeline is triggered by a merge request event.
-
Energy measurement stage
- timeout - Energy measurements can sometimes take a long time, especially for complex applications or extended test scenarios. This timeout prevents the pipeline from getting stuck if a test runs longer than expected.
- Before Script - Before the actual measurements begin, the before_script section prepares the LABPC for the tests bycopies the test scripts from the GitLab runner to the /tmp directory on the LABPC. The configuration.sh file performs specific configuration to be performed on the LABPC before the actual test scenarios are executed.
- Script - The script executes three test scenarios such as baseline, suspended and idle. For each scenario it performs, power meter readings, hardware readings, scenario execution, process termination and data export.
- Artifacts section - The artifacts section defines which files generated during this stage should be passed on to the next stage (the result stage).
-
Result stage
- Data extraction: The raw data from measurement stage (
.csv
file) is extracted usinggunzip
. - Data processing: The data is preprocessing using an R script
~/Preprocessing.R
and performs necessary data cleaning, transformation, and aggregation. - Report Generation: A set of R scripts is executed to generate reports for each scenario, such as
~/sus_analysis_script.R
for report under suspended scenario and~/idle_analysis_script.R
for report under idle scenario.
First stage of the pipeline is Build. In this stage, the application that needs to be tested in LABPC is installed.
# Build stage build: stage: build image: alpine tags: - EcoLabWorker before_script: - echo $CI_MERGE_REQUEST_TITLE script: # Flatpak command for installing test application based on merge request title from flathub - ssh -o StrictHostKeyChecking=no -i ~/.ssh/kecolab kecolab@$LABPC_IP " flatpak remote-add --user --if-not-exists flathub https://flathub.org/repo/flathub.flatpakrepo && flatpak install --user $CI_MERGE_REQUEST_TITLE -y " rules: - if: $CI_PIPELINE_SOURCE == 'merge_request_event'
Once the application is installed, the energy measurement stage will commence. Its purpose is to quantify the energy consumption of the application under various usage scenarios.
# Energy measurement stage energy_measurement: stage: energy_measurement image: alpine timeout: 12h tags: - EcoLabWorker before_script: # Copy Usage scenario scripts from test_scripts dir to the LABPC - scp -o StrictHostKeyChecking=no -r -i ~/.ssh/kecolab scripts/test_scripts/$CI_MERGE_REQUEST_TITLE/* kecolab@$LABPC_IP:/tmp/ # Check for configuration script for application under test - ssh -o StrictHostKeyChecking=no -i ~/.ssh/kecolab kecolab@$LABPC_IP 'export DISPLAY=:0 && export TERM=xterm && cd /tmp/ && if [ -f "configuration.sh" ]; then chmod +x configuration.sh; fi; exit' script: - export CURRENT_DATE=$(date +%Y%m%d) # Start taking PM Readings (Script 1) - cd /home/gitlab-runner/GUDEPowerMeter && nohup python3 check_gude_modified.py -i 1 -x 192.168.170.22 >> ~/testreadings1.csv 2>/dev/null & # Start taking Hardware readings using collectl (for script 1) . . . Check full code [here](https://invent.kde.org/teams/eco/remote-eco-lab/-/blob/master/pipelines/.energy_measurement.yaml?ref_type=heads)
The result stage is the final stage in energy measurement pipeline. Its primary function is to process the raw data collected in the energy_measurement stage and generate meaningful reports that summarize the application's energy consumption characteristics.
The script section defines the steps involved in analyzing the data and creating the reports:
Artifacts section:
The artifacts section specifies which files generated in this stage should be made available for download after the pipeline completes. This includes all the generated reports (SUS_Report.pdf
, Idle_Report.pdf
), LaTeX files, graphics directories, and supporting files. By defining these files as artifacts, they can be easily downloaded from the GitLab CI/CD interface, allowing developers to review the energy analysis results.
# Result Stage (To Generate Energy Measurement Report) result: stage: result image: invent-registry.kde.org/sysadmin/ci-images/kecolab-analysis:latest dependencies: # Use Artifacts from Previous stage - energy_measurement script: - export CURRENT_DATE=$(date +%Y%m%d) - gunzip test1.csv-kecolab-$CURRENT_DATE.tab.gz - gunzip test2.csv-kecolab-$CURRENT_DATE.tab.gz - gunzip test3.csv-kecolab-$CURRENT_DATE.tab.gz # Preprocess Raw data for OSCAR Script - Rscript ~/Preprocessing.R test1.csv-kecolab-$CURRENT_DATE.tab test2.csv-kecolab-$CURRENT_DATE.tab test3.csv-kecolab-$CURRENT_DATE.tab $CI_PROJECT_DIR # Run OSCAR Analysis script to generate a report SUS - Rscript ~/sus_analysis_script.R - cp -r ~/SUS_Report.pdf ~/SUS_Report.tex ~/sus_graphics ~/SUS_Report_files $CI_PROJECT_DIR/ # Run OSCAR Analysis script to generate a report for Idle Mode - Rscript ~/idle_analysis_script.R - cp -r ~/Idle_Report.pdf ~/Idle_Report.tex ~/idle_graphics ~/Idle_Report_files $CI_PROJECT_DIR/ artifacts: paths: - SUS_Report.pdf - SUS_Report.tex - SUS_Report_files - sus_graphics - Idle_Report.pdf - Idle_Report.tex - Idle_Report_files - idle_graphics rules: - if: $CI_PIPELINE_SOURCE == 'merge_request_event'
Conclusion
In conclusion, this GitLab CI/CD configuration effectively automates the process of analyzing and measuring the application’s energy consumption by leveraging Flatpak for streamlined application installation, SSH for secure remote execution on the LABPC, and R scripts for automated report generation.