Integrate Test Plans and CI Systems

Integrating test plans and your CI System can simplify:

  • managing the selection of tests that should be executed for each CI run
  • balance the testing load across multiple instances
  • simplify the integration of result uploads

This guide will be specific to how to integrate Test Center and gitlab when using Squish for testing, but the general concepts and scripts should transfer easily to other CI systems that will support similar concepts.

Managing Tests

The base assumption is that you manage test plan templates manually within the Test Plan Templates view. Here you could manage a single variation, or multiple different variations if you want to execute different sets of tests based on development branches or the type of test run (e.g. nightly, regression, performance, ... ).

Pipeline Setup

For this guide we will specifically look at the testing stage, and assume that in one of the previous stages your application was build and made available for testing.

The testing stage will be separated into:

  1. Executing the test plan template
  2. Spawning testing agents
  3. Transferring the test results back to gitlab

The following snippets assume that within the gitlab CI/CD Settings of the group or project the following variables are defined:

  • TESTCENTER_URL pointing to your test center instance
  • TESTCENTER_TOKEN containing a valid Test Center upload/access token

Furthermore for this guide we assume that you make the following files shipped with Test Center available on your job instances:

  • testcenter/examples/squishrunner-wrapper.py
  • testcenter/examples/testplanclient.py

Executing Test Plan Template

This example job:

  • looks up the test plan template id with the name CI Tests main
  • executes the test plan template
  • sets the batch and test plan execution name based on gitlab specific variables:
    • $CI_COMMIT_REF_NAME
    • $CI_PIPELINE_IID
  • transfers the ID of the created test plan execution as $TESTPLAN_ID to subsequent steps via the testplanid.env file artifact.
execute-testplan:
  stage: test
  extends: .docker_runner_config
  image: python:3.9
  script:
    - template_id=$(python3 testcenter/examples/testplanclient.py lookup $TESTCENTER_URL $TESTCENTER_TOKEN "example-project" "CI Tests main") || exit 1
    - python3 testcenter/examples/testplanclient.py execute $TESTCENTER_URL $TESTCENTER_TOKEN $template_id "CI Tests $CI_COMMIT_REF_NAME-$CI_PIPELINE_IID" "$CI_COMMIT_REF_NAME-$CI_PIPELINE_IID" > testplanid.env || exit $?
  allow_failure: true
  artifacts:
    reports:
      dotenv: testplanid.env

The example uses the testplanclient.py script which we ship with Test Center within the testcenter/examples folder. What it does is trivial, so you can easily replicate the behavior by directly using the Execution API

If you have multiple different templates you could conditionally adjust the test plan template lookup string, which in this example is set to CI Tests main, but could easily be set to check for specific branch names or other pipeline properties.

Testing Agents

Depending on how many different configurations you want to test on, or how many parallel executions you are targeting, you can now spawn the workers for test execution. For this example we will do parallel testing on 3 instances.

To execute our Squish based tests we will rely on a helper script squishrunner-wrapper.py that is shipped with Test Center and can be found within the testcenter/examples folder.

The wrapper will take care of the communication with Test Center. It will reserve tests for execution from your Test Center instance and then call squishrunner for each of the reserved tests until the test plan is finished.

squish-tests:
  stage: test
  image: docker.example.com:8080/test-aut
  needs:
    - execute-testplan
  allow_failure: true
  parallel: 3
  script:
    - |
      python "testcenter/examples/squishrunner-wrapper.py" \
        "${SQUISH_PREFIX}/bin/squishrunner" \
        --tc-url "$TESTCENTER_URL" \
        --tc-api-token "$TESTCENTER_TOKEN" \
        --cap OS=Linux \
        --suite-dir "$SUITE_DIR" \
        --testplan "$TESTPLAN_ID" \
        --exitCodeOnFail 234 \
        --local \
        --reportgen "stdout" \
        --scriptargs "example_argument" \
        --label branch=$CI_COMMIT_REF_NAME \
        --label .git.branch=origin/$CI_COMMIT_REF_NAME \
        --label .git.revision=$CI_COMMIT_SHA \
        --label host=$CI_RUNNER_SHORT_TOKEN \
        --label .reference.url="[GitLab]($CI_PIPELINE_URL)" || exit $?

The first parameter to squishrunner-wrapper is optional and contains the path to the actual squishrunner of your Squish installation. If you don't provide it the SQUISH_PREFIX environment variable needs to be set, or the bin directory of your Squish installation needs to be within the PATH environment variable.

The other parameters are described in the following table:

ParameterDescription
--tc-urlNeeds to point to the URL of your Test Center instance. Note that squishrunner will create a websocket connection towards that instance.
--tc-api-tokenA valid upload/access token needs to be provided
--testplanA reference to the test plan execution that the agent should be working on
--suite-dirWhen reserving executions from the Test Center instance, Test Center will respond with a test suite name and test case name, squishrunner-wrapper.py will try to locate the test within the current working directory, or within the provided --suite-dir.
--capUsed to provide labels as key=value pairs that describe the capabilities of the test instance. Could be things like the Operating System (e.g. OS=Linux), which Squish edition is configured for testing, or which AUT should be tested.
--labelExtra labels that are not part of the capabilities, but should be added to the uploaded results
--exitCodeOnFailA squishrunner option that is used to make test failures visible on the command line and helps the squishrunner-wrapper.py to return the correct status code as well (234 in the example is mostly arbitrary, with the primary requirement that it is not used for any other differentiation).

Server Assignment

If neither the --local or --host Squish command line options are provided to the squishrunner-wrapper.py, it will try to request a matching server from the Test Center instance. Learn more about how to manage this list of servers.

Execute tests on Test Center instance

If your tests are setup to work with a remote squishserver, then you could alternatively use the squishserver-wrapper.py that you can also find within the testcenter/examples folder. That script will launch a squishserver and keep it running for as long as there are unexecuted tests within the specified test plan execution that match the capabilities of the squishserver.

This can be helpful when orchestrating the test execution from within Test Center. Follow this link to learn more about test execution within Test Center.

Advantages would be the tighter integration with the repository integration and not having to distribute test scripts across your testing agents.

Report Results

Once all test scripts have finished executing we can now report back the test results to gitlab. For this we simply download and store a gitlab specific junit export from your Test Center instance.

upload-results:
  stage: test
  extends: .docker_runner_config
  image: python:3.9
  needs:
    - squish-tests
  script:
    - 'curl -H "Authorization: Token $TESTCENTER_TOKEN" -o junit.xml "$TESTCENTER_URL/upload/results?project=testcenter&batch=$CI_COMMIT_REF_NAME-$CI_PIPELINE_IID&format=gitlab.junit&option=backlinks"'
  allow_failure: true
  artifacts:
    reports:
      junit: junit.xml

© 2024 The Qt Company Ltd. Documentation contributions included herein are the copyrights of their respective owners.
The documentation provided herein is licensed under the terms of the GNU Free Documentation License version 1.3 as published by the Free Software Foundation.
Qt and respective logos are trademarks of The Qt Company Ltd. in Finland and/or other countries worldwide. All other trademarks are property of their respective owners.

Search Results