Validation

Run-Time Integration Tests

The run-time integration tests are a mechanism for validating the Reference Stack’s core functionalities.

The tests are run using the oeqa test framework. Please refer to OEQA FVP for more information on the this framework. Some tests are also built as a Yocto Package Test (ptest) and implemented using the Bash Automated Test System (BATS).

The integration tests run on an image and depend on its target Reference Stack architecture (Baremetal or Virtualization). The tests may also change their behaviour depending on the targeted architecture.

To run the tests via kas, please refer to the section Validation. Details about the Arm SystemReadyTM IR ACS testing are also included in the mentioned section. In this section, more details on the structure, implementation and debugging of the tests is given.

Testing Structure

The Compute Elements and Components of the Reference Stack that are tested by the framework are detailed below. The testing scripts can be found in yocto/meta-rd-n2-automotive/lib/oeqa/runtime/cases/.

All of the Computing Elements and Components have their terminal output logged for debugging. These can be found in build/tmp_baremetal/work/fvp_rd_n2_automotive-poky-linux/baremetal-image/1.0-r0/testimage for the Baremetal architecture and in build/tmp_virtualization/work/fvp_rd_n2_automotive-poky-linux/virtualization-image/1.0-r0/testimage/ for the Virtualization architecture.

  • RSS

    The script that implements the test is rss.py. The test waits for the RSS to log that it is releasing the SCP. This is its last action as part of the RSS-oriented Boot Flow.

  • SCP

    The script that implements the test is scp_firmware.py. The test waits for the SCP to log that it has successfully initialized and started all of its internal modules. It also checks whether the SCP has logged any errors, in which case the test fails.

  • Safety Island

    The script that implements the corresponding tests is safety_island.py.

    • test_boot

      The test waits for the Safety Island to log that the Zephyr OS is booting.

    • test_shell

      The test waits for the Safety Island shell prompt after booting (this test depends on test_boot).

    • test_ping

      The test pings the Safety Island from the Primary Compute and vice versa and checks that an answer is received (this test depends on test_shell).

    • test_hipc

      The test verifies Heterogeneous InterProcessor Communication (HIPC) between the Safety Island (using zperf) and the Primary Compute (using iperf). The tested configurations are:

      • The Safety Island as an iperf server (UDP/TCP) and the Primary Compute as a client (UDP/TCP).

      • The Safety Island as an iperf client (UDP/TCP) and the Primary Compute as a server (UDP/TCP).

      This test depends on test_ping.

  • Primary Compute
    • TF-A

      The script that implements the test is trusted_firmware_a.py. The test waits for the Primary Compute to log that it is entering the normal world as defined in RSS-oriented Boot Flow.

    • Linux shell login

      The script that implements the test is linuxlogin.py. The test waits for Primary Compute to log the linux login prompt.

    • Reset

      The script that implements the test is reset.py. The test sends a reboot command to the linux console and then sends several reset commands to U-Boot, checking that the system effectively resets.

    • Baremetal

      The entry point to these tests is ptest_ssh.py, which triggers the ptest-runner to launch the system-tests implemented via BATS. To find out more about the applicable tests, please refer to System Tests.

    • Dom0 (Virtualization architecture)

      The entry point to these tests is ptest_ssh.py, hence, it runs the same tests as Baremetal. To find out more about the applicable tests, please refer to System Tests.

    • DomU (Virtualization architecture)

      The entry point to these tests is ptest_domu.py. The test enters the DomU console through xl console domu from Dom0 and runs the ptest-runner command, those system-tests, implemented via BATS, that are applicable for DomU. To find out more about the applicable tests, please refer to System Tests.

System Tests

The System Tests are available for both the Baremetal and Virtualization architecture images and consist of a series of BATS tests that can be found in yocto/meta-rd-n2-automotive/recipes-test/system-tests/files.

Detailed below is information on which test is applicable to which architecture and a brief description of what each test does:

  • Baremetal architecture:
    • rtc

      The BATS script implementing the test is 01-rtc.bats. Checks that the rtc (real-time clock) device and its correct driver are available and accessible via the filesystem and verifies that the hwclock command runs successfully.

    • watchdog

      The BATS script implementing the test is 02-watchdog.bats. Checks that the watchdog device and its correct driver are available and accessible via the filesystem.

    • networking

      The BATS script implementing the test is 03-networking.bats. Checks that the network device and its correct driver are available and accessible via the filesystem and that outbound connections work (invoking wget).

    • smp

      The BATS script implementing the test is 04-smp.bats. Checks for CPU availability and that basic functionality works, like enabling and stopping CPUs and preventing all of them from being disabled at the same time.

    • virtiorng

      The BATS script implementing the test is 05-virtiorng.bats. Check that the virtio-rng device is available through the filesystem and that it is able to generate random numbers when required.

  • Virtualization architecture:
    • Dom0:
      • Same tests as baremetal.

      • xendomains The BATS script implementing the test is 06-xendomains.bats. Checks if restarting Xen domains is possible and verifies MPAM configuration.

    • DomU:
      • networking and smp as it was explained for Baremetal.

The System Tests are built and installed in the image according to the following BitBake recipe: yocto/meta-rd-n2-automotive/recipes-test/system-tests/system-tests.bb.

Integration Tests Implementation

This section gives a high-level description of how the integration testing logic is implemented.

To enable the integration tests, the testimage.bbclass is used. This class supports running automated tests against images. The class handles loading the tests and starting the image.

The Writing New Tests section of the Yocto Manual explains how to write new tests when using the testimage.bbclass. These are placed under yocto/meta-rd-n2-automotive/lib/oeqa/runtime/cases and will be selected by the different machines/configurations by modifying the TEST_SUITES variable. For example, the file yocto/meta-rd-n2-automotive/conf/machine/fvp-rd-n2-automotive.conf adds the linuxboot test to the TEST_SUITES variable and the fvp-rd-n2-automotive-xen.inc appends ptest_domu to it to add the DomU tests to the collection of tests to be run, only in the case in which Xen is included.

For the System Tests, the system-tests.bb that inherits the ptest framework/class and installs the BATS test files located under yocto/meta-rd-n2-automotive/recipes-test/system-tests/files. Each image recipe (Baremetal or Virtualization) includes system-tests.bb.

Arm SystemReadyTM IR Tests Implementation

The section Validation describes how to run the Arm SystemReadyTM IR ACS tests on the Reference Stack. This section describes the internal workings of the Yocto testing setup:

The Reference Stack has baseline files for the results of the ACS tests. These files are stored under yocto/meta-rd-n2-automotive/lib/acs. When the Validation tests are executed, new files are produced that are in turn compared to the baseline. If any change is detected, the Arm SystemReadyTM tests fail and new diff html files containing the differences with the baseline are generated in the path printed in the console. These diff files can be used to identify the tests for which the result changed.

The logic to compare the newly generated result files with the baseline can be found in yocto/meta-rd-n2-automotive/lib/acs/analyze.py. This file also contains the logic to generate the diff html files.

As per the Yocto recipes that implement the tests, these are:

 yocto/meta-rd-n2-automotive/recipes-test/systemready-acs/edk2-test-parser.bb
 yocto/meta-rd-n2-automotive/recipes-test/systemready-acs/systemready-scripts.bb
 yocto/meta-rd-n2-automotive/recipes-test/systemready-acs/systemready-ir-acs.bb

The recipe systemready-ir-acs.bb inherits from the class:

yocto/meta-rd-n2-automotive/classes/systemready-acs.bbclass

Both edk2-test-parser.bb and systemready-scripts.bb fetch code that is used to parse the results of the ACS tests and produce report files.

The recipe systemready-ir-acs.bb fetches the bootable prebuilt ACS image needed to run the ACS tests and the rest of the testing logic is implemented by its inheritance from systemready-acs.bbclass.

systemready-acs.bbclass in turn inherits from testimage and hence executes the tests in the ACS prebuilt image via the task do_testimage. The function that gets executed after this task is acs_logs_handle and is the one that actually uses the scripts obtained in edk2-test-parser.bb and systemready-scripts.bb to generate the reports. It then calls on yocto/meta-rd-n2-automotive/lib/acs/analyze.py to make the comparison with the baseline and generate the diff files in case any difference is found.