This is the page for the Quality Assurance team. It includes links for QA services, documentation, tests, among with any information related to the different QA processes in the project.

Documentation

Weekly tests reports

The images are tested weekly using both manual and automated tests and a report is generated on the lavaphabbridge. Historical test reports (from before lavaphabbridge) are still available.

Services

LAVA (Linaro Automated Validation Architecture)

Tools

Tools used for QA tasks and infrastructure.

  • lqa: It submits the automated tests jobs to LAVA. It also offers a LAVA API that can be used by other scripts.
  • phab-handler: It updates the Phabricator tasks status by the B&I infrastructure.
  • phab-tasks: Tool that helps checking Phabricator tasks status. It is mainly used to check for new bugs during the weekly testing rounds and helps to generate the email report.
  • lwr: The LAVA weekly round tool is used to fetch the automated tests results and generate the wiki report page for the weekly testing round.

QA ImmutableRootfsTests

Testing on a immutable rootfs Tests should be self-contained and not require changes to the rootfs: any change to the rootfs causes the tested system to diverge from the one used in production, reducing the value of testing. Changing the rootfs is also impossible or very cumbersome to do with some deployment systems such as dm-verity or OSTree. Other systems may simply not ship package management systems like apt/dpkg due to size constraints, making package dependencies not viable. [Read More]

QA LQA

LQA is both a tool and API for LAVA quality assurance tasks. It stands for LAVA Quality Assurance and its features go from submitting LAVA jobs, collect tests results, and query most of the LAVA metadata. It is a tool developed by Collabora and it is the standard way to submit test jobs for Apertis images. Installation Fetch the latest code from the git repository and install using python pip as it will handle automatically the required dependencies:: [Read More]

QA Test Cases Guidelines

The following are guidelines to create and edit test cases. Please follow them unless there is a compelling reason to make an exception. Workflow Before developing a new test case: Make sure the test case doesn't exist by checking the Test Cases list. Make sure nobody is already working on a test case by checking the Phabricator tasks. Determine the main feature to focus test upon and set the test case identifier accordingly. [Read More]

QA Personal LAVA Tests

This tutorial explains how to submit a personal LAVA job for an Apertis test, which is very useful during development, either to debug a test, or to check that everything is working as expected with the test before its final integration. Running a personal test basically consists in adding the LAVA test definition file in a personal repository and submitting a LAVA job that will fetch this file from there to execute the test. [Read More]