Assignment directive Execution of HIL regression test suites As part of the release process for engine and aftertreatment SW systems regression test suites are performed as part of quality assurance.
Types of tests vary and could be but not limited to protocol OBD cybersecurity functionality tests and CPU load tests. Tests are carried out mostly in a HIL environment that is a usually one or two ECUs actuators simulated sensors/actuators.
Some tests for example Cyber security are carried out in bench environment. These regressions suites are performed in a semiautomatic process. Setting up HIL environment and getting environment to a testable state is currently done manually. Once system is setup regression test suite is started and test execution will run automatically. Upon test suite conclusion results are uploaded to a test management system and further analyzed.
Requirements
Typical setup tasks: Test run planning meeting Booking HIL rigs Creating test runs in Test Management System Exporting test cases from Test Management System Creating test cases branch in GIT. Configuring SW and flashing ECU. Verifying HIL environment is in a testable state. Test run follow up meetings. Starting test runs.
Checking on progress. Restarting/debugging if test execution crashes. Getting all tests to execute. Typical post processing tasks. Uploading test results to Test Management System Analyzing tests not passed and classing them based on type of failure. Rerunning test cases manually for further analysis and debugging.
Tools used: In house tools for flashing and configuration of SW. Polarion for test case specification and test run result database. Jira for project and issue tracking GIT for version control of test cases. HIL rigs are ETAS based and models developed in Simulink. Vision/Inca for online calibration and data acquisition.
Pycharm Python for test cases. Pytest as test framework. Various batch scripts
Typical setup tasks: Test run planning meeting Booking HIL rigs Creating test runs in Test Management System Exporting test cases from Test Management System Creating test cases branch in GIT. Configuring SW and flashing ECU. Verifying HIL environment is in a testable state. Test run follow up meetings. Starting test runs. Checking on progress. Restarting/debugging if test execution crashes. Getting all tests to execute. Typical post processing tasks. Uploading test results to Test Management System Analyzing tests not passed and classing them based on type of failure. Rerunning test cases manually for further analysis and debugging. Tools used: In house tools for flashing and configuration of SW. Polarion for test case specification and test run result database. Jira for project and issue tracking GIT for version control of test cases. HIL rigs are ETAS based and models developed in Simulink. Vision/Inca for online calibration and data acquisition. Pycharm Python for test cases. Pytest as test framework. Various batch scripts