Verification and Validation

Last modified on February 22nd, 2017.


Welcome to the Verification and Validation guide!

This guide is meant to demonstrate how to use Innoslate for verification and validation. If you haven’t already, we recommend reading our Requirements Management and Analysis guide before proceeding.

According to the IEEE Standard for Systems and Software, Verification and Validation (V&V) “are used to determine whether the development products of a given activity conform to the requirements of that activity and whether the product satisfies its intended use and user needs.” The goal of V&V is to ensure the resultant product, service, or system being developed meets or exceeds customer expectations. Without implementing proper V&V procedures while engineering a system, it is highly likely the product, service, or system wouldn’t even actually be able to be built, let alone function properly.

Innoslate facilitates these system lifecycle procedures with integrated tools and a new, purpose-built user interface called ‘Test Center’.  Our purpose for integrating these tools is to decrease the amount of time it takes to complete the verification and validation processes, and therefore, encourage those processes to occur more frequently throughout the systems engineering lifecycle.

Validation

Before trying to verify the product, service, or system meets the design requirements and system models, it is important to first validate whether or not those requirements and models are correct. It is important to begin the validation process early in the lifecycle to enhance the probability of success by identifying errors early on in the development phase and preparing for the verification process later in the lifecycle.

Requirements Validation

Requirements validation is an iterative process which should be done throughout the systems engineering lifecycle.  Regularly questioning and analyzing the quality of each requirement ensures at the end of your requirements gathering and capture efforts you will be left with well-written and verifiable requirements.

Innoslate provides integrated tools to help automate requirements validation such as the “Quality Check” feature of ‘Requirements View’.  The “Quality Check” feature is used to assess a requirement’s clarity, completeness, consistency, design implications, traceability, and verifiability.  If you haven’t already, read the “Analyze Requirements” section of our Requirements Management and Analysis guide for more information on the “Quality Check” feature of ‘Requirements View’ and instructions on how to run it.

During this process it is important to double-check that each requirement is traceable back to the origin of the requirement and every change has been tracked throughout the life of the requirement.  Innoslate takes care of most of this by automatically creating relationships between entities and the original uploaded artifact and tracking changes to maintain a version history on every entity.

Model Validation

Model validation is another iterative process which should also be done throughout the systems engineering lifecycle. Regularly questioning and analyzing the quality of each model ensures at the end of your behavioral and physical modeling efforts you will be left with a model complete enough to be executable.

Innoslate provides integrated tools to help automate model validation as well such as the ‘Intelligence’ tool to assess a model’s traceability, construction, naming conventions, and more.  If you haven’t already, please read the “Ensuring Overall Model Quality” section of our Model-Based Systems Engineering guide for more information on the “Intelligence” tool and instructions on how to run it.  Other built-in tools include the ‘Discrete Event Simulator’ and the ‘Monte Carlo Simulator’ to help determine whether or not a model is executable and help reduce uncertainty of a product, service, or system’s cost, schedule, and performance.  If you haven’t already, read the “Executing the Model” section of our Model-Based Systems Engineering guide for more information on the simulators and instructions on how to run them.

Verification

Now that you are sure the right requirements and correct system models are captured in Innoslate, you can begin the process of verifying the system built meets the previously validated design requirements and system models.  This process in Innoslate is primarily accomplished using ‘Test Center’, a hierarchical document-like display of Test Case entities with collapsible sections, status roll-up, and inline entity editing.

Add Verification Method Labels

Next you must identify which method or methods of verification are most appropriate to use to verify each design requirement. To capture this information in Innoslate follow the instructions below:

  1. Click the ‘Database’ button on the top navigation bar to navigate to ‘Database View’.

    Click the Database Button
  2. Once in ‘Database View’, click the table icon known as the ‘Table Mode’ button located on the toolbar to switch the view into “Table” mode.

    Click Table Mode Button

  3. To filter the view to only display Requirement entities, click ‘Requirement’ in the “By Class” section of the left sidebar.
  4. Using the horizontal scrollbar on the bottom, scroll to the right until the “Labels” column is visible.
  5. Identify which method of verification is appropriate to verify the first requirement in the table.
  6. Innoslate’s default database schema provides five different verification method labels including Analysis, Demonstration, Inspection, Modeling & Simulation, and Test.  Type the name of the appropriate verification method label into the first cell in the “Labels” column and press the Enter key.

    Add Verification Method Label

  7. Repeat step 6 if needed to add more than one Verification Method label to your requirement.
  8. Press the Enter Key again to save your changes.

    Save Verification Method Labels

  9. Continue identifying appropriate verification methods and adding the corresponding labels until each requirement has a verification method specified.

Download a VCRM Report

Double-check to ensure each requirement has a verification method label by downloading a VCRM report. Follow the instructions below to download a VCRM report:

  1. Click the ‘Requirements’ button on the top navigation bar to navigate to ‘Requirements View’.

    Click the Requirements Button

  2. Once in ‘Requirements View’, click on the ‘Report’ button.

    Click the Report Button

  3. The ‘Download a Report’ panel will appear, for ‘Report Type’, select the ‘VCRM Output’ menu item.

    Select the VCRM Output Menu Item

  4. Click the ‘Create’ button to generate and automatically download a VCRM report.

    Click the Create Button

  5. Once the report has finished downloading, click on the downloaded file to open the VCRM in Word.  An example of the VCRM report is shown below:

    Example VCRM Report

Create a Test Suite

The first step in the verification process is to create a suite of tests to eventually run against the product, system, or service. Follow the instructions below to use ‘Test Center’ to create a test suite:

  1. Click to open the ‘MENU’ drop-down on the top navigation bar and select the ‘Test Center’ menu item under the “General” column.

    Select the Test Center Menu Item

  2. A ‘Create New Test Suite’ dialog will appear. Type in at least a name for the root Artifact entity of your new test suite, and then click the ‘Create’ button.  In this case, we used the name “Autonomous Vehicle Test Suite”.

    Create New Test Suite Dialog

  3. ‘Test Center’ will refresh to display your newly created, empty test suite.

    New Empty Test Suite

  4. Click the ‘New Test Case’ button.

    Click the New Test Case Button

  5. This will navigate you to the end of the test suite where a new row has been added.  Give your new Test Case entity a meaningful name and be sure to include the procedures necessary to perform the test in the entity’s description or decomposition.  In this case, we used the name “Vehicle Steering Test”.

    Vehicle Steering Test

  6. Fill in the ‘Expected Result’ attribute with what conditions are necessary for the test to pass.

    Fill in Expected Result

  7. Continue adding test cases until you are confident you have identified a procedure to verify each design requirement.

Trace Test Cases to Requirements

It is important to maintain traceability throughout the verification process so any engineer reviewing your work months or even years down the road can clearly understand what was tested and why. Follow the instructions below to trace your test cases to your design requirements:

  1. Within ‘Test Center’, click to open the ‘Open’ drop-down and select the ‘Hierarchical Comparison’ menu item under the “Matrices” section.

    Select the Hierarchical Comparison Matrix Menu Item

  2. For ‘Target Entity’ in the left sidebar, select from the drop-down the name of your requirements document’s root Artifact entity: “SAE-Label-5-Automation-Requirements.csv”.  Search or scroll within the drop-down to find your desired entity.

    Select Target Requirements Document

  3. For ‘Target Relationship’ in the left sidebar, select from the drop-down the name of the relation to compare.  In this case, select “verifies”.  Search or scroll within the drop-down to find your desired entity.

    Select Verifies Target Relationship

  4. Click on the blue ‘Generate’ button.

    Click the Generate Button

  5. Trace which individual Test Case entity from your test suite verifies which Requirement entity from your requirements document by clicking the cell in the matrix where the two intersect.  An ‘X’ will appear in the cell indicating a verifies/verified by relationship has been added linking the two entities.  You should end up with a matrix similar to the one pictured below:

    Example Hierarchical Comparison Matrix

  6. Click the ‘Save’ button to persist your changes to your project’s database.

Download a RVTM Report

You have created traceability between your test cases and requirements and added verification method labels to those requirements. You can now download a Requirements Verification Traceability Matrix (RVTM) report as proof of traceability and test coverage. Follow the instructions below to download a RVTM report:

  1. Click the ‘Requirements’ button on the top navigation bar to navigate to ‘Requirements View’.

    Click the Requirements Button

  2. Once in ‘Requirements View’, click on the ‘Report’ button.

    Click the Report Button

  3. The ‘Download a Report’ panel will appear. For ‘Report Type’, select the ‘New RVTM Output’ menu item.

    Select the RVTM Output Menu Item

  4. Click the ‘Create’ button to generate and automatically download a RVTM report.

    Click the Create Button

  5. Once the report has finished downloading, click on the downloaded file to open the RVTM in Excel.

Run a Test Cycle

Now that you have proven traceability and test coverage, it is time to actually perform all the procedures to run all the tests against your product, service, or system. Follow the instructions below to run your first test cycle:

  1. Click to open the ‘MENU’ drop-down on the top navigation bar and select the ‘Test Center’ menu item under the “General” column.

    Select the Test Center Menu Item

  2. Once in ‘Test Center’, click on the ‘New Test Cycle’ button.

    Click the New Test Cycle Button

  3. A ‘Start New Test Cycle’ dialog will appear. For ‘Name’, type in “First”.

    Start New Test Cycle Dialog

  4. Click on the blue ‘Start’ button.

    Click the Start Button

    This sets the ‘Actual Result’ and ‘Status’ attributes of every test case in your suite back to blank and “Not Run” respectively. ‘Test Center’ will refresh to display the changes, as shown below:

    Example Test Cycle Started

  5. Click anywhere within the first test case’s displayed row to enter edit mode of that row.

    Test Case in Edit Mode

  6. Change the ‘Status’ attribute of that test case to “In Progress”.

    Change Status to In Progress

  7. Click the green check-mark ‘Save Changes’ button to save your change.

    Click the Save Changes Button

  8. Using the verification method specified by the test case, perform the test procedure on the product, service or system.
  9. Fill in the ‘Actual Result’ attribute with your observations of what actually happened as a result of the test procedure.

    Fill In Actual Result

  10. Determine if the test has passed, failed, or otherwise, and then change the ‘Status’ attribute to reflect your decision.

    Change Status to Passed

  11. Click the green check-mark ‘Save Changes’ button to save your changes.
  12. Continue testing until every test in the suite has been performed.

Download a Test Cases Report

You have executed each test procedure and determined whether each test passed, failed, or otherwise using that test’s specified verification method. You can now download a Test Cases report as proof of verification of your product, service, or system. Follow the instructions below to download a Test Cases report:

  1. Within ‘Test Center’, click on the ‘Report’ button.

    Click the Report Button

  2. The ‘Download a Report’ panel will appear. For ‘Report Type’, select the ‘Test Cases Output’ menu item.

    Select the Test Cases Output Menu Item

  3. Click the ‘Create’ button to generate and automatically download a Test Cases report.

    Click the Create Button

  4. Once the report has finished downloading, click on the downloaded file to open the Test Cases in Word.  An example of the Test Cases report is shown below:

    Example Test Cases Report

Verification and Validation

Last modified on February 22nd, 2017. 


Welcome to the Verification and Validation guide!

This guide is meant to demonstrate how to use Innoslate for verification and validation. If you haven’t already, we recommend reading our Requirements Management and Analysis guide before proceeding.

According to the IEEE Standard for Systems and Software, Verification and Validation (V&V) “are used to determine whether the development products of a given activity conform to the requirements of that activity and whether the product satisfies its intended use and user needs.” The goal of V&V is to ensure the resultant product, service, or system being developed meets or exceeds customer expectations. Without implementing proper V&V procedures while engineering a system, it is highly likely the product, service, or system wouldn’t even actually be able to be built, let alone function properly.

Innoslate facilitates these system lifecycle procedures with integrated tools and a new, purpose-built user interface called ‘Test Center’.  Our purpose for integrating these tools is to decrease the amount of time it takes to complete the verification and validation processes, and therefore, encourage those processes to occur more frequently throughout the systems engineering lifecycle.

Validation

Before trying to verify the product, service, or system meets the design requirements and system models, it is important to first validate whether or not those requirements and models are correct. It is important to begin the validation process early in the lifecycle to enhance the probability of success by identifying errors early on in the development phase and preparing for the verification process later in the lifecycle.

Requirements Validation

Requirements validation is an iterative process which should be done throughout the systems engineering lifecycle.  Regularly questioning and analyzing the quality of each requirement ensures at the end of your requirements gathering and capture efforts you will be left with well-written and verifiable requirements.

Innoslate provides integrated tools to help automate requirements validation such as the “Quality Check” feature of ‘Requirements View’.  The “Quality Check” feature is used to assess a requirement’s clarity, completeness, consistency, design implications, traceability, and verifiability.  If you haven’t already, read the “Analyze Requirements” section of our Requirements Management and Analysis guide for more information on the “Quality Check” feature of ‘Requirements View’ and instructions on how to run it.

During this process it is important to double-check that each requirement is traceable back to the origin of the requirement and every change has been tracked throughout the life of the requirement.  Innoslate takes care of most of this by automatically creating relationships between entities and the original uploaded artifact and tracking changes to maintain a version history on every entity.

Model Validation

Model validation is another iterative process which should also be done throughout the systems engineering lifecycle. Regularly questioning and analyzing the quality of each model ensures at the end of your behavioral and physical modeling efforts you will be left with a model complete enough to be executable.

Innoslate provides integrated tools to help automate model validation as well such as the ‘Intelligence’ tool to assess a model’s traceability, construction, naming conventions, and more.  If you haven’t already, please read the “Ensuring Overall Model Quality” section of our Model-Based Systems Engineering guide for more information on the “Intelligence” tool and instructions on how to run it.  Other built-in tools include the ‘Discrete Event Simulator’ and the ‘Monte Carlo Simulator’ to help determine whether or not a model is executable and help reduce uncertainty of a product, service, or system’s cost, schedule, and performance.  If you haven’t already, read the “Executing the Model” section of our Model-Based Systems Engineering guide for more information on the simulators and instructions on how to run them.

Verification

Now that you are sure the right requirements and correct system models are captured in Innoslate, you can begin the process of verifying the system built meets the previously validated design requirements and system models.  This process in Innoslate is primarily accomplished using ‘Test Center’, a hierarchical document-like display of Test Case entities with collapsible sections, status roll-up, and inline entity editing.

Add Verification Method Labels

Next you must identify which method or methods of verification are most appropriate to use to verify each design requirement. To capture this information in Innoslate follow the instructions below:

  1. Click the ‘Database’ button on the top navigation bar to navigate to ‘Database View’.

    Click the Database Button

  2. Once in ‘Database View’, click the table icon known as the ‘Table Mode’ button located on the toolbar to switch the view into “Table” mode.

    Click Table Mode Button

  3. To filter the view to only display Requirement entities, click ‘Requirement’ in the “By Class” section of the left sidebar.
  4. Using the horizontal scrollbar on the bottom, scroll to the right until the “Labels” column is visible.
  5. Identify which method of verification is appropriate to verify the first requirement in the table.
  6. Innoslate’s default database schema provides five different verification method labels including Analysis, Demonstration, Inspection, Modeling & Simulation, and Test.  Type the name of the appropriate verification method label into the first cell in the “Labels” column and press the Enter key.

    Add Verification Method Label

  7. Repeat step 6 if needed to add more than one Verification Method label to your requirement.
  8. Press the Enter Key again to save your changes.

    Save Verification Method Labels

  9. Continue identifying appropriate verification methods and adding the corresponding labels until each requirement has a verification method specified.

Download a VCRM Report

Double-check to ensure each requirement has a verification method label by downloading a VCRM report. Follow the instructions below to download a VCRM report:

  1. Click the ‘Requirements’ button on the top navigation bar to navigate to ‘Requirements View’.

    Click the Requirements Button

  2. Once in ‘Requirements View’, click on the ‘Report’ button.

    Click the Report Button

  3. The ‘Download a Report’ panel will appear, for ‘Report Type’, select the ‘VCRM Output’ menu item.

    Select the VCRM Output Menu Item

  4. Click the ‘Create’ button to generate and automatically download a VCRM report.

    Click the Create Button

  5. Once the report has finished downloading, click on the downloaded file to open the VCRM in Word.  An example of the VCRM report is shown below:

    Example VCRM Report

Create a Test Suite

The first step in the verification process is to create a suite of tests to eventually run against the product, system, or service. Follow the instructions below to use ‘Test Center’ to create a test suite:

  1. Click to open the ‘MENU’ drop-down on the top navigation bar and select the ‘Test Center’ menu item under the “General” column.

    Select the Test Center Menu Item

  2. A ‘Create New Test Suite’ dialog will appear. Type in at least a name for the root Artifact entity of your new test suite, and then click the ‘Create’ button.  In this case, we used the name “Autonomous Vehicle Test Suite”.

    Create New Test Suite Dialog

  3. ‘Test Center’ will refresh to display your newly created, empty test suite.

    New Empty Test Suite

  4. Click the ‘New Test Case’ button.

    Click the New Test Case Button

  5. This will navigate you to the end of the test suite where a new row has been added.  Give your new Test Case entity a meaningful name and be sure to include the procedures necessary to perform the test in the entity’s description or decomposition.  In this case, we used the name “Vehicle Steering Test”.

    Vehicle Steering Test

  6. Fill in the ‘Expected Result’ attribute with what conditions are necessary for the test to pass.

    Fill in Expected Result

  7. Continue adding test cases until you are confident you have identified a procedure to verify each design requirement.

Trace Test Cases to Requirements

It is important to maintain traceability throughout the verification process so any engineer reviewing your work months or even years down the road can clearly understand what was tested and why. Follow the instructions below to trace your test cases to your design requirements:

  1. Within ‘Test Center’, click to open the ‘Open’ drop-down and select the ‘Hierarchical Comparison’ menu item under the “Matrices” section.

    Select the Hierarchical Comparison Matrix Menu Item

  2. For ‘Target Entity’ in the left sidebar, select from the drop-down the name of your requirements document’s root Artifact entity: “SAE-Label-5-Automation-Requirements.csv”.  Search or scroll within the drop-down to find your desired entity.

    Select Target Requirements Document

  3. For ‘Target Relationship’ in the left sidebar, select from the drop-down the name of the relation to compare.  In this case, select “verifies”.  Search or scroll within the drop-down to find your desired entity.

    Select Verifies Target Relationship

  4. Click on the blue ‘Generate’ button.

    Click the Generate Button

  5. Trace which individual Test Case entity from your test suite verifies which Requirement entity from your requirements document by clicking the cell in the matrix where the two intersect.  An ‘X’ will appear in the cell indicating a verifies/verified by relationship has been added linking the two entities.  You should end up with a matrix similar to the one pictured below:

    Example Hierarchical Comparison Matrix

  6. Click the ‘Save’ button to persist your changes to your project’s database.

Download a RVTM Report

You have created traceability between your test cases and requirements and added verification method labels to those requirements. You can now download a Requirements Verification Traceability Matrix (RVTM) report as proof of traceability and test coverage. Follow the instructions below to download a RVTM report:

  1. Click the ‘Requirements’ button on the top navigation bar to navigate to ‘Requirements View’.

    Click the Requirements Button

  2. Once in ‘Requirements View’, click on the ‘Report’ button.

    Click the Report Button

  3. The ‘Download a Report’ panel will appear. For ‘Report Type’, select the ‘New RVTM Output’ menu item.

    Select the RVTM Output Menu Item

  4. Click the ‘Create’ button to generate and automatically download a RVTM report.

    Click the Create Button

  5. Once the report has finished downloading, click on the downloaded file to open the RVTM in Excel.

Run a Test Cycle

Now that you have proven traceability and test coverage, it is time to actually perform all the procedures to run all the tests against your product, service, or system. Follow the instructions below to run your first test cycle:

  1. Click to open the ‘MENU’ drop-down on the top navigation bar and select the ‘Test Center’ menu item under the “General” column.

    Select the Test Center Menu Item

  2. Once in ‘Test Center’, click on the ‘New Test Cycle’ button.

    Click the New Test Cycle Button

  3. A ‘Start New Test Cycle’ dialog will appear. For ‘Name’, type in “First”.

    Start New Test Cycle Dialog

  4. Click on the blue ‘Start’ button.

    Click the Start Button

    This sets the ‘Actual Result’ and ‘Status’ attributes of every test case in your suite back to blank and “Not Run” respectively. ‘Test Center’ will refresh to display the changes, as shown below:

    Example Test Cycle Started

  5. Click anywhere within the first test case’s displayed row to enter edit mode of that row.

    Test Case in Edit Mode

  6. Change the ‘Status’ attribute of that test case to “In Progress”.

    Change Status to In Progress

  7. Click the green check-mark ‘Save Changes’ button to save your change.

    Click the Save Changes Button

  8. Using the verification method specified by the test case, perform the test procedure on the product, service or system.
  9. Fill in the ‘Actual Result’ attribute with your observations of what actually happened as a result of the test procedure.

    Fill In Actual Result

  10. Determine if the test has passed, failed, or otherwise, and then change the ‘Status’ attribute to reflect your decision.

    Change Status to Passed

  11. Click the green check-mark ‘Save Changes’ button to save your changes.
  12. Continue testing until every test in the suite has been performed.

Download a Test Cases Report

You have executed each test procedure and determined whether each test passed, failed, or otherwise using that test’s specified verification method. You can now download a Test Cases report as proof of verification of your product, service, or system. Follow the instructions below to download a Test Cases report:

  1. Within ‘Test Center’, click on the ‘Report’ button.

    Click the Report Button

  2. The ‘Download a Report’ panel will appear. For ‘Report Type’, select the ‘Test Cases Output’ menu item.

    Select the Test Cases Output Menu Item

  3. Click the ‘Create’ button to generate and automatically download a Test Cases report.

    Click the Create Button

  4. Once the report has finished downloading, click on the downloaded file to open the Test Cases in Word.  An example of the Test Cases report is shown below:

    Example Test Cases Report