Module 18 - T313

T313: Applying Your Test Plan to the NTCIP 1204 v03 ESS Standard

HTML of the Student Supplement

(Note: This document has been converted from the Student Supplement to 508-compliant HTML. The formatting has been adjusted for 508 compliance, but all the original text content is included, plus additional text descriptions for the images, photos and/or diagrams have been provided below.)

 

T313 - Applying Your Test Plan to the NTCIP 1204 v03 ESS Standard. See extended text description below.

(Extended Text Description: Large graphic cover page with dark blue background with the title in white letters “T313 - Applying Your Test Plan to the NTCIP 1204 v03 ESS Standard.” At the middle left is the “Standards ITS Training” logo with a white box and letters in blue and green. The words “Student Supplement” and “RITA Intelligent Transportation Systems Joint Program Office” in white lettering are directly underneath the logo. Three light blue lines move diagonally across the middle of the blue background.)

 

T313 Applying Your Test Plan to the NTCIP 1204 v03 ESS Standard

 

Table of Contents

 

Purpose  - 2

NTCIP 1204 Environmental Sensor Station History - 3

Presentation Example - 4

Test Plan from a Sample Project - 7

Sample Test Design Specification - 11

Sample Test Case Specification - 12

Glossary - 17

References - 18

 

PURPOSE

This supplement provides additional information for the Professional Capacity Building (PCB) Module T313, Applying Your Test Plan to the NTCIP 1204 v03 ESS Standard.

Module T313 provides participants with the information needed to assist agencies on how to create a test plan specific to their Environmental Sensor Station (ESS) needs, based on the NTCIP 1204 standard.  

This module helps the participant understand the elements of the ESS standard that are required to apply test plans to verify that the agency’s ESS system meets the design specifications and is conformant to the NTCIP 1204 standard while following standard testing methodologies. An example is provided in the module.

 

NTCIP 1204 Environmental Sensor Station History


Presentation Example

The functional requirement “Retrieve Wind Data” is an example of how to use the NTCIP 1204 v03 standard to test that a functional requirement has been fulfilled. The full details of this requirement, including the description, implementation, and selection of the requirement, along with the test design, test case, and test procedure specifications for this requirement, are provided below.

The functional requirement is:

3.5.2.3.2.2         Retrieve Wind Data

Upon request, the ESS shall return the current wind speed and direction for each wind sensor connected to the ESS.

 

In the Requirements Traceability Matrix (RTM), this functional requirement traces to the following design:

Table 29 Requirements Traceability Matrix (RTM)

Req ID

Dialog

Requirement

Object ID

Add'l Requirements/Object

3.5.2.3.2.2

F.4.6

Retrieve Wind Data

5.6.8

windSensorTableNumSensors

5.6.10.1

windSensorlndex

5.6.10.4

windSensorAvgSpeed

5.6.10.5

windSensorAvgDi rection

5.6.10.6

windSensorSpotSpeed

5.6.10.7

windSensorSpotDirection

5.6.10.8

windSensorGustSpeed

5.6.10.9

windSensorGustDirection

5.6.10.10

windSensorSituation

 

The dialog referenced in the RTM is used to retrieve objects that are in tabular form:

F.4.6 Generic Retrieve Table Dialog

NOTE—This is a generic dialog that is referenced by requirements in the RTM with specific object names.

The list of objects provided by the specific dialog shall include:

  1. An object that indicates the number of rows in the table;
  2. The object(s) that serve as the index field of the table row; and
  3. The list of columnar objects to be retrieved from the table.

The standardized dialog for a management station to retrieve a table shall be as follows:

  1. The management station shall GET the number of rows in the table; and
  2. For each row of the table, the management station shall GET all objects referenced by the specific dialog that references this generic dialog, except for the number of rows object and the index object(s).


Sample object is windSensorSpotSpeed. The object definition is as follows:

5.6.10.6 Wind Sensor Spot Speed

windSensorSpotSpeed OBJECT-TYPE
SYNTAX INTEGER (0..65535)
ACCESS read-only
STATUS mandatory
DESCRIPTION "<Definition>The wind speed in tenths of meters per second measured by the wind sensor. For mobile platforms, the wind speed shall be corrected for vehicle movement.
<SetConstraint>read-only
<DescriptiveName>WindSensor.spotSpeed:quantity
<Valid Value Rule>The value of 65535 shall indicate an error condition or missing value.
<Data Concept Type>Data Element
<Unit>tenths of meters per second"
::= { windSensorEntry 6 }

The Retrieve Wind Data requirement can be selected by the agency by means of the Protocol Requirements List, as follows:

Please see the Extended Text Description below.

(Extended Text Description: An excerpt from the NTCIP 1204 PRL shows a table with the content as shown below. User Need IDs 2.5.2, 2.5.2.1 and 2.5.1.2 all have Project Requirements “Yes” highlighted in red boxes. User Need ID 2.5.2.1.1 has Project Requirement “No” highlighted in a red box. FR IDs 3.5.2.3.2.1 and 3.6.1 have Project Requirements “NA” highlighted in red boxes.

User Need ID

User Need

FR ID

Functional Requirement

Conformance

Project Requirement

Additional Project Requirements

2.5.2

Sensor Manager Features

0.1 (1..*)

Yes/ No

2.5.2 1 (Weather)

Monitor Weather Conditions

0.2 (1..")

Yes/ No/ NA

2.5.2.1.1 (Pressure)

Monitor Atmospheric Pressure

0.3(1..")

Yes/ No

3.5.2.3.2 1

Retrieve Atmospheric Pressure

M

Yes/ NA

3.6.1

Required Number of Atmospheric Pressure Sensors

M

Yes/ NA

The ESS shall support at least_

atmospheric pressure sensors.

2.5.2 1.2 (Wind)

Monitor Winds

0.3(1..')

Yes/ No/ NA

3.5 2.3.2.2

Retrieve Wind Data

M

Yes/ NA

3.6.2

Required Number of Wind Sensors

M

Yes/ NA

The ESS shall support at least X wind sensors.

)

 

The test cases that are required to show conformance to NTCIP can be found indexed by the Functional Requirement ID in the Requirements to Test Case Traceability Matrix (RTCTM):

Please see the Extended Text Description below.

(Extended Text Description: An excerpt from the NTCIP 1204 PRL shows a table with the content as shown below. FR ID 3.5.2.3.2.2 is highlighted in a red box with an arrow leading to an excerpt from the NTCIP 1204 Requirements to Test Cases Traceability Matrix. The RTCTM has superheadings of Requirement and Test Cases, each with subheadings of ID and Title. Requirement ID 3.5.2.3.2.2 is highlighted in a red box, and pointed to by the arrow from the PRL table. Test Case ID C.2.3.3.3 is highlighted in a red box.

NTCIP 1204 v03 - Protocol Requirements List (PRL)

User Need ID

User Need

FR ID

Functional Requirement

Conformance

Project Requirement

Additional Project Requirements

2.5.2.1.2 (Wind)

Monitor Winds

0.3(1..')

Yes/ No/ NA

3 5 2.3.2.2

Retrieve Wind Data

M

Yes/ NA

3.6.2

Required Number of Wind Sensors

M

Yes/ NA

The ESS shall support at least 1 wind sensors.

NTCIP 1204 v03/Requirements to Test Case Traceability Matrix (RTCTM)

Requirement

Test Case

ID 7

Title

ID

Title

C.2.3.3.2

Retrieve Atmospheric Pressure

3.5.2.3.2.2\

Retrieve Wind Data

C. 2.3.3.3

Retrieve Wind Data

3.5.2.3.2.3

Retrieve Temperature

C.2.3.3.4

Retrieve Temperature

)

 

Test Plan from a Sample Project

Test plan identifier

TP-C-ESS-1

Introduction

This test plan has been developed to define the process that the agency will use for the design approval test to ensure that the ESS provided by the manufacturer fulfills all project requirements related to NTCIP 1204.

Test Items

This test plan will test the NTCIP-related operation of an ESS. The version and revision of the equipment to be tested shall be recorded on the test item transmittal.

Features to be tested

All requirements selected in the NTCIP 1204 Final Completed Protocol Requirements List (PRL) shall be tested.

Features not to be tested

Features that are not defined in NTCIP 1204 are not directly covered by this test plan. These features typically include, but are not limited to:

While some aspects of these features may be tested (e.g., all NTCIP 1204 communications rely upon the basic operation of lower-layer protocols; tests may include verification of those performance requirements defined in NTCIP 1204; etc.), this test plan does not focus on these types of requirements because they are not the focus of NTCIP 1204.

Approach

This test is to be performed on a single item and will be valid for the remaining items submitted under contract with the agency that have the same firmware version number.

The test analyst will perform each selected test case from the ESS test procedures. A test case shall be deemed to be selected if it traces from a requirement selected in the final completed PRL. The tracing of requirements to test cases is provided in Appendix A of the sample test case specification, which follows from the Requirements to Test Cases Traceability Matrix (RTCTM) of NTCIP 1204 v03.

Item Pass/fail criteria

In order to pass the test, the ESS shall pass all test procedures included in this test plan without demonstrating any characteristic that fails to meet project requirements.

Suspension criteria and resumption requirements

The test may be suspended, at the convenience of test personnel, between the performances of any two test procedures. The test shall always resume at the start of a selected test procedure.

If any modifications are made to the ESS, a complete regression test may be required in order to pass this test plan.

Test deliverables

The test manager will ensure that the following documents are developed and entered into the configuration management system upon their completion:

All test documentation will be made available to both the agency and the developer. All test documentation will be made available in a widely recognized computer file format such as Microsoft Word or Adobe Acrobat. In addition, the files from the test software shall be provided in their native file format as defined by the test software.

Testing tasks

Table A-1: Testing Tasks

Task #

Task Name

Predecessor

Responsibility

NTCIP Knowledge Level (low =1 to high = 5)

1

Finalize Test Plan

Finalize Completed PRL

Test Manager

2

2

Complete the Test Item Transmittal Form and transmit the component to the Test Group

Implement ESS Standard

Developer

1

3

Perform Tests and produce Test Log and Test Incident Reports

2

Test Analyst

5

4

Resolve Test Incident Reports

3

Developer, Test Manager

2

5

Repeat Steps 2–4 until all test procedures have succeeded

4

N/A

N/A

6

Prepare the Test Summary report

5

Test Analyst

2

7

Transmit all test documentation to the Agency Project Manager

6

Test Manager

1

 

Environmental needs

All test cases covered by this test plan require the device under test to be connected to a test application as depicted in Figure A-1. A data analyzer may also be used to capture the data exchanged between the two components. The test environment should be designed to minimize any complicating factors that may result in anomalies unrelated to the specific test case. Failure to isolate such variables in the test environment may result in false results to the test. For example, the device may be conformant with the standard, but communication delays could result in timeouts and be misinterpreted as failures.

Please see the Extended Text Description below.

(Extended Text Description: Figure A-1: Field Device Test Environment, showing the Device Under Test at the top, double-headed arrows to a Communications Cloud in the middle, and a double-headed arrow from there to a laptop running Test Application software shown at the bottom. To the right, a laptop running Data Analyzer software is labeled as Optional.)


Figure A-1:  Field Device Test Environment

The specific test software and data analyzer to be used are identified in the tools clause of the approach section of this test plan.

The tests will be performed at the Agencies’ facility. This location will provide the following:

Tools

The following software will be used for the testing:

For testing the wind sensor, a means of simulating wind speed will be required.

Communications environment

All tests shall be performed using the following communications environment, unless otherwise defined in the specific test procedure.

Connection Type: RJ-45 Ethernet

Subnet Profile: NTCIP 2104 – Ethernet

Transport Profile: NTCIP 2202 – Internet

Read Community Name:  public

Write Community Name: administrator

Timeout Value:  200 ms

Responsibilities

The following roles are defined in this test plan:

Staffing and training needs

The following staffing is expected for this test plan:

If the agency project manager is not familiar with NTCIP testing, he or she should become familiar with NTCIP 9012 and FHWA guidance on the procurement of ITS systems. The test manager and test analyst must be familiar with how to use the test software. Many software systems come with extensive online help, but the test personnel may also need detailed knowledge of the NTCIP standards to fully perform their duties.

Schedule

Testing will commence within two weeks of the receipt of the hardware from the manufacturer. The testing is expected to take one week followed by one additional week of work to prepare the report.

Risks and contingencies

If the ESS repeatedly fails the testing procedures, it may be returned to the manufacturer for repair. The decision to return the ESS is at the discretion of the project committee. The developer of the ESS shall correct any problems identified with the ESS. Upon completion of the modifications, the developer shall resubmit the component for another complete test consisting of all test cases.

Approvals

______________________________________
Test Manager

_________________
Date

______________________________________
Agency Project Manager

_________________
Date

______________________________________
Developer

_________________
Date

 

Sample Test Design Specification

Test Design Specification Identifier

TDS-C-ESS-1

This Test Design Specification follows from Test Plan TP-C-ESS-1.

Features to be tested

The features selected for testing are in accordance with the completed project Protocol Requirements List (PRL). An excerpt showing the requirements for wind data is shown below.

Please see the Extended Text Description below.

(Extended Text Description: An excerpt from the NTCIP 1204 PRL shows a table with the content as shown below. User Need IDs 2.5.2, 2.5.2.1 and 2.5.1.2 all have Project Requirements “Yes” highlighted in red boxes. User Need ID 2.5.2.1.1 has Project Requirement “No” highlighted in a red box. FR IDs 3.5.2.3.2.1 and 3.6.1 have Project Requirements “NA” highlighted in red boxes.

User Need ID

User Need

FR ID

Functional Requirement

Conformance

Project Requirement

Additional Project Requirements

2.5.2

Sensor Manager Features

0.1

Yes/ No

2.5.2 1 (Weather)

Monitor Weather Conditions

0.2

Yes/ No/ NA

2.5.2.1.1 (Pressure)

Monitor Atmospheric Pressure

0.3(1..-)

Yes/ No/ NA

3.5.2.3.2.1

Retrieve Atmospheric Pressure

M

Yes/ NA

3.6.1

Required Number of Atmospheric Pressure Sensors

M

Yes/ NA

The ESS shall support at least____
atmospheric pressure sensors.

2.5.2.1.2 (Wind)

Monitor Winds

0.3

Yes/ No/ NA

3.5.2.3.2.2

Retrieve Wind Data

M

Yes/ NA

3.6.2

Required Number of Wind Sensors

M

Yes/ NA

The ESS shall support at least 1 wind sensors

)

 

Approach Refinements

Wind speed is to be simulated by means of rotating the propeller of the wind sensor by means of a calibrated rotating device.

Test Identification

For testing wind speed, two test cases are to be performed: 3.5.2.3.2.2-1 Retrieve Wind Data - Calm, and 3.5.2.3.2.2-2 Retrieve Wind Data - Hurricane.

Feature Pass/Fail Criteria

In order to pass, the ESS shall pass all test cases included in this test design without demonstrating any characteristic that fails to meet project requirements.


Sample Test Case Specification

Test Case Specification Identifier

TCS-C-ESS-1

Objectives

These test cases are to be used to verify conformance of the ESS with NTCIP 1204 v03. They specify the input and output parameters for use with the NTCIP 1204 v03 test procedures.

Test Items

These test cases will provide details of the tests to be performed to verify conformance of the ESS with NTCIP 1204 v03. The items selected for testing are in accordance with the completed project Protocol Requirements List (PRL).

Input Specifications

The individual input specifications for each test case are given in Appendix A.

Output Specifications

The individual output specifications for each test case are given in Appendix A.

Environmental Needs

All Test Cases require the device under test to be connected to a test application as depicted in Figure A-1. A data analyzer may also be used to capture the data exchanged between the two components. The test environment should be designed to minimize any complicating factors that may result in anomalies unrelated to the specific test case. Failure to isolate such variables in the test environment may result in false results to the test. For example, the device may be conformant with the standard, but communication delays could result in timeouts and be misinterpreted as failures.

Please see the Extended Text Description below.

(Extended Text Description: Figure A-1: Field Device Test Environment, showing the Device Under Test at the top, double-headed arrows to a Communications Cloud in the middle, and a double-headed arrow from there to a laptop running Test Application software shown at the bottom. To the right, a laptop running Data Analyzer software is labeled as Optional.)

Figure A-1:  Field Device Test Environment

The specific test software and data analyzer to be used are identified in the tools clause of the approach section of the test plan.

The tests will be performed at the ESS manufacturer’s facility. This location will provide the following:

For testing the wind sensor, a means of simulating wind speed will be required.

Special Procedural Requirements

No special procedural requirements exist.

Intercase Dependencies

Intercase dependencies are as detailed in NTCIP 1204 v03.

 


Appendix A:

The table below provides the inputs and expected results for two conditions of wind speed.

Functional Requirement 3.5.2.3.2.2 - Retrieve Wind Data

Wind speeds are to be simulated by means of rotating the propeller at a variable calibrated rate.

NTCIP 1204 Test Case

Simulated Wind Speed

windSensorSpotSpeed.n

windSensorSituation.n

C.2.3.3.3

0 km/h

0

3
(calm)

C.2.3.3.3

> 118 km/h

> 118

11
(hurricaneForceWinds)

From NTCIP 1204 v03, Test Case C.2.3.3.3 is used to test the requirement 3.5.2.3.2.2 - Retrieve Wind Data. Note that this Test Case will be performed twice: once simulating calm conditions (Test Case 3.5.2.3.2.2-1) and once simulating hurricane-force wind conditions (Test Case 3.5.2.3.2.2-2).

Test

Case:

3.3

Title:

Retrieve Wind Data

Description:

This test case verifies that the ESS allows a management station to determine current wind information.

Variables:

Required_ Wind_ Sensors

\PRL 3.6.2

Pass/Fail Criteria:

The device under test (DUT) shall pass every verification step included within the Test Case to pass the Test Case.


Step

Test Procedure

Device

1

CONFIGURE: Determine the number of wind sensors required by the specif cation (PRL 3.6 2). RECORD this information as: » Requi red_Wi nd_Senso rs

2

GET the following object(s): »wind S en so rTableN u niSen sors.O

Pass / Fail (Sec. 3.5.2.3.2.2)

3

VERIFY that the RESPONSE VALUE for windSen sorTableNumSensorsO is greater than or equal to Required_Wind_Sensors.

Pass / Fail (Sec. 3.6.2)

4

Determine the RESPONSE VALUE for windSensorTableNurri Sen sors.O. RECORD this information as:
» Su pported_Wind_Senso rs

5

FOR EACH value, N, from 1 to Supported Wind Sensors, perform Steps 5.1 through 5.22.

5.1

GET the following object(s):
»wind SensorAvgSpeed.N
»wind SensorAvg Direction.N
»wind SensorSpotSpeed.N
»wind SensorSpotDirection.N
»windSensorGustSpeed.N
»windSensorGustDirection.N
»windSensorSituation.N

Pass / Fail (Sec. 3.5.2.3.2.2)

5.2

VERIFY that the RESPONSE VALUE for windSensorAvgSpeed N is greater than or equal to 0.

Pass / Fail (Sec 5.6.10 4)

5.3

VERIFY that the RESPONSE VALUE for windSensorAvgSpeed N is less than or equal to 65535.

Pass / Fail (Sec. 5.6.10.4)

5.4

VERIFY that the RESPONSE VALUE for windSen sorAvgSpeed.N is APPROPRIATE.

Pass / Fail (Sec 5.6 10 4)

5.5

VERIFY that the RESPONSE VALUE for windSen sor Avg Direction. N is greater than or equal to 0.

Pass / Fail (Sec 5.6 10 5)

5.6

VERIFY that the RESPONSE VALUE for windSensorAvgDirection.N is less than or equal to 361.

Pass / Fail (Sec 5.6.10.5)

5.7

VERIFY that the RESPONSE VALUE for windSensorAvgDirection.N is APPROPRIATE.

Pass / Fail (Sec 5.6 10 5)

5.8

VERIFY that the RESPONSE VALUE for windSensorSpotSpeed.N is greater than or

Pass / Fail

5.9

VERIFY that the RESPONSE VALUE for windSen sorSpotSpeed.N is less than or equal to 65535.

Pass / Fail (Sec 5.6.10.6)

5.10

VERIFY that the RESPONSE VALUE for windSen sorSpotSpeed.N is APPROPRIATE.

Pass / Fail (Sec. 5.6 10 6)

5.11

VERIFY that the RESPONSE VALUE for windSen sorSpotDirection.N is greater than or equal to 0.

Pass / Fail (Sec 5.6 10 7)

5.12

VERIFY that the RESPONSE VALUE for windSensorSpotDirection.N is less than or equal to 361.

Pass / Fail (Sec. 5.6.10.7)

5.13

VERIFY that the RESPONSE VALUE for windSensorSpotDirection.N is APPROPRIATE.

Pass / Fail (Sec. 5.6 10 7)

5.14

VERIFY that the RESPONSE VALUE for windSensorGustSpeed.N is greater than or equal to 0.

Pass / Fail (Sec 5.6 10 8)

5.15

VERIFY that the RESPONSE VALUE for windSen sorGustSpeed.N is less than or equal to 65535.

Pass / Fail (Sec. 5.6 10 8)

5.16

VERIFY that the RESPONSE VALUE for windSen sorGustSpeed.N is APPROPRIATE.

Pass / Fail (Sec 5.6 10 8)

5.17

VERIFY that the RESPONSE VALUE for windSensorGustDireetion.N is greater than or equal to 0.

Pass / Fail (Sec. 5.6.10.9)

5.18

VERIFY that the RESPONSE VALUE for windSen sorGustDireetion.N is less than or equal to 361.

Pass / Fail (Sec. 5.6 10 9)

5.19

VERIFY that the RESPONSE VALUE for windSen sorGustDireetion.N is APPROPRIATE.

Pass / Fail (Sec 5.6 10 9)

5.20

VERIFY that the RESPONSE VALUE for windSensorSituation.N is greater than or equal to 1.

Pass / Fail (Sec. 5.6.10.10)

5.21

VERIFY that the RESPONSE VALUE for windSen sorSituation.N is less than or equal to 12.

Pass / Fail (Sec. 5.6.10.10)

5.22

VERIFY that the RESPONSE VALUE for windSen sorSituation.N is APPROPRIATE.

Pass / Fail


Test Case Results

Tested By:

Date Tested:

Pass/Fail

Test Case Notes:

 

Glossary

The following is a glossary of terms that are used throughout the module.

Test Case

A document that specifies the actual inputs, predicted results, and set of execution conditions for a test. It also identifies constraints on the test procedures resulting from use of that specific test case.

NOTE—See IEEE 829 for a more detailed discussion of test cases.

Test Design Specification

Per IEEE 829, “A document specifying the details of the test approach for a ... feature or combination of ... features and identifying the associated tests.”  For testing NTCIP conformance, this document includes the completed Protocol Requirements List and Requirements to Test Cases Traceability Matrix.

Test Plan

A document that prescribes the scope, approach, resources, and schedule of the testing activities. It identifies the items to be tested, the features to be tested, the testing tasks to be performed, the personnel responsible for each task, and the risks associated with the plan.

Test Procedure

A document that specifies a sequence of actions for the execution of a test. The test procedures test the implementation of the requirement. Test procedures are separated from test design as they are intended to be followed step by step and should not have extraneous detail.

 

References

Environmental Sensor Stations

Systems Engineering

Testing