Module 41 - T203

T203: How to Develop Test Cases For an ITS Standards-Based Test Plan

HTML of the PowerPoint Presentation

(Note: This document has been converted from a PowerPoint presentation to 508-compliant HTML. The formatting has been adjusted for 508 compliance, but all the original text content is included, plus additional text descriptions for the images, photos and/or diagrams have been provided below.)

 

Slide 1:

Welcome - Graphic image of introductory slide. Please see the Extended Text Description below.

(Extended Text Description: Welcome - Graphic image of introductory slide. A large dark blue rectangle with a wide, light grid pattern at the top half and bands of dark and lighter blue bands below. There is a white square ITS logo box with words "Standards ITS Training" in green and blue on the middle left side. The word "Welcome" in white is to the right of the logo. Under the logo box is the logo for the U.S. Department of Transpotation, Office of the Assistant Secretary for Research and Technology.)

 

Slide 2:

Welcome slide with Ken Leonard and screen capture of home webpage. Please see the Extended Text Description below.

(Extended Text Description: This slide, entitled "Welcome" has a photo of Ken Leonard, Director, ITS Joint Program Office, on the left hand side, with his email address, Ken.Leonard@dot.gov. A screen capture snapshot of the home webpage is found on the right hand side - for illustration only - from August 2014. Below this image is a link to the current website: www.pcb.its.dot.gov - this screen capture snapshot shows an example from the Office of the Assistant Secretary for Research and Development - Intelligent Transportation Systems Joint Program Office - ITS Professional Capacity Building Program/Advanced ITS Education. Below the main site banner, it shows the main navigation menu with the following items: About, ITS Training, Knowledge Exchange, Technology Transfer, ITS in Academics, and Media Library. Below the main navigation menu, the page shows various content of the website, including a graphic image of professionals seated in a room during a training program. A text overlay has the text Welcome to ITS Professional Capacity Building. Additional content on the page includes a box entitled What's New and a section labeled Free Training. Again, this image serves for illustration only. The current website link is: http://www.pcb.its.dot.gov.)

 

Slide 3:

Activity. A placeholder graphic with an image of hand over a computer keyboard to show that an activity is taking place.

 

Slide 4:

T203 Part 1 of 2: How to Develop Test Cases for an ITS Standards-based Test Plan, Part 1 of 2

 

Slide 5:

Instructor

Headshot photo of Manny Insignares, Vice President Technology, Consensus Systems Technologies, New York, NY, USA

Manny Insignares

Vice President Technology

Consensus Systems Technologies

New York, NY, USA

 

Slide 6:

Target Audience

  • Traffic management and engineering staff
  • Maintenance staff
  • System developers
  • Testing personnel
  • Private and public sector users including manufacturers

 

Slide 7:

Recommended Prerequisite(s)

  • T101: Introduction to ITS Standards Testing
  • T201: How to Write a Test Plan
  • T202: Overview of Test Design Specifications, Test Case Specifications, and Test Procedures

 

Slide 8:

Curriculum Path (Testing)

Curriculum Path: A graphical illustration indicating the sequence of training modules that lead up to and follow each course. Please see the Extended Text Description below.

(Extended Text Description: Curriculum Path: A graphical illustration indicating the sequence of training modules that lead up to and follow each course. Each module is represented by a box, labeled with that module’s name. The first box is light blue and is labeled "T101 Introduction to ITS Standards Testing" followed by a line pointing right that connects to a light blue box labeled "T201 How to Write a Test Plan" followed by a line that moves down and to the right and connects to a box labeled "T202 Overview of Test Design Specifications, Test Cases and Test Procedures." This box, colored dark blue to indicate that it represents this module, in turn connects to a box with an arrow pointing right and is labeled "T203 Part 1 of 2 How to Develop Test Cases for an ITS Standards-based Test Plan, Part 1 of 2.")

 

Slide 9:

List of abbrs Used in this Module

ASN.1 Abstract Syntax Notation 1
C2C Center-to-Center (Information Exchange)
C2F Center-to-Field (NTCIP Devices)
CCTV Closed Circuit Television
DMS Dynamic Message Sign
ESS Environmental Sensor Station
MIB Management Information Base
NRTM Needs to Requirements Traceability Matrix
NTCIP National Transportation Communications for ITS Protocol
PRL Protocol Requirements List
PDU Protocol Data Unit
RTM Requirements Traceability Matrix
RTCTM Requirements to Test Case Traceability Matrix

 

Slide 10:

List of abbrs Used in this Module (cont.)

TMDD Traffic Management Data Dictionary
TDS Test Design Specification
TCS Test Case Specification
SE System Engineering
SEP System Engineering Process
XML Extensible Markup Language

 

Slide 11:

Learning Objectives

Part 1 of 2:

  1. Review the role of test cases within the overall testing process.
  2. Discuss ITS data structures used in NTCIP and Center-to-Center standards (TMDD) and provide examples.
  3. Find information needed to develop a test case.
  4. Explain test case development.

Part 2 of 2:

  1. Handle standards that are with and without test documentation.
  2. Develop a Requirements to Test Case Traceability Matrix (RTCTM).
  3. Identify types of testing.
  4. Recognize the purpose of test logs and test anomaly report.

 

Slide 12:

Learning Objective 1: Review the Role of Test Cases Within the Overall Testing Process

  • Review test documentation as defined in IEEE Std 829
  • Show test cases in relationship to test plans, test designs, and test procedures
  • Review ITS standards testing approaches and advantages of

IEEE Std 829-based testing

 

Slide 13:

Learning Objective #1

Brief Review of Module T202

  • Module T202 provided the context of the testing life cycle; what to test and when to test during System Engineering life cycle:
    • Provided overview of testing documentation, including Test Design Specifications, Test Cases, and Test Procedures
    • Introduced IEEE Std 829-2008, a Standard for Software and System Test Documentation that guides on formats
    • This module teaches how to use IEEE approach (users can customize testing documentation for their specification)

 

Slide 14:

Learning Objective #1

This slide shows the systems engineering life cycle in the form of a Vee diagram. Please see the Extended Text Description below.

(Extended Text Description: This slide shows the systems engineering life cycle in the form of a Vee diagram. The diagram is labeled "Beginning of a Project Level Test Process". The last two words are red, and the rest are blue. The life-cycle steps include the following: Regional Architectures, Feasibility Study / Concept Exploration, Concept of Operations, System Requirements, High-level Design, Detailed Design, Software / Hardware Development Field Installation, Unit Device Testing, Subsystem Verification, System Verification & Deployment, System Validation, Operations and Maintenance, Changes and Upgrades, Retirement / Replacement. An arrow runs down the bottom left of the Vee and is labeled "Decomposition and Definition". The bottom of the Vee has a label that says "Implementation" and a timeline arrow running from left to right is labeled "Development Process". Another arrow runs up the right underside of the Vee and is labeled "Integration and Recomposition". A red arrow, underlining the phrase "Test Process" at the top of the diagram, runs down to a rectangle made of red dotted lines. The rectangle highlights the area between System Requirements and High Level Design indicates that the test documentation development process begins there.)

 

Slide 15:

Learning Objective #1

What is a Testing Process?

  • The purpose of software and software-based systems testing is:
    • To help build quality into the software and system during the life cycle processes and to validate that the quality was achieved
    • To determine whether the products of a given life cycle activity conform to the requirements of that activity, and whether the product satisfies its intended use and user needs
    • Includes inspection, demonstration, analysis, and testing of software and software-based system products
    • To perform test activities in parallel with development efforts, not just at the conclusion of the development effort

 

Slide 16:

Learning Objective #1

What is a Test Case?

  • A test case specifies the inputs, outcomes, and conditions for execution of a test
  • A test case is identified and included in a Test Case Specification (TCS) as part of an ITS project overall Test Plan
  • This module teaches how to prepare a test case documentation by the agencies

 

Slide 17:

Learning Objective #1

Approaches to Preparing Project Testing Documentation

  1. ITS Standards Approach
  2. IEEE Std 829 Approach

Various cover pages. Please see the Extended Text Description below.

(Extended Text Description: This slide shows cover pages for: 1) IEEE standard 829-2008 IEEE Standard for Software System Test Documentation, 2) NTCIP (National Transportation Communications for ITS Protocol) 8007 Testing and Conformity Assessment Documentation within NTCP Standards Publications, 3) NTCIP 1204 Environmental Sensor Station (ESS) Interface Protocol. These are examples of project testing documentation.)

 

Slide 18:

Learning Objective #1

IEEE Std 829 Testing Approach

  • IEEE approach is applicable to all devices
    • Separates test cases and test procedures allows re-use of procedures
    • Includes a test plan and a method to split testing into test designs
    • Includes test reports
  • IEEE approach can be more broadly applied (common format) across ITS, including center-to-center and center-to-field standards

 

Slide 19:

Learning Objective #1

What does IEEE Std 829 Provide?

  • Guidance and formats for preparing testing documentation:
    • Test Plan
    • Test Design Specification
    • Test Case Specification
    • Test Procedure Specification
    • Test Reports
  • Test Logs
  • Test Anomaly Report
  • Test Report
  • Testing professionals across ITS are familiar with these definitions-formats

This slide shows the cover page of the IEEE Standard for Software and System Test Documentation. For illustration only.

 

Slide 20:

Learning Objective #1

Testing Documentation Structure (IEEE Std 829)

This slide shows a diagram with test documentation in a layered diagram. Please see the Extended Text Description below.

(Extended Text Description: This slide shows a diagram with test documentation in a layered diagram. Three boxes made of dashed lines run from top to bottom along the left side. The first box contains the text – Test Plan describes the overall approach to testing. The second box contains the text – Test Design Specification describes which requirements are to be tested and associated test cases. The third box contains the text – Text Case Specification identifies objectives and inputs, outcomes, and conditions for execution of a test. Each box points to its corresponding place on the diagram to the right. The layers from top to bottom are labeled: Test Plan, Test Design Specification, Test Case Specification, Test Procedure Specification, Test Execution, Test Reports.
The Test Plan layer contains a large dashed rectangle surrounding four other rectangles. Each rectangle is labeled differently. They are, from left to right, Unit Test, Integration Test, System Acceptance Test, and Periodic Maintenance Test. The Test Design Specification Layer contains four boxes, each connecting to a corresponding box on the previous layer. They are labeled, from left to right, Test Design Unit Test, Test Design Integ. Test, Test Design Sys. Acceptance, and Test Design Periodic Mtce. The Test Case Specification layer contains five boxes that are labelled Test Case 001, Test Case 002, Test Case 003, Test Case 004, and Test Case N. Each Test Case is connected to multiple boxes in the previous layer. Test Case 001 is connected to Test Design Unit Test and Test Design Integ. Test. Test Case 002 is connected to Test Design Unit Test, Test Design Integ Test, and Test Design Sys. Acceptance. Test Case 003 is connected to Test Design Integ Test and Test Design Sys. Acceptance. Test Case 004 is connected to Test Design Sys. Acceptance and Test Design Periodic Mtce. Test Case N is connected to Test Design Periodic Mtce. The Test Procedure Specification layer contains two rectangles containing the text Test Procedure 001 and Test Procedure 002. Test Procedure 001 links back to Test Cases 001 – 003. Test procedure 002 links back to Test Case 004 and N. The Text Execution layer has a single dashed oval labeled Test Plan Execution (Process) and Is linked back to both Test Procedures. The final layer is Test Reports and contains three text boxes. The top two are labeled Test Logs and Test Incident Reports and they connect back to Test Plan Execution. These boxes also lead down to the final box via arrows and this final box is labeled Test Plan Execution Summary Report.)

 

Slide 21:

Learning Objective #1

Testing Documentation Structure (cont.)

This slide shows a diagram with test documentation in a layered diagram. Please see the Extended Text Description below.

(Extended Text Description: This slide contains the same layered diagram as the previous slide, but adds additional definitions of two layers. A text box containing the text - Test Procedure Specification defines the steps to execute a test. Multiple Test Cases may reference a single Test Procedure. Test Procedures may be more costly to develop than Test Cases. – points to the Test Procedure Specification layer. A text box with the text - Test Reports: Test Logs, Test Anomaly Reports, Test Report – points to the Test Reports layer.)

 

Slide 22:

Learning Objective #1

Key Differences Between the Two Approaches

  • IEEE standard approach is applicable to all ITS standards including C2C and C2F
  • IEEE standard approach separates test cases from test procedures while previous efforts combined both such as per NTCIP 8007 information report
  • IEEE standard approach allows re-use of test procedures, where agencies typically place more efforts
  • IEEE standard approach includes a test plan and method to split testing into test designs, and includes test reports

 

Slide 23:

Activity. A placeholder graphic with an image of hand over a computer keyboard to show that an activity is taking place.

 

Slide 24:

Learning Objective #1

Which of the following IEEE Std 829-based component describes data inputs and outputs to be tested?

Answer Choices

  1. Test Plan
  2. Test Case Specification
  3. Test Design Specification
  4. Test Procedure Specification

 

Slide 25:

Learning Objective #1

Review of Answers

A small graphical red and yellow X representing incorrect.a) Test Plan
Incorrect. Test plan describes overall approach.

A small graphical green and yellow check mark representing correct.b) Test Case Specification
Correct! Test Case Specification focuses on data input and output requirements to be tested.

A small graphical red and yellow X representing incorrect.c) Test Design Specification
Incorrect. Test Design Specification specifies the requirements to be tested and which test cases are associated with which requirements.

A small graphical red and yellow X representing incorrect.d) Test Procedure Specification
Incorrect. Test procedures outlines steps.

 

Slide 26:

Learning Objective #1

Summary of Learning Objective #1

Review the Role of Test Cases Within the Overall Testing Process

  • Reviewed test documentation structure as defined in IEEE Std 829
  • Discussed test cases in relationship to test plans, test designs, and test procedures
  • Review ITS standards testing approaches and IEEE Std 829-based testing approach and key difference

 

Slide 27:

Learning Objective #2: Discuss ITS Data Structures Used in NTCIP and Center-to-Center Standards (TMDD) and Provide Examples

  • Review data structure of ITS information and provide examples
  • Discuss how a test case verifies the correct structure of the data as specified in the standards
  • Discuss how a test case verifies the correct value of the data (range-syntax) and data types to conform the standards

 

Slide 28:

Learning Objective #2

Example: Information Exchange Between ITS Centers

  • Centers exchange information using Dialogs
  • Dialogs contain Messages
  • Messages are formed with Data Frames and Data Elements

Example: Information Exchange Between ITS Centers. Please see the Extended Text Description below.

(Extended Text Description: This slide, entitled Example: Information Exchange Between ITS Centers, shows a diagram at left that will be described in detail in future slides. The graphic shows a message exchange sequence at top, referred to as a ‘Dialog’. At top, a line with an arrow at the tip labeled ‘Message A’ goes from an External Center to an Owner Center. Message A is a request message. Another line with an arrow at the tip labeled ‘Message B’ is directly below the previous line described goes between the Owner Center and the External Center. Message B is a response message. The top part of the graphic shows that a dialog consists of a request message followed by a response message. Below the dialogs are a series of different colored ovals. The first set of two orange ovals are labeled Message A and Message B. Lines lead from them to aqua colored ovals, all labeled Data Frame. Lines lead from the Data Frame ovals to blue Data Element ovals.)

 

Slide 29:

Learning Objective #2

What does Testing Verify? (Information Exchange Standards)

  • Testing verifies the correct sequence of information being exchanged:
    • Standardized Dialogs specify the correct sequence of information exchanges

What does Testing Verify? Please see the Extended Text Description below.

(Extended Text Description: This slide, entitled What does Testing Verify? (Information Exchange Standards), has a larger version of the top portion of the graphic from the previous slide. It shows the same External Center and Owner Center in silver boxes, side by side. An arrow labeled Message A connects the boxes from External Center to Owner Center. An arrow below that labeled Message B connects the boxes from Owner Center to Message Center.)

 

Slide 30:

Learning Objective #2

Verifying the Correct Structure of Information

  • ITS standards specify the tree-like exact structure of information

This slide shows a graphic that illustrates the inverted tree-like structure of information contained in messages that are exchanged between systems. Please see the Extended Text Description below.

(Extended Text Description: This slide shows a graphic that illustrates the inverted tree-like structure of information contained in messages that are exchanged between systems. The top of the graphic shows a set of two orange ovals are labeled Message A and Message B. Text to the left indicates that this is the "Root" of the Message. Lines lead from them to aqua colored ovals, all labeled Data Frame. This level is labeled "Branches". Lines lead from the Data Frame ovals to blue Data Element ovals. These Data Elements are the "Leaves".)

 

Slide 31:

Learning Objective #2

Data Structure Is Tree-Like (Hierarchical)

  • Messages (Root level)
    • Root element in the hierarchy of data exchanged between centers
    • A message is made up of data frames and data elements
  • Data Frames (Branch level)
    • Reusable bundles of data elements and other data frames
  • Data Elements (Leaf level)
    • Leaves in the hierarchy of data structure
    • Provide value constraints for data content

 

Slide 32:

Learning Objective #2

How?

  • Test Case Specification (TCS) Identifies:
    • Inputs (Message A)
    • Outcomes-Predicted results (Message B)
    • Execution conditions (sequence): Owner Center responds with Message B upon receipt of Message A from External Center

This slide has the same graphic from slide twenty-nine. Please see the Extended Text Description below.

(Extended Text Description: This slide has the same graphic from slide twenty-nine. It shows the same External Center and Owner Center in silver boxes, side by side. An arrow labeled Message A connects the boxes from External Center to Owner Center. An arrow below that labeled Message B connects the boxes from Owner Center to Message Center.)

 

Slide 33:

Learning Objective #2

Example: Center-to-Center Dialog

  • Only specified sequence of messages and combinations are valid
    • linkStatusRequestMsg is used to make the request
    • linkStatusMsg contains the response

This slide replaces the generic terms shown in the previous slide, namely the Dialog, message A, and message B, with actual dialog and messages from the TMDD standard. Please see the Extended Text Description below.

(Extended Text Description: This slide replaces the generic terms shown in the previous slide, namely the Dialog, message A, and message B, with actual dialog and messages from the TMDD standard. The graphic shows an illustration labelled "Dialog: linkStatusRequest". The box on the left, "External Center" is connected via an arrow labeled linkStatusRequestMsg to the "Owner Center" box on the right. An arrow labeled linkStatusMsg goes back from "Owner Center" to "External Center".)

 

Slide 34:

Learning Objective #2

Example: Center-to-Center Data Structure of linkStatusRequestMsg

  • linkStatusRequestMsg is of type TrafficNetworkInformationRequest

This slide contains a graphic showing the hierarchy of the linkStatusRequest message. Please see the Extended Text Description below.

(Extended Text Description: This slide contains a graphic showing the hierarchy of the linkStatusRequest message. A Legend in the bottom right shows that elements drawn with a solid outline represents mandatory elements, while a dashed outline indicates an optional element. A box with the text "TrafficNetworkInformationReque" starts the process. Text underneath this box says "<objectClass>TransportationNetwork</objectClass>". A line from the right of that box branches out to show the elements of a message in a specific order. Authentication, first; is in a dashed box with the text "<requirement>REQ1408</requirement>" underneath. Following that is a solid box labeled "organization-requesting" with the text "<requirement>REQ212</requirement>" underneath. That is followed by a solid box labeled "network-information-type" with the text "<requirement>REQ212</requirement>" underneath. Next is a dashed box labeled "network-identifiers" with the text "<requirement>REQ1178</requirement>" underneath. Next is a dashed box labeled "roadway-network-id-list" with the text "<requirement>REQ1177</requirement>" underneath. Finally, a dashed box labeled "any ##other" makes up the last step.)

 

Slide 35:

Learning Objective #2

Constraints on Content of Data Values

  • Device Standards: Testing verifies the correct value of object instance and protocol data units (PDUs)
  • ITS standards use XML format for C2C information exchange data and ASN.1 for C2F NTCIP device data
  • Typical data value constraints are:
    • Data type such as text or number
    • Enumerations such as a list of valid values
    • Text length
    • Numerical value ranges such as 0-255 in NTCIP Objects

(Note: some devices use 0 as value to turn OFF a Device, 1 to turn ON, Ramp Meter Control standard uses 1-255 range)

 

Slide 36:

Learning Objective #2

Example: Center-to-Field Device (ESS)

  • Only number values are valid: Values 1 through 12.

Example: Center-to-Field Device (ESS). Please see the Extended Text Description below.

(Extended Text Description: There are two dashed line boxes on the left, one over another. The top box contains the text "Constraint: Number Value SYNTAX INTEGER" and an arrow leads to the text "SYNTAX INTEGER"on the right. The second box contains the text "Constraint: Value Range 1 to 12" and an arrow leads to text "other (1)" on the right.
This is the text that makes up the right hand side of the image:
5.6.10.10 Wind Sensor Situation
windSensorSituation OBJECT-TYPE
SYNTAX INTEGER {
Learning Objective #2
other (1), unknown (2), calm (3), lightBreeze (4), moderateBreeze
(5), strongBreeze (6), gale (7), moderateGale (8), strongGale (9),
stormWinds (10), hurricaneForceWinds (11), gustyWinds (12)}
ACCESS read-only
STATUS mandatory
DESCRIPTION "<Definition>Describes the weather and travel
situation in terms of wind from staffed stations only. Specific
ranges for these values are defined in the Glossary of
Meteorology.
<DescriptiveName>WindSensor.situation:code
gustyWinds defined by a peak and a lull of greater than 46.3
tenths of meters per second within a 2 minute period.
<Data Concept Type>Data Element"
::= { windSensorEntry 10 }

)

Source: NTCIP 1204 v03

 

Slide 37:

Learning Objective #2

Examples of Constraints on Data Structure

  • C2F NTCIP Devices:
    • Management Information Base (MIB)
    • ASN.1 Object Value specification
  • C2C TMDD (Volume II - Design):
    • XML Value specification
    • ASN.1 Value specification
  • Data Structure:
    • Message
    • Data Frames
    • Data Elements

 

Slide 38:

Learning Objective #2

What is the Purpose of a Test Case?

  • To verify the requirements related to information exchanged between two systems by:
    • Verifying the sequence of information exchanged is correct:
      • Standards use dialogs to define information exchange sequence
    • Verifying the structure of information exchanged is correct
      • Standards define the order of Messages-Data Frames-Data Elements
    • Verifying the content of information exchanged is correct
      • Standards define the valid value rules (e.g., value ranges) for data exchanged

 

Slide 39:

Learning Objective #2

Relationship of Test Case to Requirements

  • Purpose: To test monitoring capability stated by a requirement
  • A Requirement to Test Case Traceability Matrix (RTCTM) relates the test case to requirement(s) being tested.

NTCIP 1204 v03.08 Page 154

Requirement Test Case
ID Title ID Title
3.5.1.1.2 Retrieve Compressed Station Metadata
    C.2.3.1.2 Retrieve Compressed Station Metadata
3.5.1.1.3 Configure ESS Manager
    C.2.3.1.1 ESS Characteristics
3.5.1.2 ESS Status Monitoring Requirements
3.5.1.2.1 Retrieve ESS Door Status
    C.2.3.1.3 Retrieve ESS Door Status
3.5.1.2.2 Retrieve Battery Status
  - C.2.3.1.4 Retrieve Battery Status

 

Slide 40:

Activity. A placeholder graphic with an image of hand over a computer keyboard to show that an activity is taking place.

 

Slide 41:

Learning Objective #2

Which of the following defines the structure and data content of inputs and outputs?

Answer Choices

  1. Data Dictionary Standard (e.g., NTCIP 1204 ESS, TMDD)
  2. Protocol Requirements List (PRL)
  3. Requirements to Test Case Traceability Matrix (RTCTM)
  4. All of the above

 

Slide 42:

Learning Objective #2

Review of Answers

A small graphical green and yellow check mark representing correct.a) Data Dictionary (e.g., NTCIP 1204 ESS, TMDD)
Correct! A data dictionary specifies the structure of data and constraints of valid values for data content.

A small graphical red and yellow X representing incorrect.b) Protocol Requirements List (PRL)
Incorrect. The PRL traces requirements to needs, and allows you to specify optional requirements for a specific project.

A small graphical red and yellow X representing incorrect.c) Requirements to Test Case Traceability Matrix (RTCTM)
Incorrect. The RTCTM traces test cases to the requirements the test case verifies.

A small graphical red and yellow X representing incorrect.d) All of the above
Incorrect. Only a) above is correct.

 

Slide 43:

Learning Objective #2

Summary of Learning Objective #2

Discuss ITS Data Structures used in NTCIP and Center-to-Center Standards (TMDD) and Provide Examples

  • Reviewed data structure of ITS information with examples
  • Discuss how a test case verifies the correct structure of the data as specified in the standards
  • Discussed how a test case verifies the correct value of the data (range-syntax) and data types to conform the standards

 

Slide 44:

Learning Objective #3: Find Information Needed for a Test Case

  • What information is needed:
    • Relevant User Needs for Project ° Relevant Requirements
    • Relevant Design (dialogs, data elements, and valid values)
  • Where to find content for a Test Case for C2C standards (TMDD)
  • Where to find content for a Test Case for C2F standards (NTCIP)

 

Slide 45:

Learning Objective #3

Where to Find C2C Standards Content

  • Requirements and dialogs are identified by project level Needs to Requirements Traceability Matrix (NRTM)
  • Dialogs identifies inputs and outputs needed to develop the test case specification

TMDD v03 NRTM

Each project tailors this matric

UN ID User Need Reqmt Type ReqID Requirement Conformance Support
2.5.2.2 Travel Time Data far Roads Optional Yes/No
    Dialogs for Link Based Information
      3 5 3.3.2.1 Send Link Status Information Upon Request M Yes/No/NA
      3.5.3.3.2.2 Publish Link Status Information Subscription O Yes/No/NA
      3.5.3.3.2.3 Subscribe to Link Status Information Subscription O Yes/No/NA
    Request Message
      3.5.3.3.2.4 Contents of the Link Status Request M Yes
      3.5.3.1.1 Contents of the Traffic Networt; I nformatiori Request M Yes
      3.5.3.1 11 Required Traffic Network Information Request Content M Yes
      3 5 3.1 1 2.1 Authentication O Yes/No
      3.5.3.1.1.2.1.1 Operator Identifier O Yes/No
      3.5.3.1.1.2.2 Roadway Network Identifier O Yes/No
      3.5.3.1.1.2.3 Traffic Network Identrfier O Yes/No
    Response Message
      3.5.3.3.2.5 Contents of the Link Status Information M Yes
      3.5.3.3.2.5.1 Required Link Status Information Content M Yes
      3.5.3.3.2.5.2.1 Restrictions O Yes/No
      3.5.3.3.2.5.2.2 Link Name O Yes/No
      3.5.3.3.2.5.2.3 Link Direction O Yes/No
      3.5.3.3.2.5.2.4 Link Travel Time M Yes/No
      3.5.3.3.2.5.2.1.1 Status Date and Time Change Information O Yes/No
    Error Report Message    
      3.4.4.1 Contents of the Enor Report M Yes
      3.4.4.1.1 Required Enor Report Contents M Yes
      3.4.4.1.2.1 Restrictions O Yes / No

 

Slide 46:

Learning Objective #3

Example of a Project Level NRTM

UN ID User Need Reqmt Type Req ID Requirement Conformance Support
2.5.2.2 Travel Time Data for Roads Optional Yes/No
    Dialogs for Link Based Information
      3.5.3.3.2.1 Send Link Status Information Upon Request M [Yes]/No/NA
      3.5.3.3.2.2 Publish Link Status Information Subscription Yes/[No]/NA
      3.5.3.3.2.3 Subscribe to Link Status Information Subscripts"on 0 Yes/[No]/NA
    Request Message
      35332.4 Contents of the Link Status Request M [Yes]
      353.1.1 Contents of the Traffic Network Information Request M [Yes]
      353 1.1.1 Required Traffic Network Information Request Content M [Yes]
      3.5.3.1.1.2.1 Authentication O [Yes]/No
      353 1 1.2.1.1 Operator Identifier O [Yes]/No
      3.5.3.1.1.2.2 Roadway Network Identifier O [Yes]/No
      353 1 1.2.3 Traffic Network Identifier O [Yes]/No
             
    Response Message
      35332.5 Contents of the Link Status Information M [Yes]
      35332.5.1 Required Link Status Information Content M [Yes]
      35332.5.2.1 Restrictions O Yes/[No]
      35332.5.2.2 Link Name O [Yes]/No
      35332.5.2.3 Link Direction O Yes/[No]
      35332.5.24 Link Travel Time M* [Yes]/No
      35332.5.2 11 Status Date and Time Change Information O Yes/[No]
    Error Report Message    
      3.4.4.1 Contents of the Error Report M [Yes]

 

Slide 47:

Learning Objective #3

A Section of NRTM Tailored For Project-Specific Needs

UN ID User Need Reqmt Type Req ID Requirement Conformance Support
2.5 2.2 Travel Time Data for Roads Optional Yes/No
    Dialogs for Link Based Information
      3.5.3.3.2.1 Send Link Status In formation Upon Request M [Yes]/No/NA
      3.5.3.3.2.2 Publish Link Status Information Subscription:O Yes/[No]/NA
      3.5.3.3.2.3 Subscribe to Link Status Information Subscription:O Yes/[No]/NA

Section of the tailored RTM that corresponds with the requirements identified from the NRTM above. Section covering User Needs 2.5.2.2 is shown below.

  RTSMIP-DXFS Requirement ID Requirement DC Type TMDD Vol II DC Instance Name TMDD Vol II DC ID TMDD Vol II DC Class Name
  3.5.3.3.2.1 Send Link Status Information Upon Request dialog dILinkStatusRequest 3.1.13.2 dlLinkStatusRequest
  3.5.3.3.2.2 Publish Link Status Information dialog dILinkStatusUpdate 3.1.34.2 dlLinkStatusUpdate
  3.5.3.3.2.3 Subscribe to Link Status Information dialog dlTrafficNetworkInformationSubscription 3.1.19.1 dlTrafficNetworkInformationSubscription

 

Slide 48:

Learning Objective #3

Where to Find C2F Standards Content

  • Requirements are identified by project level Protocol Requirements List (PRL)
  • NTCIP SEP standards such as DMS provides a PRL
  • Non-SEP standards such as CCTV must develop a project PRL
USER NEED SECTION NUMBER USER NEED FR SECTION NUMBER FUNCTIONAL REQUIREMENT CONFORMANCE SUPPORT/ PROJECT REQUIREMENT ADDITIONAL PROJECT REQUIREMENTS
2.5.2.3 Control the Sign Face M Yes  
2.5.2.3.1 Activate and Display a Message M Yes  
    3.5.2.3.1 Activate a Message M Yes  
    3.5.2.3.3.5 Retrieve Message M Yes  
    3.5.2.3.6 Activate a Message with Status Drum:M Yes/NA  
    3.6.5 t Supplemental Requirements for Message Activation Request M Yes  
    3.6.7 t Supplemental Requirements for Locally Stored Messages M Yes  

 

Slide 49:

Learning Objective #3

Where to Find C2F Standards Content (cont.)

  • Requirements Traceability Matrix (RTM) references relevant design content needed to define the inputs and outputs for the test case specification
  • NTCIP SEP standards such as DMS provides a RTM
  • Non-SEP standards such as CCTV must develop a project RTM
Requirements Traceability Matrix (RTM)
FR ID Functional Requirement Dialog ID Object ID Object Name Additional Specifications
3.5.2 3 Control the Sign Face      
3.5.2.3.1 Activate a Message 4.2.3.1  
      5.7.3 dmsActivateMessage  
      5.11.2.1.1 shortErrorStatus  
      5.7.17 dmsActivatelvlsgError  
      5.7.24 dmsActivateErrorlvlsgCode  
      5.7.18 dmsMultiSyntaxError  
      5.7.19 dmsMultiSyntaxErrorPosition  
      5.7.20 dmsMultiOtherErrorDescription  

 

Slide 50:

Activity. A placeholder graphic with an image of hand over a computer keyboard to show that an activity is taking place.

 

Slide 51:

Learning Objective #3

Which of the following will provide information on project needs for a C2C project?

Sources of Information:

  1. Needs to Requirements Traceability Matrix (NRTM)
  2. Requirements to Test Case Traceability Matrix (RTCTM)
  3. Requirements Traceability Matrix (RTM)
  4. Design (dialogs, data elements, valid values)

 

Slide 52:

Learning Objective #3

Review of Answers

A small graphical green and yellow check mark representing correct.a) Needs to Requirements Traceability Matrix (NRTM)
Correct! NRTM identifies project needs for a C2C project.

A small graphical red and yellow X representing incorrect.b) Requirements to Test Case Traceability Matrix (RTCTM)
Incorrect. The RTCTM traces test cases to the requirements the test case verifies.

A small graphical red and yellow X representing incorrect.c) Requirements Traceability matrix (RTM)
Incorrect. The RTM traces requirements to design objects for C2F NTCIP standard based project.

A small graphical red and yellow X representing incorrect.d) Design (dialogs, data elements, valid values)
Incorrect. Project design does not trace to project needs.

 

Slide 53:

Learning Objective #3

Summary of Learning Objective #3

Find Information Needed for a Test Case

  • Reviewed content sources for a test case information for C2C standard such as TMDD
  • Reviewed content sources for a test case information for C2F standards such as NTCIP-ESS

 

Slide 54:

Learning Objective #4: Explain Test Case Development

  • Outline of a test case:
    • Suggested template
    • Required content
  • Where do we find information for test case template?
    • Center-to-Center Standards (C2C)
    • Center-to-Field Standards (C2F)
  • Discuss Positive/Negative Testing
  • Additional test case Requirements

 

Slide 55:

Learning Objective #4

Outline of a Test Case-Suggested Template (IEEE Std 829)

  • Required Content of a test case:
    • Test case identifier
    • Objective
    • Inputs
    • Outcomes
    • Environmental needs
    • Special procedural requirements
    • Intercede dependencies
Test Case  
ID:  
Objective:  
Inputs:  
Outcome(s):  
Environmental Needs:  
Tester/Reviewer:  
Special Procedure Requirements:  
Intercase Dependencies:  

 

Slide 56:

Learning Objective #4

Test Case Identifier

  • Each test case requires a unique identifier to distinguish it from all other test cases.

This image contains a table with a graphical element. Please see the Extended Text Description below.

(Extended Text Description: This image contains a table with a graphical element. This text is as follows:

Note that the Text "ID: TC001" is circled in red.

Test Case  
ID: TC001 Title: Link Status Request-Response Dialog Verification (Positive Test Case)
Objective: To verify system interface implements (positive test case) requirements for:
1) Link Status Request-Response Dialog message exchange
2) Contents of the Link Status Request Message
3) Contents of the Link Status Information Message
The test case verifies that the dialog, request message content, and response message content are correct by sending a request message (verified to be correct) across the system interface, and verification that the response message is correct. Input and output specifications are provided to verify the request and response message are correct per the requirements for the request and response message.
Inputs: Use the input file linkStatusRequest.xml. See Test Case Input Specification TCIS001 -
LinkStatusRequest (Positive Test Case).
Outcome(s): All data are returned and verified as correct: correct sequence of message exchanges, structure of data, and valid value of data content. See Test Case Output Specification TCOS001 LinkStatusInformation (Positive Test Case)
Environmental Needs: No additional needs outside of those specified in the test plan.
Tester/Reviewer: M.I.
Special Procedure Requirements: None
Intercase Dependencies: None

)

 

Slide 57:

Learning Objective #4

Test Case Objective

  • Purpose: The objective identifies the purpose of the test case
  • Focus: Describe the special focus of a particular test case and relation to other test cases
  • Priority: Test case priority

 

Slide 58:

Learning Objective #4

Test Case Objective: Focus

  • Whether TC is for testing a dialog (i.e., correct sequence of message exchanges)
  • Whether TC is testing correct structure and content of data
  • Intercase dependencies
    • An example of an intercase dependency is when a test case to verify a publication dialog must be preceded by a complete and correct subscription dialog

 

Slide 59:

Learning Objective #4

Test Case Objective: Priority

  • Identifies the relative importance of accomplishing certain test cases in advance of others
  • Priority is project specific
  • Examples:
    • Specify the order of which devices to test (e.g., CCTV first, DMS next, etc.)
    • Specify that inventory and status dialogs shall be tested first, followed by the testing of device control dialogs
    • Specify that request-response dialogs shall be tested first, followed by subscription-publication dialogs
    • Specify that positive test cases shall be tested first, followed by negative test cases

 

Slide 60:

Learning Objective #4

Test Case Inputs

  • Specify each input required to execute each test case:
    • Some inputs will be specified by value (with tolerances where appropriate)
    • Some others such as constant tables or transaction files will be specified by name
    • Specify each input and timing of input(s) required to execute the test case

 

Slide 61:

Learning Objective #4

Example Test Case Input Specification

Test Case Input Specification
ID: TCIS001 Title: LinkStatusRequest (Positive Test Case)
Data Concept Name (Variable) Data Concept Type Value Domain
trafficNetworkInformationRequestMsg Message  
- organization-requesting Data Frame  
- organization-id Data Element IA5String (SIZE(1..32))
- organization-name Data Element IA5String (SIZE(1..128))
- network-information-type Data Element 1 = "node inventory"
2 = "node status"
3 = "link inventory"
4 = "link status"
5 = "route inventory"
6 = "route status"
7 = "network inventory"

 

Slide 62:

Learning Objective #4

Test Case Outcome(s)

  • Outcomes specify all outputs and the expected behavior (e.g., response time) required of the test items
  • Provides representative value(s) (with tolerances where appropriate) for each required output and expected behavior

 

Slide 63:

Learning Objective #4

Example Test Case Output Specification

Test Case Output Specification
ID: TCOS001 Title: LinkStatusInformation (Positive Test Case)
Data Concept Name (Variable) Data Concept Type Value Domain
linkStatusMsg Message  
- link-status-item Data Frame  
- organization-information Data Frame  
- organization-id Data Element IA5String (SIZE(1..32))
- organization-name Data Element IA5String (SIZE(1..128))
- link-status-list Data Frame  
-link Data Frame  
- network-id Data Element IA5String (SIZE(1..32))
- link-id Data Element IA5String (SIZE(1..32))
- link-name Data Element IA5String (SIZE(1..128))
- link-status Data Element 1 = "no determination"
2 = "open"
3 = "restricted"
4 = "closed"
- travel-time Data Element INTEGER (0..65535), units=seconds

 

Slide 64:

Learning Objective #4

Sample Filled-in Test Case Specification

Test Case
ID: TC001 Title: Link Status Request-Response Dialog Verification (Positive Test Case)
Objective: To verify system interface implements (positive test case) requirements for:
1) Link Status Request-Response Dialog message exchange
2) Contents of the Link Status Request Message
3) Contents of the Link Status Information Message
The test case verifies that the dialog, request message content, and response message content are correct by sending a request message (verified to be correct) across the system interface, and verification that the response message is correct. Input and output specifications are provided to verify the request and response message are correct per the requirements for the request and response message.
Inputs: Use the input file linkStatusRequest.xml. See Test Case Input Specification TCIS001 -LinkStatusRequest (Positive Test Case).
Outcome(s): All data are returned and verified as correct: correct sequence of message exchanges, structure of data, and valid value of data content. See Test Case Output Specification TCOS001 - LinkStatusInformation (Positive Test Case)
Environmental Needs: No additional needs outside of those specified in the test plan.
Tester/Reviewer M.I.
Special Procedure Requirements: None
Intercase Dependencies: None

 

Slide 65:

Learning Objective #4

Positive Test Case

  • Positive Test Case Inputs and Outputs include:
    • Data values within the range of values (or text length or format) specified in the standards
    • Data that are correctly structured as specified in the standard
    • All mandatory data values, including those optional elements in the standard made mandatory for a project

 

Slide 66:

Learning Objective #4

Positive Test Case Data Example

This slide contains an example TMDD Link Status Request Message. The format of the message is in XML. The text is as follows:

<?xml version="1.0" encoding="UTF-8"?>
<trafficNetworkInformationRequestMsg>
    <authentication>
        <user-id>user</user-id>
        <password>pass</password>
    </authentication>
    <organization-requesting>
        <organization-id >ORG001</organization-id>
        <center-contact-list>
                                <center-contact-details>
                        <center-id>test</center-id>
                    </center-contact-details>
        </center-contact-list>
    </organization-requesting>
    <network-information-type>link inventory</network-information-type>
<trafficNetworkInformationRequestMsg>

 

Slide 67:

Learning Objective #4

Negative Test Case

  • Negative Test Case inputs and outputs:
    • Include data values that are not within the range of values (or text length or format) specified in the standards.
    • May have data not correctly structured as specified in the standard
    • May have missing mandatory data elements, including those optional elements in the standard made mandatory for a project

 

Slide 68:

Learning Objective #4

Negative Test Case Data Example

Please see the Extended Text Description below.

(Extended Text Description: This slide contains an example TMDD Link Status Request Message. The format of the message is in XML. There are two boxes. The smaller one, on the left, is dashed contains the text "Errors: 1. Invalid User Name and Password 2. Missing mandatory element <organization-id>, and 3. Extra element <depreciation-method> not defined in TMDD or project specific NRTM."
The solid box on the right contains the following text:
<?xml version="1.0" encoding="UTF-8"?>
<trafficNetworkInformationRequestMsg>
<!– Error: Invalid User Name and Password -->
    <authentication>
        <user-id>user</user-id>
        <password>incorrectpass</password>
    </authentication>
    <organization-requesting>
            <!– Error: Missing TMDD Mandatory Element:-->
            <!-- organization-id -->
            <center-contact-list>
                        <center-contact-details>
                            <center-id>test</center-id>
                    </center-contact-details>
            </center-contact-list>
            <!-- Error: Extra element not defined -->
            <depreciation-method>sum of the years digits
            </depreciation-method>
    </organization-requesting>
    <network-information-type>link inventory</network-information-
type>
<trafficNetworkInformationRequestMsg>)

 

Slide 69:

Learning Objective #4

Missing Elements and Incorrect Data Structure

  • A missing element or incorrect structure of a message may be specified in the TCS inputs, perhaps referencing a file with an example

 

Slide 70:

Learning Objective #4

Example Missing Elements

Please see the Extended Text Description below.

(Extended Text Description:This slide shows the correctly formatted TMDD XML Message at left and the incorrectly formatted message at right to highlight what the error is. In this case the error is missing mandatory element. The box on the left, labeled "Correct Message" underneath, contains the text:
<?xml version="1.0" encoding="UTF-8"?>
<trafficNetworkInformationRequestMsg>
    <authentication>
        <user-id>user</user-id>
        <password>pass</password>
    </authentication>
    <organization-requesting>
            <organization-id>ORG001</organization-id>
            <center-contact-list>
                          <center-contact-details>
                                <center-id>test</center-id>
                    </center-contact-details>
            </center-contact-list>
    </organization-requesting>
    <network-information-type>link
inventory</network-information-type>
<trafficNetworkInformationRequestMsg>

The box on the right, labeled "Incorrect Message: Missing Mandatory Element organization-id" contains the following text:

<?xml version="1.0" encoding="UTF-8"?>
<trafficNetworkInformationRequestMsg>
    <authentication>
        <user-id>user</user-id>
        <password>pass</password>
    </authentication>
    <organization-requesting>
            <!– Missing Mandatory Element -->
            <center-contact-list>
                          <center-contact-details>
                                <center-id>test</center-id>
                    </center-contact-details>
            </center-contact-list>
    </organization-requesting>
    <network-information-type>link
inventory</network-information-type>
<trafficNetworkInformationRequestMsg> )

 

Slide 71:

Learning Objective #4

Example Incorrect Data Structure

Please see the Extended Text Description below.

(Extended Text Description:This slide shows the correctly formatted TMDD XML Message at left and the incorrectly formatted message at right to highlight what the error is. In this case the error is the password and user-id are not listed in the correct sequence: that is, the user-id must precede the password. The solid box on the left, labeled "Correct Message" contains the following text:

<?xml version="1.0" encoding="UTF-8"?>
<trafficNetworkInformationRequestMsg>
    <authentication>
        <user-id>user</user-id>
        <password>pass</password>
    </authentication>
    <organization-requesting>
            <organization-id>ORG001</organization-id>
            <center-contact-list>
                          <center-contact-details>
                                <center-id>test</center-id>
                    </center-contact-details>
            </center-contact-list>
    </organization-requesting>
    <network-information-type>link
inventory</network-information-type>
<trafficNetworkInformationRequestMsg>

The box on the right, labeled "Incorrect Message: Incorrect Sequence of Elements user-id and password" contains the following text:
<?xml version="1.0" encoding="UTF-8"?>
<trafficNetworkInformationRequestMsg>
    <authentication>
        <password>pass</password>
        <user-id>user</user-id>
    </authentication>
    <organization-requesting>
            <organization-id>ORG001</organization-id>
            <center-contact-list>
                          <center-contact-details>
                                <center-id>test</center-id>
                    </center-contact-details>
            </center-contact-list>
    </organization-requesting>
    <network-information-type>link
inventory</network-information-type>
<trafficNetworkInformationRequestMsg>)

 

Slide 72:

Learning Objective #4

Test Case Environmental Needs

  • Describe the test environment needed for test setup, execution, and results recording
  • Ideally, the test plan identifies environmental needs for conducting testing
  • This section of the test case may simply reference the section of the test plan that identifies environmental needs if there are no special test case-specific needs
  • In some instances, a test case may specify additional environmental needs or exceptions to environmental needs identified in the test plan or referenced test procedure

 

Slide 73:

Learning Objective #4

Test Case Special Procedural Requirements

  • Describes special constraints on test case execution
  • Pre- and post-conditions for test case execution
  • This section may reference the use of automated test tools not described in the test plan or referenced test procedure
  • Exceptions to what is described in the test plan or referenced test procedure would be included in this section

 

Slide 74:

Learning Objective #4

Test Case Intercase Dependencies

  • Lists the identifiers of test cases that must be executed prior to this test case
  • Summarize the nature of the dependencies
  • For example, when testing subscription-publication dialogs, a subscription must take place (or be tested) prior to testing for a publication update

 

Slide 75:

Activity. A placeholder graphic with an image of hand over a computer keyboard to show that an activity is taking place.

 

Slide 76:

Learning Objective #4

Which of the following is part of the IEEE Std 829 Test Case Specification?

Answer Choices

  1. Description and valid values of inputs and outputs
  2. Project Sponsor
  3. Steps to Conduct a Test
  4. Test Pass-Fail

 

Slide 77:

Learning Objective #4

Review of Answers

A small graphical green and yellow check mark representing correct.a) Description and valid values of inputs and outputs
Correct! The test case includes specification of inputs, including their value.

A small graphical red and yellow X representing incorrect.b) Project Sponsor
Incorrect. The project sponsor is not a formal part of a TC.

A small graphical red and yellow X representing incorrect.c) Steps to Conduct a Test
Incorrect. This feature is contained in a test procedure.

A small graphical red and yellow X representing incorrect.d) Test Pass-Fail
Incorrect. This feature is contained in a test procedure.

 

Slide 78:

Learning Objective #4

Summary of Learning Objective #4

Understand Test Case Development

  • Reviewed an outline of a test case and a suggested template with required content
  • Discussed where do we find information for test case template for C2C and C2F standards
  • Discussed positive/negative testing
  • Reviewed additional test case requirements

 

Slide 79:

What We Have Learned

1) The role of test cases in relation to other test documents: test plan, test designs, test procedures, and test reports.

2) The purpose of a test case specification is to document the inputs, expected outcomes, and execution conditions for a test.

 

Slide 80:

What We Have Learned (cont.)

3) The outline for a test case specification is defined in IEEE Std 829.

  1. Test case identifier
  2. Objective
  3. Inputs
  4. Outcomes
  5. Environmental needs
  6. Special procedural requirements
  7. Intercase dependencies

 

Slide 81:

What We Have Learned (cont.)

4) ITS data dictionary standards constrain the structure of data and content of data of information exchanges between systems.

5) Walked through an example test case to learn how to develop one.

 

Slide 82:

Resources

 

Slide 83:

Next Course Module

T203 Part 2 of 2: How to Develop Test Cases for ITS Standards-based Test Plan, Part 2 of 2

Part 2 of 2:

5. Handle standards that are with and without test documentation

6. Develop a Requirements to Test Case Traceability Matrix (RTCTM)

7. Identify types of testing

8. Recognize the purpose of test logs and test anomaly report

 

Slide 84:

Questions? A placeholder graphic image with word Questions? at the top, and an image of a lit light bulb on the lower right side.