Module 17 - T311

T311: Applying Your Test Plan to the NTCIP 1203 v03 DMS Standard

HTML of the PowerPoint Presentation

(Note: This document has been converted from a PowerPoint presentation to 508-compliant HTML. The formatting has been adjusted for 508 compliance, but all the original text content is included, plus additional text descriptions for the images, photos and/or diagrams have been provided below.)

 

Slide 1:

Slide 1: ITS Welcome - see the extended text description below.

(Extended Text Description: Slide 1: Welcome - Graphic image of introductory slide. A large dark blue rectangle with a wide, light grid pattern at the top half and bands of dark and lighter blue bands below. There is a white square ITS logo box with words “Standards ITS Training” in green and blue on the middle left side. The word “Welcome” in white is to the right of the logo. Under the logo box are the words “RITA Intelligent Transportation Systems Joint Program Office.”)

 

Slide 2:

Welcome

Head shot photo of Shelley Row, P.E., PTOE - Director - ITS Joint Program Office

Shelley Row, P.E., PTOE

Director

ITS Joint Program Office

Shelley.Row@dot.gov

Screen capture snapshot of RITA website - for illustration only - see the extended text description below.

(Extended Text Description: Slide 2: Screen capture snapshot of RITA website - for illustration only. Below this image is a link to the current website: http://www.pcb.its.dot.gov - this screen capture snapshot shows an example from the RITA website from June 3, 2011. At the top of the page it shows the RITA logo with the text Research and Innovative Technology Administration - Intelligent Transportation Systems. Below the main site banner, it shows the main navigation menu with the following items: About RITA, Communities of Interest, Contact Us, Press Room, RITA Offices, Site Map, and a Search button. Below the main navigation menu, it shows a sub-navigation menu with the following items: About Us, T3 Webinars, ITS Peer-to-Peer, Resources, Local ITS PCB and Testimonials. Beneath the sub-navigation menu, the page is sub-titled "ITS Professional Capacity Building Program" and is divided into sub-sections such as "Welcome to ITS Professional Building", "News", "ITS Technical Assistance" and "Scheduled T3 Webinars". Again, this image serves for illustration only. The current website link is: http://www.pcb.its.dot.gov)

WWW.PCB.ITS.DOT.GOV

(Note: There is additional text attached to this slide that includes the following introductory information from Shelley Row):

"ITS Standards can make your life easier. Your procurements will go more smoothly and you’ll encourage competition, but only if you know how to write them into your specifications and test them. This module is one in a series that covers practical applications for acquiring and testing standards-based ITS systems.

I am Shelley Row the director of the ITS Joint Program Office for USDOT and I want to welcome you to our newly redesigned ITS standards training program of which this module is a part. We are pleased to be working with our partner, the Institute of Transportation Engineers, to deliver this new approach to training that combines web based modules with instructor interaction to bring the latest in ITS learning to busy professionals like you.

This combined approach allows interested professionals to schedule training at your convenience, without the need to travel. After you complete this training, we hope that you will tell colleagues and customers about the latest ITS standards and encourage them to take advantage of the archived version of the webinars.

ITS Standards training is one of the first offerings of our updated Professional Capacity Training Program. Through the PCB program we prepare professionals to adopt proven and emerging  ITS technologies that will make surface transportation safer, smarter and greener which improves livability for us all. You can find information on additional modules and training programs on our web site www.pcb.its.dot.gov.

Please help us make even more improvements to our training modules through the evaluation process. We look forward to hearing your comments. Thank you for participating and we hope you find this module helpful."

 

Slide 3:

T311

Applying Your Test Plan to the NTCIP 1203 v03 DMS Standard

 

Slide 4:

Target Audience

 

Slide 5:

Instructor

Photo of the instructor Patrick Chan, P.E.

Patrick Chan, P.E.

Senior Technical Staff Consensus Systems Technologies (ConSysTec) Flushing, NY, USA

 

Slide 6:

Recommended Prerequisites

 

Slide 7:

Curriculum Path (SEP)

Curriculum Path (SEP). Please see the Extended Text Description below.

(Extended Text Description: Curriculum Path (SEP). A chart showing the curriculum path for implementing a system that uses standards that are based on the systems engineering process.  A linear box chart starting with I101 – Using Standards: An Overview with an arrow leading to A101 – Introduction to Acquiring Standards-based ITS Systems with an arrow leading to A102 – Introduction to User Needs Identification with an arrow leading to A201 – Details on Acquiring Standards-based ITS Systems with an arrow leading to Understanding User Needs (A311a NTCIP 1203, A313a NTCIP 1204 v03, A321a TMDD v3.0) with an arrow leading to A311b – Specifying Requirements for DMS Systems Based on NTCIP 1203 Standard.)

 

Slide 8:

Curriculum Path (Testing)

Curriculum Path (Testing). Please see the Extended Text Description below.

(Extended Text Description: Curriculum Path (Testing). A chart showing the curriculum path for testing standards that are based on the systems engineering process.  A linear box chart starting with T101 – Introduction to ITS Standards Testing with an arrow leading to T201 – Introduction How to Write a Test Plan with an arrow leading to T202 – Overview of Test Design Specifications, Test Cases, and Test Procedures with three arrows, each leading to a separate box, T311 – Applying Your Test Plan to the NTCIP 1203 v03 DMS Standard, T313 Applying Your Test Plan to the NTCIP 1204 v03 ESS Standard, and T311 – Applying Your Test Plan to NTCIP/TMDD/ATC Standards.  The box with T311 – Applying Your Test Plan to the NTCIP 1203 v03 DMS Standard is highlighted.)

 

Slide 9:

Learning Objectives

  1. Recognize the purpose, structure, and content of a well-written test plan.
  2. Describe within the context of a testing lifecycle the role of a test plan and the testing to be undertaken.
  3. Describe the application of a good test plan to a DMS system being procured using a sample DMS test plan.
  4. Identify key elements of the NTCIP 1203 standard relevant to what is covered in the test plan.
  5. Walk through the process of adapting the test plan in the context of the needs and requirements of the DMS that have been selected by the user.

 

Slide 10:

Activity. A placeholder graphic with an image of hand over a computer keyboard to show that an activity is taking place.

 

Slide 11:

Discussion

Why do you perform testing?

Enter responses in the chat pod

 

Slide 12:

Learning Objective #1

Review of Testing

Why Test?

 

Slide 13:

Learning Objective #2

Review of Testing

Review of Testing. Please see the Extended Text Description below.

(Extended Text Description: Review of Testing. A figure with a diagram in the shape of a “VEE” with flanges.  On the left-most flange of the “VEE” diagram is Regional ITS Architecture(s), followed by Feasibility Study / Concept Exploration.  Going down the left-side of the “VEE” diagram is Concept of Operations, followed by a Document/Approval marker, followed by Systems Requirements, followed by a Document/Approval marker, followed by High-Level Design, followed by a Document/Approval marker, followed by Detailed Design, followed by a Document/Approval marker, and Software / Hardware Development Field Installation at the bottom of the “VEE” diagram.  Going up the right-side of the “VEE” diagram is a Document/Approval marker, followed by Unit/Device Testing, followed by a Document/Approval marker, followed by Subsystem Verification, followed by a Document/Approval marker, followed by System Verification & Deployment, followed by a Document/Approval marker, followed by System Validation, and followed by Operations and Maintenance.  Continuing along the right flange of the VEE diagram is Changes and Upgrades, followed by Retirement / Replacement. Between Concept of Operations (on the left side of the VEE) and System Validation (on the right side of the VEE) is a bi-directional dotted line labeled System Validation Plan.  Continuing down the VEE diagram is a bi-directional dotted line labeled System Verification Plan (System Acceptance) between System Requirements and System Verification & Deployment.  Continuing down the VEE diagram is a bi-directional dotted line labeled Subsystem Verification Plan (Subsystem Acceptance) between High-Level Design and Subsystem Verification.  Continuing down the VEE diagram is a bi-directional dotted line labeled Unit / Device Test Plan between Detailed Design and Unit/Device Testing. There is an arrow pointing down parallel to the left side of the VEE diagram, with the text Agency Requirements and Specification Development adjacent to System Requirements, and the text Test Document Preparation adjacent to High-Level Design and Detailed Design.  There is an arrow labeled Time Line at the bottom of the VEE diagram pointing from left to right.  Finally there is an arrow pointing up the right side of the VEE diagram, with the text Prototype Test, followed by Design Approval Test, followed by Factory Acceptance Test adjacent to Unit/Device Testing; and the text Incoming Device Test followed by Site Acceptance Test adjacent to Subsystem Verification; and the text Burn-in and Observation Test adjacent to System Verification & Deployment.)

 

Slide 14:

Learning Objective #2

Review of Testing

Verification

Review of Testing. Please see the Extended Text Description below.

(Extended Text Description: Review of Testing Verification. A figure showing the bottom half of the “VEE” diagram that was previously depicted in Slide #13.  Going down the left-side of the “VEE” diagram is High-Level Design, followed by a Document/Approval marker, followed by Detailed Design, followed by a Document/Approval marker, and Software / Hardware Development Field Installation at the bottom of the “VEE” diagram.  Going up the right-side of the “VEE” diagram is a Document/Approval marker, followed by Unit/Device Testing, followed by a Document/Approval marker, followed by Subsystem Verification. Between High-Level Design (on the left side of the VEE) and Subsystem Verification (on the right side of the VEE) is a bi-directional dotted line labeled Subsystem Verification Plan (Subsystem Acceptance).  Continuing down the VEE diagram is a bi-directional dotted line labeled Unit / Device Test Plan between Detailed Design and Unit/Device Testing. There is an arrow pointing up the right side of the VEE diagram, with the text Prototype Test, followed by Design Approval Test, followed by Factory Acceptance Test adjacent to Unit/Device Testing; and the text Incoming Device Test followed by Site Acceptance Test adjacent to Subsystem Verification.)

 

Slide 15:

Learning Objective #2

Review of Testing

Verification and Validation

Review of Testing. Please see the Extended Text Description below.

(Extended Text Description: Review of Testing Verification and Validation. A figure showing the top half of the “VEE” diagram that was previously depicted in Slide #13. On the left-most flange of the “VEE” diagram is Regional ITS Architecture(s), followed by Feasibility Study / Concept Exploration.  Going down the left-side of the “VEE” diagram is Concept of Operations, followed by a Document/Approval marker, followed by Systems Requirements, followed by Document/Approval marker.  Going up the right-side of the “VEE” diagram is a Document/Approval marker, followed by System Verification & Deployment, followed by a Document/Approval marker, followed by System Validation, and followed by Operations and Maintenance.  Continuing along the right flange of the VEE diagram is Changes and Upgrades, followed by Retirement / Replacement. Between Concept of Operations (on the left side of the VEE) and System Validation (on the right side of the VEE) is a bi-directional dotted line labeled System Validation Plan. Continuing down the VEE diagram is a bi-directional dotted line labeled System Verification Plan (System Acceptance) between System Requirements and System Verification & Deployment.
There is an arrow pointing down parallel to the left side of the VEE diagram, with the text Agency Requirements and Specification Development adjacent to System Requirements.  There is an arrow pointing up the right side of the VEE diagram, with the text Burn-in and Observation Test adjacent to System Verification & Deployment.)

 

Slide 16:

Learning Objective #1

Review of Testing

Validation

 

Slide 17:

Learning Objective #1

Review of Test Plans

Definition

Test Plan - High-level document that answers:

 

Slide 18:

Learning Objective #1

Review of Test Plans

What is the Item to be tested?

 

Slide 19:

Learning Objective #1

Review of Test Plans

How is the Item to be tested?

Device Under Test (DUT). Please see the Extended Text Description below.

(Extended Text Description: Device Under Test (DUT). A figure with a graphic depicting a device, labeled Device Under Test (DUT), with a bi-directional line to a cloud, labeled Communications.  From the cloud labeled Communications is a bi-directional ling to a graphic of a laptop computer, labeled Test Software.  Also there is a loop connecting the cloud labeled Communications and a different graphic of a laptop computer labeled Data Analyzer (Optional).)

 

Slide 20:

Learning Objective #1

Review of Test Plans

Who is to test the item?

 

Slide 21:

Learning Objective #1

Review of Test Plans

In what detail is the item to be tested?

 

Slide 22:

Learning Objective #1

Review of Test Plans

What Are the Test Deliverables?

 

Slide 23:

Learning Objective #1

Review of Test Plans

When is the testing to take place?

 

Slide 24:

Case Study. A placeholder graphic showing a Traffic Management Center indicating that a Case Study follows.

 

Slide 25:

Learning Objectives #3

Example Test Plan for a DMS System

Introduction and Test Items

[IEEE 829-1998]

 

Slide 26:

Learning Objectives #3

Example Test Plan for a DMS System

Features Being Tested and Approach

 

Slide 27:

Learning Objectives #3

Example Test Plan for a DMS System

Test Deliverables and Setup

 

Slide 28:

Learning Objectives #3

Example Test Plan for a DMS System

Responsibilities, Schedule, and Approvals

 

Slide 29:

Learning Objective #1

Types of Test Plans

OR

 

Slide 30:

Learning Objective #1

Test Plan Components

 

Slide 31:

Learning Objective #1

Test Plan Components

Test Plan Components. Please see the Extended Text Description below.

(Extended Text Description: Test Plan Components. A figure with a chart depicting the components of a test plan.  At the top of the chart is a box with the text, Test Plan (Document for Project), pointing to a box on the second row with the text, Test Design Specification (For DMS Interface), pointing to three boxes on the third row, each with the text, Test Case Specification #m, nn.  Each box on the third row points to a box on the fourth row.  Each box on the fourth row has the text Test Procedure Specification #m, nn.)

 

Slide 32:

Polling. A placeholder slide showing hands raised signifying polling activities.

 

Slide 33:

Poll Question

A test case does what?

  1. Specifies the inputs, predicts results, and the conditions for one or more functions in the test item.
  2. Specifies the details of the test approach for a feature or combination of features.
  3. Describes the scope, approach, and resources for the testing activities.
  4. Specifies the sequence of actions for the execution of a test.

 

Slide 34:

Learning Objective #4

Review of NTCIP 1203

NTCIP 1203

 

Slide 35:

Learning Objective #4

Review of NTCIP 1203

NTCIP 1203

 

Slide 36:

Learning Objective #4

NTCIP 1203 Testing

Interface/Communications Testing

 

Slide 37:

Learning Objective #4

NTCIP 1203 Testing

Interface/Communications Testing

 

Slide 38:

Learning Objective #4

Test Design Specification

Definition

 

Slide 39:

Learning Objectives #4,5

Test Design Specification

Features to be Tested

 

Slide 40:

Learning Objectives #4,5

Test Design Specification

Features to be Tested

USER NEED SECTION NUMBER

USER NEED

FR SECTION NUMBER

FUNCTIONAL REQUIREMENT

CONFORMANCE

SUPPORT/ PROJECT REQUIREMENT

ADDITIONAL PROJECT REQUIREMENTS

2.5.3

Monitor the Status of the DMS

M

Yes

2.5.3.1

Perform Diagnostics

M

Yes

2.5.3.1.1

Determine Sign Error Conditions - High-Level Diagnostics

M

Yes

3.5.3.1.1.1 (LampTest)

Execute Lamp Testing

Lamp OR Fiber: M

Yes / NA (NA highlighted in red)

3.5.3.1.1.2 (PixelTest)

Activate Pixel Testing

Matrix :M

Yes / NA (Yes highlighted in red)

I

3.5.3.1.1.3 (ClimateTest)

Execute Climate-Control Equipment Testing

0

|Yes / No (Yes highlighted in red)

3.5.3.1.2

Provide General DMS Error Status Information

M

Yes

 

Slide 41:

Learning Objectives #4,5

Test Design Specification

RTM

 

Slide 42:

Learning Objectives #4,5

Test Design Specification

RTM

Requirements Traceability Matrix (RTM)

FR ID

Functional Requirement

Dialog ID

Object ID

Object Name

Additional Specifications

3.5.3

Monitor the Status of the DMS

3.5.3.1

Perform Diagnostics

3.5.3.1.1

Test Operational Status of DMS Components

3.5.3.1.1.1

Execute Lamp Testing

4.2.4.1

3.5.3.1.1.2

Activate Pixel Testing

4.2.4.2

5.11.2.4.3

pixelTestActivation

3.5.3.1.1.3

Equipment Testing

5.11.2.3.5.6

dmsClimateCtrlTestActivation

5.11.2.3.5.7

dmsClimateCtrlAbortReason

(Additional author's notes for this slide: The figure is a snapshot of a Requirements Traceability Matrix (RTM) table. The table headings are FR ID, Functional Requirement, Dialog ID, Object ID, Object Name and Additional Specifications. The first row is shaded, the FR ID is 3.5.3, and the Functional Requirement is Monitor the Status of the DMS. The second row is shaded, the FR ID is 3.5.3.1, and the Functional Requirement is Perform Diagnostics. The third row is shaded, the FR ID is 3.5.3.1.1, and the Functional Requirement is Test Operational Status of DMS Components. On the fourth row, the FR ID is 3.5.3.1.1.1, the Functional Requirement is Execute Lamp Testing, and the Dialog ID is 4.2.4.1. On the fifth row, the Object ID is 5.11.2.5.3, and the Object Name is lampTestActivation. On the sixth row, the FR ID is 3.5.3.1.1.2, the Functional Requirement is Activate Pixel Testing, and the Dialog ID is 4.2.4.2. On the seventh row, the Object ID is 5.1.2.4.3, and the Object Name is pixelTestActivation. Rows six and seven are circled by a single large red oval. On the eighth row, the FR ID is 3.5.3.1.1.3, the Functional Requirement is Execute Climate Control Equipment Testing, and the Dialog ID is 4.2.4.3. On the ninth row, the Object ID is 5.11.2.3.5.6, and the Object Name is dmsClimateCtrlTestActivation. On the tenth row, the Object ID is 5.11.2.3.5.7, and the Object Name is dmsClimateCtrlAbortReason.)

 

Slide 43:

Learning Objectives #4,5

Test Design Specification

RTM

4.2.4.2 Activating Pixel Testing

The standardized dialog for a management station to command the DMS to activate pixel testing shall be as follows:

a) The management station shall SET pixelTestActivation.O to 'test'.

b) The management station shall repeatedly GET pixelTestActivation.O until it either returns the value of 'noTest' or a maximum time-out is reached. If the time-out is reached, the DMS is apparently locked and the management station shall exit the process.

c) (Postcondition) The following objects will have been updated during the pixel test to reflect current conditions. The management station may GET any of these objects as appropriate.

  1. pixelFailureTableNumRows
  2. any object within the pixelFailureTable

 

Slide 44:

Learning Objectives #4,5

Test Design Specification

RTCTM

 

Slide 45:

Learning Objectives #4,5

Test Design Specification

RTCTM

Requirement

Test Case

ID

Title

ID

Title

3.5.3

Monitor the Status of the DMS

3.5.3.1

Perform Diagnostics

3.5.3.1.1

Test Operational Status of DMS Components

3.5.3.1.1.1

Execute Lamp Testing

C.3.5.21

Verify Lamp Test with No Errors

C.3.5.22

Verify Lamp Test with Errors

3.5.3.1.1.2

Activate Pixel Testing

C.3.5.1

Pixel Test - No Errors

C.3.5.2

Pixel Test - Errors

3.5.3.1.1.3

Execute Climate-Control Testing

C.3.5.3

Climate-Control Equipment Test - No Errors

C.3.5.4

Climate-Control Equipment Test - Errors

(Additional author's notes for this slide: The figure is a snapshot of a Requirements – Test Case Traceability Matrix (RTCTM) table. The table headings are Requirement ID, Requirement Title, Test Case ID, and Test Case Title. The first row is shaded dark, the Requirement ID is 3.5.3, and the Requirement Title is Monitor the Status of the DMS. The second row is shaded a lighter color, the Requirement ID is 3.5.3.1, and the Requirement Title is Perform Diagnostics. The third row is shaded a lighter color, the Requirement ID is 3.5.1.1, and the Requirement Title is Test Operational Status of DMS Components. The fourth row is shaded the same shade as the third row, the Requirement ID is 3.5.1.1.1, and the Requirement Title is Execute Lamp Testing. On the fifth row, the Test Case ID is C.3.5.21 and the Test Case Title is Verify Lamp Test with No Errors. On the sixth row, the Test Case ID is C.3.5.22 and the Test Case Title is Verify Lamp Test with Errors. The seventh row is shaded lightly, the Requirement ID is 3.5.1.1.2, and the Requirement Title is Activate Pixel Testing. On the eighth row, the Test Case ID is C.3.5.1 and the Test Case Title is Pixel Test - No Errors. On the ninth row, the Test Case ID is C.3.5.2 and the Test Case Title is Pixel Test - Errors. Rows 7, 8 and 9 are encircled by a red square. The tenth row is shaded lightly, the Requirement ID is 3.5.1.1.3, and the Requirement Title is Execute Climate Control Equipment Testing. On the eleventh row, the Test Case ID is C.3.5.3 and the Test Case Title is Climate-Control Equipment Test – No Errors. On the twelfth row, the Test Case ID is C.3.5.4 and the Test Case Title is Climate-Control Equipment Test – Errors.)

 

Slide 46:

Learning Objectives #4,5

Test Design Specification

RTCTM

 

Slide 47:

Learning Objectives #4,5

Test Plan for a DMS System

Tailored RTCTM

 

Slide 48:

Learning Objectives #4,5

Test Plan for a DMS System

Tailored RTCTM

 

Slide 49:

Case Study. A placeholder graphic showing a Traffic Management Center indicating that a Case Study follows.

 

Slide 50:

Learning Objective #3

Test Plan for a DMS System

Example Test Design Specification

 

Slide 51:

Learning Objectives #1,3

Test Case Specifications

Definition

 

Slide 52:

Learning Objective #3

Test Case Specifications

Example

3.5.1.3.4 Retrieve a Font Definition

The DMS shall allow a management station to upload the fonts defined in the sign controller.

Requirement

Test Case

ID

Title

ID

Title

3.5.1.3.4

Retrieve a Font Definition

C.3.2.4

Retrieve a Font Definition

 

Slide 53:

Learning Objectives #1,3

Test Procedures

Definition

 

Slide 54:

Learning Objectives #4,5

Test Procedures

C.3.5.1 Pixel Test - No Errors

Test Case: 5.1

Title:

Pixel Test - No Errors

Description:

This test case verifies that the DMS executes a pixel test and verifies that there are no failed pixels.

Pixel_Test_Time

From Manufacturer's Documentation

Variables

Message_Display_Test_Time

From Manufacturer's Documentation

Pass/Fail Criteria:

The DUT shall pass every verification step included within the Test Case to pass the Test Case.


Step

Test Procedure

Results

Additional References

1

CONFIGURE: Determine the maximum period of time that the pixel test should require (based on manufacturer documentation). RECORD this information as: »Pixel Test Time

2

CONFIGURE: Determine the maximum period of time that the message display pixel test should require (based on manufacturer documentation). RECORD this information as: » Message_Display_Test_Time

3

SET-UP: Ensure that all pixels are functioning prior to this test.

4

SET the following object(s) to the value(s) shown: »pixelTestActivation.O = 'test' (3)

NOTE-Valid enumerated values are defined in Section 5.11.2.4.3 (Pixel Test Activation Parameter).

Pass / Fail (Section 3.5.3.1.1.2)

Section 4.2.4.2 Step a

 

Slide 55:

Learning Objectives #4,5

Test Procedures

Elements of Test Procedures

17

VERIFY that the RESPONSE VALUE for shortErrorStatus.0 has bit 5 (pixel error) cleared.

Pass / Fail (Section 3.5.3.1.2)

18

PERFORM the test case labeled 'Blank the Sign1 (C.3.7.15).

Pass / Fail (Section 3.5.2.3.1)

Test Case Results

Tested By:

Date Tested:

Pass /Fail

Test Case Notes:

 

Slide 56:

Learning Objectives #4,5

Test Procedures

Elements of Test Procedures

1

CONFIGURE: Determine the enumerated value corresponding to the beacon type required by the specification (PRL 2.3.2.4). RECORD this information as:

»Required_Beacon_Ty p e

NOTE-Valid enumerated values are defined in Section 5.2.8 (Beacon Type Parameter).

2

SET-UP: Determine the enumerated value indicating the actual type of beacons on the sign (See Section 5.2.8). RECORD this information as: »Actual_Beacon_Typ e

 

Slide 57:

Learning Objectives #4,5

Test Procedures

Elements of Test Procedures

17

VERIFY that the RESPONSE VALUE for shortErrorStatus.0 has bit 5 (pixel error) cleared.

Pass / Fail (Section 3.5.3.1.2)

18

PERFORM the test case labeled 'Blank the Sign1 (C.3.7.15).

Pass / Fail (Section 3.5.2.3.1)

Test Case Results

Tested By:

Date Tested:

Pass /Fail

Test Case Notes:

 

Slide 58:

Learning Objectives #4,5

Test Procedures

Elements of Test Procedures

8

FOR EACH value. N, from 1 to Actual Graphic Entries, perform Steps 8.1 through 8.2.

8.1

SET-UP: GET the following object(s): »drnsG raphicStatus .N sdmsGraphiclD.N

8.2

IF the RESPONSE VALUE for dmsGraphicStatus.N does not equal 'permanent' (6), then GOTO Step 8.2.1; otherwise, GOTO Step 9.

NOTE--Valid enumerated values are defined in Section 5.12.6.10 (Graphic Status Parameter)

8.2.1

VERIFY that the RESPONSE VALUE for dmsGraphicStatus.N is not 'inUse' (5).

Pass / Fail

 

Slide 59:

Polling. A placeholder slide showing hands raised signifying polling activities.

 

Slide 60:

Test Plan for a DMS System

Exercise

Below is a dialog in NTCIP 1203 v03.

b) The management station shall GET the following data:

1) dmsMessageMultiString.x.y

2) dmsMessageOwner.x.y

3) dmsMessageRunTimePrlorityxy

4) dmsMessageStatus.x.y

c) The management station shall GET dmsMessageBeaconxy,

d) The management station shall GET dmsMessagePlxelServicexy,

NOTE—The response to this request may be a noSuchName error, indicating that the DMS does not support this optional feature. This error will not affect the sequence of this dialog, but the management station should be aware that the CRC will be calculated with this value defaulted to zero (0).

Look at Step C of this dialog.

 

Slide 61:

Test Plan for a DMS System

Exercise

Which test step properly reflects Step C of the previous dialog?

Get the following object(s): >> __________________

1. dmsMessageMultiString.x.y

2. dmsMessageStatus.x.y

3. dmsMessageBeacon.x.y

4. dmsMessagePixelService.x.y

Enter responses in the chat pod

 

Slide 62:

Review

 

Slide 63:

Review

 

Slide 64:

Review

 

Slide 65:

Resources

www.ntcip.org

 

Slide 66:

Questions? A placeholder graphic image with word Questions? at the top, and an image of a lit light bulb on the lower right side.