T3 Webinar Presentation

A Sign of the Times: Using the ITS Standard (NTCIP 1203) for Dynamic Message Signs (September 26, 2007)

Presenter:   Ken Vaughn
Presenter's Org:   Trevilon
Presenter:   Ashwin Amanna
Presenter's Org:   VTTI
Presenter:   Tom Stout
Presenter's Org:   Federal Highway Administration (FHWA)

HTML version of the presentation
Image descriptions are contained in brackets. [ ]
Back to Webinar Files

T3 Webinars are brought to you by the ITS Professional Capacity Building Program (ITS PCB) at the U.S. Department of Transportation's (USDOT) ITS Joint Program Office, Research and Innovative Technology Administration (RITA)


Slide 1: NTCIP 1203 (DMS) Deployment Experiences

T3 Webinar
September 26, 2007

Slide 2: Agenda

  1. Version 1 Lessons Learned
  2. Version 2 Development
  3. Version 3 Early Deployment

Slide 3: Version 1: Deployments

[Map of the US, showing deployment of DMS by state.]

Slide 4: Version 1: Lessons Learned

  • Initial deployments were painful
    • Vendors required extra time to implement
    • Implementation of standard could create bugs in software
    • Questions about interpretations of standard
    • Specifications not always rigorous enough
    • Testing, when done, required multiple rounds
    • Often accepted good-enough
  • Need maturity
  • Need to define a process that ensures success

Slide 5: Version 1: Lessons Learned

  • Standard had ambiguities and omissions
    • No explicit statement of device functionality
      • But objects implied device functionality
    • Data exchange dialogs not explicitly defined
      • Can you edit a font that is in use
    • Definition of objects not always clear
    • Other omissions
  • Need a way to validate and verify the standard

Slide 6: Version 1: Lessons Learned

  • Implementers found the standard difficult to use
    • Version 1 was a design document
    • Lots of features are optional
      • Options needed to support diversity of signs
    • Had to reverse engineer design to understand intended functionality
  • Need a more user-friendly solution

Slide 7: Version 1: Lessons Learned

  • The standard was difficult to specify
    • Had to identify each object required for project
    • Had to identify required range for each object
      • Required detailed understanding of standard
      • Made specifications difficult to understand
    • Still had to specify functional requirements
      • Conflicts between functional and NTCIP specs
    • Had to identify exact communications stack
  • Need to improve quality of specifications

Slide 8: Version 1: Lessons Learned

  • The standard was difficult to test
    • Functional requirements only implied
    • Had to derive intended processes
    • Had to define procedures
      • ENTERPRISE/I-95 procedures became de-facto
    • Tools available to test were limited
      • Testing required significant time
      • Testing required extreme expertise
      • Reproducing tests required extreme care
  • Need a complete, efficient, reproducible testing solution

Slide 9: Version 1: Lessons Learned

  • Agencies expected few problems
    • Only minimal testing was performed
    • Deployments revealed problems
    • Some problems discovered in follow-on deployments
  • Agencies need to fully test each delivery

Slide 10: Version 1: Lessons Learned

  • Deployments are not 100% interoperable
    • Deployment process is not consistent
    • Standard is not correct and complete
    • Different interpretations of standard
    • Holes in specifications
    • Inconsistent testing
  • Need to create an end-to-end solution
    • Could be standardized, but not required
    • Industry needs to be aware of solution

Slide 11: Version 1: Deployments

  • Integration is still easier
    • Standards facilitate organizational change
    • In the big picture, the problems are minor

Slide 12: Version 2: Development

  1. Addressing Lessons Learned
  2. Summary of Changes
  3. Backwards Compatibility
  4. Status

Slide 13: Version 2: Development

Lessons LearnedV2 Solution
  1. Define Process
  2. V&V Standard
  3. Easy–to–use
  4. Improve Specs
  5. Define Testing
  6. Encourage Testing
  7. Advertise Solution
  1. Follow SEP
  2. Correct Standard
  3. V1 Compatible
  4. Develop Guides
  5. Define Test Proc.
  6. Testing Tools
  7. Workshops&Asst

Slide 14: Version 2: Follow SEP

  • Systems engineering material added to Standard
    • Concept of operations
      • User needs
    • Functional requirements
    • Dialogs
    • Detailed design
    • Traceability tables
      • Protocol Requirements List (PRL)
      • Requirements Traceability Matrix (RTM)
    • Test procedures (may be added in future)

Slide 15: Version 2: Follow SEP

  • Extra material provides
    • Formal functional requirements
      • Removes ambiguity in previous standard
    • A more user-friendly document
      • Userís select desired functionality
      • Traceability translate functions into design
      • Userís need not worry about design details
  • Value proven during the Early Deployment

Slide 16: Version 2: Follow SEP

  • User needs define the features that may be supported
    • Activate and Display a Message
      • This feature allows an operator to activate a previously defined message to be displayed on the sign face. The message can be a blank message or come from a set of previously defined messages.
      • When activating the message the operator will need to specify the desired duration for the display and the relative priority for the proposed message to override the currently displayed message.

Slide 17: Version 2: SEP: PRL

  • Protocol Requirements List (PRL)
    • Summarizes features defined in the standard
    • Provides a clause reference for each feature
    • Indicates whether each is optional or mandatory
    • Provides a column to select for a specific project

[Table showing Protocol Requirements List (PRL)]

Slide 18: Version 2: SEP: PRL

  • Traceability to Requirements
    • Many-to-many relationship
    • Clause of each requirement also shown
    • Conformance and Support also shown

[Table showing Traceability to Requirements]

Slide 19: Version 2: SEP: Requirement

  • Requirements define details of feature
    • Activate a Message
      • The DMS shall allow a management station to display a message on the sign face, including:
        • Any permanent message supported by the sign
        • Any previously defined message
        • A blank message of any run-time priority
        • A message based on the scheduling logic, if a scheduler is supported by the sign

Slide 20: Version 2: SEP: Specification

  • Specifying Version 2 is primarily filling out PRL
    • Need to ensure that selections are in agreement with remainder of specification

[In specifying Version 2, a PRL is filled out with selections that are in agreement with the remainder of specification]

Slide 21: Version 2: SEP: Specification

  • Some requirements require additional details

[PRL with additional details for some requirements.]

Slide 22: Version 2: SEP: RTM

  • Requirements are traced to design details
    • Defined in Requirements Traceability Table (RTM)
    • Specification does not need to worry about RTM
      • Maps each requirement to
      • A listing of objects (essentially content of v1 standard)
      • A dialog (standardized sequence for exchanging data)

[Requirements Traceability Table.]

Slide 23: Version 2: SEP: Dialog

[Dialogue Diagram.]

Slide 24: Version 2: SEP: Object

  • Objects were the only content of v1
    • Users previously had to understand this level of detail and build upwards in specifications
    • Version 1 only implied the sign functionality
    • Version 2 simplifies and tightens specifications

Slide 24: Version 2: SEP Summary

  • Benefits of SEP
    • Clearly defines process used to specify product (and test...)
    • Allows validation and verification of standard
    • Helps resolve ambiguities
      • Ensures all dialogs are defined
      • Ensures all objects are defined
    • Makes the standard more useable
      • Users can read the needs and requirements
      • Implementers can trace backwards to understand reason for objects

Slide 25: Version 2: Correct Standard

  • CMP corridors used in TIP & RTP project prioritization
  • Scoring system gives higher priority to projects that have been recommended in the CMS

Slide 26: Project Prioritization Process

New Features Corrections Changes
Graphics
24-bit Color
Msg Positioning
Critical Temp
Add'l Diagnostics
Add'l Config items
Time
Font Definition
Brightness Ctrl
Fan Diagnostics
Auxiliary I/O
All of these support "Backwards Compatibility"

Slide 27: Backwards Compatibility

  • Term applies to systems, not standards
  • A standard merely "supports" the concept
    • Changes do not conflict with old mechanisms
      • V1/2 system can decode both V1 and V2 data
    • V1 mechanism is not changed
      • The heart of backwards compatibility
      • Any ambiguities still exist
    • V1a may not work with V1b
    • Key is to specify the desired interpretation
  • Standard can not adequately address

Slide 28: Version 2: V1 Compatible

[V1 compatibility matrix.]

Slide 29: Version 2: Deployment

Daniel S. Blevins
Principal Transportation Planner
email: dblevins@wilmapco.org

For more CMP material, visit our website at http://www.wilmapco.org

Slide 29: Version 2: Deployment

  1. Guides
  2. Test Procedures
  3. Test Tools
  4. Workshops and Assistance

Slide 30: Version 2: Guides

  • Procurement Guide
    • Supplements Procurement Workshop
    • Explains procurement process
    • Includes language to include in specification
    • Includes PRL from standard
  • Workbook for Testing Workshop
    • Explains testing process
    • Includes sample documentation

Slide 31: Version 2: Test Procedures

  • At least one test for every requirement
    • Tests functionality defined in standard
      • Ensures sign can display a message
      • Does not focus on accuracy of sensors
      • Does not test environmental conditions
    • Not 100% exhaustive
  • Defined per NTCIP 8007 rules
    • Tool generic
    • Project generic

Slide 32: Version 2: Test Tools

  • Test procedures in formal XML structure
    • Tool-generic format
    • Allows export to automated scripts
      • Requires converter for specific script language
      • 80% automatic
      • 20% requires customization
      • Minimizes errors in implementing test procedures
    • Proof of concept included in early deployment

Slide 33: Version 2: Workshops, etc.

  • Procurement Workshop
    • Explains procurement process
    • Provides overview of NTCIP structure
    • Explains how to specify NTCIP
    • Discusses extensions to standard
    • Discusses life-cycle issues
  • Testing Workshop
    • Explains test documents
    • Explains NTCIP details
    • Explains testing process

Slide 34: VTTI Early Deployment

Slide 35: VDOT/VTTI Early Deployment

  • V1 initial deployments
    • Experienced many challenges
    • Were not coordinated
  • FHWA wanted a more coordinated approach
    • Demonstrate end-to-end process to industry
    • Provide feedback to standards effort
    • Provide assistance for initial deployment
    • Properly capture lessons learned

Slide 36: VDOT/VTTI Early Deployment

  • Joint effort
    • Virginia DOT
    • Virginia Tech Transportation .Institute
    • FHWA
  • Deployed second User Comment Draft
    • One Central System
    • One Sign Vendor
    • Each firm was required to work in isolation
    • Questions fed through Technical Assistance

Slide 37: VDOT/VTTI Early Deployment

Procure Test Sign Test Central
RFP for Sign
RFP for Central
Evaluate Proposals
Select Vendors
Issue POs
Implement
Pre-test (Controller)
Initial Test
Final Test
Pre-test
Initial Test
Final Test
2005–2006 Nov í06–Feb '07 Dec í06–Mar '07

Slide 38: Userís Perspective

  • PRL
    • Relatively straight forward to fill out PRL
    • Easy to make mistakes
      • Entering wrong format of information
      • Entering repeated variables inconsistently
      • PRL information drives the variable table and testing tool

Slide 39: VDOT/VTTI Early Deployment

  • Used FHWA Test Procedures for v2
    • Tested every requirement included in the deployment (75% central/85% sign)
  • Traceability tables isolated problems
    • Failures could be
      • Ambiguity in standard
      • Problem in test procedure
      • Problem in test tool
      • User error
      • Problem in device
      • Problem in central

Slide 40: Userís Perspective

  • Testing
    • Actual Test Case steps go above/beyond just functional tests that an agency might be used to
      • Example: Activate/display message (user need 2.4.2.3.1) has 21 steps.
      • Steps 1 and 2 are activate and display message

Slide 41: Userís Perspective

  • Testing (continued)
    • The RTM really does foster an amenable environment between contractors
    • Eliminates finger pointing/blame game
    • Applying RTM to testing the software allowed apples-to-apples comparison of the software and sign, rather than relying on strictly functional testing of the sign

Slide 42: VDOT/VTTI Early Deployment

  • Demonstrated value of systems engineering
    • Traceability > quick identification of problems
      • Consensus because everyone can see
      • Requirement
      • Need
      • Design
    • Identification of problem > assign action item
    • Assigned action item > resolution of problems
    • Resolution of problem > accepted product
    • Accepted product avoids conflict and legal issues

Slide 43: VDOT/VTTI Early Deployment

  • Resulting tools
    • DMS Procurement Guide
    • DMS Procurement Workshop
    • DMS Testing Workbook
    • DMS Testing Workshop
    • DMS Test Procedures (8007 Conformant)
    • XML Version of Test Procedures
    • Lessons Learned Report
    • Comments back to DMS WG

Slide 44: VDOT/VTTI Early Deployment

  • Tools still need to be updated
    • Reflect RS instead of UCD
    • Enhance based on lessons learned

Slide 45: VDOT/VTTI Early Deployment

  • Successful Deployment
    • Good Standard
    • Good Specs
    • Formal
    • Component Test
    • Formal Integration Test

Slide 46: VDOT/VTTI Early Deployment

Ken Vaughn, Trevilon
kvaughn@trevilon.com

Ashwin Amanna, VTTI
AAmanna@vtti.vt.edu

Tom Stout, FHWA
Tom.Stout@dot.gov

back to top