2. Development of the Evaluation Strategy
2.1 Review Of ADMS Virginia Documents
The Evaluation Team reviewed and commented on many of the documents produced during the course of ADMS Virginia's development. These included:
- TMC Applications of Archived Data Operational Test, Modified "Build" Methodology & Schedule, January 24, 2003
- Concept of Operations, ADMS Virginia, March 17, 2003
- Build 1 - Functional Requirements Document, ADMS Virginia, April 01, 2003
- Build 2 - Functional Requirements Document, ADMS Virginia, July 16, 2003
- Build 3 - Functional Requirements Document, ADMS Virginia, December 16, 2003
- ADMS Virginia Draft Final Report, December 1, 2004
Many more documents were developed by the ADMS Virginia team that were not reviewed with formal comments by the Evaluation Team.
2.2 Early Review Of ADMS Virginia Development
The interim documents produced by the ADMS Virginia team as well as the Build approach to system development allowed the Evaluation Team to review the functionality at several different stages. The first major review of ADMS Virginia progress was held on March 26, 2003. The purpose of this meeting was to flesh out the types of evaluations that should be performed as a precursor to the Evaluation Plan. The Concept of Operations document and initial contact with the ADMS Virginia team were the basis of this formulation. At that time, neither the Concept of Operations nor the ADMS Virginia team indicated direct use of archived data in operations strategies. The focus appeared to be on planning functions, both traditional transportation planning and operations planning (the latter primarily through the use of performance measurement.) As a result of this meeting, more emphasis was given to supporting operations by the ADMS Virginia team, in addition to maintaining support for planning functions.
2.3 ADMS Virginia Final Functionality
The key features of the final system provided the basis for developing the Evaluation Plan and the Test Plans. These are documented as follows.
2.3.1 Data Processing and Management Functions
- Data Structure - a relational data base was constructed. Common location referencing was used to link the various types of data for applications.
- Metadata - metadata on the data stored in the system is provided.
- Quality Control (QC) - post hoc QC procedures were developed and are applied to the traffic data.
- Data Imputation - for traffic data that is either missing (not reported from the field) or fail the QC tests, an imputation procedure is used to fill in or replace gaps.
- On-line Query Results - many ADMS Virginia applications include data summarized visually for users on-line via the use of maps and statistical charts
- Output File Formats - the results of queries for data files may be viewed online as an Adobe Acrobat file or downloaded as files in either comma-separated or XML formats. The XML definitions were developed by the ADMS Virginia team.
2.3.2 User Functions and Applications
126.96.36.199 Standard Data Query
This service outputs raw data or aggregates of the raw data at user-requested temporal and spatial levels of aggregation. The format of the output data is available in CSV, XML, PDF, or by plot. Data is available from the Traffic, Incident, Weather, and TMS databases.
Traffic Data Timeline Plot/Map
Plots or maps volume, occupancy, speed, and quality information for a single corridor, corridor section, or station aggregated at selected time intervals.
Station Data Download
Allows user to view or download detailed volume, occupancy, speed, and quality information for selected corridors, corridor sections, or stations aggregated at selected time intervals.
Allows user to view or download detailed incident information.
Provides user with capability to perform analysis of incident information for defined periods of time. This page allows the user to obtain counts for the types of incidents requested plus detailed incident information. The user can view, download, or plot the data. The data can be plotted by incident type, weather conditions, duration of incident, number of cars involved, number of lanes blocked, or number of incidents that occurred by day. Users can map average, maximum, or minimum duration of an incident.
Downloads weather data from various WBAN(s) (Weather-Bureau-Army-Navy) in the region.
TMS Data Download
Downloads detailed classification, speed, and quality information from the Traffic Monitoring System (TMS) for selected links, time period, and time aggregation.
188.8.131.52 Mobility Measures of Effectiveness
Derives a number of defined mobility measures from the archived data and present these measures in different formats. Users are able to retrieve the following traffic-based mobility measures: speed, flow rate, V/C ratio, speed standard deviation, and VMT.
Mobility Measures - Traffic Download
Allows user to view or download of traffic-based performance measures.
Mobility Measures - Traffic Spatial Plot/Map
Allows user to plots or map traffic-based performance measures.
Mobility Measures - Traffic Timeline Plot/Map
Plots or maps traffic-based performance measures. Measures can be aggregated at selected time intervals.
Mobility Measures - AADT Analysis
Allows a view, plots, or downloads AADT values.
184.108.40.206 Operations/Maintenance Support
Allows users to evaluate current road conditions, data quality for sensor stations, and compare current incidents with past incidents.
Allows user to view speed and flow rate for the last 5 minutes, by corridor. The user can view current conditions on a plot or map. The user can also monitor active incidents in the region.
Provides traffic information regarding similar incidents from the past.
Allows users to view data quality for selected stations. User can download, plot, or map % of usable data and % of imputed data.
Allows user to view short-term forecasted traffic statistics. Forecasts are made 10, 30, and 60 minutes into the future for level of service and volume. Forecasted volumes may be displayed along with either current volume or historical average volume.
Traffic Forecasting Accuracy
Allows user to review the accuracy of forecasted traffic volumes for the last week.
220.127.116.11 Evacuation/Special Events Planning
This service aids the development and implementation of evacuation plans for major disasters such as hurricanes or for local events such as the July 4th holiday.
18.104.22.168 HOV Monitoring/Evaluation (currently NoVA only)
This service provides reports for HOV usage monitoring/evaluation on the I-95 and I-395 corridors. These analyses are available only for weekdays currently.
HOV Daily Report
Allows user to view, download or plot: volume, speed, and quality information for two pre-selected stations (one inside the beltway and one outside the beltway) during the AM and PM HOV Restriction Peak.
HOV Detailed Analysis
Allows a view, downloads, or plots: volume, speed, and quality information for either a Mainline (HOV or RHOV) or Ramp (On or Off) Station Analysis.
22.214.171.124 Transportation Planning and Air Quality Support (currently Hampton Roads only)
Supports air quality analysis needs and long-range transportation planning by computing statistics typically use as inputs to travel demand forecasting and emissions models: volume, speed, VMT, % VMT by hour, V/C ratio, Level of service, peak hour factor average daily traffic.
126.96.36.199 DynaMIT Simulation Support (currently Hampton Roads only)
Allows user to download data structured in the input formats of the DynaMIT simulation model.
2.4 Final Evaluation Hypotheses And Approach
The objectives of the evaluation relate to the use of the data to improve TMC-related and other activities. ADMS Virginia is developing a series of applications around its data archive that support a variety of transportation functions. From these, eight hypotheses and associated goals for the evaluation have been constructed. These are organized into three broad areas, as follows:
188.8.131.52 TMC Operations Planning
Archived data tools enable STC staff to perform more effective Operations Planning
- Goal - improved TMC operations
Use of the ADMS improves system wide travel conditions
- Goal - less total delay and increased reliability
184.108.40.206 Planning Functions
Availability of archived data will improve accuracy of regional planning models
- Goal - improved regional planning
Availability of archived data will reduce cost of regional planning models
- Goal - improved regional planning
220.127.116.11 General Archive Functions
The ADMS provides a mechanism for improving the quality of traffic data
- Goal - improved data quality
The ADMS is portable to other areas
- Goal - provide transferability with a minimum of customization
The ADMS development process has met the needs of the stakeholders
- Goal - exemplary or "model" ADMS design
The ADMS has satisfactorily fused data from different sources
- Goal - applications and queries can access and use disparate forms of data
A summary of hypotheses, goals, measures of effectiveness (MOEs), and required data appear in Tables 1 and 2, followed by a discussion of individual evaluations.
Methods of Evaluation (MOE)
The Archived Data Tools Enable STC Staff To Perform More Effective Operations Planning
|Change in the time required to post DMS message following an incident.||• System Data
|Percent of time that ADMS tools are accessed prior making a DMS change.||• System Data
|Perceived change in the time required to post DMS message following an incident||• Interviews|
|Perceived usefulness of the ADMS data available to STC operators when considering a DMS change.||• Interviews|
|Reported change in the process used by STC operators when considering a DMS update||• Interviews|
|Percent of time that ADMS tools are accessed prior making a road closure decision.||• System Data
|Perceived change in the time required to plan and implement road closures||• Interviews|
|Perceived usefulness of the ADMS data available to STC operators when planning a road closure||• Interviews|
|Reported change in the process used by STC operators when considering a road closure||• Interviews|
|Percent of time that ADMS tools are accessed prior making a decision regarding HOV restrictions||• System Data
|Perceived change in the time required to plan and implement changes to HOV restrictions||• Interviews|
|Perceived usefulness of the ADMS data available to STC operators when planning changes to HOV restrictions||• Interviews|
|Reported change in the process used by STC operators when considering changes to HOV restrictions||• Interviews|
|The Use of the ADMS Improves Systemwide Travel Conditions||Travel time index (mean and 95th %ile), buffer time index, delay, incident duration by type||• Archived Data|
Methods of Evaluation (MOE)
The ADMS Improves Accuracy of Planning Models
|Perceived usefulness of ADMS tools.||• Interviews|
|Reported change in the day-to-day processes of users resulting from the availability of ADMS tools.||• Interviews|
|Perceived benefit of ADMS tools||• Interviews|
|Perceived user friendliness of ADMS tools.||• Interviews|
|Identification of aspects of the ADMS tools that users find effective.||• Interviews|
|Identification of user’s needs not being met by the ADMS tools.||• Interviews|
|Number of ADMS queries made by planners||• System Data
|Comparison of ADMS performance measures with similar measures from travel demand model.||• Archived Data
• Model Comparisons
|Comparison of ADMS performance measures with similar measures from MOBILE6 model.||• Archived Data
• Model Comparisons
|The ADMS Decreases Costs of Planning Models||Estimated reduction in data collection costs for model development and calibration||• Interviews
• Review of previous data collection efforts
|The ADMS Provides A Mechanism For Improving The Quality Of Traffic Data.||Failure rates tracked over time by corridor for each QC test in the ADMS software||• System Data
• Archived Data
|The ADMS is Portable to Other Areas||Labor hours needed to customize (actual and/or estimated); extent to which code and concepts can be applied to other installations||• Interviews
• Labor logs by personnel category
|The ADMS Development Process Has Met the Needs of the Stakeholders||Subjective -- attitudes and opinions of stakeholders:
1. Perceived usefulness of ADMS tools.
2. Reported change in the day-to-day processes of users resulting from the availability of ADMS tools.
3. Perceived benefit of ADMS tools.
4. Perceived user friendliness of ADMS tools.
5. Identification of aspects of the ADMS tools that users find effective.
6. Identification of user’s needs not being met by the ADMS tools.
|Quantitative – system usage statistics
1. Number of “current conditions” queries made, by user.
2. Number of “traffic forecasting” queries made, by user.
3. Number of “data quality reports” queries made, by user.
4. Number of “incident insight” queries made, by user.
5. Number of failed/aborted queries
|• System Data|
|The ADMS Has Satisfactorily Fused Data from Different Sources||Perceived ease of integration within the ADMS analytical framework||• Interviews
• Analyst Observations
|Perceived ease of integration outside the ADMS analytical framework||• Interviews
• Analyst Observations
2.4.2 Evaluation Structure
18.104.22.168 TMC Operations Planning
Hypothesis #1: Archived data tools enable STC staff to perform more effective Operations Planning; and
Hypothesis #2: Use of the ADMS Improves System-wide Travel Conditions
The main effect of the ADMS on the TMC will be in the area of Operations Planning. For the purpose of this evaluation, Operations Planning is defined as activities related to the modification or adjustment of existing Operational strategies. It is seen as being a very short-term planning horizon; this contrasts with the longer time horizon undertaken by the traditional transportation planning process. In a broader context, Operations Planning also includes the identification and deployment of new short-term Operations strategies, but the evaluation schedule does not permit enough time for this to be practical.
22.214.171.124 Planning Functions
Hypothesis #3: Improved Accuracy of Planning Models; and Hypothesis #4: Cost of Operating Planning Models
In assessing the ability of the system to improve regional planning, it is also important to identify the effectiveness of the tools to meet user's needs. It is hypothesized that the ADMS tools will perform satisfactorily for planners and operators. These MOEs are largely subjective, measuring the perceived usefulness, benefit, and user friendliness of the tools. These measures will be gathered through interviews of various users and stakeholders. These subjective measures will be supported by quantitative measures of the usage of particular ADMS tools by different types of users (e.g., TMC operators, planners, transit operators, traveler information providers, etc.) gathered from the system usage logs. This facet of the evaluation has been rolled into Hypothesis #10 ("The ADMS development process has met the needs of the stakeholders").
126.96.36.199 General Archive Functions
Hypothesis #5: The ADMS provides a mechanism for improving the quality of traffic data
A highly significant concern in the use of archived ITS-generated data is the quality/accuracy of the data. While professionals agree that quality data is required to implement advanced forms of Operations control strategies and for secondary uses, budgets to install and maintain field equipment - as well as the detection of suspect data - are often limited. Basically, ITS-generated traffic data can be of poor quality for a number of reasons.
Hypothesis #6: The ADMS is Portable (Transferable) to Other Areas
The ADMS Virginia project has great potential for sparking ADMS development in other areas. However, the more directly the results can be applied, the greater the influence the Operational Test will have. A number of general issues will be explored as part of this evaluation:
- Does the design appear to be expandable? (Does the location referencing system used work for other archives? Does the reporting system expand easily to account for other geographic locations?)
- Can the hardware expand to meet larger data set requirements? Can the software meet the needs of users if the number of users grows substantially?
- Can the database structure be transferred in whole or in part to other installations? This will depend to a large degree on the nature of data being collected in other areas. Although ITS data standards have been developed (TMDD, P1512), adoption of these standards have been slow.
- Which components of the data structure are best suited to transfer (e.g., metadata versus measurement data)?
- To what extent can the software code be used directly by other installations? Are algorithms, concepts, and output formats better suited to transfer than actual code?
Hypothesis #7: The ADMS Development Process Has Met the Needs of the Stakeholders
The development of ADMS Virginia has followed sound IT practice by adopting a user requirements process in designing the system. The ASTM standard on ADUS recommends this approach.1 It would be useful for future ADMS deployments to understand how well this process worked. To this end, interviews will be conducted with stakeholders by the Evaluation Team. A general "question guide" will be used but answers will be free-form and not in the same format as a traditional survey. The guide will be developed prior to the interviews and will include such question as:
- Do they use the system? How easy is it to use? Do they need training? Is the training provided sufficient? (What training do they need?) Do the meta-data provided meet their needs? (If not, why not?) Do they feel the system is readily accessible? Do they have confidence in the data stored in the system and/or the results they get out of the system? (If not why not?) Are there specific concerns they have about using the system? How quickly do they get responses back from queries that make of the archive? Does this meet their expectations?
- What analytical capabilities are part of the system? Do the analytical capabilities scale along with the database itself? (For example, as new detectors are added in Hampton Roads, do they change their definition of the "corridors" used for travel time estimation and/or for computing "average corridor volume?")
- How many people access the ADMS? (Inside VDOT? External to VDOT? - Describe who those people are, and how they use it.)
- Is the archive functionality the same as planned? What capabilities (output reports, uses) are actually built, and how do they compare with the original design?
- What was the cost and effort that went into designing and implementing the ADMS (documentation, not a formal evaluation)? This involves review of the labor records of those involved in the ADMS development. Several dimensions will be used:
- Phase of the project: design, implementation, maintenance
- Labor categories: management, senior software engineer, junior software engineer, senior transportation analyst, junior transportation analyst, stakeholder (for meeting attendance), and clerical.
Hypothesis #8: The ADMS Has Satisfactorily Fused Data from Different Sources
A major challenge for any ITS archive is the fusion of these data and their combined use in advanced applications. A number of questions/issues arise from the ability to fuse data from different sources.
- Does the system effectively integrate multiple data sources? Can an analyst match data from two different data sources efficiently? (For example, can they use incident response data to easily select volume and speed data?) This should be examined both within their analytical framework and the whether they are able to export data in such a way as to allow matching of data from different sources outside of the archive's own analytical framework.
- Is the location referencing system used capable of correlating data collected from two different data sources? What are the issues associated with using that referencing system given the other referencing systems used by VDOT and the other participating agencies (what are the other location referencing systems being used)? How are they integrated, and what does it take to perform/maintain that integration?
188.8.131.52 Data Collection and Management
The Evaluation Team worked with STC and other stakeholder staff to identify the appropriate personnel to be interviewed on each topic and to get approval for the interview. Before beginning most interviews, an interview guide was prepared that lists the topics that should be covered and specific questions that should be addressed. These guides were used during the interviews.
The Evaluation Team obtained historical archived traffic data. Metadata is crucial for the analyses envisioned, and these will be obtained as well; this is especially true for estimates of the quality of the data.
System Usage Data (e.g., user sessions for Websites)
The Evaluation Team relied on STL to provide tracking of system usage.
1ASTM E 2259, Standard Guide for Archiving and Retrieving ITS-Generated Data.