U.S. Department of Transportation
Federal Highway Administration
1200 New Jersey Avenue, SE
Washington, DC 20590
202-366-4000


Skip to content
Facebook iconYouTube iconTwitter iconFlickr iconLinkedInInstagram

Safety

FHWA Home / Safety / HSIP / HSIP Assessment Toolbox

FHWA HSIP Evaluation Peer Exchange


Best for printing: fhwasa18021.pdf (445 KB)

To view PDF files, you can use the Acrobat® Reader®.

Logo: HSIP

November 28-29, 2017

Utah Department of Transportation
Taylorsville, Utah

Summary Report

FHWA-SA-18-021

logo: U.S. Department of Transportation - Federal Highway Administration

Introduction

The Highway Safety Improvement Program (HSIP) comprises three components: planning, implementation, and evaluation. While planning and implementing projects is critical to addressing opportunities for safety improvement, evaluation is critical to understanding project and program effectiveness. Employing more consistent and reliable evaluation methods will support future HSIP decisions, optimize return on investment of safety funding, and increase the effectiveness of projects and programs.

State transportation agencies continue to establish and enhance HSIP evaluation practices. Many States are tracking at least basic project information, evaluating projects, employing more advanced methods, and reporting results to stakeholders. While States are making progress in enhancing HSIP evaluation practices, there is still a great deal of variation among States. Agencies need specific guidance to track and evaluate the effectiveness of projects, countermeasures, and programs. There is also an opportunity for many to learn from the successes and challenges of others.

To help advance evaluation practices, the FHWA Office of Safety hosted an HSIP Evaluation Peer Exchange on November 28-29, 2017 at the Utah Department of Transportation headquarters in Taylorsville, UT. The purpose of the peer exchange was to facilitate the exchange of noteworthy practices and lessons learned among the States. The peer exchange also served as an opportunity to promote the recent release of the FHWA HSIP Evaluation Guide.

The peer exchange was organized around six main topics related to HSIP evaluation:

For each topic, peer States led with presentations or key remarks followed by a roundtable discussion. At the end of the peer exchange, following the six topics, the participants divided into four breakout groups. The breakout groups provided an opportunity to reflect and share thoughts on strengths, weaknesses, opportunities, and threats to implementing and advancing HSIP evaluation efforts. Each State also identified key takeaways from the peer exchange. Attachment A includes the detailed peer exchange agenda.

Attendees

The following States attended the peer exchange. Attachment B includes a full list of attendees.

Topic Area 1: Project Tracking

This topic discussion focused on how States maintain an inventory of completed HSIP (and non-HSIP) projects. Virginia DOT and Idaho DOT presented their efforts to lead off discussion of the topic, followed by a brief overview by Frank Gross of examples from other States.

Virginia

Virginia uses tracking tools developed in-house to assist project managers with the tracking of their projects. These tracking tools include dashboards and maps that help present information in an intuitive manner. VDOT uses its recently developed Smart Portal as an intake process for project submittal and readiness to prioritize HSIP funding and feeds those projects to its Integrated Six Year Plan and project tracking tools. VDOT HSIP funds are mainly available for two types of projects: traditional location-specific projects and systemic low-cost projects. Systemic low-cost safety projects include the FHWA proven safety countermeasures: flashing yellow arrows, retroreflective backplates, high-intensity activated crosswalk beacons, pedestrian refuge islands, rumble strips, curve delineations, road diets, and safety edge.

Idaho

Idaho uses the Office of Transportation Investment system to track all its transportation projects. Idaho staff have made improvements in recent years to enhance the amount of information available in its tracking systems, including detailed scopes, dates of the projects, and start and ends points on the roadway. Idaho intends to use the enhanced information to set up protocols to pull needed information for HSIP evaluation and for countermeasure evaluation.

Other States

Frank Gross briefly described Alaska’s use of spreadsheet-based tools for project tracking. He also showed an example spreadsheet used by Massachusetts DOT.

Discussion Highlights

After the presentations, meeting participants engaged in an open discussion on the topic.
California attendees indicated that there are instances in their State where the completion date for a project is tied to the date recorded in FHWA’s Fiscal Management Information System (FMIS). California and Utah use substantial completion date because construction completion can linger. New Jersey is keeping track of its project construction dates.

The attendees discussed about how many years are needed after the completion of a project to have sufficient information to evaluate a project. Most of the participating States indicated they use three years of data, which is consistent with state of the practice. As discussed later, no more than 5 years is recommended due to changing conditions.
Utah continues to improve its crash data so that this information can help with the evaluation of specific locations, such as curves. New Jersey indicated that it continues to use paper crash reports and that there may be over-confidence with data associated with low-volume roads. Minnesota is applying a linear reference system to its crash information, which will help with the automation of its HSIP application process, particularly with an interface for point-and-click selection on State maps.

Identified Opportunities

Participants identified the following opportunities related to project tracking:

Topic Area 2: Project Evaluation

This topic discussion focused on the tools States use to evaluate their HSIP projects, with South Carolina featured during the discussion.

South Carolina

South Carolina showed its use of crash diagrams and supporting spreadsheet tables to evaluate its projects. The State set up its evaluation processes also for uniformity with benefit-cost reporting. South Carolina also discussed their use of a color-coded spreadsheet for project tracking. While there is a desire to enhance current project tracking capabilities and move toward something like Virginia’s Smart Portal, the color-coded spreadsheet is relatively simple and easy to implement until other tools are available.

Other States

Frank Gross provided examples of project evaluation in other States. He described the use of tables that are used in the different districts of Colorado DOT. He also showed the use of crash diagrams in North Carolina and Wisconsin and the web-based access to project evaluation information at North Carolina DOT.

Discussion Highlights

After the presentations, attendees continued their discussion on project evaluation approaches.
The attendees discussed the use of generic costs to analyze types of projects. Colorado indicated that HSIP is viewed as supplemental funding to State projects and that it is preferable not to have all funds dedicated to one project. For example, CO never uses HSIP funds for total roundabout cost.

The attendees then discussed project evaluation timeframes. Minnesota’s project evaluation approach removes any calendar years that have construction activity. Arkansas works with its public information office to acquire supplemental descriptive information for its projects. California expressed a desire to have more than three or five years of evaluation data with the additional comment that data collection and evaluation should be continuous. Meeting attendees indicated that ten years of evaluation data for a project may be too long of a period because the built environment around the project may have changed significantly during that time. Minnesota uses an Empirical Bayes approach for its evaluation. Idaho currently uses a simple before and after approach, but expressed a desire to use an Empirical Bayes approach.

Identified Opportunities

Participants identified the following opportunities related to project evaluation:

Topic Area 3: Countermeasure Evaluation

This topic discussion focused on approaches and methods States are using to evaluate countermeasures and develop Crash Modification Factors, including evaluations of multiple project sites and combinations of countermeasures. Minnesota, Kentucky, and Arkansas described their approaches.

Minnesota

Minnesota described its approaches to evaluate systemic projects, namely signing, pavement markings, and intersection lighting. The State has found evaluations are tricky, especially to account for risk factors. Minnesota uses control sections where possible, but noted occasional difficulties to account for those projects with limited mileage (i.e., small samples). They also noted that countermeasure costs are increasing and the total mileage of “treated” roadway is decreasing annually. The State is investigating the use of assistance outside of MnDOT, such as its university system to develop better approaches for systemic evaluations.

Minnesota uses traffic volume and roadway characteristics for its project selection criteria. MnDOT is also decentralized, therefore, the central office provides criteria to the districts so that projects are managed on a district basis. Minnesota could develop rumble strip policy based on some of the evaluations. Although Minnesota is applying nationally proven countermeasures, the State wants to evaluate their actual effectiveness in the State. The effectiveness of a countermeasure is also dependent on location in the State. Minnesota is also trying to identify the number of miles affected annually by HSIP funds.

Kentucky

There is no safety division within the Kentucky Transportation Cabinet (KYTC). The KYTC has four central office HSIP staff and one HSIP coordinator in each of its districts. Kentucky’s annual HSIP reports focus on money spent so it does not capture benefits for non-HSIP funded projects. The KYTC partners with the University of Kentucky to support the HSIP program, including the evaluation of countermeasures and the development of annual HSIP reports. The University of Kentucky uses Wilcoxon Signed-Rank Test and Empirical Bayes approaches to evaluate its projects.

Kentucky has an HSIP Investment Plan. As Kentucky treats more sites with high friction surface treatments, the benefit-cost ratios are decreasing. KYTC has a standard procedure to use rumble strips; therefore, HSIP funds are no longer used for this countermeasure. The same goes for safety edge. The State also refers to safety edge as “durable pavement edge”. HSIP funding helps to change perceptions of safety investment by decision-makers. Kentucky is selective with its high friction surface treatments and restricts the use of specific materials.
Kentucky involves maintenance and administration staff to educate all parties on the benefits of specific countermeasure design and application. Nevada also has a similar approach and this also helps educate contractors. South Carolina trains its inspectors on the application of high friction surface treatments.

Arkansas

Arkansas presented an overview of its systemic and project evaluations. Systemic evaluations were for projects that included cable median barriers, shoulder rumble strips, and centerline rumble strips. Project evaluations focused on treatments including ultra-thin bonded wearing course, roundabouts, passing lanes, and raised medians. The State found a significant reduction in crashes through the installation of raised medians. The State installed cable median barriers as part of a freeway reconstruction bond. A successful rumble strip pilot project helped to justify a second round of rumble strip installations. The State also installed urban roundabouts; however, these were not funded by HSIP during the pilot phase.

Discussion Highlights

After the presentations, attendees continued their discussion on countermeasure evaluation approaches.

Minnesota indicated they use sinusoidal rumble strips only in certain situations as this treatment costs twice as much as traditional rumble trips. The results from a study of sinusoidal rumble strips are documented in Report 2016-23, Sinusoidal Rumble Strip Design Optimization Study.
Frank Gross provided a quick demonstration of FHWA’s Roadway Safety Data and Analysis Toolbox to help identify tools related to evaluation. He suggested that attendees review the following publications:

Identified Opportunities

Participants identified the following opportunities related to countermeasure evaluation:

Topic Area 4: Program Evaluation

This topic discussion focused on how States evaluate their overall HSIP program. Minnesota led off the discussion by presenting how public attitudes are measured. Utah then described its benefit-cost evaluation approaches.

Minnesota

Minnesota’s SHSP includes an emphasis area to improve traffic safety culture. More accurately, safety culture is at the center of the State SHSP. Minnesota applied the Integrated Behavior Model, which has been used in many industries, to predict intentions to engage in certain behaviors (i.e., driving after drinking or wearing a seat belt). The State conducted a baseline survey that can be repeated to measure changing roadway safety attitudes. If other States are interested in conducting their own measure of traffic safety culture, Minnesota suggests partnering with researchers who are familiar with traffic safety culture and its underlying behavior models, and have the capacity to develop and implement a high-quality survey.

Utah

Utah uses three methods to evaluate benefit-cost: three-year crash history, usRAP, and a Bayesian/Predictive method. The State compiles an actual three-year before crash history and a three-year after crash history to calculate the benefit of a project. All individual safety project costs and benefits are added together to calculate the overall program benefit-cost ratio.
During project selection, Utah does not count the effect of multiple treatments at a site, only the treatment with the highest benefit is accounted for so as not to overestimate the impact of treatments. Utah is interested in the calculation of the benefits of new technology and the potential to apply HSIP funds to these future projects. Utah wants to advance their efforts to perform program level evaluations using the Bayesian/Predictive method instead of the simple before-after method. They would also like national guidance on methods for evaluating systemic safety projects. Utah is considering the evaluation of individual safety elements on non-HSIP program/STIP projects.

Discussion Highlights

The attendees then engaged in a discussion about local involvement in safety program data and project selection:

Identified Opportunities

Participants identified the following opportunities related to program evaluation:

Topic 5: Using Evaluation Results

This topic discussion focused on how States are using HSIP evaluation results to inform policy and safety investment decisions. Michigan and Minnesota presented their approaches.

Michigan

Michigan presented four parts of its HSIP evaluation process: calculating benefit-cost ratios, assigning crash costs and weights, support of future decisions, and a multi-objective decision analysis. The State HSIP project application process includes the use of a Time of Return (TOR) form; there are separate forms for the State and local agencies. Funding is allocated by region at a target amount. Multiple countermeasures and crash types may be input into the TOR form. Each crash type and corresponding crash reduction factor are placed on separate lines. Unit crash costs are based on figures published by the National Safety Council. It is important to note that the actual crash cost does not matter, it is important the same costs are used uniformly across applications. The form also considers interest rate and AADT. The overall calculation of the time of the return in years includes fatality and serious injuries that occurred over a five-year period. Overall project costs include right-of-way, construction, and preliminary engineering, but does not include maintenance. Michigan has a set of approved countermeasures including systemic treatments of rumble strips, cable median barrier, and retroreflective post-mounted sheeting.

Michigan communicates its results across MDOT, its partner local agencies, and via public outreach through various mechanisms. The State SHSP has eleven emphasis areas, and these groups are informed of the results. A MDOT listserv publishes weekly fatality statistics. County engineers across the State have an annual workshop. The MDOT design division has helped with the funding of graphics since HSIP funds cannot be used for such activities. More information on Michigan’s Toward Zero Deaths effort can be found at http://www.michigan.gov/ZeroDeaths. The State is working on updated guidelines to focus on systemic projects and investigating combining HSIP and roadside funding.

Minnesota

Minnesota discussed its vision for using evaluation results to inform policy decisions such as safety strategies that should be incorporated into other (non-HSIP) projects. The vision is a circular process where the agency implements projects, evaluates project and countermeasure effectiveness, and influences investment decisions and agency policy based on the results of project and countermeasure evaluations.
To influence policy, there is a need to perform rigorous and reliable evaluations. There have been instances where policy-makers hold-up implementation because there is not a state-specific CMF. For example, the State was asked to develop a reliable state-specific CMF for restricted-crossing U-turns (RCUTs); however, it was difficult to identify suitable reference or comparison sites to perform a rigorous evaluation.

Discussion Highlights

In response to Michigan’s presentation, California expressed that its systemic projects on State highways are expensive and have a long project development period. The State also has a policy that roundabouts need to be considered when intersections are analyzed. California also indicated their project application form allows up to three countermeasures per location and their effects are multiplied.

New Jersey has attempted to develop systemic projects that do not trigger environmental impacts, but as projects are completed, the State is running out of additional project options and opportunities.

Nevada’s shoulder widening efforts included looking ahead five years in advance and working with design teams. There is also coordination with 3R projects and the use of PE and federal funds. Nevada would also like to have a better understanding of true operating costs.
New Jersey reminded participants about accounting for electricity operating costs for traffic signals.

South Carolina and Connecticut set a benefit-cost ratio of one for projects, and then determine a breakeven cost for a project to achieve the ratio. This helps them to quickly decide if project costs are likely to overrun the breakeven cost.

When discussing the opportunity to estimate lives saved and injuries prevented as a measure of program effectiveness, Utah expressed concern about the methods to compute these estimates. UDOT was quick to use the Highway Safety Manual as it represents national state-of-the-practice. When they receive questions about how or why they are using certain methods (e.g., average crash costs), they reference the Highway Safety Manual. The first edition of the Highway Safety Manual does not discuss lives saved or injuries prevented as a measure of program effectiveness.

New Jersey indicated there may be interest in computing lives saved by a project or program.
Frank Gross highlighted additional States and their efforts to communicate results. He pointed that Florida DOT meets periodically with executives and districts, New York State DOT develops progress reports at their central office for distribution to regions, and Colorado DOT provides similar progress reports to regions.

Identified Opportunities

Participants identified the following opportunities related to using evaluation results:

Topic 6: Preparing for HSIP Evaluation

This topic discussion focused on the considerations States are making to prepare for HSIP evaluation efforts. Frank Gross presented information about staff and management support in North Carolina, right-sizing evaluations in Montana, and the application of results in Wisconsin.

Kentucky

Kentucky provided information about the relationship between the Kentucky Transportation Cabinet and the University of Kentucky and how the latter supports the State HSIP program. An executive order by the Governor of Kentucky moved transportation research functions from KYTC to the University. Design immunity is not an issue with the University as KYTC continues to serve as the final signing authority for projects. A challenge for the researchers is not having AADT information for local roads, making it difficult to prioritize these locations.

Utah

Utah shared its relationship with its colleges and its in-house consultants. UDOT has partnerships with Brigham Young University (for Bayesian analytics) and the University of Utah (for the crash database). This provides an opportunity for students to get exposed to highway safety. Intergovernmental agreements are set up with the colleges. BYU is private; therefore, it can only accept a specific limit on Federal funding. UDOT has consultants serving as support staff with on-site employees. This initiative supports local business and does not grow government. UDOT currently has 1700 employees which is 500 less than 20 years ago. 15 percent of HSIP funding is dedicated to non-engineering tasks, such as data improvements and analysis. UDOT, not the consultants, has the final signing authority for all projects.

Minnesota

Minnesota described efforts to develop a safety evaluation position. MnDOT wanted to employ an analytics expert to replace a retiring position. The position is evaluated on the number of reports produced and is reported back to management. This position opens a door to candidates who do not have engineering skills; however, the ability to advance in the organization is limited as there are few non-engineering jobs or positions. This presents a motivation challenge. Minnesota also has a program that places junior engineers in different roles for 3-6 months at a time over two years. Nevada indicated it just hired its first non-engineer on staff.

Identified Opportunities

Participants did not specifically identify opportunities related to preparing for evaluation.

Breakout Discussion and Wrap-Up

Frank Gross provided a summary of the topics and allowed attendees to comment. The following is a summary of discussion points and comments by topic area:

Each State then presented the key takeaways from the peer exchange that they would like to explore in greater detail in the future. The following is a summary of key takeaways by State:

Alabama

Arkansas

California

Colorado

Connecticut

Idaho

Kentucky

Michigan

Minnesota

Nevada

New Jersey

South Carolina

Utah

Virginia

Next Steps

FHWA closed the peer exchange with a discussion of next steps to advance HSIP evaluation practices. Following the peer exchange, FHWA will conduct an HSIP evaluation webinar to share noteworthy practices from the peer exchange and others.

Attachment A: Peer Exchange Agenda

Tuesday, November 28, 2017

8:00 am

Opening Remarks

  • Welcome, Carlos Braceras, UDOT Executive Director, and Ivan Marrero, FHWA Utah Division, Division Administrator
  • Overview and Objectives, Karen Scurry (FHWA)
  • Introductions, All

8:45 am

HSIP Evaluation Guide

  • Benefits of HSIP Evaluation and Overview of Guide, Frank Gross (VHB)

9:00 am

Project Tracking

  • Presentations:
    • Virginia’s SmartPortal and Tableau Tools, Deepak Koirala (Virginia DOT)
    • Idaho’s Project Tracking and Prioritization Tool, Kelly Campbell (Idaho Transportation Department)
  • Roundtable Discussion

10:15 am

Break


10:30 am

Project Evaluation

  • Presentations:
    • Colorado’s Project Evaluation Template and Online Reports, Frank Gross (VHB) on behalf of Colorado DOT
  • Additional State Input:
    • South Carolina’s Project Evaluation Spreadsheet, Joey Riddle (South Carolina DOT)
  • Roundtable Discussion

12:00 pm

Lunch


1:00 pm

Countermeasure Evaluation

  • Presentations:
    • Minnesota’s Systemic Countermeasure Evaluations, Brad Estochen (Minnesota DOT)
    • Kentucky’s Experience Using the Shift of Proportions, David Durman (Kentucky Transportation Cabinet)
  • Additional State Input:
    • Arkansas’ Countermeasure Evaluation Experience, Adnan Qazi (Arkansas DOT)
  • Roundtable Discussion

2:30 pm

Break


2:45 pm

Program Evaluation

  • Presentations:
    • Minnesota’s Evaluation of Public Attitudes, Katie Fleming (Minnesota DOT)
  • Additional State Input:
    • Experiences Estimating the BCR for HSIP Projects, Multiple
    • Experiences Evaluating Specific Programs (Systemic, Local), Multiple
  • Roundtable Discussion

4:30 pm

Wrap-up/Adjourn

Wednesday, November 29, 2017

8:00 am

Day 1 Recap


8:30 am

Using Evaluation Results

  • Presentations:
    • Michigan’s Experience Communicating Evaluation Results, Heidi Spangler (Michigan DOT)
  • Additional State Input:
    • Minnesota’s Vision for Using Evaluation Results to Inform Policy, Brad Estochen (Minnesota DOT)
    • Experiences Using Evaluation Results to Inform Policy, Multiple
    • Experiences Using Lives Saved to Report Progress and Justify Funding, Multiple
  • Roundtable Dis.cussion

9:30 am

Preparing for HSIP Evaluation

  • Presentations:
    • Considerations in Preparing for HSIP Evaluation, Frank Gross (VHB)
    • Kentucky’s University Partnership, David Durman (Kentucky Transportation Cabinet)
  • Additional State Input:
    • Minnesota’s Experience Creating Position for HSIP Evaluation, Brad Estochen (Minnesota DOT)

10:00 am

Breakout Discussion

  • S.W.O.T. Analysis of HSIP Evaluation, All

10:45 am

Wrap-Up

  • Roundtable Discussion on Challenges, Opportunities, and Key Takeaways, All
  • Review of Objectives and Discussion, Frank Gross (VHB)
  • Resources, Frank Gross (VHB)
  • Final Questions/Comments, All

Closing Remarks, Karen Scurry (FHWA) 11:55 am

Adjourn/Safe Travels! 12:00 pm

Attachment B: Participant List

The following is a list of attendees at the HSIP Evaluation Peer Exchange.

State Contact Organization
Alabama Kim Biddick Alabama Department of Transportation
Arkansas Adnan Qazi Arkansas State Highway & Transportation Department
California Richard Ke California Department of Transportation
California Howard Giang California Department of Transportation
Connecticut Joe Ouellette Connecticut Department of Transportation
Idaho Kelly Campbell Idaho Transportation Department
Kentucky David Durman Kentucky Transportation Cabinet
Kentucky Tim Tharpe Kentucky Transportation Cabinet
Michigan Heidi Spangler Michigan Department of Transportation
Michigan Mark Bott Michigan Department of Transportation
Minnesota Brad Estochen Minnesota Department of Transportation
Minnesota Katie Fleming Minnesota Department of Transportation
Minnesota Mao Yang Minnesota Department of Transportation
Nevada Lori Campbell Nevada Department of Transportation
New Jersey Angela Quevedo New Jersey Department of Transportation
South Carolina Brett Harrelson South Carolina Department of Transportation
South Carolina Joey Riddle South Carolina Department of Transportation
Virginia Deepak Koirala Virginia Department of Transportation

The following is a list of attendees from the host agency, Utah Department of Transportation

Scott Jones Robert Miles
Anne Ogden Rudy Alder
Brian Phillips Tyler Laing
Glenn Blackwelder Clancy Black
Jesse Sweeten Dallas Wall
Robert Dowell  

FHWA staff in attendance at the HSIP Evaluation Peer Exchange included:

Page last modified on September 30, 2015
Safe Roads for a Safer Future - Investment in roadway safety saves lives
Federal Highway Administration | 1200 New Jersey Avenue, SE | Washington, DC 20590 | 202-366-4000