Skip to content
Insights
Operations4 min read9 sections

Measuring FTS Program Outcomes for Grant Reporting

Fentanyl test strip (FTS) programs sit at the intersection of harm reduction and data-driven public health, and the grants that fund them demand rigorous evidence that the investment is working. Whether your program operates under a CDC Overdose Data to Action (OD2A) cooperative agreement, a SAMHSA State Opioid Response (SOR) grant, or another federal funding stream, you will need to demonstrate measurable progress toward reducing drug overdose morbidity and mortality. This guide is designed for program managers at state, local, and tribal health departments who need practical guidance on what data to collect, how to collect it in the field, and how to translate raw numbers into the outcome narratives that funders require. We cover the specific reporting frameworks for CDC and SAMHSA grants, walk through the distinction between outputs and outcomes, introduce evaluation models you can adapt to your program, and flag the most common reporting mistakes that jeopardize continuation funding. The goal is to help you build a measurement system that satisfies your grant requirements, informs your program improvement, and ultimately saves lives.

01

CDC OD2A-Specific Reporting Requirements

The OD2A cooperative agreement structures reporting around a data-to-action framework that incorporates surveillance data, process evaluation, and outcome evaluation. For FTS-related harm reduction activities, OD2A grantees should be prepared to report across several dimensions. Surveillance Integration: OD2A places heavy emphasis on syndromic surveillance. Recipients submitting data through the Drug Overdose Surveillance and Epidemiology (DOSE) system must share aggregate emergency department data on suspected nonfatal drug overdose visits across eight required indicators: all drugs, all opioids, heroin, fentanyl, benzodiazepines, all stimulants, methamphetamine, and cocaine.

Metric TypeWhat to TrackWhy It Matters
OutputStrips distributed per site/monthProves program activity to funders
Reach# unique individuals servedDemonstrates population coverage
OutcomeBehavior change after testingShows actual impact on overdose risk
ProcessStock-out days, training completionIdentifies operational gaps
CostCost per test distributedJustifies continued investment
02

Building a Logic Model for Your FTS Program

A logic model is a visual representation of the relationships between your program\\

03

Evaluation Frameworks: RE-AIM and Beyond

Beyond the logic model, established evaluation frameworks can help you structure your measurement approach in ways that funders recognize and respect. The RE-AIM framework, which stands for Reach, Effectiveness, Adoption, Implementation, and Maintenance, has been directly applied to FTS program evaluation. A 2025 study published in the Harm Reduction Journal used RE-AIM to evaluate New York State\\

04

Data Management and Quality Assurance

Raw field data is only as valuable as the systems you use to manage, clean, and analyze it. Poor data quality is one of the most common reasons grant reports are flagged for revision or programs receive unfavorable continuation reviews. Establish a Data Flow: Map the journey of each data point from field collection to final report. Who collects it?

05

Common Reporting Mistakes and How to Avoid Them

Years of grant administration have revealed recurring errors that program managers should actively guard against. Confusing Outputs with Outcomes: Reporting that you distributed 25,000 test strips is an output. Reporting that 72 percent of participants who tested positive changed their behavior is an outcome. Many programs over-report outputs and under-report outcomes, which leaves reviewers unable to assess whether the investment produced results.

06

Using Published Evidence to Strengthen Reports

Grounding your program reports in the published evidence base signals scientific rigor and helps reviewers contextualize your findings. Key studies to reference include the 2025 JAMA Network Open multisite cohort study of 732 people who use drugs, which found that FTS users engaged in significantly more overdose risk reduction behaviors than non-users, with specific behaviors including using smaller amounts, doing test shots, using more slowly, having naloxone present, and not using alone. The study also found an important nuance: while FTS use was associated with more risk reduction behaviors, it was not independently associated with a lower rate of nonfatal overdose, reinforcing the importance of FTS as one component of a broader harm reduction strategy rather than a standalone solution. The ASPE report on Drug Checking Programs in the United States and Internationally provides a full picture of drug checking interventions and their outcomes.

Outputs ≠ Outcomes
Distributing 10,000 strips (output) means nothing if nobody changes behavior (outcome). Funders increasingly demand outcome data. Build collection into your workflow from day one.
07

Planning for Sustainability and Long-Term Evaluation

Grant-funded programs must eventually demonstrate a path to sustainability, and your evaluation system should be designed with this in mind from day one. Build Evaluation into Operations: The most sustainable measurement systems are those embedded in routine program operations rather than bolted on as a separate research activity. If your distribution staff collect a brief survey as a natural part of every encounter, data collection survives staff turnover. If data collection requires a dedicated research team that disappears when supplemental funding ends, your measurement capacity is fragile.

08

Sample Metrics Dashboard for FTS Programs

A well-designed metrics dashboard provides an at-a-glance view of program performance and serves as the backbone for quarterly and annual reports. The following template captures the essential indicators most FTS programs should track. Distribution Metrics (reported quarterly): Total FTS distributed this period, cumulative FTS distributed to date, number of unique individuals served, number of active distribution sites, number of new distribution sites activated, geographic coverage as percentage of target area, FTS distributed per site per month. Naloxone Co-Distribution (reported quarterly): Total naloxone kits distributed alongside FTS, percentage of FTS recipients who also received naloxone, number of reported overdose reversals among program participants.

Key Outcome Metrics From Published Research
Used smaller amount after positive45%
Used with someone present39%
Kept naloxone nearby69%
Did a test dose first42%
Sources & References
  1. CDC. Overdose Data to Action (OD2A). https://www.cdc.gov/overdose-prevention/php/od2a/index.html
  2. CDC. Overdose Data to Action: Evaluation. https://www.cdc.gov/overdose-prevention/php/od2a/evaluation.html
  3. CDC. Drug Overdose Surveillance and Epidemiology (DOSE) System. https://www.cdc.gov/overdose-resources/media/pdfs/2025/04/DOSE_FactSheet_December-2024.pdf
  4. Vermont Department of Health. OD2A-S Performance Measures Technical Guidance. https://www.healthvermont.gov/sites/default/files/document/dsu-cdc-od2a-performance-measures.pdf
  5. SAMHSA. State Opioid Response (SOR) Grants, NOFO TI-24-008. https://www.samhsa.gov/grants/grant-announcements/ti-24-008
  6. SAMHSA. SAMHSA Unified Performance Reporting Tool (SUPRT). https://spars.samhsa.gov/content/samhsa-unified-performance-reporting-tools-administrative-suprt-and-client-suprt-c-data