National Parks
METHODOLOGY

Data & Methodology

How national park visitation data is collected, what it measures, and where it falls short.

Where the data comes from

All visitation data on this site comes from the National Park Service's Integrated Resource Management Applications (IRMA) system, specifically the NPS Stats reports. The NPS has been collecting visitation data since 1904, though consistent methodology dates to the late 1970s. This dataset covers 1979 to 2025.

The data is reported monthly by each park unit and published annually. This site covers all 63 designated national parks (not national monuments, recreation areas, or other NPS-managed sites, which use different reporting categories).

What is a 'recreation visit'?

A recreation visit is the entry of a person onto NPS-managed land for recreational purposes. It is not a unique visitor. The same person entering and leaving a park three times in one day counts as three recreation visits.

This is the most important thing to understand about this data: visit counts overstate the number of actual people visiting a park. Parks with free admission and multiple entry points (like Great Smoky Mountains) see particularly inflated numbers relative to parks that charge entrance fees at a single gate.

Non-recreation visits include NPS employees, concessioner staff, contractors, and people who pass through the park on a through-road without stopping for recreational purposes.

How visits are counted

Parks use a combination of methods to count visitors, and the method varies by park:

  • Traffic counters: Pneumatic tube counters or inductive loop sensors at entrance roads. A persons-per-vehicle (PPV) multiplier converts vehicle counts to person counts based on periodic occupancy surveys. The NPS estimates typical counter accuracy at roughly ±10%, though it varies significantly by park and season.
  • Entrance station tallies: Parks with staffed entrance stations count visitors directly through fee collection systems.
  • Permit data: Backcountry permits, wilderness permits, and timed-entry reservations provide direct counts for specific areas.
  • Trail counters: Infrared or pressure-pad counters on popular trails, used to calibrate broader estimates.

The NPS acknowledges that counting accuracy varies significantly between parks. The formal counting procedures are documented in NPS Reference Manual 82-C. Parks with single-point vehicle access and fee stations (like Arches or Carlsbad Caverns) have relatively precise counts. Parks with open boundaries and multiple access points (like North Cascades or Congaree) rely more heavily on estimation.

Overnight stays

Overnight stays are categorized into several types, each counted differently:

  • Tent campers and RV campers: Counted through campground reservation systems and check-in records at developed campgrounds. Reported as person-nights.
  • Backcountry campers: Counted through required backcountry permits. More accurate than day-use counts since permits are mandatory in most parks.
  • Concessioner lodging: Hotels, lodges, and cabins operated by private concessioners (like Xanterra or Aramark). Counted through reservation and occupancy records.
  • Concessioner camping: Commercially operated campgrounds within the park.

Overnight data is generally more reliable than day-use data because it requires some form of registration or permit.

Recreation hours

Recreation hours estimate the total time visitors spend in the park. These are derived from periodic visitor surveys, not direct measurement. The NPS multiplies recreation visits by an average visit duration (estimated through surveys) to produce this figure.

The accuracy of recreation hours varies considerably. Parks that are primarily drive-through experiences may overcount hours, while parks where people spend full days hiking or climbing may undercount.

Monthly data

Monthly breakdowns reflect the reporting period in which visits were counted, which generally corresponds to the calendar month. Some parks report on a slightly different cycle, and monthly figures should be treated as approximate. Annual totals are more reliable than any individual month.

Known limitations

  • Visits ≠ visitors. A family of four entering a park counts as four visits. The same person visiting three days in a row counts as three visits. There is no reliable way to derive unique visitors from this data.
  • Counting methods vary by park. Comparing raw visit numbers between parks is inherently imprecise because different parks use different counting methods with different levels of accuracy.
  • COVID-era disruptions. Partial closures, reduced hours, and varying reopening timelines in 2020 and 2021 make year-over-year comparisons for that period unreliable. Some parks counted zero visits for months they were technically open but had closed facilities.
  • Timed-entry systems. Parks like Arches, Rocky Mountain, and Yosemite have implemented reservation systems since 2020. These improve count accuracy but also cap visitation, making pre- and post-permit comparisons difficult.
  • Park boundary changes. New River Gorge was redesignated from a National River to a National Park in 2020. Indiana Dunes was redesignated in 2019. Historical comparisons should account for these changes.
  • Weather and fire closures. Major fires, flooding, or storms can close parks for weeks or months. The data reflects these closures as drops in visitation, not changes in demand.
  • Fiscal vs. calendar year. The NPS fiscal year runs October to September. Some historical reports use fiscal year while IRMA Stats primarily reports calendar year, which can cause discrepancies when comparing sources.

How the data is used

NPS data is presented as-is, without adjustment or normalization. Rankings, year-over-year comparisons, and trend lines reflect the raw reported numbers. Computed metrics (peak months, growth rates, composition breakdowns) are derived directly from the raw data. No adjustments are made for inflation, population growth, or changes in counting methodology.

About the insights

The insight callouts on charts are manually written observations based on additional research, reporting, and direct experience visiting these parks. They're meant to give context that the numbers alone can't: why visitation dropped in a particular year, what a policy change means for a park's future, or what a trend actually feels like on the ground. Sources are linked where available.

Sources and further reading