Home » Has the Decline of U.S. Christianity Finally Stopped?

Has the Decline of U.S. Christianity Finally Stopped?

The apparent stabilization in religious trends suggests God may be granting us a season to strengthen what remains and to sow seeds for future revival.

“When will American Christianity finally hit rock bottom?”

Many church leaders have had that question on their minds as they’ve watched affiliation trends for the past decade. Now, according to Pew Research Center’s recently released Religious Landscape Study, there may be reason for cautious hope. After years of seemingly relentless decline, Christianity’s downward spiral appears to be slowing. The comprehensive 2023–24 study, which surveyed more than 36,000 American adults, reveals a religious landscape that has begun to stabilize—albeit at levels far below historical norms.