If you look at the latest survey results, most people living in the US no longer consider themselves Christian. Europe when that way decades ago. Western Culture is now a post-Christian Culture. How should Christians adjust?
Living in a Post-Christian Culture
Bill Jacobs
January 15, 2021
For questions about this presentation, send an inquiry via this form.