PREVIEW: Was America Really Founded as a 'Christian Nation'?

  • 9 years ago
Most Americans believe the U.S. was founded as a Christian nation. But, is this merely a myth? Larry talks with the author of a new book claiming 'Corporate America' invented 'Christian America,' and how that has defined and divided our politics since.

Recommended