Christianity was introduced to the Americas as it was first colonized by Europeans beginning in the 16th and 17th centuries. Immigration further increased Christian numbers. Going forward from its foundation, the United States has been called a Protestant nation by a variety of sources.
When was the founding of Christianity?
1st century CE
When did religion begin in America?
In the storybook version most of us learned in school, the Pilgrims came to America aboard the Mayflower in search of religious freedom in 1620. The Puritans soon followed, for the same reason.