There has been a notable shift in Hollywood from film portrayals that mock Christianity to films that portray Christianity as a force of evil.
For more resources to live like a Christian in this cultural moment, visit Colsoncenter.org