https://substack.com/@bibliotequeando/note/p-163728195?r=6gphrs
When the history of Christianity is invoked, the reaction is almost automatic. Crusades, inquisitions, bonfires, indulgences, pedophilia, and popes with swords and crowns are mentioned. And not without reason. The Church's historical record is long and notoriously bloody. But amidst this outrage, something fundamental is ignored: that before becoming a political colossus with embassies and armies, Christianity was a heresy that struck at the brutal heart of humanity within the most ruthless empire in history.
Ask the right questions to a self-proclaimed atheist who believes in nothing, and in less than five minutes they'll hand you a god. Perhaps they won't wear a cassock or carry a crucifix, but they'll come with dogmas, liturgies, and commandments. They'll call it a cause, a party, a homeland, progress, or rights. But it will be a god.
Christianity today no longer needs Christians to survive. The idea—that scandalous idea that the downtrodden are the center of the universe—has won. And no one, including me, quite knows what to do about it.
Where are our "gods" leading us today? To the perdition of this aimless world 🌎 or to inevitable salvation?
What do you think? 🤔
Not sure I entirely disagree, but could you elaborate here. (Environmentalism becoming a deity is the obvious candidate in my circles)