I'll be honest, I don't understand why people who claim to be Christians are so quick to bash woke culture.
Christianity is supposed to be about love, compassion, and understanding, so why is it that so many people who claim to be Christians are quick to judge and denounce those who are trying to make the world a better place? This, I am genuinely asking.
We claim to be people of faith, but then we use our religion as a weapon instead of a tool for healing.
Don't get me wrong, I'm not saying that everything woke culture stands for is “good.” There are definitely some aspects of it that I take issue with. However, I think it's important to have an open and honest dialogue about the things we disagree on, rather than simply writing off anyone who doesn't see eye-to-eye with us.
Sadly, it seems like these days people are more interested in attacking those they disagree with than they are in finding common ground.
Contrary to what many people might think, woke culture didn't start
with the modern day social justice movement. [read more]