The Walking Dead (and Religion)
My husband and I have recently gotten into the Walking Dead series. I'm usually not the type to watch a lot of TV, but I'm on the edge of my seat after every episode. It also had me thinking. In a world gone mad with zombies at every turn and corruption from those that are still alive, what does that say about our society? How could religious people justify their beliefs in a world that seems truly without any sort of godly figure? I don't know about you, but being in that zombie apocalypse would only strengthen my atheism!