I just don’t even know anymore… I don’t like this planet, I don’t want to live here.
What kind of world is it when the news shows us nothing but wars, disasters, bombings, shootings, hunger, sickness, death? It’s like we’re headed to hell in a hand basket. How am I supposed to raise children in this? It scares the heck out of me that my sister has to grow up in this world, how can I dare to bring kids into it when there’s a chance they’ll get shot just going to the movies?
When did the villains stop existing in comics, books, and movies and suddenly become part of every day life again? Where did safety, security, trust in your fellow man go? And when is it coming back?
And most importantly… where did we as human beings go wrong? How did we let society get this way? It can’t have just happened on its own - something has to have changed to allow us to fall this far. I wonder what it was.