We are bombarded everyday with news that erodes or jades us into thinking there is little good in fellow humans and that mankind isn't going to make it out of the hell we create for each other whether it be spineless ingenuous political distortions, football pedophiles, or once trusted financial institutions and health care companies shamelessly fucking over it's customers.
But once in a blue moon something happens and you realize that, hey, we humans aren't all that bad and it looks like humanity just might make it.
What are those happenings and events or personal experiences and revelations that have restored your faith in humanity?
But once in a blue moon something happens and you realize that, hey, we humans aren't all that bad and it looks like humanity just might make it.
What are those happenings and events or personal experiences and revelations that have restored your faith in humanity?