Yesterday, in an after the elections discussion, someone told me that the USA is still the greatest country in the world and that it will continue to provide hope to the world. Being a European I questioned the person, as I would any other person claiming such importance for their country, but the guy insisted that this was the truth.

“Delusional” came to mind, but also me wondering: Where does our hope come from?

Tell me: What gives you hope? What makes you believe something is possible/will get better? What makes that the little light in your heart keeps burning?