LongTimeWaiting
Rising Star
Hey everyone. I recently completed a fantastic book that changed my perspective on this beautiful planet Earth. It's called, "Factfulness: Ten Reasons We're Wrong About the World--and Why Things Are Better Than You Think". I'm sometimes pessimistic about society, it's hard not to be when the media portrays that this world is crumbling, but this book showed me the beauty of truth behind what's really going on. I won't spoil it in case you decide to pick it up.
But, the real question I want to ask is, what books can you recommend me that will improve my perspective on the world, and on societies throughout this world? It could be anything from fiction to nonfiction, and if you'd like, you could discuss your favorite books as well, although, I'm not directly interested in reading them unless they will change my perspective on things.
But, the real question I want to ask is, what books can you recommend me that will improve my perspective on the world, and on societies throughout this world? It could be anything from fiction to nonfiction, and if you'd like, you could discuss your favorite books as well, although, I'm not directly interested in reading them unless they will change my perspective on things.