This weekend I looked at 4 movies, 2 by Louise Hay and 2 by Dr. Wayne Dyer. Something profound struck me after looking at You Can Heal Your Life by Louise Hay. It was the story of how she began to live in her purpose of teaching about healing, saw people healed and also healed herself from cancer.
I found myself asking the question, in the world today, did this belief go out of the window? Where are all of the people who teach about healing? What are they doing today? Are people even believing that they can be healed or, has this belief gone out of the window?
I love the part in movie where being with disease was a wake up call to her to take care of her health. She stopped eating junk food, hit the stop button on her life and went from busyness to inspiration and total healing. A purposeful and meaningful transformation took place.
I don't think I've ever experienced the world so full of fear in my lifetime. There's a division across the planet as to what is right, who is right and confusion about what and whom to believe. What would it take for us to hit the stop button and really think about what is going on around us, casting judgement aside? How can we collectively change in order to heal this world? Is there anyone out there who still believes this, I wonder?
Comments