We can see various very important new ideas becoming mainstream in society, such as environmentalism, global warming, women’s equality with men, and the dangers of plastic. It often takes decades of a few visionaries to keep drawing attention to a particular issue, before it bursts into wide public and political consciousness and acceptance, eventually leading to positive actions to create a healthier, fairer and more sustainable world.
A particular favourite of mine, brought to me through the teachings of macrobiotics, is the central role of our daily food in creating and maintaining health. Modern western society has been through a period of complete disconnection between food and health, and the use of dietary and lifestyle changes being used as a primary way of healing disease. Going to western medicine for help rarely involves any dietary or lifestyle advice, and our abundant television cooking programmes are all about how the food looks and tastes with little about the health effects of what is being prepared.
So, what will our world look like once the essential truth that our food is a primary source of health becomes fully accepted? Here are a few of my ideas, what would you add to this list?