How in the world did the U.S. change from having the best scientists in the world who were discovering vitamins, minerals, vaccines and cures for disease, to modern times, where the only medicine available is toxic with horrific side effects, and where nutrient-void, chemical-laden food is being sold at almost every restaurant and grocery store, all in the name of corporate profits that keep the public sick and in need of expensive care?
In the early 1900's, America was chock full of small farms and families who ate fresh food from those farms. Cancer, diabetes, heart disease and Alzheimer's barely existed because the soil was rich in nutrients and minerals, and if you did get sick, a doctor would come to your home and give you some herbal tinctures or natural remedies, and that was that.
Read More:
http://www.naturalnews.com/035450_Western_Medicine_morals_collapse.html