Filed under: Government, Health in America | Tags: climate, environment, health, medicine, science, society
I’m currently in the midst of reading The Omnivore’s Dilemma by Michael Pollan. While reading, I accidentally highlighted the word “food” and proceeded to read the definition, not thinking anything of it. Then it hit me. Do the majority of Americans fully understand the inherent meaning of FOOD?
According to The New Oxford American Dictionary the definition for food is “any nutritious substance that people or animals eat or drink, or that plants absorb, in order to maintain life and growth.” Food, being a fundamental element in the development of human beings and other life forms is vital to our continued existence on this planet. So what happens when the so-called foods being mass-manufactured and distributed are intentionally depleted of almost all nutrients, and in addition are diminishing the fertile soil imperative for our survival? Is it then ethical to persist in considering these altered materials as food or is it merely solid “stuff” posing as a nutritious product? And if so, is it possible for humans to truly gauge the consequences these foreign substances may have on our bodies and the environment before it’s too late?
This also raises questions about the insufficiency of labeling policies within our American borders (including imported goods). Is it lawful to put a universal label on all otherwise edible substances that lack nutritional density? Or should federal laws be implemented to label these materials as GMO’s (genetically modified organisms) that are potentially toxic- similar to cigarettes, alcohol and other well-known biologically harmful drugs. Regulated policies could provide the education necessary for consumers to make intelligent decisions regarding their personal health and the health of their families.
Furthermore, how would this affect our economy and our healthcare system? Would individuals veer away from genetically modified foods and opt for local/organic options if they knew the truth behind what they were putting in their mouths? If so, it seems that as a result Americans would have the opportunity to transform into healthier beings less dependent on a government driven healthcare system to keep them well. After all, the United States spends more on healthcare (approximately one trillion dollars!) but seems to be helping a lower number of people compared to other industrialized nations. Therefore, if American’s took health back into their own hands it appears the demand of healthcare would in fact diminish allowing funds to open for areas such as education, social security, military, job production, etc.
What do you think? I would love your feedback!
Angry Granola Girl
2 Comments so far
Leave a comment