Monday, November 24, 2014

What Happened to American Medicine?

America once had the best healthcare system around, but no longer. We still have the best doctors and nurses, but they are stymied and frustrated by a system that hampers rather than helps them when they try to care for us. Why?


Read the complete article by Deanne Waldman at AmericanThinker.com.