Why US Healthcare Isn’t Better At Getting Better

US healthcare has gotten worse despite tremendous efforts to improve it. Physicians can take a leading role towards driving towards a better healthcare system.

Read the full post on Forbes - Healthcare