Despite the widespread claims from politicians, providers, and even many members of the public that American health care is the best in the world, a new report from the Institute of Medicine concludes that growing up in the United States actually puts people's health at risk compared to other wealthy nations.