Skip to main content
American employers are doing more to promote their employees’ wellness than ever before, according to a recent study by the American Management Association, and are doing so because they say employers are duty-bound to ensure the health of their workers.

U.S. employers more dedicated to wellness