health care
n : the maintaining and restoration of health by the treatment and prevention of disease esp. by trained and licensed professionals (as in medicine, dentistry, clinical psychology, and public health) health-care adj
Published under license with Merriam-Webster, Incorporated. © 1997-2024.
|
|
|