(HealthDay News) — Having dental insurance doesn’t mean people will actually take care of their teeth, a new study indicates.
To read more of the HealthDay News study, click here.