Dentistry is the diagnosis, treatment, and prevention of conditions, disorders, and diseases of the tooth, gums, mouth, and jaw. Often considered necessary for complete oral health, dentistry can have an impact on the health of your entire body.