About College Of Dentistry » University Of Florida
Established in 1972, the UF College of Dentistry is the only publicly-funded dental school in the State of Florida and is a national leader in dental education, research and communit... Read more
Established in 1972, the UF College of Dentistry is the only publicly-funded dental school in the State of Florida and is a national leader in dental education, research and communit... Read more