This is like asking, "what is better? an orange or an apple?"
I think they are both important, because they are both experts in their fields and are both great benefactors to our society. Some may say one is more important than the other, or that they are both unimportant since other people (such as parents) can teach you or treat you anyways.
But no one can teach you better than a teacher, and no one can restore your health better than a doctor. That's why they went to school and specialized in their fields, to become experts in what they do, and what they do is something we all need (learning and healing).
Also, they need each other, who will teach the doctor math and science without a teacher? and who will treat the teacher when they become sick if there is no doctor?