I have had white doctors in the past, but have always moved on to another doctor. I think it's because they wouldn't always be able to figure out my issues. But whenever I would see a black doctor, he or she would figure out my issue like that. Either I was seeing bad doctors back them, or they just hate black people.
Anyway, what is your preferences when it comes to your doctor? Do you care if they are white or black? Male or female?
Anyway, what is your preferences when it comes to your doctor? Do you care if they are white or black? Male or female?