Medical racism has been a thing for decades. I try to find black doctors, dentists when I can. Its not just in America. I know a Kenyan woman who said many there don't take those WHO (World Health Organization) vaccines or other health care because a lot of women, especially in the rural areas are dying or sick weeks or months after these people show up.
Also, it's pretty much been established that Blacks in countries like America, Canada, UK which doesn't have as much sunlight as the tropical areas died more than the average for Covid19 especially if you had pre-existing illnesses and it was due to a lack of Vitamin D that we get naturally back in Africa.
I've seen a few white doctors say that. So, knowing that and knowing how cheap Vitamin D is, why isn't the government via the CDC or other healthcare agencies recommending that Blacks increase that and even providing it maybe? We know the answer.