I heard a statement made the other day that was along the lines of "doctors are to black women as police are to black men" and the comments on it were really eye opening. So many stories of women who struggled to get their doctors to listen to them and some had even lost family members due to not being listened to. I think it starts to create a really big problem where black women feel so worried about being brushed off by the doctor that they don't go at all. I also know that a lot of research isn't being done on how disease presents itself in black women. I feel like all of this is a silent killer that no one talks about. What do you think about it? Have you girls ever had any bad experiences?