I feel as if schools don't do enough to teach people about black history. I feel like kids these days, aren't taught all the depressing details of the civil rights movement, slavery, hate and more. Do you think schools should do more to include the truths of our past? I know they teach some aspects, but I feel like they overlook a lot of it.
Makes me wonder what schools in white dominated areas taught their kids. I could see many going the route of teaching very little.
Makes me wonder what schools in white dominated areas taught their kids. I could see many going the route of teaching very little.