I think it is important we hold onto our culture and uplift the good aspects of it but I feel like we will not go anywhere if we continue to uplift the bad aspects of it and let white-washed Jewish men in Hollywood dictate what is and is not our culture for us. They promote the worst things about us and is it any wonder why there are still people who treat us differently because of our skin? I know plenty of people say they don't, especially white folks, but deep down they do and it is because of stereotyping and this thug and hood rat culture that has been defined for us. They want us to fail and they use this against us.