I know it's because of the political climate today. But everything seems to be tamed down a lot to appease as many people and not offend anyone. Movies & tv play it safe for the most part. I don't mind offensive content as long as it's done in a comedic manner that works or tells a story that deals with it, overcomes it and more. That can be done, but it feels like it's becoming less and less of it these days. Almost like they think everyone are babies or something and that we can't handle anything offensive. Do you guys agree?
I understand some things will trigger people, but the damn world is a trigger in itself. The news is a trigger, the books you read are a trigger. We all live through horrible shit in our lives some more worse than others, and I think seeing it in media, sometimes helps us through it.
What do you think? Have comedies for the most part, become more tame with the comedy they do?
I understand some things will trigger people, but the damn world is a trigger in itself. The news is a trigger, the books you read are a trigger. We all live through horrible shit in our lives some more worse than others, and I think seeing it in media, sometimes helps us through it.
What do you think? Have comedies for the most part, become more tame with the comedy they do?