I think they are. When I was still on Twitter, I saw a whole mess of different things. I mean we see it in TV, movies, and music even. Like people are being told to be selfish, worry about themselves, and all that. Like yes, you take care of yourself but not to a point where you become a scumbag or something. I feel like every different kind of culture in the west is being uplifted for its worst attributes. Worst of all, it is being spread now like wildfire and celebrated. You see this in black communities, gay communities, and just generally with women from all different backgrounds. I am all for freedom and people doing what they want in life but making nothing but bad life choices won't just screw up your own life but everyone in your life and around you. They are establishing a culture where failure and struggles in life an inevitable. What do you think about all this?