In the last few years, we've seen countless videos of white women (Karens) acting like total idiots, I always joked and said that there's "something in the water" or "They are off their med's" I don't understand the thinking process that goes on to make you believe you can act the way they do. White privilege? or mental issues?