I don't know about you guys, but has anyone who works for a medium-large sized company noticed a company-wide tendency to A) lean to the left of politics e.g appease the minority and do stupid things because "diversity" and B) Allowing women to get promotions to the top because... they're women
I have no problem if you're black, blue, white, green or indifferent - but being promoted because of your genitalia reeks of virtue signalling and worse still, it's happening all around the world it seems (unless of course, you live in a middle eastern country).
I've witnessed women with no clue on how to do their job being promoted to run teams and earn 6-figure salaries. In days gone by, they'd have earned 1/4, maybe 1/6 of what they're now being paid.
Is this happening where you work? Or am I speaking an alien language here?