Maybe it's the overall political climate or the shitshow that feminism has mutated into, but over the past few years, women I encounter in public are rude or hostile toward men even if they (the men) are with their wives or children.

For example, went into a higher end store at the mall with my wife a few years ago, and the female salesperson greeted my wife, complimented her profusely, and didn't even acknowledge my existence and ignored anything I had to say during the sales process.

More recently middle aged women at the grocery store will run you off the aisle even when I have my baby in a stroller.

Another lady started yelling at me when she parked over the line and couldn't open her own car door.

The irony, stupidity, entitlement, and superiority are all so misplaced and uncalled for, especially when they claim to be so mistreated by systemic whatever nonsense.

Has anyone seen most law firms these days? HR departments? Hiring managers? Recruiters? More and more c suite and politicians? Nearly any advertisement or new show/movie? Predominantly female.

Stop the hysteria and start co-existing.