We've known that at least in the last few years the culture in the West has been ever increasing hostility towards men and traditional gender roles.

In my opinion, the foundational principles of Western civilization (in b4 white nationalist - I'm an Asian) include freedom of speech and expression and individualism.

With the erosion of these principles, the West is more a string of supermarket chains than that vaunted civilization that the rest of the world (largely) used to envy and seek to emulate.

Anyone from the West thinking of leaving?

Edit: I'm in Asia