And finally someone said it
No, not so much. Shut the hell up.
Men are the ones who have instilled the belief in society that women can only wear make-up. They’re also the same ones that have far too much criticism for something they hardly take part in, too. Remember that whole fiasco about fashion trends men hate but women adore? One was red lipstick. Seriously, if men have no experience with make-up, they should have no say in it, either.
Also, make-up isn’t something to wear just because you’re ugly. Similar to: you don’t just wear a hat because you’re having a bad hair day! It’s something you can choose to put on, regardless of how you or others feel about your appearance.
Lastly, women are not just better off because they get make-up. That make-up gets them nothing in terms of equal pays or even getting hired for a job. Men are responsible for banning themselves from “feminine” things. And as if society makes fun of men for being unattractive as much as they make fun of women whom they deem not beautiful.