Do Not Smile
Smiling is beautiful, an expression of happiness and affection. But why do women always have to look happy?
Telling a woman to smile is usually done for two reasons:
Caring or control
The former may be well-intentioned, but neither is likely to go over well. On the other hand, studies show that many men see a smile as submissive, weak, and vulnerable. So telling a woman to smile could be about wanting to push her back into a traditional stereotype. In our society, we are so used to women smiling at us from every advertisement that it is almost disconcerting when this is not the case.
There is so much more behind every woman than just the smile we all expect. Let’s make that visible.