I seem to be plagued lately by people telling me to smile. This happened to me on pretty much a daily basis when I was a kid and continued until I was sixteen or so. It only happened to me occasionally after that, and usually in the grocery store, of all places. It seems to be mostly, but not exclusively, older males. But these days, it seems to be happening to me quite frequently. I'm not sure why this is.
It always enrages me when people tell me to smile. Why should I? I smile when I chose to. Just because I'm not smiling, it doesn't mean I'm not happy, and even if I'm not happy, what's the problem with that? I feel how I feel, and if you don't like it, it's not my problem. Why should a complete stranger think they have the right to tell me how to act and feel? I'm not going to pretend to be happy just to please some random asshole.
Other than the annoyance factor in all this, I am never sure what to say to these people. I have tried a variety of things, including:
- ignoring the person (doesn't work)
- telling them not to tell me what to do (I usually get the response "but I wasn't!" - um, yeah you were!)
- telling them to fuck off/kiss my ass
- saying "why should I?"
- and if the person is being particularly nasty, I tell them someone in my family has just died
Does anyone have this problem? Why does this seem to be mostly males telling me to smile?