As I was reading other blogs about women on the internet, I came across a few asking about whether or not women should be required to wear make-up in the work place. Certain companies require their female employees to wear make-up in order to “look professional”, or to “uphold the image of the company”. I personally choose not to wear make-up, and would be offended if someone were to try to force me to. I believe that wearing make-up should be a personal choice for women and if it is chosen it should not be in order to make the company look any better. There is nothing wrong with women showing their natural faces.

Sadly a few arguments that i found supporting women being forced to wear make-up were, “Work is about often image and your appearance makes people feel good, Why scare them with an unhealthy washed out face?”, “If you cannot put an effort into your appearance, how can I be sure that you put an effort into your work as well?”, “I know some women who do not wear any at all and you can totally tell…their skin looks so unclear like they havent washed their face in weeks”, and my personal favorite ” I work with several women who don’t wear any makeup what-so-ever….they are ugly. Just saying!!”. Harsh right?

I believe that going all natural does not make peoples faces look washed out and dirty nor does it make them ugly. As for the effort issue, just because one does not wear make-up does not mean that they do not put effort into their appearance. If they were coming to work sloppily dressed or with bad hygiene then I would agree 100% with the argument, but make-up has nothing to do with effort. For example, I choose not to wear make up because I feel that I should be accepted for who I really am. I feel that if someone cannot accept the way I look naturally, then they cannot accept me at all and are not worth my time. This may however, just be my personal bias. What do you guys think about the issue? Should women be forced to wear make-up in the work place? Or is it their own choice?

Advertisements