I remember a conversation I had with a friend a few years ago as she explained her views on makeup and why she chooses to no longer wear it on a regular basis.
“Really?” I found myself saying. “You look great!”
Looking back, I wonder why I said that. I mean, in hindsight, it was supposed to be a compliment to her, but for some reason the words now seem twisted in my head, and seem foolishly contradictory. Why did I feel the need to condone her choice not to wear makeup? To suggest that she didn’t need to wear it because she already looked great? Is it because society has taught women that without makeup we can’t possibly be beautiful? Who decided that human skin is imperfect and needs to be covered; “fixed” and “painted” into something “better?”
Don’t get me wrong, there is absolutely nothing wrong with makeup. The question I’m getting at is why is it so important, especially in female society? Why does it feel like makeup defines beauty?
While I don’t plaster on a large amount of foundation and eyeliner every morning, I usually put on a little makeup because it makes me feel pretty. But why does it make me feel pretty? It doesn’t change who I am. It doesn’t define me. All it does is add a layer of liquid and powder to my skin and an extra bill on my bank statement.
There are days when I just don’t have time or simply don’t want to take the time, and I leave the house with a completely bare face. If I’m being absolutely honest, when I do this, I have to remind myself that it’s okay. Whether or not I wear makeup does not define who I am as a person.
But if that was true, then why do I feel like every time I leave the house without makeup…it’s like making a statement?
Recently, I traveled in India and honestly I only wore makeup a few times during the two weeks I spent there. Why? Because the weather is hot, humid, and when I wore it, it felt like the makeup was melting off my face. I was twice as sweaty and my skin twice as oily. It simply wasn’t worth it to me.
And yet I felt like I had to justify that decision. Not to anyone in particular. I had to justify it to myself. Not because I felt any less beautiful by not wearing makeup, but because I had to consider whether or not to wear it in the first place. It was a choice. A conscious decision. The beauty industry has advertised to women since childhood that wearing makeup is simply the normal thing to do. I had to remind myself that simply isn’t true.
I don’t have to wear makeup. I don’t even have to want to wear makeup.
The history of beauty cosmetics dates clear back to the ancient Egyptians, when the wealthier women would carefully apply face-paint in order to make themselves appear more beautiful. Now, in our modern day society, the roots of capitalism run deeper, businesses profiting off human insecurity. In 2019, the U.S. cosmetics industry was worth a whopping 93.5 billion dollars.
I’m not saying boycott makeup altogether, but I wish that some would put their beauty routine in perspective. Women are not required to wear makeup simply because it has been advertised to us, since we were kids, that we should. It’s okay to choose to, so long as it’s a conscious choice, not because you feel like you have to.
Because you don’t.
You don’t have to wear makeup.
There is a timeless saying; beauty is only skin deep. But honestly though, beauty is a lot more complex than just surface level. And I know we all know this, but sometimes it’s good to have a reminder that we are more than our appearance. Because whether or not our appearance fits the mold of beauty defined by society is irrelevant.
True beauty is found within.
Hi! My name is Rachel. I love to write. Write about life, love, and reflect on how the past builds the future. Mostly, I love to tell stories because I believe there is something about stories that brings the world closer together. You can check out some of my writing reflections here at Rachel Writes.