Having a positive image of your body is so important. It is probably something that our parents don’t help us with enough. They say, “Love yourself”, “Go to school”, and “Eat your vegetables”. But there should definitely be more sayings flying around the house along the lines of “love the body you were given”, “don’t compare yourself to others”, “having a big butt or big boobs doesn’t make you beautiful” and “don’t give someone the power to think you aren’t pretty enough because you don’t fit into their box”.
Granted…these are all things that I know now. But I’ll admit that it took me some time to grow up and understand that.
During my teens and 20s, I had the misconception that I wouldn’t be considered sexy or pretty if I didn’t have tons of skin showing. You would catch me in something super low cut or sheer most days of the week. Especially if I was going somewhere with my friends like the mall, movies, or a nightclub. Shirts would be worn as dresses, bras would be a top, and the skirt had to be so short that bending over was a problem. I recall one time when my friends and I went to a beach party and ran into one of my cousins. My outfit was so revealing that she refused to walk around and be seen with me. Yes, I said refused. But showing off my body somehow made me feel confident. So, I let her bust her moves elsewhere. Plus, it felt like I was showing the world that I loved the way that I looked. But was all of that necessary??
I can’t put my finger on when the switch happened but at some point my idea on showing my positive self image switched. I started to notice that I felt like I was showing off when I’d wear a baby doll dress and sneakers. Or jeans with a sweater and low booties. I also noticed that my guy loads up on the compliments on outfits that I wouldn’t think would turn his head. He loves seeing me in a jogging suit. It was weird to me but I realized how I still felt confident and positive with my body when I am completely covered up.
These days, judging by social media. You only love yourself if you are posting up naked. Its like you have to show how much you love your body by being damn near naked in every photo.
Showing it all seems to be taken to a whole new level thanks to some peoples need for likes on the ‘Gram and other social media platforms. Okay, I can understand posting a picture of someone in a bra and panty set because they have a lingerie brand. But some of the shit is too much. For instance, I ran across a picture of a girl who was completely naked. Only putting a small butterfly emoji over her vagina and blurring out her nipples. I look at her bio and it read, “I have my Bachelors and Masters in Business.” Girl please. Then post a pic of you studying. What’s the point? Does being butt ass naked for all of the world to see say to the world that she has a healthy body image?? Which do you think?
It is even worse with celebrities. They will drop a naked pic or sex tape and claim their phone was hacked or stolen so fast. Yea okay. It is all for the need to either make people want more or to show off that body. Who do they think they are fooling? Then if you happen to comment on people showing all of their guts, you are accused of body shaming.
I still love a good mini skirt, bra top, or t-shirt dress and you will definitely see me in one or all of them. I just don’t feel like I HAVE to do it to make me feel sexy. What does having a positive body image mean to you? Do you have to show it all?? Let me know your thoughts in the comments…
Stay Fearless 💋