--
Love is what makes life worth living.
Or at least, that’s what I was told as someone who grew up in the ’90s; this sentiment about love seemed to be everywhere.
Disney movies involved women being saved by knights in shining armor. My friends and I lived for the day our Seventeen Magazine issues came in the mail, shelling out love advice like crack to pubescent teenagers. Every time I drove somewhere with my mom, I heard Dr. Laura’s harsh love advice.
I was told that love was this thing that would rock my world. And though love is indeed life-altering, those definitions of love were wrong.
After loving a few too many people who didn’t deserve my love and having my heart broken by one too many careless people, I realized how warped my idea of love was. I ventured into the world of dating with unhealthy beliefs and habits but I barely kept afloat in a world I thought would be beautiful.
It took me a while to the truth about love.
Even though I spent countless nights crying over men who didn’t deserve my tears, I wouldn’t go back and change anything. I had to go through those heartbreaks to be able to redefine what I believed love to be.
We’re told love is the best thing to ever happen to us.