The most current stats on plastic surgery were just released by the American Society of Plastic Surgeons, and the news is NOT pretty. Probably makes lots of "Dr. 90210"s happy, though.
That's because, as the The New York Times reports, more American women are getting implants than EVER BEFORE. In fact, breast implant surgeries have gone up 40 percent over the past 10 years, and almost 300,000 women got boob jobs in 2010.
Botox injections have increased 584 percent over the past 10 years. In general, doctors performed 13.1 million cosmetic procedures in 2010, an increase of 5 percent compared with 2009. And sadly -- though not surprisingly -- women account for 91 percent of cosmetic procedures. Wowowow. It's an epidemic! And it's totally disturbing!
Alright. Let's get this out in the open. Seriously, what's going on here, ladies? Who is to blame -- The Real Housewives, Hugh Hefner, magazines, fashion designers, MEN (likely of the Tiger Woods, misogynistic variety)? Who has been convincing you that your body isn't perfect just the way it is?
Now, obviously, you could write a book the size of Lisa Rinna's lips on the rise of Botox in this country, and that's a problem, but it's not as invasive as breast implant surgery. So, let's chat about that, shall we? What disturbs me more than Joan Rivers' face is how 300,000 women can't possibly have a medical or even a mental reason that can only be solved by getting breast implants.
As for the "mental" part -- I've heard the argument before from smaller-chested women that their self-esteem suffered, because they were born short of a C-cup, or whatever. I guess that's valid, but WHY is your self-esteem so inextricably linked to your breasts? That's so sad that you can't think to yourself, "Hey, I may not have Victoria's Secret-sized wahbos, but I've got gorgeous hair/a stellar smile/killer legs/a fantastic tush, etc." I really feel like that's an issue that would be better solved with some kind of counseling, coaching, therapy vs. plastic surgery.
Then, there are the women who get plastic surgery on their boobsicles after a baby. I can't say I've been there, and who knows -- someday, I may actually be disillusioned with how my body looks postpartum and I end up going under the knife. So I can't purport to understand quite yet how it feels when you're in that position. But I still have to wonder if plastic surgery is actually the best answer for so many women.
What these trends really seem to reflect is that most of us could stand to love ourselves a LOT more and to treat our bodies with more respect and kindness. Sure, maybe some women see going under the knife as a gift to themselves. But more often, I think they're doing it because something's missing ... They don't feel loved, they don't feel desired, they don't feel BEAUTIFUL. (One extreme case in point: Train wreck Heidi Montag.)
I won't hesitate to assert that the bulk of this plastic surgery seems to be the result of hating ourselves. Loathing the hand that nature dealt us. Wanting to fit some unrealistic ideal of the female form that was invented and exists only in Photoshop.
Maybe we could all consider trading reality TV for a reality check? Perhaps then we'd realize that a boob job is rarely, if never, the best way to feel better about our bodies and ourselves.
What do you think? Why do you think breast implants are on the rise?
Image via David Becker/Getty