This has come up with some other members while talking off the site, and in a few (notorious) threads, and I wanted to pass it on to see what other members think. I've been looking into cosmetic surgery a bit lately-- I've had a child and lost 75 pounds or so, and between both of those things, parts of my body have really changed. I'd really like to wear a bikini and feel sexy in lingerie, and though I'm hoping that exercise and losing the last 20 pounds or so will make that possible, I also know that there's a distinct possibility that they won't. So I've looked into a few procedures. Since doing that, there's never been a thought that I'd try to conceal or dissemble or otherwise try to avoid saying that I've had work done. In fact, if I get work done and it looks good, I might just stand by the pool handing out my surgeon's card. :tongue: In most cases, and with breast implants in particular, it's usually glaringly obvious that somone has them, and yet both celebrities and non-celebs try very hard to conceal the fact that they have had some sort of enhancement. If you get a mose job or a facelift, it's pretty hard to hide as well. I read awfulplasticsurgery.com a lot, and they've had suits brought against them for speculating that this or that celebrity has had work done. What I wonder is, why lie? Why not just come out and say that you chose to have work done and you're proud of it? If natural is somehow better to you, why have work done at all? Why try to pass it off as natural? I have no problem with people who get cosmetic surgery, but I don't like being lied to about it because it is somewhat unfair to people who have chosen to remain natural. I know there is some stigma attached to cosmetic surgery-- I didn't like it until I lost all this weight and had a kid, then I understood why someone might want it. In my case, fixing my c-section scar almost seems like corrective surgery. I imagine for most people it's similar. What do you think?