What are your views on cosmetic surgery? I don't mean reconstructive surgery, like to help burn victims, but cosmetic surgery. Plastic surgery done just to make the patient feel better about themselves. Do you think these kinds of plastic surgeries are right or wrong? Do people who get them have legitimate gripes about their bodies or are they victims of a society that places too much emphasis on looks? Personally, I'm a big fan of cosmetic surgery. I've gotten it before and I would like to get even more work done in the future.