After reading a post in a previous thread saying women need to leave their balls at work and be the woman at home to be treated like a queen. And how most men prefer to be "the man" in the relationship. I do personally agree that we needed some change, for example wives and mothers are now working so they should expect their men to help around the house with cooking, cleaning etc. Also that a woman should have the choice if she ants to work after having children. But in the end do women still need to be women (ie: feminine, caring) and men need to be men (provider, worker) etc? Also did men lose to many rights in the process or was it just evening up the playing field? Are we better off now then we were before?