Often I read articles by women saying that things between the sexes really started getting on an even keel when women decided to have sex the way men do. To generalize, men have sex without feelings, purely for pleasure. Many don't discriminate, and will say or do whatever to get whatever girl they want. I'm sorry, but what? Why is this something women would try to aspire to? There seems to still be this mentally that if men are doing it, it must be right, so women should do it too. Why is this so, even in 2011? Men and women are not the same. We don't think the same and we don't act the same, and that's ok. I have to believe there is another way to rise up in the workforce, etc, without acting like men. That's not being true to oneself, right? And what's more important, faking your way up a ladder or finding a way up in an honest way? Maybe I'm being unrealistic. I'm not in the corporate world. But it just makes me a little sad when I see a woman stand up and say, "I have sex the way men do". Not a good thing.