Are women becoming the new men?

Phoenix, Arizona
April 25, 2014 5:55pm CST
I have noticed recently that with the influx of women in the workplace and increase of women in higher education. that it seems we have become more stern. For example, has it always been that we marry have children and then divorce? have we always been this callused toward love? and its not till now that we had strength to voice our feelings or lack there of? is love and romance really just a man thing?
1 response
@cupkitties (7421)
• United States
26 Apr 14
If women have become the new man, I think it's because they have much more responsibility and stress to deal with. Yes they have better opportunities, but they're also still expected to be "traditional" women at the same time. They work much harder than men do or the women who came before them.