Do you think men and women are designed to have different roles? Do you think women are more nurturing? Do you think men should be the bread winners? Throughout history for some reason the men have usually gone to work and the women usually took care of the children and the household. For the past thirty years or so in America we have been fighting against these traditional roles and I think it has caused a lot of problems for our children and our society. The career versus stay at home mom argument, or the “mommy wars” is a hotter than ever. This morning on The Fox News Channel one guest actually said “women enjoy mopping more than being with their children so why do we continue to push this idea that women are meant to raise children,” or something like that. This Olympics for the first time there are more women than men competing for the US team. There are more women in college than men and more women in small business than men. Now get this, even the women cheat more on their spouses. Hey gals we have made it. We have more depression, more divorce, and less respect than ever. Women used to be put up on a pedestal and now a pregnant career gal stands on the subway because the men don’t even get up for her. More husbands are home playing Mr. Mom as more women hold jobs in the workforce. Is all this a good thing? And is this healthy for any of us? Can men really be happy raising the children? Are women really happy being the bread winner? The traditional roles have reversed. But not for everyone. In many households the women are home and their hubbies are off working. Dinner is on the table at 6 because the wife is there to cook it. It may be an old fashioned lifestyle but one that many of us believes to be the best way to raise your family. What do you think?