Society has changed greatly when it comes to women's rights. There are some people who still believe that women have certain roles, etc. I for one think that in a relationship we are a team. At this point in my life, I do all of the house work, cook, etc. But, my husband works 60 hours a week to provide us with an amazing house and a relaxed lifestyle. My theory is that I will do all of the work that way when my husband is home we can just spend family time together and not have to worry about any other responsibilities.
The point of the post: I was reading some "gossip trash" and got the skinny on Jon Gosselin. Apparently he is broke, can't afford to pay his attorneys, can no longer do any public appearances due to the fact that he broke his contract with TLC. Yada yada. On the bottom of these articles the readers can post their thoughts. Here is a post that I saw: LEAVE THE POOR GUY ALONE, IF SHE HAD DONE HER JOB AS A WIFE AND MOTHER HE WOULD NOT HAVE HAD TO LOOK ELSEWHERE FOR LOVE. WOMAN NEED THE REMBER THEIR ROLE.
Seriously! Who says things like that? Are we back in the dark ages?! I take great offense to these types of comments where people say that women have their role, their place in society, blah blah blah! Absolutely not! Women are able to support families, they can be the "bread-winner", and they can also choose to stay home with the kids and run the house. That's the key word though, they choose to take on that role, they aren't doing it because they have to!
Sigh, I could just ramble and ramble on about this topic but I'll save you all the time!