Wednesday, October 19, 2011

Gender Roles - Stephanie Reynolds

Western society has some rigid gender roles, but some of them are influenced by religious means. A majority of Western society thinks men should act like men, and women should do the same in our specific gender roles. However, not every person can fit into one or the other category based on anatomy or sexual orientation. Religious institutions, including many Christian or Muslim followers, have some strict gender roles to follow, and can possibly turn violent if some individuals do not adhere to those roles.

Biology plays a large part on how our society defines gender and gender roles since some aspects have been there since Neolithic times. Gender roles for both men and women have been sort of hard-wired into our brains so men and women will act in a certain way in order to survive. Although the idea that men should always be out hunting while the woman plays the domestic role is outdated, but it did allow ancient people to survive. Although a majority of women do continue to handle all the house work or raising children, many more opportunities are available to them now. Many men these days still have the dominant role at work or in the home, but they can still change gender roles without much hassle.

The change that has resulted in the views of gender roles really comes from the human instinct to adapt and survive. For example, a mother with two kids may pursue her career and bring home the money, while the father stays at home and looks after the kids simply because the mother makes more money at her job than the father would. In the old days, this idea was unheard of, but today it happens in many families because the different gender roles allows the family to keep a roof over their heads. Like everything else on this planet that must survive, our views on gender society had to change in order for the human race to survive.

No comments:

Post a Comment