NEED HELP WITH YOUR PAPER?

We are the ones who can help you to get a well-written, balanced paper only from $8.99. A paper that meets all your requirements.

order now

Gender Roles in America Argumentative Essay

Introduction

Gender roles issue has always been very contentious in the American society. This stems from the fact that man has continued to dominate the woman since time immemorial. Since the biblical days, the role of man has been depicted as the protector, bread winner and the decision maker. On the other hand, the woman has been shown as the home maker. The role of the woman was to support the man and provide care to her family through sewing clothes for them, cooking and nursing them when they fall ill. Throughout many generations, this ideology has still remained the same. However, in recent years, the gender roles have been entirely reshaped. The gender roles have been transformed over the years due to several reasons.

Gender Roles

Gender roles denote sexual and biological differences between men and women. It refers to a set of behavioral and social norms considered being socially applicable to people of a specific gender. Evidently, the creation of human beings denotes that there are only two human genders in the world i.e. men and women. The gender roles vary greatly according to the culture and other social factors. Mostly, the roles arise as a result of the social-cultural construction in which the determined gender roles reflect the accepted aspiration of a specific gender (Stone & McKee, 1999). However, the gender roles may be manipulated to favor a certain gander. Such a scenario may lead to oppression of the ‘weaker’ gender.

Changes of Gender Roles in America

The gender roles in the American society changed drastically during the 1st world war. During this war, the men had to go and fight for their country while women were left behind to take care of their family. They had to fend for and defend their families. During this period of the American history, there was a complete change in the gender roles in the American society. However, women were still considered weak despite the change in the gender roles. Hence, they could not participate in tough jobs that were considered fitting for men. Women were more often given certain jobs such as waiters, secretaries and tailors. Any significant job that required decision making was primarily for men and no women could be involved in such a job. In most cases, women were brought up believing that only certain jobs were meant for them. Hence, many of them believed that men were potent and capable than them. Nonetheless, this belief has been transformed for the past 5 decades. Massive efforts by the feminist groups coupled with the introduction of technology have enabled the women to know their rights in the American society.

Factors That Contributed To the Change in Gender Roles in the American Society

The change in gender roles in America was mainly caused by the prevailing economic trends. For example, the prevailing economic conditions were harsh, and the men’s income was not satisfactory to cater for all the needs of the family. Hence, both men and women opted to take up different jobs than what they were used to, in order to supplement the family income. This occurrence had a long-term effect on the way women used to view certain jobs. They realized that they too could succeed in the jobs they do if they are given the chance. Moreover, they could contribute to the financial status of their respective families. Presently, men are facing intense competition from women in all avenues. There have been rare cases in which women even earn more than their male counterparts. Most importantly, women are now capable of taking up the role of breadwinners of their families in cases where they there are absentee fathers. Such women have managed to embrace their roles as mothers and wife coupling them with their career. This clearly shows that the gender roles in the American society have clearly changed. It would be easy to say that there are no definite roles for either men or women.

Buy Argumentative Essay

Women were pushed to look for employment by the desire to be financially independent of their male counterparts. Unlike in the past where the role of the man was to provide for a woman and take her from her parent’s house, present-day men wish to pursue other important options before even thinking of marriage. Hence, women are left with no choice but to try to be financially independent before and after they get married. In order to achieve their desire, they opt to join colleges to further their studies and get employed. Present-day women will only think about marriage once they have secured a good stable job. The change in the family values could be the primarily cause of this change. Still, moral decay that has totally overwhelmed the American society can be a casual factor. As a result, women see the need of having financial back-up plan in the unfortunate case their marriage fails to succeed.

Education has contributed considerably to the change in gender roles, in the American society. Nowadays women are exposed to the same training and skills as their male counterparts. This is in contrary to early years when they used to be only taught home management skills. Women can now make sound decisions about their future because they have been taught about their rights (Gerson, 2010). Education has managed to discourage gender inequality as both men and women have trained to respect each other’s opinion.

Lastly, the media have influenced the change in the gender roles in America. Often, the media give a picture of an independent woman who has succeeded in her life without being exploited sexually. A clear example of such women is the likes of Condoleezza Rice, Oprah and Hilary Clinton. They are role models to the majority of the young girls who admire their characters and achievements. These women have proved that women can still make it in life despite competition from a male-dominated society. The young girls are motivated by the success of such reputable women, and they will desire to be independent women in the future. Furthermore, the documentaries and entertainment programs aired by leading media TV stations portray women as powerful individuals. For example, the movie ‘desperate housewives’ often portrays women as powerful individuals, who have the potential of succeeding without getting favors from the men. Such media efforts greatly contribute to the change in the gender roles, in the American society.

Impacts of Changes of Gender Roles in America

The overall impacts of the change in the gender roles in the American society may be both positive and negative. On one hand, families are now more financially stable, children are getting quality education and the entire family is able to pay for better health services (Cancian, 1987). However, the change in gender roles has brought up some unfortunate impacts. For example, some women are now engaging in criminal activities in order to be financially independent. Furthermore, out of all the countries in the world, the United States has the highest number of divorces. Some people have also attempted to commit suicide after feeling that they cannot cope with the new changes in the gender roles.

Conclusion

Over the past 5 decades, gender roles in America have changed significantly. There are different factors that have considerably contributed to this notable change. All these changes have totally changed how the Americans’ perspective about gender roles. Furthermore, this change in gender roles had both positive and negative influences in the American society. The positive impacts of the change in gender roles far outweigh the negative impacts. Conclusively, the American society has improved and become better as a result of the change in gender roles.