The Gender Roles And Working Culture In The Nursing Industry



Workplace culture and gender roles have been changing very radically in the past few decades. For instance, workplace roles traditionally associated with men or those ascribed to females have seen a shift in acceptance of the opposite gender. More women are taking up roles that were traditionally reserved for men (Mills, 2002). At the same time, men are also venturing into career lines that were assumed best suited for women. For instance, nursing was for a long time considered the preserve of women. Recently, more men have taken up nursing as a profession and can be found in many health facilities across the world (Cross &amp. Baglihole, 2002). Meanwhile, it is not always easy for men to venture in professions culturally considered the preserve of women such as nursing. This essay explores the gender roles and working culture in the nursing industry with regards to sharing of roles among male and female nurses. The traditional nursing industry has been stereotyped in favour of women. Unknown to many quarters, this was not always the case. In fact, the domination of women in the nursing profession only began to emerge in the 1800s with largely unskilled female nurses providing the essential medical services. As early as 250 B.C. E nursing schools existed but admitted only men for training (Thompson, 2014). The oldest documented nursing school is believed to have been set up in India, which also recruited male students to care for the sick. In continental Europe, nursing services were provided by Christian organizations such as churches. For instance, when the Bubonic plague broke out in Europe, church organizations took over the role of caring for the sick and disposing of the dead. The Benedictine Nursing Order established by St. Benedict, as well as the Knights Hospitallers, are some of the earliest nursing institutions.