Gender equality has come a long way since the days when women were expected to stay at home with the kids or go to work in the corporate world as a secretary with no chance of advancing beyond that rate. With the evolution of the American workplace, gender lines are being shattered more than ever. Although this widespread change has appeared to be drastic, there is one field that seems not to change with the times.

The field of nursing continues to be dominated by women. Not only are male nurses an overwhelming minority, they receive no mercy from hollywood.

While male nurses have become a laughing matter, it is no surprise how shockingly low the percentage of nurses that are men. Out of the registered nurses in the United States, men make up only 5.8 percent of the population. (Dept. of Health and Human Services, 2009) It is interesting to see that as gender equality in the workplace is a hot topic of conversation and has been for the past several decades, yet this number is as low as it is.

The main reason men make up such a low minority of the nursing profession is believed by many simply to be the stereotype that nursing is meant to be a female dominated profession. It is easy to see that nursing is an attractive profession being that it has a lot of room for advancement, is high paying, and allows for a flexible schedule. Although nursing seems to meet all of the credentials of a legitimate career regardless of gender, the rate of male nurses continues to remain low because of the stereotypes which discourage men from entering the field.


About dunit4rays

Criminal Justice Major. ODU '13
This entry was posted in Uncategorized. Bookmark the permalink.

4 Responses to Murses

  1. tvanh001 says:

    I think the idea of gender roles plays a huge part in why this field is dominated by women. We talked a lot in class about how women are naturally caring and nurturing. Men on the other hand have not been known to show those characteristic as much as women. With that it makes sense on why Nursing is dominated by women.

  2. Jeff Acheampong says:

    I guess it’s so commercialized as women becoming nurses that i guess it is a bit out of the norm and there is a lot of other stereo typed jobs like football and firemen aswel

  3. alexisib says:

    This is so true. Women dominate the nursing field, and that profession as been extremely genderized. But the same applies to other titles like farmer, fire fighter, officer, and still even doctor.

  4. insuru says:

    It is true that women are viewed as caring and nurturing so would make good care givers, but men can be the same way and it shows on the other side of the medical field- doctors which is dominated by men.

Comments are closed.