Studies have found that as women take over male-dominated fields, the pay drops. So what happens when men start joining female-dominated fields?
This is a particularly important question given that female-dominated jobs have some of the highest projected job and wage growth in the coming decade. (In fact, for just the second time, women outnumbered men in the U.S. paid workforce - in large part due to job growth in health care and education.)
Sociology professors Jill Yavorsky and Janette Dill wanted to find the answer. They asked themselves: "If jobs in female-dominated sectors represent the future, what will it take for men to take them?" And what will happen once they do?
They shared highlights of their recent study and its findings in Business Insider:
- "Men who are unemployed are much more likely to switch to a female-dominated job in fields like education or healthcare.
- Men entering these jobs experienced a wage and prestige increase — which could be because they will only take a position in a female-dominated fields that pays more is and is more prestigious.
- Men entering these positions could push how our culture values work that is traditionally done by women — and even lead to higher compensation if these jobs were valued more."
While having fields like teaching and nursing become more valued would be amazing, it's downright painful (albeit unsurprising) to think that it would take more men joining the field for that shift to happen.
And the potential outcomes aren't necessarily all positive. Many women in male-dominated professions face the glass-ceiling; Yavorsky and Dill think the opposite might be true for men who join (and stay in) traditionally female-dominated occupations. They could benefit from a "glass escalator" that accelerates their careers, they said, citing previous studies that found that "straight, white men in nontraditional fields are often fast-tracked to management positions."