The more a woman likes her job, the better her self-image and the more she enjoys her life.
The things women find rewarding about work are, by and large, the same things that men find rewarding and include both the inherent nature of the work and the social relationships.