Feminism
Feminism can be defined as a movement that seeks to increase the quality of women’s lives by impacting the norms and ideas of a society based on male dominance and subsequent female subordination. The means of change in the work place, politically, and domestically. Women have come a long way since the 19th century. Women has been trying to prove to the male dominant world that they are equal. They can perform and complete any tasks equal, or in some cases better than man. Feminism has changed the definition of men in many ways.


Women in the work place have transposed dramatically since the 19th and mid 20th century. Even if women had any education in the 19th century they were not allow to manifest any of it. It just wasn’t proper for women to give any signs of intelligence. Women were to prepare themselves to become wives and mothers, which were the extent of their lives. In the early and mid 20th century some women started to be brave and take a stand for themselves. At the beginnings of feminism, women were starting to take its massive role in society. More and more women were getting educated and looking for employment opportunities that had power. Men no longer can be in control of everything. Men in the work place started to feel impotent. But women fed off each other and gave each other strength. They were not looking for just the secretarial jobs; they were taking some men’s jobs and being good at it. Women were professionals with a university degree; taking and sharing jobs with men and performing just as well as them.