Why 'Feminism' Should Be Erased from the American Lexicon

New attitudes are making genuine equality harder to achieve.

By + More

Look at education. Girls now do better than boys in school. More women than men go to university. Women are free to compete for any job and are increasingly leading companies as well as earning public office. It is foolish to compare the status of women here to women in the countries like Afghanistan, Iran, and Iraq. I have worked with women living in these countries. They know the meaning of oppression.

Attempts to position American women as a victim class marginalize us. Politicians and pundits often speak of "women's issues," but there is no reason for "women's issues" to revolve solely around our breasts and ovaries.

American women are first and foremost citizens. At the Independent Women's Forum, our mantra is that "all issues are women's issues," from nuclear nonproliferation to financial regulation, education reform, healthcare reform and terrorism.

Call it free market feminism if you like, but women do not have to be liberals, Democrats, or socialists to be feminists. True feminists should be committed to taking charge of both their own lives and their nation's destiny. They need to shape the future rather than fight the past.

Given the gains women have made and the country we live in today—and the counterproductive agenda embraced by the most vocal self-proclaimed "feminists" of today, maybe it is time to simply retire the word from the American lexicon.

  • Check out this month's best political cartoons.
  • Follow the money in Congress.
  • See the women of the Senate.