Feminism is the idea that women are better than men in the political, economic, and social aspects of society.