The term "Feminism" should be renamed. Right now, with the word "Female" in it, it implies female superiority. When really, it should be about men and women being equal.
I think Feminism should be renamed into something like "Equalism" or something
-
I don't think this is a bad idea. Maybe feminist leaders should do what the homosexual community did. They changed their official title to 'the LGBTQ community" because it is more inclusive and reflective of modern times. Feminism has done the same thing, they've largely focused on eliminating the focus on gender roles and equal treatment (yes, for everyone).