Feminism initially started off as a good thing. A woman with equal ability should have the same opportunities as a man. She shouldn't be given a leg up or down because of her gender - it should all be based on ability.
However, what feminism evolved into was women hooking up, disdaining men, and deciding not to raise their own children. Women began taking less interest in their homes, husbands, and kids, and what resulted was very destructive to both men and women.
First off, boys started growing up to be "males" instead of "men." The best thing to ever happen to young males was feminism because it saved them a lot of money paying for whores. With all the hooking up and casual sex that goes on these days, most girls act like whores - they just don't get paid for it. And what guy wants to lay down his life for some skank who has been with 18 guys? What for? He doesn't see her as motherhood and apple pie, he sees her as a skank. This is why young males stand by and watch when girls are molested and raped - it's entertainment to them.
In addition, feminism encouraged mothers to neglect their kids. I think it's wonderful for a woman to go through medical school and save a lot of lives, but she shouldn't have kids. We shouldn't dump kids by the wayside so we can pursue a career. No nanny or day care can take the place of a mother's arms.
What began as a noble cause has emasculated and effeminized our culture to a disgraceful level. As parents, we need to place more value on teaching kids to be ladies and gentleman again - and fast.