Because American feminists realize their movement is bullstuff but still want attention
Implying that all American feminists want attention
Implying that a minority of feminists that fight for equal rights over trivial things dictate the majority
Implying that a majority should be ashamed of themselves due to the actions of a minority
This is exactly what I warned about in the first post. "Feminism" is a broad topic that spans from the treatment of women in everyday society to statistics relating to wage gaps and other things. While some feminists fight against the supposed 'rape culture,' there are others that fight for the fair treatment of women in business settings, etc. In reality,
women are still being oppressed at a minor scale, in the form of discrimination due to a male-dominated society. loveism unfortunately still exists and can be observed in multiple settings. Women are less likely to be hired for the same job that men may be applying for, simply because the employer has a higher chance of being male and therefore trusting their own gender to make better decisions.
Between the moot and ridiculous arguments that radical feminists like to cling to and the promotion of equal rights among genders, people love to choose the former because it causes a lot more controversy and in turn attention.