So the fact that harry weinstein being outed for being a predator, and all the social media buzz around it just isn't relevant in this context?
It seems that this whole thing is snowballing and digging up some old stuff outta hollywood. I don't see how this is in any way a bad thing, hollywood has always been a cesspool of drugs and rape. When it starts to actually finally maybe change, everyone freaks out over some supposed anti-male/anti-republican bias. It's boggling.