If it wasn't for black history month, the only thing I'd know about my culture was that we were enslaved and the Civil Rights Era.
Maybe it's just my district or something, but literally the only thing we get taught during Black History Month
is the Civil Rights Movement.
And again, maybe it's just Jersey, or hell, just my district, but the history classes I've been in so far are far from white people-focused. Stuff like ancient Egypt, the Ottomans, ancient Japan, and so forth. Granted, classes that focus on US history are going to be mostly about white people, but to be honest, the majority of people who did the major things in the first 200 years or so of the US's history
were white.
It just kind of confuses me a little when people say "every other month is white history month".