So is America becoming more politically correct?
I'd say that it's more along the lines of there being more at stake when it comes to the entertainment industry, so your publishers opt towards less "Risky" writing and development so they don't lose out on potential consumers / ad revenue
It's not like they couldn't get away with it, it's that the people making sure they get on air don't want them to try it and fail and lose them money. This is why TV shows that are hits with ratings are generally shoved at us like it's the greatest thing since sliced bread and why TV shows that are mediocre or gain a small cult following at best are cut off and canceled like it's going out of style