One of the tricky things about pop culture is that you never know for sure whether it'll be ahead of the times or behind them. African-American music, for example, was a trailblazing force — it helped make black culture part of the mainstream decades before anyone dreamed we might ever have a president named Obama.
You can see the flip side of this in our current movies' treatment of women. When Hollywood isn't ignoring them altogether, it's usually putting them down, even in romantic comedies supposedly aimed at the female audience.
Nobody gets ruder treatment than career women, who are routinely portrayed as bossy, uptight and utterly without personal lives. What they need, we're supposed to think, is a man. But before they can get one, they must have a mortifying comeuppance.
( Rest Under CutCollapse )
What do you guys think of this? Personally, I'm wondering how much of this misogyny is a reflection of the culture overall and how much is a reflection of audience/fandom misogyny.