The movies that Hollywood churns out are chock full of subtle stereotypes or racism. Has anyone else noticed this? And it’s not just race either, they even have sexist roles as well. I read somewhere that the reason they do this is to appeal to white males (which are still the majority in the US) and so, having minority characters (both racial and sexual) that deviate from what is acceptable from this demographic, will not sell tickets. How true do you think this is?
White males are not actually the majority in the US (there are slightly more women than men!) , but they are the group with the largest amount of disposable income, which is what Hollywood really cares about.
Hollywood producers definitely think that “niche movies” won’t sell. This has been proven wrong by films like The Hunger Games, with a strong lead female character (although she’s still been cast as white). The thing is that historically, these “niche demographic” movies have been given less funding and advertising – unsurprisingly a lot of them aren’t up to snuff – and producers look at the results and go, “well, it looks like movies about this demographic just don’t sell.”
To find more people who’ve noticed how racist etc all this is, spend some time on tvtropes.org – it’s amazing how much of this stuff isn’t only offensive, but offensive recycled stereotypes because Hollywood doesn’t even know what the box is, let alone how to think outside of it.
Lets see
Typical Horror formula = Black guy dies first,White women dies in shower
Typical Transformers formula = Somehow Autobots can talk like African-Americans SOMEHOW !
Quite true… to name one…
“male has sex with woman (unmarried) has sex at deserted lake cabin” – get killed by person with hook as a hand
yea they do it alot