So I was thinking.
What do you guys think about liberal / anti-Racism films?
Are they any better or worse than actual racist films from the past or any subtle racism you see in films today such as racist jokes/narratives/stereotypes?
For example all of the slave movies where while the purpose is to educate, who is being educated by these type of films exactly? Surely not the rednecks with confederate flags in their homes or the white elites and politicians. So it's like... at the same time it is entertainment of seeing black bodies being killed in a way. Why do we NEED so many of these types of films every damn year?
Do you side eye Jill and Ben when they get so excite; 12 Years a Slave!? Ooh high culture! :butim16robert: