Interesting take. White people do love those “important films” that depict Black people as slaves, servants, and people fighing for basic civil rights, especially if there is lots of “important” violence against Black bodies. It solidifies the view of Black people in that inferior position, as well as justifies the…