Hollywood makes movies for it’s primary audience, which is (for a little while) mostly white. They’ll do so because a lot of people still won’t go see a ‘black’ movie and money drives all. Until society changes or black people become a majority, this won’t change.