Here's my prediction for the future of American films: Disney, like Hollywood in general will (as always) take the wrong lesson from their failures. They're going to start making "male-centric" movies, but they'll approach them from the woke perspective: making pig movies for pigs. The writers, directors, best boys - excuse me, best "persons" - and all the rest will be just as woke and DEI as ever. The idea of hiring great writers and directors irrespective of their skin color, gender, or whatever type of furniture they sexually identify as is absolutely incomprehensible to management. So we're going to see a lot of condescending crap aimed at what they see as stupid, sexist men. They'll almost certainly try to slide in some sly jabs at the stupidity of their audience under the assumption that we're all too dumb to notice.
And they'll fail. Maybe some movies will do okay; they might even generate a few hits. But in the end, crap is crap. So they'll fail again. And once again, they'll blame the audience.
Maybe down the road Bollywood or China will start producing good films for American audiences. But one thing for sure is that Disney and Hollywood are lost causes.