Immutable characters shouldn't have to be called out no matter what dystopian, nonsense year we live in.
I've noticed that with a lot of AI translations, there's a lot of misplaced "he", "his" and "him" terms and garbling like in the first few screenshots. That, coupled with the translator's unwillingness to read over stuff and correct mistakes leads to things being THIS bad.
I don't trust any AI that pushes its own gender politics.
1. This is the single most persistent problem with low effort machine translations AI based or otherwise. It means whatever setup they used doesn't have any context, or more likely they configured/prompted/processed things wrong.
2. It doesn't have anything to do with gender politics. It's a characteristic of Japanese to English translation. Japanese sentence structure does not require a subject, and thus pronouns simply do not happen that often, when they do happen their relationship is far more contextual, and it can get downright ambiguous whose pronouns belong to whom.
English sentence structure DOES require a subject, so if the japanese sentence doesn't contain a subject, adding a pronoun is usually the best way to create a legible sentence and maintain the original meaning with minimal change.
IF you are just translating a single line, blind, with no context, and you are a machine that can't derive the context through thoughtful analysis, then it's probably a 50/50 toss up- which is going to be wrong pretty much constantly.
No politics- it's just the mechanics of the linguistic differences, and it's not a trivial problem.
The most popular solution atm is to this is to provide a glossary of characters and what pronouns you want the AI to use for each of them. IF you are using a top end model, sometimes you can get by without it if you give it the entire script- some models can reliably deduce gender from the text if there is enough of it.
But whoever translated this has no idea wtf they are doing, they did neither of those things.
3. You shouldn't trust AI at all regardless of what you think its politics are- it doesn't have politics anyway, it's just a mathematical prediction engine.
AI is a powerful tool, but this is a toddler with a chainsaw situation.
It isn't allowing this person to make better translations, it's probably just allowing them to make bad translations faster.