For crying out loud, broke fellas, and others, forget MTool already. Seriously.
Install Luna Translator. One scenario it won't beat MTool at is if there is a lot of various menus with stats, choices, options and shit in the game (Like in the latest NTR Case game, it'd inject/extract the main text, but likely won't work on those menus, or will require manually select them too to be hooked.) And it kind of will be slower as it'll translate as you go, not translate everything in advance as MTool does, but still.
Offline LLMs available, online using API too. Shitloads of other translators and settings too. Translations can be shared freely which greedy fucking dev of MTool locked (exporting to json files or something). Translaton can be embedded in case of RPGM games, at least some (I tried just one or a few; i.e. you'll see translation right in game instead of original text much like with MTool).
I prefer two models for translation from Japanese, suitable for reasonable amount of VRAM of 8-12Gb at Q4_K_M quants:
- Floppa-12B-Gemma3-Uncensored.i1 (mradermacher) (that one recent and bigger, probably better to have 10-12GB at least).
- Gemma-2-Llama-Swallow-9b-it-v0.1.i1 (Tokyotech) (that one older, not worse, I mind you, and can be squeezed in 8GB cards).
Imagine everyone using it and then just sharing translations here at the level of between Sugoi and Deepseek/ChatGPT in case of local LLMs or those mentioned online models. E.g.
ktez mentioned translating some recent game here using Deepseek with Luna Translator (along with Google Translate which is funny, for the reason of it being faster), and I bet it can be shared rather easily. Not sure how the second translation will affect that though. We'd have quite a library of translations...
Moreover, if you share a file with translation, the point of it all being slower than Mtool goes away cause it will load translation momentarily from json file instead of generating offline or online.
I can probably share in some time translation of
You must be registered to see the links
with the first local model (Floppa) as an experiment.