Tool Sugoi - a translate tool with offline AI-powered model to translate from Japanese; DeePL competitor

Entai2965

Member
Jan 12, 2020
161
459
if i run linux on a bootable usb, how can i configure sugoi translate to use my amd gpu?
1. install the ROCm version of PyTorch in the relevant python instance
2. use some server software to host the model
3. point whatever software you are using to the address of the server

Note that Sugoi Toolkit is windows only, but it is possible to run the backend model that sugoi uses to translate jpn->eng on Linux. There are various pieces of software that use a Sugoi API to translate stuff beyond the toolkit like Luna Translator.
 
  • Like
Reactions: RottenBoner

RottenBoner

Newbie
Nov 28, 2017
60
44
please help me if you are on discord contact me
try open the game via locale japanese emulator, or open texttractor (i don't know if that game is x86 or x64) and go to ajust, tick the box on "always run on local japanese" and the launch the game through textractor
 

skan743

Newbie
Nov 17, 2021
61
77
copy c:\.....\Code\userInterface\Menu-Interface\resources\app\node_modules\electron-window-state folder to
\Code\userInterface\Sugoi-File-Translator\resources\app\node_modules\
Do you know if anything can be modified in Sugoi File Translate? I don't have an Nvidia GPU, so it uses the CPU, but I don't feel like it's using much of it. I have the program on an SSD, so I don't think it's due to the storage speed.
 

nekim

Member
Sep 16, 2017
205
387
Do you know if anything can be modified in Sugoi File Translate? I don't have an Nvidia GPU, so it uses the CPU, but I don't feel like it's using much of it. I have the program on an SSD, so I don't think it's due to the storage speed.
if you don't use translation server offline option cpu usage seems to be normal. if you use offline translate too cpu use becomes %100.

if you want you can ask on his discord.
 

Zippix

Well-Known Member
Sep 7, 2017
1,847
1,243
Do you know if anything can be modified in Sugoi File Translate? I don't have an Nvidia GPU, so it uses the CPU, but I don't feel like it's using much of it. I have the program on an SSD, so I don't think it's due to the storage speed.
It was a while back that I tried file translate (with maybe v6? dunno), but the long and short of it probably still stands (though DO correct me if I'm wrong, everyone), which is that basically without CUDA (nVidia), it's a crawl. I mean you are already using two thirds of the CPU processing capacity as per the task manager image you shared - so you can -theoretically- get a 50% boost at max, maybe? (know it's oversimplification, but still)
if you don't use translation server offline option cpu usage seems to be normal. if you use offline translate too cpu use becomes %100.

if you want you can ask on his discord.
Judging by the first attached picture, he is using the offline model.
 

nekim

Member
Sep 16, 2017
205
387
It was a while back that I tried file translate (with maybe v6? dunno), but the long and short of it probably still stands (though DO correct me if I'm wrong, everyone), which is that basically without CUDA (nVidia), it's a crawl. I mean you are already using two thirds of the CPU processing capacity as per the task manager image you shared - so you can -theoretically- get a 50% boost at max, maybe? (know it's oversimplification, but still)Judging by the first attached picture, he is using the offline model.
when i test with cpu i got full cpu load on all cores. But when i don't use offline model i go 60-65 percent load on all cores just like attachment picture. (with a 4 core 8 thread cpu like on attachment)
 
  • Thinking Face
Reactions: Zippix

skan743

Newbie
Nov 17, 2021
61
77
when i test with cpu i got full cpu load on all cores. But when i don't use offline model i go 60-65 percent load on all cores just like attachment picture. (with a 4 core 8 thread cpu like on attachment)
I've read and tried a few things, but even though I've managed to get the program to use 100% of the 8 threads, I've measured the time it takes for sugoi translate offline and sugoi file translate with the server offline, and I don't see any performance improvement by squeezing my CPU. Perhaps it's due to the age of my CPU.

Edit the parameters in "user-settings.json":
"intra_threads": 4,
"inter_threads": 1,
According to the ctranslate2 documentation, which uses sugoi:

intra_threads is the number of OpenMP threads used per translation: increase this value to decrease latency.
inter_threads is the maximum number of CPU translations executed in parallel: increase this value to increase throughput. Even though the model data is shared, this execution mode will increase memory usage as some internal buffers are duplicated for thread safety.

But I didn't notice any difference. Maybe someone with a CPU with more than 4 real cores would notice.