That is unlikely to happen, ever. The developer already stated that there is not really a way to improve the current model further. It has run its course in terms of technological development and there would be, at best, only incremental updates possible now which makes it not really worth the GPU time to retrain the model. Hence, there is no reason for the developer to bother improving it anymore.
If you want better translations than what can Sugoi can produce now, that requires either adding dictionaries to hardcode specific translations like
SLR Translator does which fixes a lot of quirks that Sugoi has, but leaves the subjects messed up of course like all NMTs, or just go ahead and use the superior technology of LLMs instead. The successor technology to the NMT technology used the Sugoi Toolkit is Large Language Models (LLMs), so if you want any practical improvements over Sugoi, then look into using AI translations instead.
I did a
comparison between Sugoi, DeepL, Mixtral8x7b. The results were that Sugoi is better than LLMs without context, but with context, LLMs are better at the cost of significantly increased computational time and reduced automation. For the minimal computation time that it has, the Sugoi NMT model included Sugoi Offline Translator v4, included in Sugoi since Sugoi Toolkit v6, is the best quality realistically possible for any JPN->ENG NMT model.