Alright so nothing about this is "sure", yet, but essentially, big change of plans.
The administrator of anime-sharing has convinced me to broaden my plans significantly.
Instead of just supporting DeepSeekV3 I will add universal support for all common LLM apis, most importantly the OpenRouter service, which means it will be possible to use SLR Translator with whatever online LLM you want and you will be able to use the free dalie requests as well.
I also have plans further down the line to add support for LLM Studio, which will allow you to use whatever locally hosted LLM you want.
I will add an options menu to specify url, api key, model, temperature, system prompt, user prompt and request size. They will all go through the SLR translation wrapper pipeline, which means all tagging and error checking will work exactly the same as it does for Sugoi except it will not use my dictionary system. (Since that is tailored towards SugoiV4.)
But I only plan to actually test stuff with the DeepSeek-chat-v3-0324 and DeepSeek-chat-v3-0324:free models, if you choose to use a different model, you're on your own.
Edit: I should mention, this will not happen overnight, adding stable support will probably take months. I don't just need to write the new code I need to run a lot of tests. These "AI" aren't particularly consistent in their responses, even if you turn the temperature way down.