- Jan 8, 2022
- 507
- 1,346
Deepseek does work on Janitor, if you set it up as proxy through openrouter.I wish JanitorAI would work with Deepseek
Deepseek does work on Janitor, if you set it up as proxy through openrouter.I wish JanitorAI would work with Deepseek
How please?Deepseek does work on Janitor, if you set it up as proxy through openrouter.
1. Go to openrouter.ai and make account.How please?
I tried this1. Go to openrouter.ai and make account.
2. After making account, click on settings, then scroll down to default model and select Deepseek:R1(Free).
3. Go to keys, and create and api key. Be carefull and don't forget to save it as you can never see it again.
4. Go to janitor and find proxy compatible bots.
5. Once in the chat, click on the api settings and select proxy, then select custom.
6. For model type in (All in lovercase): deepseek/deepseek-r1:free
7. For the url, type in exactly this link:You must be registered to see the links
8. Click on the api key and paste the key you saved earlier.
9. Click save settings and when pop up ask you to reset temperature back to normal click yes.
10. Click on generation settings and set the temperature between 0.8 - 0.9 and tokens size to 1000 to avoid cutt off messages.
This should be all to make the free version work. I never used paid api on openrouter, so I don't know how to set it up.
Not sure what could be causing this. Maybe change the context size? I'm having mine on 16,3k and it works for me fairly well.I tried this
Getting an error message unfortunately
A network error occurred, you may be rate limited or having connection issues: Failed to fetch (unk)
Followed each step. It detects 128k limit but not working unfortunately.
I don't use janitor, but I did try confirm it before. I followed aI tried this
Getting an error message unfortunately
A network error occurred, you may be rate limited or having connection issues: Failed to fetch (unk)
Followed each step. It detects 128k limit but not working unfortunately.
UpdateNot sure what could be causing this. Maybe change the context size? I'm having mine on 16,3k and it works for me fairly well.