it is already customisted though - and has a lower barrior of entry versus sillytav. I do think though people should learn to not be intimidated by using sillytavern as its the best front end we have by miles.This just looks like an inferior version of SillyTavern
Detailed instructions on how to set it up are here:
You must be registered to see the links
TLTR:
Open config.txt and put your OPENAI KEY in the first value
OPENAI_API_KEY=sk-......your_key_here
cmon man dont do the OG dirty like that.This just looks like an inferior version of SillyTavern
It looks like the OFFLINE AI folder from the older versions work, I just tried the folder from V0.3bThere is no OFFLINE AI folder. Can someone help me?
might not be the right place to ask but after hearing about silly tavern here i just tried it outit is already customisted though - and has a lower barrior of entry versus sillytav. I do think though people should learn to not be intimidated by using sillytavern as its the best front end we have by miles.
anyone have a download link for a good working model?
edit: found these models that work fine.You must be registered to see the links
being able to run models depends on your graphics card, the more vram the better, im running a 2080ti so i have 11gigs of gddr which is just barely enough to run mid to low end size models, 7b models was pushing i think about 9 gigs on my vram when it has a download size of 7gig's, the 2gig model was taking about 4 gigs on vram. The bigger the language model like 13b + will require a minimum of 12gig's vram which is why so many people want a 4090 since thats 24gigs.Am I dumb or are the offline modes just slow as all hell? like am I doing something wrong? or is my computer just doodoofard and AI is just beyond it