Create and Fuck your AI Cum Slut -70% Summer Sale
x

myxlmynx

New Member
Oct 26, 2017
9
9
Is this game set up for Vulkan/AMD on Linux? Tried doing local kobold instead of web tunneling to try and get faster responses but the log is all messed up and it just loads forever. Maybe I did the setup wrong but I Launched Kobold with my model, confirmed everything was working in the web ui, launched multiic, went to settings and picked local and looked up my model from the menu on the left, set a context value, then started the game. Then when I get to the bedroom description the log looks like this, tries to run CPU only with 4k context instead of using my GPU.
As I understand it, "Koboldcpp local" will try to use the built-in Koboldcpp server, which is a CUDA binary (no Vulkan acceleration). To use your own local server, launch it as usual first, then choose "Koboldcpp remote" with your local URL, typically "http://localhost:5001" (I'm also on Vulkan + AMD + Linux, and that's exactly how I set up Multiic here). You'll see the game requests on your Koboldcpp server logs.
 

kurwa99

Newbie
Jan 20, 2018
77
14
so for image generation i need to get grok or openAI api keys and looking into it they cost money per promt. is there a way to get immage generation for free ?
 

IsaacOtto

Newbie
Mar 21, 2024
26
11
so for image generation i need to get grok or openAI api keys and looking into it they cost money per promt. is there a way to get immage generation for free ?
Great question, also is there a way to generate porn images based on the context ? The only results I've had so far with the OpenAI image generation is horrendous, doesn't even look like a person lmao
 

Jinkxy

Newbie
Dec 18, 2018
16
5
A few questions for the sandbox mode.
Is there a good way to initiate scenes with multiple people?
I had some success to get one additional person into it, with the prompt at the top saying "X wants to join", but more then one i could never make work.
Also even with one sometimes i get funny results, like for example asking for Fiona and prompt comes up as "Jo wants to join" and if you click the model of Jo would join but the Chat would refer to her as Fiona. Also in those cases most AI models seem to get confused and try to fit Fiona AND Jo into the conversation.

Is there any prompt i can use to make someone join that doesnt fail 70-80% of the time?
 

Jinkxy

Newbie
Dec 18, 2018
16
5
Yes. I've been trying to find the best one.
Let me know if you find something good as im doing the same.

My current winner is L3-8B-Stheno-v3.2-Q8_0-imat.gguf so far.


Edit1:
Tested "cognitivecomputations_Dolphin3.0-R1-Mistral-24B-Q6_K.gguf" and "dolphin-2.5-mixtral-8x7b.Q4_K_M.gguf" today , those had also pretty nice results.
 
Last edited:

PGU

Newbie
Oct 9, 2021
93
113
Nice girls... :giggle:


*le hard pass*

THX for sharing...
This is just like VR only games, it's pandering to such a small audience. Just how many people does the dev think have internet access nowadays? It's a shame, i really wanted to try it. Maybe one day i get the dial-up hooked up and can play all these fancy new games.
 
  • Like
Reactions: IsaacOtto

Cryptist

Member
Aug 20, 2020
432
633
10 Gs? Freakin' forget it. Compress to under 3 Gs, or die. I literally have 2 dozen games in my pending folder. I don't have to waste time downloading Moby Dick or War And Peace.

10 Gs indicates crappy game design to me. Some of the games I have followed through years of updates are still only a couple of Gs in size.
 

chrono1337

Member
Aug 30, 2017
112
376
10 Gs? Freakin' forget it. Compress to under 3 Gs, or die. I literally have 2 dozen games in my pending folder. I don't have to waste time downloading Moby Dick or War And Peace.

10 Gs indicates crappy game design to me. Some of the games I have followed through years of updates are still only a couple of Gs in size.
then use the internet ai instead of the local model, its 2gb. He seems to have upgraded the LLM model which explains the increase in size.
 
3.90 star(s) 16 Votes