Create your AI Cum Slut -70% for Mother's Day
x
4.30 star(s) 12 Votes

fierylion

Member
Game Developer
Jan 18, 2019
269
471
it sounds like the in-game memory limit was set too high, make sure the memory limit is set to what your PC can handle
 

tacosnap123

Active Member
Mar 3, 2018
802
221
it sounds like the in-game memory limit was set too high, make sure the memory limit is set to what your PC can handle
how do i do that exactly cuz i can handle alot but if i need to trim some tokens to get a bit faster respones time then i will cuz i relly have no clue what im doing in lms other then more tokens= more story. so like what exactly is gpu offload out of 30 and cpu thread pool size mean
 
Last edited:

fierylion

Member
Game Developer
Jan 18, 2019
269
471
how do i do that exactly cuz i can handle alot but if i need to trim some tokens to get a bit faster respones time then i will cuz i relly have no clue what im doing in lms other then more tokens= more story. so like what exactly is gpu offload out of 30 and cpu thread pool size mean
I'm not experienced with LM Studio, only used it a few times. Hopefully you can get help with the LM Studio parameters on their discord or forum
 

ptkato

New Member
Jul 8, 2022
11
4
A "continue" button would go a long way, sometimes it just stops where it'd be better to keep going. Even with multiple paragraphs enabled.
 

fierylion

Member
Game Developer
Jan 18, 2019
269
471
A "continue" button would go a long way, sometimes it just stops where it'd be better to keep going. Even with multiple paragraphs enabled.
you can type 'continue'

It's not really possible to allow the AI to continue writing from the previous game text, most AI endpoints don't support this unfortunately.
 
  • Wow
Reactions: Serpream

draxton

Newbie
Dec 10, 2019
36
55
One thing fierylion should consider is to add a few small validations on game load, to make sure the save being loaded is from the active world.
Adding versioning to the world definition and validating the save's 'world version' should also be considered
Just a warning will suffice when detecting a mismatched world or version (at least for now)
 

fierylion

Member
Game Developer
Jan 18, 2019
269
471
One thing fierylion should consider is to add a few small validations on game load, to make sure the save being loaded is from the active world.
Adding versioning to the world definition and validating the save's 'world version' should also be considered
Just a warning will suffice when detecting a mismatched world or version (at least for now)
Thats strange, you don't see this warning?
1746651694119.png
 

PALADINO3

Newbie
Jun 25, 2018
15
3
And the server went down again... Does anyone know or have a solution for an online server? My PC is as good as a microwave to run a local server
 

tacosnap123

Active Member
Mar 3, 2018
802
221
it sounds like the in-game memory limit was set too high, make sure the memory limit is set to what your PC can handle
thats output tokens correct cuz ya that other story takes a good chuck to come up with it respones and i have no idea if i mess with the world setting and remove how many promts it tires to give cuz it gives like more then 5 auto respones
 

ptkato

New Member
Jul 8, 2022
11
4
you can type 'continue'

It's not really possible to allow the AI to continue writing from the previous game text, most AI endpoints don't support this unfortunately.
What determines how long of a response the AI will write? Is that memory/token configs alone or is there something in the world/game settings too?
 

fierylion

Member
Game Developer
Jan 18, 2019
269
471
What determines how long of a response the AI will write? Is that memory/token configs alone or is there something in the world/game settings too?
max output tokens in the settings
However, by default AI will only produce one paragraph of gametext. You can disable this and allow long responses by unchecking One Paragraph Responses in the setting
 

ptkato

New Member
Jul 8, 2022
11
4
max output tokens in the settings
However, by default AI will only produce one paragraph of gametext. You can disable this and allow long responses by unchecking One Paragraph Responses in the setting
I'm a little confused by that setting, what's the difference between "max memory" and "max output tokens"? It seems that max output tokens depends on being smaller than max memory, why have two separate settings?
 
  • Like
Reactions: Serpream
4.30 star(s) 12 Votes