4.30 star(s) 12 Votes

Breedking

Newbie
Jul 7, 2024
53
35
Players should have the ability to access and edit the character stats in the game menu or on a side bar. The AI sometimes over or underinflates stat values. Also, it looks like editing the AI text output in stories doesn't really work either, it would be cool if that worked (+ I think adding a delete option for AI outputs would be nice, as I like going through the different descriptions outputs can generate from one input.).
click the pencil on the lower right to manually edit stats and the pencil on the top right of the text box to edit what the AI writes.

You can also manually tell the AI to adjust stats, but that gets buggy.
 

rathernot3

New Member
Aug 8, 2022
5
1
Used LLM service gives an error 95% of times. Was able to make it work with LM Studio (local llm), but the model I have is bad at it, unfortunately (Lumimaid 0.2, 70B Llama3.1 finetune) . Also it fails to track history (well, I see the requests going to LLM, there is simply no history)
 

fierylion

Member
Game Developer
Jan 18, 2019
256
459
Used LLM service gives an error 95% of times. Was able to make it work with LM Studio (local llm), but the model I have is bad at it, unfortunately (Lumimaid 0.2, 70B Llama3.1 finetune) . Also it fails to track history (well, I see the requests going to LLM, there is simply no history)
you need to increase the memory limit in the game settings, the default of 2000 is tiny
 

fierylion

Member
Game Developer
Jan 18, 2019
256
459
How much? I can have a context about 20000 tokens, but that will be pushing the system.
oh if u have enough memory then the error is probably different. Can't tell whats wrong unless you give the specific error message

Also, which model will you recommend? I can run 70B llamas, but only heavy quantazed (Q3_K_M, ~3 bit per parameter)
I like this one
 

rathernot3

New Member
Aug 8, 2022
5
1
I mean, an error was happening with default LLM provider
With the local one, I just wasn't observing LLM using history at all, even though I see it present in requests (`lms log stream` does show the requests made by the game, not the responses, though)
 

Thronico

Member
Sep 11, 2017
131
515
Also running into a rollback issue, along with the text edit no longer visually changing the text on the current page. Paging away and back, loading the game, fully resetting, etc. do not fix the issue, and it persists with no error from LM Studio.

Edit: It looks like my responses are loading at the bottom of the previous page (the yellow text), rather than at the top of the new page. Could be a clue about the issue. This was a save with over 130 pages of generations.

Edit Edit: I also disabled the "Single Passage Event" option, in favor of controlling generations with my own established "Max of # Paragraphs" rule in the Game Text Prompt box. I've been running with about 32000 output tokens, and I move the memory around to test how it affects generation, but mostly keep it in between 10K-48k.

Edit Edit Edit: Creating a new game fixes the issue, but old saves remain broken. However, I believe the bug is only visual, as editing the text box will still load the new scene as I had made the edits, and I believe the same goes for the rollbacks.
 
Last edited:
4.30 star(s) 12 Votes