AI PSA: Edit Cards to Get More Chat Context (How to)(with examples)

Fuggiless

Newbie
May 9, 2022
16
100
If you're an AI TextGen user or just getting into it, I want to stress the importance of character card editing.
Once you find a card you enjoy playing around with, a simple 5 minutes can make the experience a lot better.
(Or if you make cards and they come out to 1200 token, this is for you too)

The reason you want to do this is you're likely running on 2048 tokens of context. (20 tokens = 15 words). That context has to be shared between the chat you are having and the character card (permanent tokens). So if you're running 2048, and a card has 1200 tokens, it won't remember anything that happened 4+ posts ago (or less). Chances are (and I mean 95%) you can drastically reduce the number of character tokens by fixing the wording (I don't even have to call it syntax, because you can do it knowing nothing about coding.)

The fact is, most cards are poorly written, wording/syntax-wise. That's not a dig at the creator at all - they made it, I played it, it made my meat the big meat, it made me say I WANT MORE OF THIS IN MY FACE YUM YUM YUM. Also, can I make it better? The answer, 19 times out of 20, is "yes, easily" and the other 1/20 is "yes, easily, but using a little attention to detail".

I played a VERY FUN card called Miss Pepper from gigasad. "Your old teacher who remembers you fondly, and is a cat".
But 898 tokens? That's playable, but absolutely outside my (600 tops, 500 ehhh, 400 okay) range for playing with 2048.
(Note: more than half of the tokens were sample conversations - which is far too much sample conversation, but its description has good examples too so we'll use it.)

Using SillyTavern, looking at the card, its description cried out for some fat-trimming.

1. HALF AS LONG
There's a great scene in "A River Runs Through It" where the father teaches his sons compositon by having them bring him a essay, he reads it, and hands it back saying "Half as long." Maintain that mentality.
Example 1:
Current: She's also started to become ....
Better: She's becoming ...
Example 2:
Current: She's gotten farly chubby as well, focused mostly on her belly and thighs.
Better: Chubby belly and thighs.
Example 3:
Current: ...fluffy tail on her lower back.
Better: ...fluffy tail. (if you want it to be somewhere else, specify that.)

2. JUST THE FACTS, MA'AM
The card does not tell a story. The chat does.
Trim the fat. Delete: as well, also, in addition, etc.
YOU ARE WRITING A GROCERY LIST NOT A RECIPE BLOG

3. IS THIS LOSS?
This is the easiest way (meaning all you do is use the Del key) to reduce card size.
Identify things you do not give a shit about and nix them.
With the example card, Miss Pepper, the fact that she's a cat is totally irrelevant to the story. (I understand that it might be totally relevant to your coom instincts). But if you're about running into an old teacher and having a fling, hell there's 1/5 of the card tokens gone as Miss Pepper becomes fully human.
Or.. does it matter if you specify, on the card, that her pants all have been altered to allow her tail to stick out?
NO. That will never, ever be relevant in the chat. That's 15 free tokens.
If her excited tail-flicking is brought up by the AI, it WILL happen.
LLMs are not logic machines. It will not say "she would have flicked her tail excitedly, but it's stuck in her pants so no tail for you." You will still get the tail flick.

4. REORGANIZE (most time-consuming)
Reduce card words = reduce card tokens = more "recent memory" for the chat.
Example:
Current(24 Tokens): She's also started to become much more conscious of her age and appearance lately, her weight in particular.
Better(10 tokens): concerned recently with her age, appearance, weight.
My preference(11 tokens): Recent Concerns(age + appearance +weight)
NOTE: you can turn a 5-15 minute card-improvement into a 1-2hr project and reformat the whole thing if you want, which would probably get his card from 900 to 350, but 15 minutes gets it from 900 to 450 and that's a significant increase in contextual memory (from 1100 to 1550 - and those extra 3-6 interactions really make a world of difference.)

----
Hope that helps somebody. Not trying to come off like a guru, and I am very much NOT claiming to be an expert. There's just a lot of pseudo-frontend sites that don't even offer card editing, and I think it could trip a lot of people up. Token-bloat can make a card that should be 9/10 into a 5/10 because it has the memory of a goldfish.
 
  • Like
Reactions: eksi