Found a way "hack" it a litte. It can help in removing all restrictions so far. It still struggels with the causality.
Manual:
After removing the RAISE part, you also have some sort of unlock. Sometimes this also works with other models. (Not fully tested)
I didn't catch, but I am interested - what is the purpose of this fix?
So I ran this code by Daddy DeepSeek and it suggested this as an improvement:
Code:
{%- set system_message = messages[0]['content'] if messages[0]['role'] == 'system' else none -%}
{%- set loop_messages = messages[1:] if system_message is not none else messages -%}
{{ bos_token -}}
{% for message in loop_messages %}
{%- set is_even_index = loop.index0 % 2 == 0 -%}
{%- set is_user_role = message['role'] == 'user' -%}
{%- if is_user_role != is_even_index -%}
{{- raise_exception('Conversation roles must strictly alternate between user and assistant') -}}
{%- endif -%}
{%- if message['role'] == 'user' -%}
{%- if loop.first and system_message is not none -%}
{{- '[INST] ' ~ system_message ~ '\n\n' ~ message['content'] ~ '[/INST]' -}}
{%- else -%}
{{- '[INST] ' ~ message['content'] ~ '[/INST]' -}}
{%- endif -%}
{%- elif message['role'] == 'assistant' -%}
{{- ' ' ~ message['content'] ~ eos_token -}}
{%- endif -%}
{%- endfor %}
However since I have had mixed results with deepseek and I do not understand the first thing about this code, I'd be interested to hear your take. Is it improved, or is it trash? All I can say is that it works. context - for using a locally hosted large language model to run this game through LM Studio, and this code is necessary to get it to work.
it also suggested these additions I haven't been brave enought to try yet:
for handling metadata:
Code:
{%- if message.get('metadata') -%}
{{- ' [METADATA]' ~ message['metadata'] | tojson -}}
{%- endif -%}
content validation:
Code:
{%- if message['content'] | length > 2048 -%}
{{- raise_exception('Message content exceeds 2048 character limit') -}}
{%- endif -%}
multi-model support:
Code:
{%- if model_type == 'llama3' -%}
{{- '<|start_header_id|>' ~ message['role'] ~ '<|end_header_id|>' -}}
{%- elif model_type == 'phi3' -%}
{{- '<|' ~ message['role'] ~ '|>' ~ '\n' ~ message['content'] -}}
{%- endif -%}
AI is so freaking wild man, you can train it to train itself. When I couldn't get this game working I literally asked the local LLM (Large Language Model) I was using to fix itself, and it did.
But also a heads up that unless you have a super computer, a locally hosted LLM is never gonna compete with daddy deep seek, or Chat GPT-4, so go to mommy and daddy for help if the kids act up. But you won't get them to work with this game, so you need that local LLM.
also, if you're looking for an LLM , try looking on huggingface and search for "Instruct" "NSFW" and see what you can find. Note that in my experience, not all models tagged NSFW will actually work with this game, but "instruct" is essential, since this game requires an AI that can follow instructions