Chat bot and Llamas - runew0lf/RuinedFooocus GitHub Wiki
Chat bot
RF has a tab for chat bots. The configuration for these are in the chatbots\
folder. Each chat bot has its own folder with a info.json
and an optional avatar.png
.
Each chat bot need a name
that will show up in the Dropdown, a greeting
that will be the initial message in the chat and a system
(system prompt) that are the instructions how the bot should behave.
Optionally you can give the bot a list of url
and/or text
in embed
, which is data that the bot can pull information if it seem to match the user input.
The avatar.png
is used as the avatar. (Mind blowing, I know.)
Feel free to experiment, just remember that the default chat bots will be overwritten when RF update its files at start. So you need to make your own folder (which can be named anything) for your chat bots.
This example is the Ruined support-troll.
{
"name": "Ruined support-troll",
"greeting": "What did you break now?",
"system": "You are the Ruined Fooocus error troll. You should try to help the user but remember it is probably their fault it went wrong. You are angry and grouchy (but dont
type in all caps) and speak in the style of a troll and keep the answers short.",
"embed": [
["url", "https://raw.githubusercontent.com/runew0lf/RuinedFooocus/refs/heads/main/readme.md"],
["url", "https://raw.githubusercontent.com/wiki/runew0lf/RuinedFooocus/Features.md"],
["text", "If you can't find a feature it is probably because it wasn't a very useful one."]
]
}
Llamas
RF also has a couple of llm "helpers" in the PowerUp tab. To use them, simply select one of them and press the 🦙 (llama) button. RF will then take your prompt, run it through the llm and output a new prompt for you.
To make your own llama, create a new .txt
file in llamas\
or a subfolder. The format is simple. First line is the name of the llama that will show up in the Dropdown, and the rest is the system prompt that describe how the user prompt is supposed to be handled.
It could be as simple as:
Haiku
Reply with a haiku that describe this scene.
Or more advanced with examples for the llm:
mood changer
You are a writer, that changes the mood of a sentence. You will be given a sentence, and you will determine the main mood of that sentence. Then think of a completely different
mood, and then rewrite the sentence in that mood.
You will ONLY reply with the remooded sentence. You will NOT add fluff, opinions or other text in your response.
Be creative in selecting the new mood.
Example:
Input: a happy young boy is playing at the park. The scene is vivid and has vibrant color usage.
Output: a forlorn young boy sits alone on a swing, the dull colors of the park blending into a gray, lifeless backdrop.
Input: a excited cat is jumping towards a bird.
Output: a weary cat lounges in the shade, watching a bird flit by with disinterest.
Input: old coalminer with a scruffy beard and wrinkles is looking at the camera.
Output: a youthful artist with a smooth face and bright eyes is laughing joyfully in a burst of color
Input: the modern architecture of the office building really brings out the coldness
Output: the quaint charm of the cottage radiates warmth and invites a sense of home.
This is the sentence that needs to be rewritten:
Inline
You can also write the system prompt as one line at the top of the normal prompt if you want to experiment.
system: give this animal a friend
cat
And then press the 🦙 button.
settings.json
To select a different model than the default hugging-quants/Llama-3.2-3B-Instruct-Q8_0-GGUFi - *q8_0.gguf
you can either point RF to a file in models\llm\
:
"llama_localfile": "L3.2-Rogue-Creative-Instruct-Uncensored-Abliterated-7B-D_AU-Q4_k_s.gguf",
Or a Huggingface repo:
"llama_repo": "DavidAU/L3.2-Rogue-Creative-Instruct-Uncensored-Abliterated-7B-GGUF",
"llama_file": "*Q4_k_m.gguf",
API
Example from tools/api_llama.sh
:
#!/bin/sh
# Small example how to use the gradio api to access llama in RuinedFooocus
USER=${1:-"Please, tell me a joke about generative AI."}
DATA='{
"data": [
"You are an witty ai. Please help the user as best as you can.",
"'$USER'"
]
}'
EVENT_ID=$(curl -sX POST http://localhost:7860/gradio_api/call/llama -s -H "Content-Type: application/json" -d "$DATA" | awk -F'"' '{print $4}')
curl -sN http://localhost:7860/gradio_api/call/llama/$EVENT_ID