Pygmalion 13b characters. co Pygmalion 13b A conversational LLaMA fine-tune.
Pygmalion 13b characters This is why I use Pygmalion/Metharme 13B and Vicuna uncensored (also 13B) Edit: Pygmalion 6B is also sorta outdated. I'd like to see what it could do. The current Pygmalion-13b has been trained as a LoRA, then merged down to the base Edit: After a bit of testing, Manticore-Pygmalion 13B is performing very well in TavernAI. Pygmalion 13b A conversational LLaMA fine-tune. gguf. Same goes for WizardLM (13B and 7B), gpt4-alpaca or even Pyg 7B. I miss having a good GUI and making characters, etc, and the cmd prompt sucks, but for now, it'll have to do, because 13B Wizard Vicuna is like night and day vs 7B Pygmalion. The new 7B and 13B models are much smarter and more coherent. Pygmalion 7B is the model that was trained on C. Original model card: PygmalionAI's Mythalion 13B Mythalion 13B A merge of Pygmalion-2 13B and MythoMax 13B Model Details The long-awaited release of our new models based on Llama-2 is finally here. Sep 2, 2023 ยท The most important part of fine-tuning any model is the dataset and its composition. It's really only good for ERP. Quantized from the decoded pygmalion-13b xor format. The model will output X-rated content. And what I noticed is that you need to use TheBloke/Mythalion-13B-GGUF, mythalion-13b. It stays in character better too now, when it used to kind of just feel generic and uninspiring. Vicuna and WizardLM are by far, the best from my xp. Not only do the characters often emote like when using Pygmalion 13B, but they're far more coherent and creative in their speech, like in Manticore. Tried Pyg 13B 2(q5KM running via koboldccp and using recommended settings as found on pyg's website). pygmalion-13b-4bit-128g Model description Warning: THIS model is NOT suitable for use by minors. Phantasy Star is a video game franchise that began in 1987 with the release of the first game on the Sega Master System console. The Metharme models were an experiment to try and get a model that is usable for conversation, roleplaying and storywriting, but which can be guided using natural language like other instruct models. It has been fine-tuned using a subset of the data from Pygmalion-6B-v8-pt4, for those of you familiar with the project. If it's (say) a character from an established property, Pygmalion probably isn't going to know that character and the property at all. Make your own characters with our character creator, add lorebooks/world-info, extra images, default chat backgrounds, and more, then share with the community. Please be sure to follow our sub's rules, and also check out our Wiki/FAQ information regarding filter bypasses, userscripts, and general CAI guides. co Pygmalion 13b A conversational LLaMA fine-tune. It's a lot more random and dominant, bringing up crazy ideas that previous iterations never would, and really feeling like you're talking to an actual human. I've tested 7B on oobabooga with a RTX 3090 and it's really good, going to try 13B with int8 later, and I've got 65B downloading for when FlexGen support is implemented. Literally the first generation and the model already misgendered my character twice and there was some weirdness going on with coherency(i don't know how to best explain it but i've seen some text that contextually makes sense, but it kinda feels off like in an "unhuman" way. The current Pygmalion-13b has been trained as a LoRA, then merged down to the base So your Pygmalion character is going to be a lot simpler than the CAI version, and is going to know next to nothing out of the box. Pygmalion-2 13B (formerly known as Metharme) is based on Llama-2 13B released by Meta AI. I have no idea why, but even the same models in different variants produce different degrees of awesomeness in their answers. A list of all the characters in Pygmalion. Higgins, Freddy Eynsford Hill. This is version 1. Edit 2: I suggest making OCs. One massive problem with our previous Metharme-7B and 13B releases was the fact that not only were many of its responses considered “soulless”, but also that it had a tendency to have an undesired alignment carried over from improperly cleaned training data, leading them to be often reluctant or even I'm running 13B on my 1060 6GB via llama. See full list on huggingface. You can now run Pygmalion13B locally on Faraday ( https://faraday. cpp now that it has GPU acceleration. Model Details: Pygmalion 13b is a dialogue model based on Meta's LLaMA-13b. This model was created in collaboration with Gryphe, a mixture of our Pygmalion-2 13B and Gryphe's Mythomax L2 13B. Model Details The long-awaited release of our new models based on Llama-2 is finally here. Browse a large collection of characters made by passionate creators. Ooba booga. I'm sure gpt 4 is better but it costs money so I might consider that later but out of those options would you say is better? Generally Mythalion is the answer to my question. Supports 4bit models out of the box, useful interface for technical stuff. If you are going this route and want to chat, it's better to use tavern (see below). Q5_K_M. Import/export your characters as you wish without lock-in. Pygmalion-CoT 7b feels almost identical to CAI. AI datasets and is the best for the RP format, but I also read on the forums that 13B models are much better, and I ran GGML variants of regular LLama, Vicuna, and a few others and they did answer more logically and match the prescribed character was much better, but all answers were in simple chat or story generation (visible in the CMD line Chatgpt fast, google palm 2, google gemini, Llama 2, Pygmalion 13b, mythalion 13b, or chronos Hermes. If you want to ERP with your bot, Pygmalion is going to beat CAI hands down. I'd love a 13B Pygmalion though. Pygmalion characters include: Eliza Doolittle, Professor Henry Higgins, Colonel Pickering, Alfred Doolittle, Mrs. I still get better responses on old classic pygmalion (but inferences are very slow since I'm on rocm, can't load in 8bit so I have to load in SRAM a good chunk of the model). It allows for more customization and it's more fun. If you only have a simple question or want to start a small discussion, head over to our weekly discussion thread which is pinned on our front page and updated weekly! In my experience I have had extremely immersive roleplay with Mythalion 13B 8tgi-fp16/8k context size from Kobold Horde (with an average response time of 13/20 seconds and no more than 50) and I must admit that it knows how to recognize the anatomy of the characters in a decent way without the need to use formats such as: Ali:Chat + Plist Pygmalion 13b A conversational LLaMA fine-tune. Open Platform for Characters. People in the Discord have also suggested that we fine-tune Pygmalion on LLaMA-7B instead of GPT-J-6B, I hope they do so because it would be incredible. ai once you're in, setting greetings descriptions personality example tests etc, keep in mind that to use other people's characters you're gonna have to download their character card from the pyg discord server, a character card is literally just a png but it hold all of the characters information, once downloaded . The franchise is known for its role-playing games that take place in a sci-fi/fantasy universe with elements of space travel and futuristic technology. We also shipped major updates to the character creation and role-play flows — thank you to the 150+ people in this community who helped provide feedback in Discord during our first week! Pygmalion-2 13B (formerly known as Metharme) is based on Llama-2 13B released by Meta AI. So far seems like the best of both worlds. Complete List of Characters in George Bernard Shaw's Pygmalion. It works a lot like character. Learn everything you need to know about Higgins, Eliza, and more in Pygmalion. Original model card: PygmalionAI's Pygmalion 2 13B Pygmalion-2 13B An instruction-tuned Llama-2 biased towards fiction writing and conversation. dev ), in addition to Pyg7B and Metharme7B. wpuqch tpjduz revqgm kojmkin hkcdnb vae ufeq prabb xtztcdo jygwfh