Yahoo Web Search

Search results

  1. pygmalion.chat › charactersPygmalionAI

    Pygmalion AI. An open-source project dedicated to creating large language models for chat and role-play purposes.

  2. Open Source Conversational AI Research. 485 followers. https://pygmalion.chat. https://huggingface.co/PygmalionAI. README.md. Pygmalion. We are a group dedicated to creating open dialogue models that anyone can freely use. You can find our website here. In addition, you can find us on HuggingFace. Pinned. training-code Public.

    • Model Description
    • Training Data
    • Training Procedure
    • Intended Use
    • Known Issues

    Pymalion 6B is a proof-of-concept dialogue model based on EleutherAI's GPT-J-6B. Warning: This model is NOT suitable for use by minors. It willoutput X-rated content under certain circumstances.

    The fine-tuning dataset consisted of 56MB of dialogue data gathered from multiple sources, which includes both real andpartially machine-generated conversations.

    Model weights were initialized from the uft-6b ConvoGPT model made available in this commit. The model was then further fine-tuned on ~48.5 million tokens for ~5k steps on 4 NVIDIA A40s using DeepSpeed.

    The easy way

    We provide a notebook with a Gradio UI for playing around with the model without having to manually format inputs. This notebook can be found here.

    The manual way

    The model can be used as a regular text generation model, but it'll perform best if the input prompt adheres to the following format: Where [CHARACTER] is, as you can probably guess, the name of the character you want the model to portray, should be used verbatim as a delimiter token to separate persona and scenario data from the dialogue, and [DIALOGUE HISTORY]is chat history so the model can have some conversational context to draw from. Ideally it'll be pairs of messages like: Apar...

    We haven't played around with the model enough to enumerate them. Feel free to give us some feedback!

  3. Pygmalion-2 13B (formerly known as Metharme) is based on Llama-2 13B released by Meta AI. The Metharme models were an experiment to try and get a model that is usable for conversation, roleplaying and storywriting, but which can be guided using natural language like other instruct models.

  4. Pygmalion 7B. A conversational LLaMA fine-tune. Model Details. Pygmalion 7B is a dialogue model based on Meta's LLaMA-7B. This is version 1. It has been fine-tuned using a subset of the data from Pygmalion-6B-v8-pt4, for those of you familiar with the project. Applying the XORs.

  5. PygmalionAI is a platform for large-scale machine learning (LLM) applications. Learn about LLMs, system requirements, installation, backends, frontends, tools, bot creation and more.

  1. People also search for