Publisher | AiKodex |
---|---|
File size | 1.97GB |
Number of files | 178 |
Latest version | 1 |
Latest release date | 2025-01-28 04:12:20 |
First release date | 2025-01-28 04:12:20 |
Supported Unity versions | 2018.4.2 or higher |
ChatLab is an LLM powered Unity editor extension designed for creating in-game dialogues with branching logic.
LINKS
The documentation page is dynamic as we are adding more functionality (faster LLMs working on various types of hardware, more templates, multiparty/group settings and self branching conversations).
Website and Support | Documentation
FEATURES
💥 Jumpstart Your Creativity
Get started in no time with 7 professionally designed templates. Whether you're a beginner or a pro, these templates are the perfect launchpad for your project.
💬 Dynamic Branching Conversations
Bring your stories to life! Create interactive dialogues with multiple outcomes and endless possibilities. No more one-size-fits-all endings.
🌟 Incredibly Easy to Use
With a lightweight and approachable design, our user interface is as friendly as it gets—no overwhelming menus, just pure productivity.
✨ Run Local Models
Power your conversations offline with support for local LLMs for added speed and privacy.
✨ Seamless OpenAI Integration
Plug in OpenAI models effortlessly and take your storytelling to the next level.
✨ Language Conversion Made Simple
Automatically adapt dialogues for different languages - perfect for global projects.
💾 Save & Load with Ease
Never lose progress! Your dialogue trees and chat logs are auto-restored, and linear chats stay compatible with dialogue trees for ultimate flexibility.
🗄️Sleek, Creative UI
A UI designed to make you want to create. The perfect mix of node-based flow and modern design principles keeps your workspace visually engaging and streamlined.
EDITOR
🔧 Character-Centric Design
Manage characters with a profile system. Add, switch, and customize character roles, names, and descriptions.
🌐 Dialogue Trees, Simplified
Build branching conversation paths visually using a dynamic, node-based interface. Create multiple dialogue outcomes, alternate responses, and flexible decision chains with a few simple clicks.
⚡ Generate and Expand with AI
With growing number of responses, you may require assistance filling up the entire tree. In a linear conversation with 10 turns, you'd need to write out 10 dialogues. However, if you introduce choices in which a player is given 4 choices each, you'd have to write out 2729 dialogues*, hence we introduce the LLM powered auto reply system to save you the hassle.
* Developers use merging and dialogue reuse to counter this issue.
With ChatLab, you can automatically populate dialogue choices with AI-generated replies!
Select a node and hit "Generate Reply" or "Generate N Options" for fast branching conversations.
🛠 Interactive Contextual Tools
Each node is packed with easy-to-access options:
- Switch roles or characters within conversations
- Color coded nodes are easy on the eyes
- All fields are editable Edit all parts of the node
💾 Integrated Chat Logs
Switch effortlessly between Dialogue Tree and Chat Log views to follow conversation flow in linear or branching formats. Reverse compatibility means your linear chat logs convert smoothly into trees*.
*This does not happen the other way around.
✨ Powerful LLM Settings Panel
Access LLM configurations directly in the editor:
- Enable local or OpenAI-powered LLMs
- Adjust settings like temperature and max tokens to fine-tune response length
- Save and load models instantly for flexible performance
🖼 Drag, Zoom, and Create with Ease
Move seamlessly through your dialogue structures with mouse-based navigation:
- Drag with Alt or mouse buttons
- Zoom in and out to view the big picture or focus on fine details
DEPENDENCIES
This tool requires no external dependencies.
PIPELINES SUPPORTED
- Built-In: Out of the box
- URP / HDRP / SRP: 1 Material needs to be converted to the default Sprite Diffuse
LIMITATIONS
Since this tool is still under development, there are a few limitations:
- Currently, using ChatGPT during development is faster than using the local LLM.
- The local LLM sometimes forgets details due to the context window being at about 16k tokens.
- We are working on Unity 6 integration. It will likely require an extra dependency - Newtonsoft JSON