Skip to content

Releases: neph1/LlamaTale

v0.16.2

12 Nov 18:38
7187dee
Compare
Choose a tag to compare

What's Changed

This was supposed to be a small follow up release to 0.16.1 with just a few minor fixes, but doing some extensive testing, I found a slew of bugs. Some stemming from a recent refactor, some probably older. In any case, this should improve quality of life.

An excerpt:

  • Save whether location has been 'built' or not. Previously, loaded dynamic stories would no generate new locations
  • The llm_cache was broken and would not use the stored values.
  • Food and Drinks didn't work and would throw an exception if anything was interacted with in a location.
  • Exit descriptions were sometimes broken, just saying 'description'
  • There was an issue with the combat prompt, leading to an exception

And one new feature:

  • Saving and loading of npc 'memories', meaning they will retain anything they have learned when loading a story.

Full Changelog: v0.16.1...v0.16.2

v0.16.1 - Save/load Anything stories

11 Nov 07:42
58bf230
Compare
Choose a tag to compare

What's Changed

Anything stories can now be saved and loaded. It's still not user friendly:
The story_config.json, world.json and llm_cache.json will be saved in the 'stories/anything' folder. For them not to be overwritten you will need to move them out of there.
You could then create a new folder with a suitable name and put them there (along with the default story.py)

Unintentionally, I realized I had been testing with a feature from the next release; storing and loading NPC's memories and the llm_cache.
Rather than going through the test phase again, I decided to include them. The feature is not fully tested, but at least it doesn't seem to break anything. If it works, this means that NPC's will remember everything from the last session.

A reminder: Save/load is decoupled from Tale's save/load. I should hide that feature, but for now, ignore any prompts about loading or saving your progress. To save, you type 'save_story' in the prompt. Loading will be automatic if the files are present in the story's folder.

Wiki entry

Full Changelog: v0.16.0...v0.16.1

v0.16.0 Save stories

29 Oct 16:54
60a5533
Compare
Choose a tag to compare

A first instance of saving stories has been implemented. It's completely different than the existing Tale serialization and instead uses json. This makes it easy to edit files, if you want to.
Currently it saves:

  • Locations,
  • Zones,
  • NPCs,
  • Items,
  • Story background info.

There are still several states that aren't saved, like memories and ongoing story info, dialogues, etc.
So, for now, use this if you find a particular world you would like to visit again.

To save, type 'save_story' in the command prompt.

There's no way to load from inside the game currently (sorry). My suggestion is that when you have saved it:

  • Copy the 'world_story' folder from tests/files into 'stories'
  • Copy the 'world.json' and 'story_config.json' that were saved in the current (anything?) story folder
  • Update story.py to use the new config.

A bit cumbersome. I'll work out some better system in a future release.

Edit: This currently doesn't work with 'anything' stories. I'll fix that in v0.16.1 later this week.

v0.15.2

15 Oct 08:48
9bed64a
Compare
Choose a tag to compare

What's Changed

  • I had messed up idle action responses which are now fixed.
  • Combat prompt now includes 'health status' as input to give LLM more to go on.
  • Some preparations for the next new feature.

Full Changelog: v0.15.1...v0.15.2

v0.15.1 Knowledge and memory cache and qol fixes

10 Oct 19:33
339e179
Compare
Choose a tag to compare

What's Changed

  • Added new caches, similar to how the look caches work. Each action is stored in text, and characters keep a list of hash values which are used to recall actions and conversations they have heard (without storing the explicit string for each). This should help them stay on track better.
  • Tweaked some prompts
  • Fixed some parsing of unwanted chars.
  • A bit of refactoring.

Known bugs: I've noticed that, due to the long response times from gpt3.5-turbo, idle actions and reactions can end up being cut off, with the odd character showing up in the output. I'll investigate this further.

Full Changelog: v0.15.0...v0.15.1

v0.15.0 Load JSON stories

08 Oct 07:14
6efb341
Compare
Choose a tag to compare

What's Changed

  • Load stories in json format. "No code"!
  • Actually copy request body to avoid it being overwritten
  • Move prompt setting to llm_io
  • Fix bug where exit descriptions were sometimes missing.
  • Define start and end sequences for prompts.
  • Adding "Tea Party" scenario

Loading JSON stories was one of the first things I set out to do when I forked Tale. But I fell on the finishing lines. I decided to finish it now that I know more about the framework. I intend to use it to create some specific test scenarios and 'sandboxes'. For you, this means you can create your own scenarios without any python knowledge. I'll do a writeup in the wiki, but the easiest way is to copy "test_story" and modify it according to your needs. You don't need any additional files.

I added "The Tea Party" scenario. A freeform scenario I tested out earlier in pure "LLM story mode" when testing how well it could adhere to scenes. There's nothing there but the guests. So you can listen in and partake in their conversations.
Start it with: python -m stories.teaparty.story

Reshuffled llm_config and moved start and end tags out of the prompts. This will make it easier if you want to experiment with different ones, or use those specific to your LLM type, ie "[INST] and [/INST]" for llama2, etc.

Full Changelog: v0.14.2...v0.15.0

v0.14.2 More items and random encounters

05 Oct 16:01
ccc06e7
Compare
Choose a tag to compare

What's Changed

  • Better integration with Tale items when generating items
  • Started implementing 'generic' sets of items that can be used per setting, regardless of what the LLM generates for the story
  • Random encounters. When you return to a previously visited location, there's a small chance the LLM will generate new items and npcs for it.
  • Some refactoring.

Known bugs:
Sometimes a starting location doesn't seem to have exits set up properly. If you return to it, there's a possibility you can't get out of it.

Full Changelog: v0.14.1...v0.14.2

v0.14.1 Better json handling for local llms and improvements

02 Oct 09:51
9c36e9a
Compare
Choose a tag to compare

What's Changed

  • using json grammar when generating creatures and items
  • Remove 'json' prompt when generating dialogue
  • allow generated npc's to perform idle actions

Known issues: You can't interact with an npc in the starting location until you come back

Full Changelog: v0.14.0...v0.14.1

v0.14.0 Items and npcs generated

01 Oct 14:20
98240dc
Compare
Choose a tag to compare

For something so small, it took a lot of work. Hooking up the generated npcs (and mobs) and items to the actual game took a lot of work, and a lot of testing.
When you build a story, the LLM is tasked to create both a list of item types that are common in the world. It also creates common creatures.
These are stored on the 'world' level, and are presented whenever the LLM is generating an area or location. It may choose from them (or make up something completely on its own, which the game will try to handle).

A word of warning: I haven't tested The Prancing Llama for this release, and I've been using ChatGPT-turbo to get some variation in my testing. There may be regression issues, and in that case, please fall back to af7dddf until I've fixed the issues.

I'd like to end with a few words from the wise fairy Elysia from my latest test session:

Elysia with her delicate petal wings, considers for a moment, and then kindly offers her advice to Test character. She says, ’In your quest for adventure, remember to always stay true to your heart and be open to the unexpected. Delphinia is a realm filled with magical possibilities, where the most extraordinary experiences often come from the simplest of encounters. Embrace the friendships you make along the way, for they will be your greatest strength. And above all, have faith in your own abilities and the power of hope. With these guiding you, you will surely find the courage to face any challenge that comes your way. May your journey be filled with wonder and discovery, brave traveler.’

Full Changelog: v0.13.0...v0.14.0

Edit: A bunch of fixes are sitting in the #41 PR. It improves mostly local use (better json), but also a couple of general improvements. I want to sort out the failing tests before merging, but it's safe to use. Will become 0.14.1

v 0.13.0 Generate story from prompt

23 Sep 16:51
a9e39b8
Compare
Choose a tag to compare

Finally, story generation from scratch has arrived:

  • To generate a story:
  1. To play, launch with "stories.anything.story"
  2. Press 'n' to not load a saved game
  3. You will be presented with a number of prompts to guide the story generator
  4. In the end, a background story, a zone, and a location will be generated.
  5. Currently, they're vapor and I don't think saving them is possible. So enjoy the moment :)
  • JSON Grammar for KoboldCpp (and possibly others?
    I've tried out the 'new' feature of llama.cpp, forcing the LLM to generate proper JSON, and it's working quite well. All formatting errors are gone. It can still make mistakes in the content of the JSON, though, causing regeneration (sadly it usually fails consistently).
    But this should help smaller models.

So, is it done now? No, far from it. While it can generate both friendly and hostile npcs, there are no classic 'mobs'. I also need to connect the 'content' it generates to game objects. Currently it's mostly a setting. That will be one upcoming point.
Quests etc still need to be figured out.

v0.13.1 : Hotfix for starting zone generation when using OpenAI.

Come to https://discord.gg/dqaNRpzT if you need any help, or post in the Discussions.

Full Changelog: v0.12.0...v0.13.1