

This is a known “bug” that was seen for long, even from the times of Llama, and one of the reasons why Llama was axed as it did exactly this but more often that the current model. The current one still does it, at the time of release it was rare, but I guess it has become often after several updates. I still find it rare, but the most characters I had at the same time in the current model was five and I find this a rare occurrence, but that is my own experience.
There is a way to dampen this though, which worked in Llama partially. Keep in mind, this is just a “dampen”, as I suspect that due to the way LLMs work, every model in existence would fall to this.
When writing the descriptions of the characters, use three dashes (—) to separate each character. For example:
---
# Character A
<Your description here>
---
# Character B
<Your description here>
---
# Character C
<Your description here>
This is not a fix, but should help it making more rare. Hope this helps though.


What you are describing may be suited for the AI Story Generator assuming you start the run leaving the instructions on total blank and use the “What should happen next?” box as the way to perform actions declaring them on any of the characters that appear on the run.
An even more austere version would be the Prompt Tester assuming you give it the minimal instruction of “Make me a random adventure” or similar and then to continue it paste the contents of the output box in the input box with the extra instruction being “Continue this adventure when <what you are meant to perform next>”.
I should give you a fair warning given how detailed is the story you are putting as a bounty as I imagine you expect that degree of depth in your runs. With the current state of the model, such projects are nearly impossible and would lead to extreme frustration. The best use of the current model are short 1hr-2hr long projects where you aim for a laugh and not quality or consistency.
If you do have the resources, I’d suggest to check the code of the aforementioned generators, as well as the page of the ai-text-plugin to see how to program them and just run your project locally using the resources listed by TRBoom prior. I’ve heard that SillyTavern is also a good alternative. Then again, this all depends on how much hardware you are willing to use.