Impressed with Depth of Knowledge
This model has been fairly impressive for what I've uncovered packed into it. Specifically, I've been using the Q4_K_M quant in both Backyard and in Kobold.
Just a few examples: It not only knew both Kenneth Russell's movie 'Gothic' and Michael Leigh's movie 'Naked', it's depth of knowledge was startling. It knew characters, setting and storylines. It not only accurately identified the lead from Naked, Johnny, but could discuss the film's grit and raw character interactions,. It then went on, using multiple sentences to talk about how London itself was portrayed in the film. For 'Gothic', it accurately discussed the friendship/power dynamic between the characters Lord Byron and Percy Shelley, the film's dark, brooding setting and the blush, baroque opulence of the set, as well as the characters' use of opium wine.
It brought up the band, The Swans, without being triggered to do so, except to suggest appropriate background music. It did hallucinate an album title, but the title itself was in keeping with Swan's album titles.
It demonstrated good familiarity with the Japanese dance movement Buto(h), accurately describing a Buto performance by a character.
It appears to have a rudimentary knowledge of the occult, though not greater than what one might expect from an average, disinterested intellectual.
My only critique is that it is - as are many - prone to duplication, even with non-default settings designed to drive away duplication and to enhance creativity.
Overall, it's been great. Even in head-to-head, non-party, chats. Thanks!
ASIDE: I've sent the chatlogs to a NotebookLM for generating summaries for lorebooks to artificially increase context. Plus the 20 minute audio summaries can be pretty entertaining themself.