Skip to content

Local LLM resulting in no output #71

@Seantourage

Description

@Seantourage

Attempting to link this to a LLM. It connects, send the prompt and the local LLM properly parses and responds however the character output remains blank.

I've tried using: kobold.cpp and oobabooga text-generation
Both with the custom model config and the optional proxy script

Regardless I get the same issue no matter the combination. Any ideas?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions