Skip to content

Misc. bug: Symbol '\' is not escaped in the json schema literals. #17306

@i-v-s

Description

@i-v-s

Name and Version

$ ./build/bin/llama-cli --version
ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no
ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no
ggml_cuda_init: found 1 CUDA devices:
Device 0: NVIDIA GeForce RTX 3090, compute capability 8.6, VMM: yes
version: 7061 (e1fcf8b)
built with cc (Ubuntu 13.3.0-6ubuntu2~24.04) 13.3.0 for x86_64-linux-gnu

Operating systems

Linux

Which llama.cpp modules do you know to be affected?

llama-cli, llama-server

Command line

llama-cli -m ... --json-schema '{"properties":{"code":{"const":"auto s = \"000\";","description":"Generated code","title":"Code","type":"string"}},"required":["code"],"title":"DecoderResponse","type":"object"}'

Problem description & steps to reproduce

When I ran the command, I received the following messages and the program terminates.

First Bad Commit

No response

Relevant log output

...
main: interactive mode on.
parse: error parsing grammar: expecting newline or end at \\";\"" space
code-kv ::= "\"code\"" space ":" space code
root ::= "{" space code-kv "}" space
space ::= | " " | "\n"{1,2} [ \t]{0,20}


code ::= "\"auto s = \\"000\\";\"" space
code-kv ::= "\"code\"" space ":" space code
root ::= "{" space code-kv "}" space
space ::= | " " | "\n"{1,2} [ \t]{0,20}

llama_grammar_init_impl: failed to parse grammar
main: failed to initialize sampling subsystem

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions