Skip to content

chore(deps): bump node-llama-cpp from 3.17.1 to 3.18.1#13

Open
dependabot[bot] wants to merge 1 commit intomainfrom
dependabot/npm_and_yarn/node-llama-cpp-3.18.1
Open

chore(deps): bump node-llama-cpp from 3.17.1 to 3.18.1#13
dependabot[bot] wants to merge 1 commit intomainfrom
dependabot/npm_and_yarn/node-llama-cpp-3.18.1

Conversation

@dependabot
Copy link
Copy Markdown

@dependabot dependabot Bot commented on behalf of github Mar 29, 2026

Bumps node-llama-cpp from 3.17.1 to 3.18.1.

Release notes

Sourced from node-llama-cpp's releases.

v3.18.1

3.18.1 (2026-03-17)

Features


Shipped with llama.cpp release b8390

To use the latest llama.cpp release available, run npx -n node-llama-cpp source download --release latest. (learn more)

v3.18.0

3.18.0 (2026-03-15)

Features

  • automatic checkpoints for models that need it (#573) (c641959)
  • QwenChatWrapper: Qwen 3.5 support (#573) (c641959)
  • inspect gpu command: detect and report missing prebuilt binary modules and custom npm registry (#573) (c641959)

Bug Fixes

  • resolveModelFile: deduplicate concurrent downloads (#570) (cc105b9)
  • correct Vulkan URL casing in documentation links (#568) (5a44506)
  • Qwen 3.5 memory estimation (#573) (c641959)
  • grammar use with HarmonyChatWrapper (#573) (c641959)
  • add mistral think segment detection (#573) (c641959)
  • compress excessively long segments from the current response on context shift instead of throwing an error (#573) (c641959)
  • default thinking budget to 75% of the context size to prevent low-quality responses (#573) (c641959)

Shipped with llama.cpp release b8352

To use the latest llama.cpp release available, run npx -n node-llama-cpp source download --release latest. (learn more)

Commits
  • 57bea3d feat(minor): customize postinstall behavior (#582)
  • c641959 feat: automatic checkpoints for models that need it (#573)
  • cc105b9 fix(resolveModelFile): deduplicate concurrent downloads (#570)
  • 5a44506 fix: correct Vulkan URL casing in documentation links (#568)
  • See full diff in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [node-llama-cpp](https://github.com/withcatai/node-llama-cpp) from 3.17.1 to 3.18.1.
- [Release notes](https://github.com/withcatai/node-llama-cpp/releases)
- [Commits](withcatai/node-llama-cpp@v3.17.1...v3.18.1)

---
updated-dependencies:
- dependency-name: node-llama-cpp
  dependency-version: 3.18.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot Bot added dependencies Pull requests that update a dependency file javascript Pull requests that update javascript code labels Mar 29, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file javascript Pull requests that update javascript code

Projects

None yet

Development

Successfully merging this pull request may close these issues.

0 participants