diff --git a/0_app/0_root/index.md b/0_app/0_root/index.md index 7770d46..8162d21 100644 --- a/0_app/0_root/index.md +++ b/0_app/0_root/index.md @@ -52,6 +52,12 @@ You can attach documents to your chat messages and interact with them entirely o Read more about how to use this feature in the [Chat with Documents](app/basics/rag) guide. +## Run LM Studio without the GUI (llmster) + +llmster is the headless version of LM Studio, no desktop app required. It's ideal for servers, CI environments, or any machine where you don't need a GUI. + +Learn more: [Headless Mode](/docs/developer/core/headless). + ## Use LM Studio's API from your own apps and scripts LM Studio provides a REST API that you can use to interact with your local models from your own apps and scripts. diff --git a/1_developer/0_core/headless.md b/1_developer/0_core/headless.md index 971ef49..5b7ee94 100644 --- a/1_developer/0_core/headless.md +++ b/1_developer/0_core/headless.md @@ -1,31 +1,72 @@ --- title: "Run LM Studio as a service (headless)" -sidebar_title: "Headless Mode" +sidebar_title: "`llmster` - Headless Mode" description: "GUI-less operation of LM Studio: run in the background, start on machine login, and load models on demand" index: 2 --- -LM Studio can be run as a service without the GUI. This is useful for running LM Studio on a server or in the background on your local machine. This works on Mac, Windows, and Linux machines with a graphical user interface. +LM Studio can be run as a background service without the GUI. There are two ways to do this: -## Run LM Studio as a service +1. **llmster** (recommended) — a standalone daemon, no GUI required +2. **Desktop app in headless mode** — hide the UI and run the desktop app as a service -Running LM Studio as a service consists of several new features intended to make it more efficient to use LM Studio as a developer tool. +## Option 1: llmster (recommended) -1. The ability to run LM Studio without the GUI -2. The ability to start the LM Studio LLM server on machine login, headlessly -3. On-demand model loading +llmster is the core of the LM Studio desktop app, packaged to be server-native, without reliance on the GUI. It can run on Linux boxes, cloud servers, GPU rigs, or your local machine without the GUI. See the [LM Studio 0.4.0 release post](/blog/0.4.0) for more details. -## Run the LLM service on machine login +llmster -To enable this, head to app settings (`Cmd` / `Ctrl` + `,`) and check the box to run the LLM server on login. + +### Install llmster + +**Linux / Mac** + +```bash +curl -fsSL https://lmstudio.ai/install.sh | bash +``` + +**Windows** + +```bash +irm https://lmstudio.ai/install.ps1 | iex +``` + +### Start llmster + +```bash +lms daemon up +``` + + +See the [daemon CLI docs](/docs/cli/daemon/daemon-up) for full reference. + +For setting up llmster as a startup task on Linux, see [Linux Startup Task](/docs/developer/core/headless_llmster). + +## Option 2: Desktop app in headless mode + +This works on Mac, Windows, and Linux machines with a graphical user interface. It's useful if you already have the desktop app installed and want it to run as a background service. + +### Run the LLM service on machine login + +Head to app settings (`Cmd` / `Ctrl` + `,`) and check the box to run the LLM server on login. When this setting is enabled, exiting the app will minimize it to the system tray, and the LLM server will continue to run in the background. +### Auto Server Start + +Your last server state will be saved and restored on app or service launch. + +To achieve this programmatically: + +```bash +lms server start +``` + ## Just-In-Time (JIT) model loading for REST endpoints -Useful when utilizing LM Studio as an LLM service with other frontends or applications. +Applies to both options. Useful when using LM Studio as an LLM service with other frontends or applications. @@ -43,16 +84,6 @@ Useful when utilizing LM Studio as an LLM service with other frontends or applic JIT loaded models will be auto-unloaded from memory by default after a set period of inactivity ([learn more](/docs/developer/core/ttl-and-auto-evict)). -## Auto Server Start - -Your last server state will be saved and restored on app or service launch. - -To achieve this programmatically, you can use the following command: - -```bash -lms server start -``` - ### Community Chat with other LM Studio developers, discuss LLMs, hardware, and more on the [LM Studio Discord server](https://discord.gg/aPQfnNkxGC). diff --git a/1_developer/0_core/lmlink.md b/1_developer/0_core/lmlink.md new file mode 100644 index 0000000..2accab0 --- /dev/null +++ b/1_developer/0_core/lmlink.md @@ -0,0 +1,22 @@ +--- +title: Using LM Link +sidebar_title: Using with LM Link +description: Use a remote device's model via the REST API with LM Link +index: 3 +--- + +## Overview + +With [LM Link](/docs/lmlink), you can use a model loaded on a remote device as if it were loaded locally — from any machine on the same link. This naturally extends to the REST API and SDK: your laptop can make requests to `localhost` and have them served by a powerful remote machine on your network. + +Requests to `localhost` still work as normal. LM Studio internally uses the model on the remote device as if it were loaded locally. For models present on multiple devices, the REST API will use the model on the preferred device. + + + +The preferred device setting is per-machine. Each device on the link independently controls which remote machine it prefers. See [how to set a preferred device](/docs/lmlink/basics/preferred-device) for more details. + +## Use the REST API as normal + +Use the REST API exactly as you would locally. See the [REST API docs](/docs/developer/rest) for usage details. + +If you're running into trouble, hop onto our [Discord](https://discord.gg/lmstudio) diff --git a/1_developer/2_rest/load.md b/1_developer/2_rest/load.md index 04fa89b..7f1e556 100644 --- a/1_developer/2_rest/load.md +++ b/1_developer/2_rest/load.md @@ -54,7 +54,7 @@ variants: "model": "openai/gpt-oss-20b", "context_length": 16384, "flash_attention": true, - "echo_load_config": true, + "echo_load_config": true }' ``` ```` diff --git a/1_developer/index.md b/1_developer/index.md index 660ce6b..febc5cc 100644 --- a/1_developer/index.md +++ b/1_developer/index.md @@ -111,6 +111,6 @@ Full docs: [LM Studio REST API](/docs/developer/rest) ## Helpful links - [API Changelog](/docs/developer/api-changelog) -- [Local server basics](/docs/developer/core) +- [Local server basics](/docs/developer/core/server) - [CLI reference](/docs/cli) - [Discord Community](https://discord.gg/lmstudio) diff --git a/3_cli/2_daemon/daemon-down.md b/3_cli/2_daemon/daemon-down.md new file mode 100644 index 0000000..6725958 --- /dev/null +++ b/3_cli/2_daemon/daemon-down.md @@ -0,0 +1,20 @@ +--- +title: "`lms daemon down`" +sidebar_title: "`lms daemon down`" +description: Stop llmster from the CLI. +index: 2 +--- + +The `lms daemon down` command stops the running llmster. + +```shell +lms daemon down +``` + +```lms_info +`lms daemon down` only works if llmster is running. It will not stop LM Studio if it is running as a GUI app. +``` + +### Learn more + +To find out more about llmster, see [Headless Mode](/docs/developer/core/headless). diff --git a/3_cli/2_daemon/daemon-status.md b/3_cli/2_daemon/daemon-status.md new file mode 100644 index 0000000..9d45df9 --- /dev/null +++ b/3_cli/2_daemon/daemon-status.md @@ -0,0 +1,48 @@ +--- +title: "`lms daemon status`" +sidebar_title: "`lms daemon status`" +description: Check whether llmster is running. +index: 3 +--- + +The `lms daemon status` command reports whether llmster is currently running. + +### Flags + +```lms_params +- name: "--json" + type: "flag" + optional: true + description: "Output the status in JSON format" +``` + +## Check daemon status + +```shell +lms daemon status +``` + +### JSON output + +For scripting or automation: + +```shell +lms daemon status --json +``` + +Example output when running: +```json +{"status":"running","pid":12345,"isDaemon":true} +``` + +Example output when not running: +```json +{"status":"not-running"} +``` + +### Start or stop the daemon + +- [`lms daemon up`](/docs/cli/daemon/daemon-up) — start the daemon. +- [`lms daemon down`](/docs/cli/daemon/daemon-down) — stop the daemon. + +To find out more about llmster, see [Headless Mode](/docs/developer/core/headless). \ No newline at end of file diff --git a/3_cli/2_daemon/daemon-up.md b/3_cli/2_daemon/daemon-up.md new file mode 100644 index 0000000..0bbfd28 --- /dev/null +++ b/3_cli/2_daemon/daemon-up.md @@ -0,0 +1,46 @@ +--- +title: "`lms daemon up`" +sidebar_title: "`lms daemon up`" +description: Start llmster from the CLI. +index: 1 +--- + +The `lms daemon up` command starts llmster + +### Flags + +```lms_params +- name: "--json" + type: "flag" + optional: true + description: "Output the result in JSON format" +``` + +## Start the daemon + +```shell +lms daemon up +``` + +If the daemon is not already running, this starts it and prints the PID. If it is already running, it reports the current status. + +### JSON output + +For scripting or automation: + +```shell +lms daemon up --json +``` + +Example output: +```json +{"status":"running","pid":26754,"isDaemon":true,"version":"0.4.4+1"} +``` + +### Check the daemon status + +See [`lms daemon status`](/docs/cli/daemon/daemon-status) to check whether the daemon is running. + +### Learn more + +To find out more about llmster, see [Headless Mode](/docs/developer/core/headless). diff --git a/3_cli/2_daemon/daemon-update.md b/3_cli/2_daemon/daemon-update.md new file mode 100644 index 0000000..5fbc3ef --- /dev/null +++ b/3_cli/2_daemon/daemon-update.md @@ -0,0 +1,49 @@ +--- +title: "`lms daemon update`" +sidebar_title: "`lms daemon update`" +description: Update llmster to the latest version. +index: 4 +--- + +The `lms daemon update` command fetches and installs the latest version of llmster. + +### Flags + +```lms_params +- name: "--beta" + type: "flag" + optional: true + description: "Update to the latest beta release" +``` + +## Update the daemon + +Stop the daemon first: + +```shell +lms daemon down +``` + +Then run the update: + +```shell +lms daemon update +``` + +Fetches the latest stable release and installs it. + +### Update to the beta channel + +```shell +lms daemon update --beta +``` + +### After updating + +Start the daemon again to use the new version: + +```shell +lms daemon up +``` + +To find out more about llmster, see [Headless Mode](/docs/developer/core/headless). diff --git a/3_cli/3_link/link-disable.md b/3_cli/3_link/link-disable.md new file mode 100644 index 0000000..f997de5 --- /dev/null +++ b/3_cli/3_link/link-disable.md @@ -0,0 +1,20 @@ +--- +title: "`lms link disable`" +sidebar_title: "`lms link disable`" +description: Disable LM Link on this device from the CLI. +index: 2 +--- + +The `lms link disable` command disables LM Link on this device. The device will no longer connect to or be visible to other devices on the link. + +## Disable LM Link + +```shell +lms link disable +``` + +You can re-enable LM Link at any time with [`lms link enable`](/docs/cli/link/link-enable). + +### Learn more + +See the [LM Link documentation](/docs/lmlink) for a full overview of LM Link. \ No newline at end of file diff --git a/3_cli/3_link/link-enable.md b/3_cli/3_link/link-enable.md new file mode 100644 index 0000000..413bcae --- /dev/null +++ b/3_cli/3_link/link-enable.md @@ -0,0 +1,32 @@ +--- +title: "`lms link enable`" +sidebar_title: "`lms link enable`" +description: Enable LM Link on this device from the CLI. +index: 1 +--- + +The `lms link enable` command enables LM Link on this device, allowing it to connect with other devices on the same link. + +```lms_info +LM Link requires an LM Studio account. Run `lms login` first if you haven't already. +``` + +## Enable LM Link + +```shell +lms link enable +``` + +After enabling, the CLI waits for a connection to be established. If there are issues, the relevant next step is printed. + +### Check the connection status + +See [`lms link status`](/docs/cli/link/link-status) to verify the connection and see connected peers. + +### Disable LM Link + +See [`lms link disable`](/docs/cli/link/link-disable) to turn LM Link off. + +### Learn more + +See the [LM Link documentation](/docs/lmlink) for a full overview of LM Link. diff --git a/3_cli/3_link/link-set-device-name.md b/3_cli/3_link/link-set-device-name.md new file mode 100644 index 0000000..210c388 --- /dev/null +++ b/3_cli/3_link/link-set-device-name.md @@ -0,0 +1,20 @@ +--- +title: "`lms link set-device-name`" +sidebar_title: "`lms link set-device-name`" +description: Rename this device on LM Link from the CLI. +index: 4 +--- + +The `lms link set-device-name` command sets a display name for this device, visible to other devices on the link. + +## Rename this device + +```shell +lms link set-device-name "My Mac Studio" +``` + +The new name takes effect immediately and is visible to connected peers via [`lms link status`](/docs/cli/link/link-status). + +### Learn more + +See the [LM Link documentation](/docs/lmlink) for a full overview of LM Link. \ No newline at end of file diff --git a/3_cli/3_link/link-set-preferred-device.md b/3_cli/3_link/link-set-preferred-device.md new file mode 100644 index 0000000..c494a80 --- /dev/null +++ b/3_cli/3_link/link-set-preferred-device.md @@ -0,0 +1,30 @@ +--- +title: "`lms link set-preferred-device`" +sidebar_title: "`lms link set-preferred-device`" +description: Set the preferred device for model resolution on LM Link. +index: 5 +--- + +The `lms link set-preferred-device` command sets which device on the link is used when a model is available on multiple connected devices. + +## Set a preferred device + +Run the command without arguments to pick from an interactive list of connected devices: + +```shell +lms link set-preferred-device +``` + +Or pass the device identifier directly to skip the prompt: + +```shell +lms link set-preferred-device +``` + +Device identifiers are listed in the output of [`lms link status`](/docs/cli/link/link-status). + +See [Using LM Link with the REST API](/docs/developer/core/lmlink) for more on how preferred devices affect model routing. + +### Learn more + +See the [LM Link documentation](/docs/lmlink) for a full overview of LM Link. diff --git a/3_cli/3_link/link-status.md b/3_cli/3_link/link-status.md new file mode 100644 index 0000000..e570e12 --- /dev/null +++ b/3_cli/3_link/link-status.md @@ -0,0 +1,42 @@ +--- +title: "`lms link status`" +sidebar_title: "`lms link status`" +description: Check LM Link connection status and see connected peers. +index: 3 +--- + +The `lms link status` command shows whether LM Link is enabled on this device, and lists connected peers and their loaded models. + +### Flags + +```lms_params +- name: "--json" + type: "flag" + optional: true + description: "Output the status in JSON format" +``` + +## Check status + +```shell +lms link status +``` + +Displays this device's name, connection state, and a list of connected peers with their currently loaded models. + +### JSON output + +For scripting or automation: + +```shell +lms link status --json +``` + +### Enable or disable LM Link + +- [`lms link enable`](/docs/cli/link/link-enable) — enable LM Link on this device. +- [`lms link disable`](/docs/cli/link/link-disable) — disable LM Link on this device. + +### Learn more + +See the [LM Link documentation](/docs/lmlink) for a full overview of LM Link. diff --git a/3_cli/2_runtime/runtime.md b/3_cli/4_runtime/runtime.md similarity index 100% rename from 3_cli/2_runtime/runtime.md rename to 3_cli/4_runtime/runtime.md diff --git a/3_cli/3_develop-and-publish/clone.md b/3_cli/5_develop-and-publish/clone.md similarity index 100% rename from 3_cli/3_develop-and-publish/clone.md rename to 3_cli/5_develop-and-publish/clone.md diff --git a/3_cli/3_develop-and-publish/dev.md b/3_cli/5_develop-and-publish/dev.md similarity index 100% rename from 3_cli/3_develop-and-publish/dev.md rename to 3_cli/5_develop-and-publish/dev.md diff --git a/3_cli/3_develop-and-publish/login.md b/3_cli/5_develop-and-publish/login.md similarity index 100% rename from 3_cli/3_develop-and-publish/login.md rename to 3_cli/5_develop-and-publish/login.md diff --git a/3_cli/3_develop-and-publish/push.md b/3_cli/5_develop-and-publish/push.md similarity index 100% rename from 3_cli/3_develop-and-publish/push.md rename to 3_cli/5_develop-and-publish/push.md diff --git a/3_cli/index.md b/3_cli/index.md index f284973..ea4423c 100644 --- a/3_cli/index.md +++ b/3_cli/index.md @@ -29,6 +29,8 @@ lms --help | See models loaded into memory | `lms ps` | [Guide](/docs/cli/local-models/ps) | | Control the server | `lms server start` | [Guide](/docs/cli/serve/server-start) | | Manage the inference runtime | `lms runtime` | [Guide](/docs/cli/runtime) | +| Manage the headless daemon | `lms daemon` | [Guide](/docs/cli/daemon/daemon-up) | +| Manage LM Link | `lms link` | [Guide](/docs/cli/link/link-enable) | ### Verify the installation diff --git a/4_integrations/claude-code.md b/4_integrations/claude-code.md index d56c77d..4c63252 100644 --- a/4_integrations/claude-code.md +++ b/4_integrations/claude-code.md @@ -9,6 +9,10 @@ See: [Anthropic-compatible Messages endpoint](/docs/developer/anthropic-compat/m +```lms_protip +Have a powerful LLM rig? Use [LM Link](/docs/integrations/lmlink) to run Claude Code from your laptop while the model runs on your rig. +``` + ### 1) Start LM Studio's local server Make sure LM Studio is running as a server (default port `1234`). diff --git a/4_integrations/codex.md b/4_integrations/codex.md index 8b2a722..59f1258 100644 --- a/4_integrations/codex.md +++ b/4_integrations/codex.md @@ -9,6 +9,10 @@ See: [OpenAI-compatible Responses endpoint](/docs/developer/openai-compat/respon +```lms_protip +Have a powerful LLM rig? Use [LM Link](/docs/integrations/lmlink) to run Codex from your laptop while the model runs on your rig. +``` + ### 1) Start LM Studio's local server Make sure LM Studio is running as a server (default port `1234`). diff --git a/4_integrations/index.md b/4_integrations/index.md index b4dd900..8502bd8 100644 --- a/4_integrations/index.md +++ b/4_integrations/index.md @@ -7,7 +7,7 @@ index: 1 Use LM Studio as a seamless, drop-in local backend for your favorite tools. -Whether you are using an IDE extension or a custom automation script, simply point your base URL to http://localhost:1234 to power your workflows with LM Studio and maintain complete control over your data privacy. +Whether you are using an IDE extension or a custom automation script, simply point your base URL to `http://localhost:1234` to power your workflows with LM Studio and maintain complete control over your data privacy. We provide guides below for popular tools and are constantly expanding this list to include new integrations. diff --git a/4_integrations/lmlink.md b/4_integrations/lmlink.md new file mode 100644 index 0000000..b0dc474 --- /dev/null +++ b/4_integrations/lmlink.md @@ -0,0 +1,49 @@ +--- +title: Using LM Link with Integrations +sidebar_title: Working with LM Link +description: Use a remote device's model with coding tools like Claude Code and Codex via LM Link +index: 1 +--- + +With [LM Link](/docs/lmlink), your coding tools can run models on a remote device (like a dedicated LLM rig on your network) while you work from your laptop + + + +## Use your integration as normal + +Start LM Studio's server on your local machine and configure your tool to point to it. Model loads are routed to the device the model is loaded on or the preferred device if set. + +Your local machine handles the API surface at `localhost:1234`, while the model runs on the device the model is present on. + +```bash +lms server start --port 1234 +``` + +### Claude Code + +```bash +export ANTHROPIC_BASE_URL=http://localhost:1234 +export ANTHROPIC_AUTH_TOKEN=lmstudio +claude --model qwen3-8b +``` + +See the full [Claude Code](/docs/integrations/claude-code) guide. + +### Codex + +```bash +codex --oss -m qwen3-8b +``` + + +See the full [Codex](/docs/integrations/codex) guide. + +## Set a preferred device + +To use a model on a specific remote device, set the device as the preferred device. + + +See [set a preferred device](/docs/lmlink/basics/preferred-device) for more details. + + +If you're running into trouble, hop onto our [Discord](https://discord.gg/lmstudio) diff --git a/5_lmlink/1_basics/preferred-device.md b/5_lmlink/1_basics/preferred-device.md index 1608de4..2b9e369 100644 --- a/5_lmlink/1_basics/preferred-device.md +++ b/5_lmlink/1_basics/preferred-device.md @@ -7,9 +7,9 @@ index: 2 ## Choosing a preferred device -When the same model is available on multiple devices in the link, LM Link uses the preferred device to load and use the model . Each device on the network can configure its own preferred device independently. +When the same model is available on multiple devices in the link, LM Link uses the preferred device to load and use the model. This setting is per-machine: each device on the link independently controls which remote machine it prefers. -This is especially relevant when accessing remote models via the SDK or REST API. +This is especially relevant when accessing remote models via the SDK or [REST API](/docs/developer/core/lmlink). ### Machines with GUI @@ -25,3 +25,5 @@ To set a preferred device from the terminal, use the following command: ```bash lms link set-preferred-device ``` + + diff --git a/5_lmlink/index.md b/5_lmlink/index.md index 47edc93..77458e2 100644 --- a/5_lmlink/index.md +++ b/5_lmlink/index.md @@ -16,4 +16,10 @@ LM Link unlocks the full potential of your hardware by sharing compute across co LM Link use cases span individuals as well as teams. -For individuals, manage a private link to keep your prized gaming GPU busy even when you’re on the go. For teams, LM Link allows you to set up local LLM serving for multiple users with just a few clicks. +You can manage a private link to keep your prized gaming GPU busy even when you're on the go. Moreover, LM Link allows you to set up LLM serving in a server and start using it with just few clicks. + +## Use LM Link with + +- **CLI** — manage LM Link from the terminal with [`lms link`](/docs/cli/link/link-enable) +- **REST API** — use remote models via the REST API with [LM Link](/docs/developer/core/lmlink) +- **Integrations** — use remote models with coding tools like Claude Code and Codex via [LM Link](/docs/integrations/lmlink)