-
Notifications
You must be signed in to change notification settings - Fork 3.3k
Open
Labels
area:configurationRelates to configuration optionsRelates to configuration optionside:vscodeRelates specifically to VS Code extensionRelates specifically to VS Code extensionkind:bugIndicates an unexpected problem or unintended behaviorIndicates an unexpected problem or unintended behavioros:linuxHappening specifically on LinuxHappening specifically on Linux
Description
Before submitting your bug report
- I've tried using the "Ask AI" feature on the Continue docs site to see if the docs have an answer
- I believe this is a bug. I'll try to join the Continue Discord for questions
- I'm not able to find an open issue that reports the same bug
- I've seen the troubleshooting guide on the Continue Docs
Relevant environment info
- OS:Ubuntu24.04
- Continue version: 1.0.19
- IDE version: VSCode 1.102.3
- Model: All
- config:
name: my-configuration
version: 0.0.1
schema: v1
models:
- name: Devstral2507-lmstudio
provider: lmstudio
model: mistralai/devstral-small-2507
roles:
- chat
- edit
- apply
capabilities:
- tool_use
- name: openrouter-deepseek-chat-v3-0324
provider: openai
model: openrouter-deepseek-chat-v3-0324
apiBase: http://localhost:4000
roles:
- chat
- edit
- apply
capabilities:
- tool_use
OR link to assistant in Continue hub:
Description
- Added rules manualy or using the
+ Add Rules
button from the Continue/Rules extension UI or even allowing the LLM to generate the rule in the .continue/rules folder with .md extensions are completely ignored by continue when sending API requests to llm providers. - On off toggle for the rule showing under continue/Rules UI seems to not do anything:

To reproduce
- Go to rules inside continue extension chat mode.
- Click on
+ Add Rules
button and edit your rule and save. - Ensure toggle button is on for the added rule and md files are inside rules folder
$ tree -a
....
.
├── appsettings.json
├── .continue
│ ├── mcpServers
│ │ └── AiCodeGenerator.yaml
│ └── rules
│ ├── llm-code-generation-refactoring-standard.md
│ └── new-rule.md
.....
- Start chat in any mode
- Observe that only default system message is passed in CONTINUE CONSOLE

Expectation:
Rule content to be added to system before sending API request to llmprovider.
Log output
Continue Console showing the system prompt passed to llmprovider not containing the rules as expected:
system
<important_rules>
You are in agent mode.
Always include the language and file name in the info string when you write code blocks.
If you are editing "src/main.py" for example, your code block should start with ' src/main.py'
</important_rules>
user
Hello
assistant
Hello! How can I assist you today?
user
hello
Metadata
Metadata
Assignees
Labels
area:configurationRelates to configuration optionsRelates to configuration optionside:vscodeRelates specifically to VS Code extensionRelates specifically to VS Code extensionkind:bugIndicates an unexpected problem or unintended behaviorIndicates an unexpected problem or unintended behavioros:linuxHappening specifically on LinuxHappening specifically on Linux
Type
Projects
Status
Todo