Currently the users have to either install 3rd party LLM applications or use the robot-mcp-client which aims to offer connections to multiple LLMs - both stored locally (through Ollama), Open Source (though Groq), or any other model
Currently the implementation for the ros-mcp-client extends only on terminal, with limited functionally and not the most user friendly.
We can build a frontend application to act as the MCP Host for the ros-mcp-server. Some the following features to include:
- Intuitive Chat UI
- Ability to give voice commands
- Older chats and history
- Change models
- Uploading files (like images, PDFs which can be a plan document) along with the chat
We may also incorporate other MCPs into the client, for example search and filesystem MCPs to perform more complicated tasks and logging onto the system.
Would love to know any feedback on this. I can get started on this.
Currently the users have to either install 3rd party LLM applications or use the
robot-mcp-clientwhich aims to offer connections to multiple LLMs - both stored locally (through Ollama), Open Source (though Groq), or any other modelCurrently the implementation for the
ros-mcp-clientextends only on terminal, with limited functionally and not the most user friendly.We can build a frontend application to act as the MCP Host for the
ros-mcp-server. Some the following features to include:We may also incorporate other MCPs into the client, for example search and filesystem MCPs to perform more complicated tasks and logging onto the system.
Would love to know any feedback on this. I can get started on this.