Skip to content

Onboarding docs mention local Ollama but doesn't hint at how to handle or detect snap-installed Ollama models #54

@arcaven

Description

@arcaven

For a typical Ubuntu user a local install of Ollama is as likely to be snap rather than apt or "curl url | bash" remote shell injection install method.

The docs at https://docs.msty.app/getting-started/onboarding provide a tab-based view for those new Msty users who likely already have Ollama installed. The options are: get started using Ollama (presumably set it up again) or use Ollama remotely or ... if Ollama was installed in user space, then they'll have and msty will detect a ~/.ollama/models directory. However, "ollama serve" served models will be in /var/snap/ollama/common/models or something like that.

Recommend updating the docs for the Ubuntu-snap-ollama crowd, as it's a major part of your demographic. I haven't even used your software yet, so I'm not sure I know what is best here, tell them to use snap installed Ollama as remote via http://127.0.0.1:11434, create a ~/.ollama/models and copy or ln (and fight with permissions, snap-installed Ollama is root:root) or copy the models aside from snap, uninstall snap Ollama and install it a different way.

If I had an opinion, I'd write it up, but let me get some experience with Msty before I start having opinions.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions