-
Notifications
You must be signed in to change notification settings - Fork 0
Use Local LLM for a summary #88
Copy link
Copy link
Open
Labels
featureNew feature or requestNew feature or requestfuture considerationI will consider this issue at some later point in timeI will consider this issue at some later point in timelow priorityIssues with low priorityIssues with low priority
Metadata
Metadata
Assignees
Labels
featureNew feature or requestNew feature or requestfuture considerationI will consider this issue at some later point in timeI will consider this issue at some later point in timelow priorityIssues with low priorityIssues with low priority
I want to explore the possibility of running inference on a local model, made possible by the new ML API that is in trial atm.
Mozilla Blog Post
Docs
Chrome has something similar:
Chrome Extension AI