Skip to content

A LLM chat app where you can access every latest AI model for free (except Anthropic models) with generous rate limits.

License

Notifications You must be signed in to change notification settings

llmdump/llmdump.github.io

Repository files navigation

LLM Dump logo with dark text and light outline (fallback)

LLM Dump

LLM Dump is an AI chat app where you can access every latest AI model (except Anthropic) with generous rate limits for completely free, no ads, no data collected.Try LLM Dump now!

Rate Limits

Note: Despite rate limits being linked to your Github Copilot subscription, they won't use up any premium requests. LLM Dump does not use the Copilot API. Rate limits are as of 16th of August 2025.

Low Rate Limit Tier (including 4.1 mini, 4o mini, Cohere Command A, and many more)

Rate limit Copilot Free Copilot Pro Copilot Business Copilot Enterprise
Requests per minute 15 15 15 20
Requests per day 150 150 300 450
Tokens per request 8000 in, 4000 out 8000 in, 4000 out 8000 in, 4000 out 8000 in, 8000 out
Concurrent requests 5 5 5 8

High Rate Limit Tier (including 4.1, 4o, and many more)

Rate limit Copilot Free Copilot Pro Copilot Business Copilot Enterprise
Requests per minute 10 10 10 15
Requests per day 50 50 100 150
Tokens per request 8000 in, 4000 out 8000 in, 4000 out 8000 in, 4000 out 16000 in, 8000 out
Concurrent requests 2 2 2 4

Embedding

Rate limit Copilot Free Copilot Pro Copilot Business Copilot Enterprise
Requests per minute 15 15 15 20
Requests per day 150 150 300 450
Tokens per request 64000 64000 64000 64000
Concurrent requests 5 5 5 8

OpenAI o1-preview

Rate limit Copilot Free Copilot Pro Copilot Business Copilot Enterprise
Requests per minute Not applicable 1 2 2
Requests per day Not applicable 8 10 12
Tokens per request Not applicable 4000 in, 4000 out 4000 in, 4000 out 4000 in, 8000 out
Concurrent requests Not applicable 1 1 1

OpenAI o1, o3, and gpt-5

Rate limit Copilot Free Copilot Pro Copilot Business Copilot Enterprise
Requests per minute Not applicable 1 2 2
Requests per day Not applicable 8 10 12
Tokens per request Not applicable 4000 in, 4000 out 4000 in, 4000 out 4000 in, 8000 out
Concurrent requests Not applicable 1 1 1

OpenAI o1-mini, o3-mini, o4-mini, gpt-5-mini, gpt-5-nano, and gpt-5-chat

Rate limit Copilot Free Copilot Pro Copilot Business Copilot Enterprise
Requests per minute Not applicable 2 3 3
Requests per day Not applicable 12 15 20
Tokens per request Not applicable 4000 in, 4000 out 4000 in, 4000 out 4000 in, 4000 out
Concurrent requests Not applicable 1 1 1

DeepSeek-R1, DeepSeek-R1-0528, and MAI-DS-R1

Rate limit Copilot Free Copilot Pro Copilot Business Copilot Enterprise
Requests per minute 1 1 2 2
Requests per day 8 8 10 12
Tokens per request 4000 in, 4000 out 4000 in, 4000 out 4000 in, 4000 out 4000 in, 4000 out
Concurrent requests 1 1 1 1

xAI Grok-3

Rate limit Copilot Free Copilot Pro Copilot Business Copilot Enterprise
Requests per minute 1 1 2 2
Requests per day 15 15 20 30
Tokens per request 4000 in, 4000 out 4000 in, 4000 out 4000 in, 8000 out 4000 in, 16000 out
Concurrent requests 1 1 1 1

xAI Grok-3-Mini

Rate limit Copilot Free Copilot Pro Copilot Business Copilot Enterprise
Requests per minute 2 2 3 3
Requests per day 30 30 40 50
Tokens per request 4000 in, 8000 out 4000 in, 8000 out 4000 in, 12000 out 4000 in, 12000 out
Concurrent requests 1 1 1 1

About

A LLM chat app where you can access every latest AI model for free (except Anthropic models) with generous rate limits.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published