Skip to content

Conversation

@davfsa
Copy link

@davfsa davfsa commented Jun 22, 2025

Hey!

This pull request adds support for nirn to utilize a sliding window technique with regards to ratelimits, which allows for a higher throughput and (optionally, which can be enabled through ALLOW_CONCURRENT_REQUESTS=true) the ability to perform concurrent requests to the same bucket!

I also took the liberty to update all the dependencies to their latest version, as of writing, and to update the usage of some deprecated apis.

@davfsa davfsa marked this pull request as draft June 22, 2025 22:22
@davfsa
Copy link
Author

davfsa commented Jun 22, 2025

Some examples:

2025-06-22.23-49-58.mp4
Script
from __future__ import annotations

import asyncio
import os

import dotenv

import hikari

dotenv.load_dotenv()

CHANNEL_ID = 

bot = hikari.GatewayBot(
    logs={
        "version": 1,
        "incremental": True,
        "loggers": {"": {"level": "INFO"}, "hikari.ratelimits": {"level": "DEBUG"}},
    },
    rest_url="http://localhost:8080/api/v10",
    max_retries=0,
    cache_settings=hikari.impl.CacheSettings(
        components=hikari.impl.CacheComponents.NONE
    ),
    token=os.environ["TOKEN"],
)


async def spam() -> None:
    while True:
        await bot.rest.create_message(CHANNEL_ID, "Hello!")


@bot.listen()
async def started(_: hikari.StartedEvent) -> None:
    await asyncio.gather(*(spam() for _ in range(5)))


bot.run()

2025-06-23.01-13-09.mp4
Script
from __future__ import annotations

import asyncio
import os

import dotenv

import hikari

dotenv.load_dotenv()

bot = hikari.GatewayBot(
    logs={
        "version": 1,
        "incremental": True,
        "loggers": {"": {"level": "INFO"}, "hikari.ratelimits": {"level": "DEBUG"}},
    },
    rest_url="http://localhost:8080/api/v10",
    max_retries=0,
    cache_settings=hikari.impl.CacheSettings(
        components=hikari.impl.CacheComponents.NONE
    ),
    token=os.environ["HIKARI_TOKEN"],
)

CHANNEL_ID =
BOT_ID =
message_id = 0


async def spam() -> None:
    while True:
        await bot.rest.add_reaction(CHANNEL_ID, message_id, "\u2764\ufe0f")
        await bot.rest.delete_my_reaction(CHANNEL_ID, message_id, "\u2764\ufe0f")


@bot.listen()
async def test(event: hikari.ReactionAddEvent) -> None:
    if event.message_id != message_id or event.user_id == BOT_ID:
        return

    if event.emoji_id:
        await bot.rest.delete_reaction(
            event.channel_id,
            event.message_id,
            event.user_id,
            emoji=event.emoji_name,
            emoji_id=event.emoji_id,
        )
    else:
        await bot.rest.delete_reaction(
            event.channel_id, event.message_id, event.user_id, emoji=event.emoji_name
        )


@bot.listen()
async def started(_: hikari.StartedEvent) -> None:
    message = await bot.rest.create_message(
        CHANNEL_ID, "Spam reactions on me, I will delete them :D"
    )
    await message.add_reaction("\u2764")

    global message_id
    message_id = message.id

    asyncio.create_task(spam())


bot.run()

Note:, hikari was patched to comment out these two lines https://github.com/hikari-py/hikari/blob/f07f4187b9e6d284e334cb913625b2988217a13f/hikari/impl/rest.py#L802-L803 to allow nirn to fully take over handling ratelimiting

@davfsa davfsa marked this pull request as ready for review June 22, 2025 23:33
…r starting in the middle of a window

This can help in cases where NIRN is restarted and it loses all ratelimiting information
@davfsa davfsa force-pushed the sliding-window-ratelimiter branch from 6ad1c9a to 0699f16 Compare July 9, 2025 08:14
@davfsa davfsa changed the title Implement sliding window ratelimiting New ratelimiting implementation Jan 11, 2026
@davfsa davfsa force-pushed the sliding-window-ratelimiter branch from dfc6340 to b31cd61 Compare January 11, 2026 18:30
@davfsa davfsa marked this pull request as ready for review January 30, 2026 08:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants