Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/modules/indicator.md
Original file line number Diff line number Diff line change
Expand Up @@ -710,7 +710,7 @@ finally:
(synchronous) and D-Bus calls (asynchronous) to run cooperatively in a
single thread
- The `_schedule()` pattern in `IndicatorApp` bridges sync GTK callbacks to
async coroutines by wrapping them in `asyncio.ensure_future()` and storing
async coroutines by wrapping them in `asyncio.create_task()` and storing
task references to prevent garbage collection
- Auto-reconnection: when the daemon exits or crashes, the client detects the
`NameOwnerChanged` signal and retries the connection at a configurable
Expand Down
4 changes: 2 additions & 2 deletions docs/modules/notifier.md
Original file line number Diff line number Diff line change
Expand Up @@ -135,7 +135,7 @@ Uses a two-level cache:
1. **In-memory** (`_avatar_cache`): instant lookup for avatars already downloaded
this session
2. **On-disk** (`_AVATAR_CACHE_DIR`): persists across restarts via deterministic
filenames (MD5 hash of the URL)
filenames (SHA-256 hash of the URL)

The caller provides a shared `aiohttp.ClientSession` so that avatar downloads
within a notification batch reuse the same HTTP connection.
Expand Down Expand Up @@ -273,7 +273,7 @@ await notify_new_prs(diff.new_prs, grouping="repo", repo_overrides=overrides)
when the portal is unavailable. The portal approach is necessary because
`xdg-open` fails silently inside the systemd sandbox when the browser is a
Snap package (Snap's `snap-confine` rejects the restricted permissions)
- Author avatars are downloaded from GitHub, cached on disk (MD5-hashed
- Author avatars are downloaded from GitHub, cached on disk (SHA-256-hashed
filenames), and passed to `notify-send` via `--icon={path}`. A shared
`aiohttp.ClientSession` is used for all avatar downloads within a
notification batch to avoid creating a new session per avatar
Expand Down
2 changes: 1 addition & 1 deletion forgewatch/indicator/app.py
Original file line number Diff line number Diff line change
Expand Up @@ -226,7 +226,7 @@ def _schedule(self, coro: Coroutine[object, object, None]) -> None:
The task reference is stored in ``_tasks`` to prevent garbage
collection before the coroutine completes.
"""
task = asyncio.ensure_future(coro)
task = asyncio.create_task(coro)
self._tasks.add(task)
task.add_done_callback(self._task_done)

Expand Down
2 changes: 1 addition & 1 deletion forgewatch/indicator/client.py
Original file line number Diff line number Diff line change
Expand Up @@ -258,7 +258,7 @@ def _fire() -> None:
# a failed connect() -> _set_disconnected() -> _schedule_reconnect()
# chain can schedule a fresh timer instead of bailing out.
self._reconnect_handle = None
asyncio.ensure_future(self.connect()) # noqa: RUF006
asyncio.create_task(self.connect()) # noqa: RUF006

self._reconnect_handle = loop.call_later(self._reconnect_interval, _fire)
logger.debug("Reconnect scheduled in %ds", self._reconnect_interval)
Expand Down
2 changes: 1 addition & 1 deletion forgewatch/notifier.py
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@ async def _download_avatar(avatar_url: str, session: aiohttp.ClientSession) -> s
_AVATAR_CACHE_DIR.mkdir(parents=True, exist_ok=True)

# Deterministic filename from URL
url_hash = hashlib.md5(avatar_url.encode()).hexdigest() # noqa: S324
url_hash = hashlib.sha256(avatar_url.encode()).hexdigest()
dest = _AVATAR_CACHE_DIR / f"{url_hash}.png"

# If already on disk (from a previous run), reuse
Expand Down
2 changes: 1 addition & 1 deletion tests/test_notifier.py
Original file line number Diff line number Diff line change
Expand Up @@ -780,7 +780,7 @@ async def test_disk_cache_hit(self, tmp_path: Path) -> None:
from forgewatch import notifier

avatar_url = "https://avatars.githubusercontent.com/u/disk-cache-1"
url_hash = hashlib.md5(avatar_url.encode()).hexdigest() # noqa: S324
url_hash = hashlib.sha256(avatar_url.encode()).hexdigest()
cached_file = tmp_path / f"{url_hash}.png"
cached_file.write_bytes(b"\x89PNG old data")

Expand Down