Skip to content

defer heavy imoprts#91

Merged
zh-plus merged 2 commits intozh-plus:masterfrom
iautolab:chore/defer-heavy-imoprts
Mar 18, 2026
Merged

defer heavy imoprts#91
zh-plus merged 2 commits intozh-plus:masterfrom
iautolab:chore/defer-heavy-imoprts

Conversation

@MaleicAcid
Copy link
Copy Markdown
Collaborator

Implements #89 .

Changes

  1. openlrc/config.py (new): extract TranscriptionConfig and TranslationConfig from openlrc.py. Only depends on stdlib + ModelConfig.
  2. openlrc/__init__.py: re-export from config.py instead of openlrc.py. Public API unchanged.
  3. openlrc/openlrc.py: remove dataclass definitions; defer Preprocessor, Transcriber, LLMTranslator, TranslateInfo to method-level imports; move Segment into TYPE_CHECKING block.
  4. openlrc/preprocess.py: defer torch and df.enhance into noise_suppression().
  5. tests/test_preprocess.py: use sys.modules fake module + patch.object() instead of @patch("df.enhance.xxx") string paths to fix Python 3.10 compatibility (3.10's mock.patch has a caching bug when module name and function name collide).

@MaleicAcid MaleicAcid requested a review from zh-plus March 16, 2026 03:34
@zh-plus
Copy link
Copy Markdown
Owner

zh-plus commented Mar 18, 2026

Thanks for working on this. Moving the config dataclasses out of openlrc.py and deferring several heavy imports is a meaningful improvement, and it gets us closer to the goal in #89.

I don’t think it fully resolves #89 yet, because there are still some eager imports in the package path, but this is a solid incremental step and the direction looks right.

I’m going to merge this PR now, and I’ll follow up with another PR to continue the import-cleanup work and fully address the remaining parts of #89.

@zh-plus zh-plus merged commit ff77ad9 into zh-plus:master Mar 18, 2026
15 of 16 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants