Skip to content

Fix for Issue #205#207

Merged
nikopueringer merged 2 commits intonikopueringer:mainfrom
ExperimentationT:main
Mar 30, 2026
Merged

Fix for Issue #205#207
nikopueringer merged 2 commits intonikopueringer:mainfrom
ExperimentationT:main

Conversation

@Raiden129
Copy link
Copy Markdown
Contributor

Prevent _prompt_inference_settings() from raising UnboundLocalError when the resolved backend is not torch (for MLX on Apple Silicon).

The function always returned InferenceSettings, but generate_comp and gpu_post_processing were only assigned inside the torch-specific branch. On MLX this left both locals undefined and caused inference to crash before startup.

Initialize those fields from InferenceSettings defaults before the backend check, then continue overriding them only for the torch path.

prompt helper returns valid settings instead of crashing.

What does this change?

  • Initializes generate_comp and gpu_post_processing before backend-specific prompting.

  • Preserves existing torch behavior: torch users are still prompted for composition preview and GPU post-processing settings.

  • Makes non-torch backend MLX fall back to the InferenceSettings defaults instead of crashing.

Checklist

  • uv run pytest passes
  • uv run ruff check passes
  • uv run ruff format --check passes

Raiden129 and others added 2 commits March 26, 2026 22:02
Prevent `_prompt_inference_settings()` from raising `UnboundLocalError`
when the resolved backend is not `torch` (for example MLX on Apple Silicon).

The function always returned `InferenceSettings`, but `generate_comp` and `gpu_post_processing` were only assigned inside the torch-specific branch. On MLX this left both locals undefined and caused inference to crash before startup.

Initialize those fields from `InferenceSettings` defaults before the
backend check, then continue overriding them only for the torch path.

prompt helper returns valid settings instead of crashing.
@raybrownco
Copy link
Copy Markdown

Thanks for the PR, @Raiden129 - my team's running into this problem at the moment and I was just about to dig into a fix myself. Looks like I don't have to. Kudos!

@nikopueringer nikopueringer merged commit a733773 into nikopueringer:main Mar 30, 2026
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants