Skip to content

Comments

fix: tensor shape err debug log#865

Open
varshith15 wants to merge 1 commit intomainfrom
vb/debug/tensor_shape_err
Open

fix: tensor shape err debug log#865
varshith15 wants to merge 1 commit intomainfrom
vb/debug/tensor_shape_err

Conversation

@varshith15
Copy link
Member

  • adds a tensor shape error debug log and prints params, to help reproduce the issue
  • also not a fix just yet, need to understand whats causing it

@varshith15 varshith15 requested a review from victorges December 11, 2025 05:40
out_tensor = out_tensor.permute(0, 2, 3, 1)

if out_tensor.dim() != 4 or out_tensor.shape[0] != 1 or out_tensor.shape[-1] != 3:
logging.error(f"[StreamDiffusion] Invalid output tensor shape for encoder: {out_tensor.shape}. Expected (1, H, W, 3). Params: {self.params}")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Btw you can also check the H, W dimensions by checking the self.params.get_output_resolution(). Although if we do that, we should probably cache the output resolution along the params so we're not re-computing the ouput dims every frame 🤔
WDYT?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, when the output is not in the right format, we should probably return None and skip the output generation on put_video_frame. Otherwise we're still gonna break the encoder in the end.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i didnt return None or skip because it might cause an issue with the timestamps on the frames because the put and get are async

i just wanted to understand what params cause it so that we can replicate it offline and fix it streamdiffusion instead?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Skipping shouldn't cause any timestamp issues! Only if we don't output monotonic timestamps

Copy link
Contributor

@victorges victorges left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM anyway not a regression

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants