Skip to content

Fix bfloat16 not recognized as floating point dtype#1408

Merged
dfalbel merged 2 commits intomainfrom
fix-bfloat16-is-floating-point
Feb 2, 2026
Merged

Fix bfloat16 not recognized as floating point dtype#1408
dfalbel merged 2 commits intomainfrom
fix-bfloat16-is-floating-point

Conversation

@dfalbel
Copy link
Member

@dfalbel dfalbel commented Feb 2, 2026

Summary

  • Add BFloat16 to the is_floating_point check in torch_dtype class
  • This fixes nn_module$to() failing when called with torch_bfloat16()
  • Adds test coverage for is_floating_point across all dtype variants

🤖 Generated with Claude Code

dfalbel and others added 2 commits February 2, 2026 16:04
Add "BFloat16" to the is_floating_point check in torch_dtype class.
This fixes nn_module$to() failing when called with torch_bfloat16().

Also adds test coverage for is_floating_point across all dtype variants.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
@dfalbel dfalbel merged commit 9f7ca96 into main Feb 2, 2026
8 of 12 checks passed
@dfalbel dfalbel deleted the fix-bfloat16-is-floating-point branch February 2, 2026 22:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant