Conversation
…in attention mask
There was a problem hiding this comment.
Pull request overview
This PR introduces minor but important robustness and device-handling improvements to the model wrapper code. The changes make the latent-space realignment logic more defensive against missing attributes and ensure attention masks are consistently placed on the appropriate device (HF_device when available, otherwise the main device).
- Added defensive attribute access for
latent_space_realignusinggetattrwith a default fallback - Improved device handling for attention masks by introducing a device selection pattern that prioritizes
HF_devicewhen available - Enhanced code robustness with minimal changes (5 additions, 3 deletions)
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
@jiaruzouu Thank you for sharing the excellent code—it was very helpful in understanding the LatentMAS algorithm. |
|
Hi @missflash, thank you so much for your efforts in improving our code! We are currently planning to update our codebase and will ensure to review your code with detailed feedback shortly! Thanks again for your efforts! |
🔧 Changes
models.py, with 5 lines added and 3 removed).🎯 Why this may matter
latent_space_realignisn’t defined.