Hi bro,
Recently I came across an article by avi chawla sir that he identified the dataloader applies same transformation to same images in multiple batches leads to redundancy. I mean if the image get transformation in a batch iterations at particular epoch then the same should not get the transformation again right. That's why he mentioned that to apply the transformation to all the images first and then pass that transformed dataset to avoid this issue.
Link:- https://blog.dailydoseofds.com/p/a-counterintuitive-behaviour-of-pytorch?utm_campaign=posts-open-in-app&triedRedirect=true
Thanks & Regards
Guna Sekhar