-
Notifications
You must be signed in to change notification settings - Fork 26
Description
Within the Diffusion class, t_intervals are created like this:
if self.args.scheduler_type == 'uniform':
skip = self.num_timesteps // self.args.timesteps
t_intervals = torch.arange(-1, self.num_timesteps, skip)
t_intervals[0] = 0
To my understanding:
self.num_timesteps comes from the config with a default value of 1000.
self.args.timesteps comes from args; The default according to the ReadMe: STEPS=10; The default according to fast_ddpm_main.py:
"--timesteps", type=int, default=100, help="number of steps involved"
Is this intended, or did I get the code wrong?
Now the actual Question:
Under the condition self.num_timesteps=1000 and self.args.timesteps=10, this code returns 11 t_intervals.
Is this intended, or should it be replaced to return 10 t_intervals?
Thanks for the code!