Skip to content

Does darpa subT dataset support mini-batch on delora? #25

@JaySlamer

Description

@JaySlamer

Hi, thanks for your great work.
I trained with default hyperparameters, and can not get any sensible result.
I noticed the default batch_size is 1 in config file, which is very likely to cause unreliable and unstable result of training. The comment says:

batch_size > 1 currently only supported if single image dims are used (vertical and horizontal cells)
and
In general: larger batches currently implemented rather primitively

I don't understand what the comment means and whether darpa subT could be trained with larger batch size.
After change batch size to 32, I got error when training 1st epoch to 99.8%
Below is the error message
Traceback (most recent call last):
File "/home/slam/Documents/delora/bin/run_training.py", line 94, in
trainer.train()
File "/home/slam/Documents/delora/src/deploy/trainer.py", line 122, in train
epoch_losses = self.train_epoch(epoch=epoch, dataloader=dataloader)
File "/home/slam/Documents/delora/src/deploy/trainer.py", line 71, in train_epoch
self.step(
File "/home/slam/Documents/delora/src/deploy/deployer.py", line 292, in step
preprocessed_dict = preprocessed_dicts[batch_index]
IndexError: list index out of range

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions