Skip to content

batch_size potentially misleading #14

@gsarti

Description

@gsarti

Hi all,

With the 0.4.0 release ferret now supports the usage of batching for methods requiring multiple steps of approximation, such as Lime and Integrated Gradients.

However, the name of the argument, batch_size, can erroneously lead users to believe that attributing multiple examples at once is possible, while it does not seem to be the case. In Captum, the same argument is referred to as internal_batch_size to disambiguate inter-example and intra-example batching.

Do you think it would be more sound to stick to the internal_batch_size nomenclature in this context? This could be especially useful if multi-example batching will be supported in the future.

Metadata

Metadata

Assignees

Labels

discussionDiscussion and request to change APIs

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions