Hi all,
With the 0.4.0 release ferret now supports the usage of batching for methods requiring multiple steps of approximation, such as Lime and Integrated Gradients.
However, the name of the argument, batch_size, can erroneously lead users to believe that attributing multiple examples at once is possible, while it does not seem to be the case. In Captum, the same argument is referred to as internal_batch_size to disambiguate inter-example and intra-example batching.
Do you think it would be more sound to stick to the internal_batch_size nomenclature in this context? This could be especially useful if multi-example batching will be supported in the future.
Hi all,
With the 0.4.0 release ferret now supports the usage of batching for methods requiring multiple steps of approximation, such as Lime and Integrated Gradients.
However, the name of the argument,
batch_size, can erroneously lead users to believe that attributing multiple examples at once is possible, while it does not seem to be the case. In Captum, the same argument is referred to asinternal_batch_sizeto disambiguate inter-example and intra-example batching.Do you think it would be more sound to stick to the
internal_batch_sizenomenclature in this context? This could be especially useful if multi-example batching will be supported in the future.