Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions pytorch_forecasting/tests/_config.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,10 @@
EXCLUDE_ESTIMATORS = [
"DummySkipped",
"ClassName", # exclude classes from extension templates
"NBeatsKAN_pkg",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you could also add "tests:skip_by_name" tag in corresponding _pkg classes. There if you add test_integration it should skip the integration tests for that model class. This would not skip other tests for that model, like it is now, where framework is skipping the whole _pkg class.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, although afaik we would have to add the machinery for that tag in pytorch-forecasting first. I do not think it works out of the box right now, since I have not added that feature to scikit-base yet. There is code in sktime that could be copied for this.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, it exists! we already brought it (and modified it for _pkg usage) from sktime. See here
We are already using it to skip specific params for some buggy fixtures(like LogNormalDistributionLoss for DecoderMLP, see here). So, I think we could use it here as well...

"NBeats_pkg",
"TimeXer_pkg",
"xLSTMTime_pkg",
]

# dictionary of lists of str, names of tests to exclude from testing
Expand Down
31 changes: 13 additions & 18 deletions pytorch_forecasting/tests/test_all_estimators.py
Original file line number Diff line number Diff line change
Expand Up @@ -334,27 +334,22 @@ def _integration(
output = raw_predictions.output.prediction
n_dims = len(output.shape)

assert n_dims in [2, 3], (
f"Prediction output must be 2D or 3D, but got {n_dims}D tensor "
assert n_dims == 3, (
f"Prediction output must be 3D, but got {n_dims}D tensor "
f"with shape {output.shape}"
)

if n_dims == 2:
batch_size, prediction_length = output.shape
assert batch_size > 0, f"Batch size must be positive, got {batch_size}"
assert (
prediction_length > 0
), f"Prediction length must be positive, got {prediction_length}"

elif n_dims == 3:
batch_size, prediction_length, n_features = output.shape
assert batch_size > 0, f"Batch size must be positive, got {batch_size}"
assert (
prediction_length > 0
), f"Prediction length must be positive, got {prediction_length}"
assert (
n_features > 0
), f"Number of features must be positive, got {n_features}"
batch_size, prediction_length, n_features = output.shape
assert batch_size > 0, f"Batch size must be positive, got {batch_size}"
assert (
prediction_length > 0
), f"Prediction length must be positive, got {prediction_length}"
assert (
# todo: compare n_features with expected 3rd dimension of the corresponding
# loss function on which model is trained and
# predictions generated in this test.
n_features > 0 # this should be n_features == net.loss.expected_dim
), f"Number of features must be positive, got {n_features}"
finally:
shutil.rmtree(tmp_path, ignore_errors=True)

Expand Down
Loading