Skip to content

Can this code deal with variable-length sequence inputs? #9

@voidism

Description

@voidism

Hi,
Thanks for this good work!
I am using Skip-LSTM on my experiment now. And it seems to work well.
However, I am wondering how can this code deal with variable-length sequence inputs?
When using RNN/LSTM in pytorch, we can use torch.nn.utils.rnn.pack_padded_sequence and torch.nn.utils.rnn.pad_packed_sequence to make the model not to consider the padding vectors as input.
Are there any alternative ways to do the same thing? Thanks a lot!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions