Skip to content

Depth tokenizer #21

@Shar-01

Description

@Shar-01

Hi everyone, thanks for the nice work. I am considering using your pretrained depth tokenizer to extract precomputed (features) tokens for further training. I have some questions.

  1. I cloned the ml-4m, and installed the diffusers library. However, get error: AttributeError: module diffusers.models has no attribute unet_2d_blocks. Could you please specify the requisites for using your repo and which diffuser version you have used?

  2. Also, how many tokens do we get from your pretrained checkpoint model?

  3. Is your uploaded pretrained depth tokenizer an encoder-decoder or encoder only model that would just give me the required tokens?

  4. What normalization did you use for the depth data?

Thanks a lot!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions