Skip to content

mm 10 pre-trained model cannot be restored properly  #13

@hanyangii

Description

@hanyangii

When running methylbert with -p hanyangii/methylbert_mm10_4l option

Restore the pretrained model hanyangii/methylbert_mm10_4l
Cross entropy loss assigned
Some weights of MethylBertEmbeddedDMR were not initialized from the model checkpoint at hanyangii/methylbert_mm10_4l and are newly initialized: ['dmr_encoder.0.weight', 'read_classifier.0.bias', 'read_classifier.0.weight', 'read_classifier.3.bias', 'read_classifier.3.weight', 'read_classifier.4.bias', 'read_classifier.4.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
Total Parameters: 46941186
MethylBertEmbeddedDMR(
  (classification_loss_fct): CrossEntropyLoss()
  (bert): BertModel(
    (embeddings): BertEmbeddings(
      (word_embeddings): Embedding(69, 768, padding_idx=0)
      (position_embeddings): Embedding(512, 768)
      (token_type_embeddings): Embedding(2, 768)
      (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
      (dropout): Dropout(p=0.1, inplace=False)
    )
    (enco

token type embeddings must be (3, 768)

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions