"Some weights of the model checkpoint at bert-base-uncased were not used when initializing BertModel: ['cls.seq_relationship.bias', 'cls.seq_relationship.weight', 'cls.predictions.decoder.weight', 'cls.predictions.bias', 'cls.predictions.transform.LayerNorm.bias', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.transform.dense.weight', 'cls.predictions.transform.dense.bias']\n",
"- This IS expected if you are initializing BertModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n",
"- This IS NOT expected if you are initializing BertModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n"
<a href='vscode-notebook-cell:/c%3A/Users/Albin/Documents/TDDE19/codebase/Neural%20graph%20module/ngm.ipynb#X11sZmlsZQ%3D%3D?line=17'>18</a> #show first entry
File b:\Programs\Miniconda\envs\tdde19\lib\site-packages\torch\utils\data\dataset.py:311, in random_split(dataset, lengths, generator)
309 # Cannot verify that dataset is Sized
310 if sum(lengths) != len(dataset): # type: ignore[arg-type]
--> 311 raise ValueError("Sum of input lengths does not equal the length of the input dataset!")
313 indices = randperm(sum(lengths), generator=generator).tolist()
314 return [Subset(dataset, indices[offset - length : offset]) for offset, length in zip(_accumulate(lengths), lengths)]
ValueError: Sum of input lengths does not equal the length of the input dataset!
Some weights of the model checkpoint at bert-base-uncased were not used when initializing BertModel: ['cls.seq_relationship.bias', 'cls.seq_relationship.weight', 'cls.predictions.decoder.weight', 'cls.predictions.bias', 'cls.predictions.transform.LayerNorm.bias', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.transform.dense.weight', 'cls.predictions.transform.dense.bias']
- This IS expected if you are initializing BertModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
- This IS NOT expected if you are initializing BertModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).