Missing model.embed_tokens.weight but got lm_head.weight
#1
by
xinhe
- opened
In transformers, model.tie_weights() will set lm_head.weight = model.embed_tokens.weight
. But in this model, model.embed_tokens.weight is empty.
In current transformers, save_pretrained
won't save lm_head.weight
but save model.embed_tokens.weight
.