Update config.json
#4
by
Datatypes
- opened
add max_position_embeddings and set it to 128k.
(1024 * 128_000 =131_072_000)
Is this going to get merged?
Hello @Datatypes !
The reason that this is not in the config is that it's the default value. Is there any reason why you need it to be explicitly added?
When you pull the model with Msty or LM studio, that default value is not carried, and the model maxes out at 4k tokens.
I added it to my config and the model now runs as expected.
pcuenq
changed pull request status to
merged