metadata
license: apache-2.0
datasets:
- BAAI/COIG-PC
- ehartford/dolphin
- emozilla/booksum-summary-analysis_llama-8192
- OpenLeecher/GPT4-10k
- 0x70DA/stackoverflow-chat-data
- togethercomputer/Long-Data-Collections
RWKV 7B world focus on reading comprehension
This is a experimental model based on RWKV 7B world.
===> remove eod, add special token, change vocabs.
if wanna test this model ,some effort needed, copy back-python folder to a new one ,which is in the same folder with rwkv runner exe file , then past rwkv_vocab_v20230424.txt into rwkv_pip folder to replace the vocabs file
../py310/python main.py in this new folder, then use RWKV runner setting API to 127.0.0.0.1:8000, and go to 127.0.0.1:8000/docs to switch model using this one
try different temp and topp , 1.2 0.5 may works.