lhoestq HF Staff commited on
Commit
deaa0c4
·
verified ·
1 Parent(s): 3fe5c9a

Add 'high_school_chemistry' config data files

Browse files
README.md CHANGED
@@ -170,6 +170,14 @@ configs:
170
  path: high_school_biology/val-*
171
  - split: dev
172
  path: high_school_biology/dev-*
 
 
 
 
 
 
 
 
173
  dataset_info:
174
  - config_name: accountant
175
  features:
@@ -771,6 +779,36 @@ dataset_info:
771
  num_examples: 5
772
  download_size: 60521
773
  dataset_size: 63511
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
774
  ---
775
 
776
  C-Eval is a comprehensive Chinese evaluation suite for foundation models. It consists of 13948 multi-choice questions spanning 52 diverse disciplines and four difficulty levels. Please visit our [website](https://cevalbenchmark.com/) and [GitHub](https://github.com/SJTU-LIT/ceval/tree/main) or check our [paper](https://arxiv.org/abs/2305.08322) for more details.
 
170
  path: high_school_biology/val-*
171
  - split: dev
172
  path: high_school_biology/dev-*
173
+ - config_name: high_school_chemistry
174
+ data_files:
175
+ - split: test
176
+ path: high_school_chemistry/test-*
177
+ - split: val
178
+ path: high_school_chemistry/val-*
179
+ - split: dev
180
+ path: high_school_chemistry/dev-*
181
  dataset_info:
182
  - config_name: accountant
183
  features:
 
779
  num_examples: 5
780
  download_size: 60521
781
  dataset_size: 63511
782
+ - config_name: high_school_chemistry
783
+ features:
784
+ - name: id
785
+ dtype: int32
786
+ - name: question
787
+ dtype: string
788
+ - name: A
789
+ dtype: string
790
+ - name: B
791
+ dtype: string
792
+ - name: C
793
+ dtype: string
794
+ - name: D
795
+ dtype: string
796
+ - name: answer
797
+ dtype: string
798
+ - name: explanation
799
+ dtype: string
800
+ splits:
801
+ - name: test
802
+ num_bytes: 46918
803
+ num_examples: 172
804
+ - name: val
805
+ num_bytes: 5625
806
+ num_examples: 19
807
+ - name: dev
808
+ num_bytes: 2576
809
+ num_examples: 5
810
+ download_size: 55668
811
+ dataset_size: 55119
812
  ---
813
 
814
  C-Eval is a comprehensive Chinese evaluation suite for foundation models. It consists of 13948 multi-choice questions spanning 52 diverse disciplines and four difficulty levels. Please visit our [website](https://cevalbenchmark.com/) and [GitHub](https://github.com/SJTU-LIT/ceval/tree/main) or check our [paper](https://arxiv.org/abs/2305.08322) for more details.
high_school_chemistry/dev-00000-of-00001.parquet ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:791d406f845a4050ee892c3ef33d06319a77319a5cb657739c1ff19e4d17b3d9
3
+ size 7292
high_school_chemistry/test-00000-of-00001.parquet ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:92b3f30abcb34d13b6b3892aaa9a4acc0a2cc820f27e341d70567d6f584d87da
3
+ size 38460
high_school_chemistry/val-00000-of-00001.parquet ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:60d20961edd3318e1cb09df44e10e0101b6ec6c97332299f030b11fc48bb1953
3
+ size 9916