togakyo commited on
Commit
007abb2
·
verified ·
1 Parent(s): 31588b9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +35 -0
README.md CHANGED
@@ -9,6 +9,7 @@ tags:
9
  license: apache-2.0
10
  language:
11
  - en
 
12
  ---
13
 
14
  # Uploaded model
@@ -20,3 +21,37 @@ language:
20
  This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
21
 
22
  [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9
  license: apache-2.0
10
  language:
11
  - en
12
+ - ja
13
  ---
14
 
15
  # Uploaded model
 
21
  This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
22
 
23
  [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
24
+
25
+ # Sample Use
26
+ 以下は、elyze-tasks-100-TV_0_jsonの回答のためのコードです。
27
+ ```python
28
+ import json
29
+ datasets = []
30
+ with open("./elyza-tasks-100-TV_0.jsonl", "r") as f:
31
+ item = ""
32
+ for line in f:
33
+ line = line.strip()
34
+ item += line
35
+ if item.endswith("}"):
36
+ datasets.append(json.loads(item))
37
+ item = ""
38
+
39
+ # step1
40
+ from tqdm import tqdm
41
+
42
+ # step2
43
+ FastLanguageModel.for_inference(model)
44
+
45
+ results = []
46
+ for dt in tqdm(datasets):
47
+ input = dt["input"]
48
+
49
+ prompt = f"""### 指示\n{input}\n### 回答\n"""
50
+
51
+ inputs = tokenizer([prompt], return_tensors = "pt").to(model.device)
52
+
53
+ outputs = model.generate(**inputs, max_new_tokens = 512, use_cache = True, do_sample=False, repetition_penalty=1.2)
54
+ prediction = tokenizer.decode(outputs[0], skip_special_tokens=True).split('\n### 回答')[-1]
55
+
56
+ results.append({"task_id": dt["task_id"], "input": input, "output": prediction})
57
+ ```