Text Classification
Transformers
Safetensors
English
llama
text-generation-inference
panos-lema commited on
Commit
7fe1cda
·
verified ·
1 Parent(s): 45377fb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -10
README.md CHANGED
@@ -40,17 +40,11 @@ Introducing **HallOumi-8B-classifier**, a _fast_ **SOTA hallucination detection
40
 
41
  Demo GIF: TODO
42
 
43
- **HallOumi**, the hallucination detection model built with Oumi, is a system built specifically to enable per-sentence verification of any content (either AI or human-generated) with **sentence-level citations** and **human-readable explanations.**
44
- For example, when given one or more context documents, as well as an AI-generated summary, HallOumi goes through every claim being made in the summary and identifies:
45
- * The **relevant context sentences** associated with that claim.
46
- * A determination whether that particular statement is **supported or unsupported** by the provided context.
47
- * An **explanation** describing why a particular claim is supported or unsupported.
48
-
49
- **HallOumi-8B-classifier**, the hallucination classification model built with Oumi, is an end-to-end classification system that enables *fast and accurate* assessment of the hallucination probability of any written content (AI or human-generated).
50
- * ✔️ Fast
51
  * ✔️ Per-claim support (must call once per claim)
52
- * ❌ No Explanations
53
- * ❌ No Citations
54
 
55
  ## Hallucinations
56
  Hallucinations are often cited as the most important issue with being able to deploy generative models in numerous commercial and personal applications, and for good reason:
 
40
 
41
  Demo GIF: TODO
42
 
43
+
44
+ **HallOumi-8B-classifier**, the hallucination classification model built with Oumi, is an end-to-end binary classification system that enables *fast and accurate* assessment of the hallucination probability of any written content (AI or human-generated).
45
+ * ✔️ Fast with high accuracy
 
 
 
 
 
46
  * ✔️ Per-claim support (must call once per claim)
47
+
 
48
 
49
  ## Hallucinations
50
  Hallucinations are often cited as the most important issue with being able to deploy generative models in numerous commercial and personal applications, and for good reason: