prince-canuma's picture
Update README.md
dbe2f78 verified
|
raw
history blame
1.05 kB
---
library_name: transformers
license: apache-2.0
language:
- en
tags:
- fill-mask
- masked-lm
- long-context
- modernbert
- mlx
pipeline_tag: fill-mask
inference: false
---
# mlx-community/answerdotai-ModernBERT-base-6bit
The Model [mlx-community/answerdotai-ModernBERT-base-6bit](https://huggingface.co/mlx-community/answerdotai-ModernBERT-base-6bit) was converted to MLX format from [answerdotai/ModernBERT-base](https://huggingface.co/answerdotai/ModernBERT-base) using mlx-lm version **0.0.3**.
## Use with mlx
```bash
pip install mlx-embeddings
```
```python
from mlx_embeddings import load, generate
import mlx.core as mx
model, tokenizer = load("mlx-community/answerdotai-ModernBERT-base-6bit")
# For text embeddings
output = generate(model, processor, texts=["I like grapes", "I like fruits"])
embeddings = output.text_embeds # Normalized embeddings
# Compute dot product between normalized embeddings
similarity_matrix = mx.matmul(embeddings, embeddings.T)
print("Similarity matrix between texts:")
print(similarity_matrix)
```