Transformers
English

This is a set of sparse autoencoders (SAEs) trained on the residual stream of Llama 3 8B using the RedPajama corpus. The SAEs are organized by layer, and can be loaded using the EleutherAI sae library.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train EleutherAI/sae-llama-3-8b-32x