You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
1. Cache activations for the first 10 million tokens of the dataset.
23
-
2. Generate explanations for the first 100 features of layer 5 using the specified explainer model.
22
+
1. Cache activations for the first 10 million tokens of the default dataset, `EleutherAI/SmolLM2-135M-10B`.
23
+
2. Generate explanations for the first 100 features of layer 5 using the default explainer model, `hugging-quants/Meta-Llama-3.1-70B-Instruct-AWQ-INT4`.
24
24
3. Score the explanations using the detection scorer.
25
25
4. Log summary metrics including per-scorer F1 scores and confusion matrices, and produce histograms of the scorer classification accuracies.
26
26
@@ -36,7 +36,7 @@ The first step to generate explanations is to cache sparse model activations. To
36
36
from sparsify.data import chunk_and_tokenize
37
37
from delphi.latents import LatentCache
38
38
39
-
data = load_dataset("EleutherAI/rpj-v2-sample", split="train[:1%]")
39
+
data = load_dataset("EleutherAI/SmolLM2-135M-10B", split="train[:1%]")
0 commit comments