Replit-v2-CodeInstruct
vs Replit-v1-CodeInstruct
#5
Unanswered
matthiasgeihs
asked this question in
Q&A
Replies: 1 comment 1 reply
-
Hi, v1 model is trained on 512 sequence length. V2 is trained on 2000 sequence length. The longer sequence training from what I know wasn’t as effective because of the training data used |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Why does
Replit-v2-CodeInstruct
perform significantly worse thanReplit-v1-CodeInstruct
?(21.5% vs 25.8% pass@1)
Beta Was this translation helpful? Give feedback.
All reactions