Hi, thank you for the great work on Parametric RAG!
I’m curious about the disk space requirements for storing the parametric encodings of the Wikipedia corpus. Could you please share approximately how much disk space is needed to store the entire encoded version of Wikipedia as used in your experiments?
This would help us better understand the resource demands for deploying your approach at scale.
Thanks in advance!