The prerequisites in setup.md says "32Gb vRAM", but it should be "32GB vRAM" instead. Because 32Gb equals to 4GB, the difference in letters may mislead researchers who want to use the model to deploy it on a GPU with insufficient VRAM, which can cause "CUDA out of memory" error.