Skip to content

RTX3080 - MultiGPU? #31

@Moonlight63

Description

@Moonlight63

First, I love this project. I have seen what others are making with it and it seems really powerful for fine tuning exactly the prompt you want.
The only way I have been able to get this to run locally on my 3080 10G is by lowering the resolution in the prompt to 352, 1 batch. It looks like others have gotten it to work, but is everyone just using colab? I'd like to run it locally. I have 2 3080s so theoretically I should have enough vram between them, but it looks like this isn't set up to train on both. I tried to add this myself with using torches DataParallel, but unfortunately I have no idea what I am doing when it comes to coding ML stuff yet. Any chance of getting multigpu to work or any advise on lowering vram usage?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions