Skip to content

Model not using GPU (RTX 3060 Ti), running on CPU #120

@Geeky-Yogesh

Description

@Geeky-Yogesh

I am using Kitten TTS on a system with an NVIDIA GeForce RTX 3060 Ti, but the model runs on CPU instead of GPU.

System Info

OS: Ubuntu
GPU: RTX 3060 Ti
Python: 3.12.2

Checks

import torch
print(torch.cuda.is_available())

GPU is detected (nvidia-smi works), but model still runs on CPU.

Expected
Model should use CUDA (GPU).

Request
Is GPU inference supported? If yes, how can I enable it?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions