Skip to content

feat: cuda support in transformers backend #1344

Open
@kno10

Description

@kno10

The transformers backend does not appear to use the GPU.
I could not get any of the settings such as cuda: true or device: cuda to work. In the code, I could not see any of these options being used in the backend, either, yet.
Maybe copy the relevant lines from the diffusers backend?

if request.CUDA:
self.pipe.to('cuda')
# Assume directory from request.ModelFile.
# Only if request.LoraAdapter it's not an absolute path
if request.LoraAdapter and request.ModelFile != "" and not os.path.isabs(request.LoraAdapter) and request.LoraAdapter:
# get base path of modelFile
modelFileBase = os.path.dirname(request.ModelFile)
# modify LoraAdapter to be relative to modelFileBase
request.LoraAdapter = os.path.join(modelFileBase, request.LoraAdapter)
device = "cpu" if not request.CUDA else "cuda"
self.device = device

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or requestroadmapup for grabsTickets that no-one is currently working on

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions