Skip to content

Is there any support for AMD GPU (ROCM) #2540

Closed
@mdrokz

Description

@mdrokz

Hi i was wondering if there is any support for using llama.cpp with AMD GPU is there a ROCM implementation ?

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions