Skip to content

Is it possible to run Llama 3.1 without HF libraries ? #625

Closed
@whatdhack

Description

@whatdhack

System Info

latest , linux

Information

  • The official example scripts
  • My own modified scripts

🐛 Describe the bug

Looking through the llama-recipes, saw references to HuggingFace libraries in many places. In Llama 3.0 , it was possible to run the 8B and 70B without using any HF libraries such as transformers etc.

For example this - https://github.com/meta-llama/llama3/blob/main/example_text_completion.py

Error logs

N/A

Expected behavior

Retain ability to run with a minimal just PyTorch stack.

Metadata

Metadata

Assignees

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions