Closed
Description
Prerequisites
Please answer the following questions for yourself before submitting an issue.
- [V] I am running the latest code. Development is very rapid so there are no tagged versions as of now.
- [V] I carefully followed the README.md.
- [V] I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
- [V] I reviewed the Discussions, and have a new bug or useful enhancement to share.
Expected Behavior
I followed this step after cloning git:
On Windows:
Download the latest fortran version of [w64devkit](https://github.com/skeeto/w64devkit/releases).
Extract w64devkit on your pc.
Run w64devkit.exe.
Use the cd command to reach the llama.cpp folder.
From here you can run:
make
and expected a successful build of files i installed
Current Behavior
Instead of successful build i got an error while using 'make'. I provided full log in lower "Failure Logs" section
Environment and Context
Environment info:
E:/TextAIModels/LLama/llama.cpp $ git log | head -1
commit dc271c52ed65e7c8dfcbaaf84dabb1f788e4f3d0
CPU: АМD Ryzen 5 5600G with Radeon Graphics (12 CPUs) ~3.9GHz
E:/TextAIModels/LLama/llama.cpp $ python -V
Python 3.8.0
E:/TextAIModels/LLama/llama.cpp $ pip list | egrep "torch|numpy|sentencepiece"
numpy 1.24.2
sentencepiece 0.1.97
torch 1.13.1
torchaudio 0.13.1
torchvision 0.14.1
E:/TextAIModels/LLama/llama.cpp $ make --version | head -1
GNU Make 4.4
- Physical (or virtual) hardware you are using
I am using Windows 11
Failure Information (for bugs)
llama.cpp did not build by using 'make'
Steps to Reproduce
- I used
git clone https://github.com/ggerganov/llama.cpp
in Windows powershell in a directory where i wanted llama.cpp to be installed - I installed the latest fortran version of w64devkit and lauched it
- i reached directory where llama.cpp is located
- i typed "make" and hit enter
Failure Logs
E:/TextAIModels/LLama/llama.cpp $ make
I llama.cpp build info:
I UNAME_S: Windows_NT
I UNAME_P: unknown
I UNAME_M: i686
I CFLAGS: -I. -O3 -std=c11 -fPIC -DNDEBUG -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -march=native -mtune=native
I CXXFLAGS: -I. -I./examples -O3 -std=c++11 -fPIC -DNDEBUG -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -march=native -mtune=native
I LDFLAGS:
I CC: cc (GCC) 13.1.0
I CXX: g++ (GCC) 13.1.0
cc -I. -O3 -std=c11 -fPIC -DNDEBUG -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -march=native -mtune=native -c ggml.c -o ggml.o
g++ -I. -I./examples -O3 -std=c++11 -fPIC -DNDEBUG -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -march=native -mtune=native -c llama.cpp -o llama.o
In file included from llama.cpp:8:
llama-util.h: In constructor 'llama_mmap::llama_mmap(llama_file*, bool)':
llama-util.h:234:94: note: '#pragma message: warning: You are building for pre-Windows 8; prefetch not supported'
234 | #pragma message("warning: You are building for pre-Windows 8; prefetch not supported")
| ^
llama-util.h:202:47: warning: unused parameter 'prefetch' [-Wunused-parameter]
202 | llama_mmap(struct llama_file * file, bool prefetch = true) {
| ~~~~~^~~~~~~~~~~~~~~
llama.cpp: In function 'size_t llama_set_state_data(llama_context*, const uint8_t*)':
llama.cpp:2685:27: warning: cast from type 'const uint8_t*' {aka 'const unsigned char*'} to type 'void*' casts away qualifiers [-Wcast-qual]
2685 | kin3d->data = (void *) inp;
| ^~~~~~~~~~~~
llama.cpp:2689:27: warning: cast from type 'const uint8_t*' {aka 'const unsigned char*'} to type 'void*' casts away qualifiers [-Wcast-qual]
2689 | vin3d->data = (void *) inp;
| ^~~~~~~~~~~~
g++ -I. -I./examples -O3 -std=c++11 -fPIC -DNDEBUG -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -march=native -mtune=native -c examples/common.cpp -o common.o
examples/common.cpp:20: warning: "NOMINMAX" redefined
20 | #define NOMINMAX
|
In file included from E:/Progs/Fortran/w64devkit/lib/gcc/i686-w64-mingw32/13.1.0/include/c++/i686-w64-mingw32/bits/c++config.h:679,
from E:/Progs/Fortran/w64devkit/lib/gcc/i686-w64-mingw32/13.1.0/include/c++/bits/requires_hosted.h:31,
from E:/Progs/Fortran/w64devkit/lib/gcc/i686-w64-mingw32/13.1.0/include/c++/string:38,
from examples/common.h:7,
from examples/common.cpp:1:
E:/Progs/Fortran/w64devkit/lib/gcc/i686-w64-mingw32/13.1.0/include/c++/i686-w64-mingw32/bits/os_defines.h:45: note: this is the location of the previous definition
45 | #define NOMINMAX 1
|
examples/common.cpp: In function 'int estimateWidth(char32_t)':
examples/common.cpp:622:28: warning: unused parameter 'codepoint' [-Wunused-parameter]
622 | int estimateWidth(char32_t codepoint) {
| ~~~~~~~~~^~~~~~~~~
g++ -I. -I./examples -O3 -std=c++11 -fPIC -DNDEBUG -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -march=native -mtune=native examples/main/main.cpp ggml.o llama.o common.o -o main
examples/main/main.cpp:26: warning: "NOMINMAX" redefined
26 | #define NOMINMAX
|
In file included from E:/Progs/Fortran/w64devkit/lib/gcc/i686-w64-mingw32/13.1.0/include/c++/i686-w64-mingw32/bits/c++config.h:679,
from E:/Progs/Fortran/w64devkit/lib/gcc/i686-w64-mingw32/13.1.0/include/c++/bits/requires_hosted.h:31,
from E:/Progs/Fortran/w64devkit/lib/gcc/i686-w64-mingw32/13.1.0/include/c++/string:38,
from ./examples/common.h:7,
from examples/main/main.cpp:6:
E:/Progs/Fortran/w64devkit/lib/gcc/i686-w64-mingw32/13.1.0/include/c++/i686-w64-mingw32/bits/os_defines.h:45: note: this is the location of the previous definition
45 | #define NOMINMAX 1
|
examples/main/main.cpp: In function 'int main(int, char**)':
examples/main/main.cpp:247:31: error: invalid 'static_cast' from type 'main(int, char**)::<lambda(DWORD)>' to type 'PHANDLER_ROUTINE' {aka 'int (__attribute__((stdcall)) *)(long unsigned int)'}
247 | SetConsoleCtrlHandler(static_cast<PHANDLER_ROUTINE>(console_ctrl_handler), true);
| ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
make: *** [Makefile:205: main] Error 1
Metadata
Metadata
Assignees
Labels
No labels