We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
This error is on M3 Max inside docker container with linux. Docker file:
FROM python:3.10.12-slim-buster USER root RUN apt-get update && apt-get install cmake libopenblas-dev build-essential pkg-config git -y WORKDIR /opt COPY ./requirements/cpu.requirements.txt ./requirements.txt RUN CMAKE_ARGS="-DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS" pip3 install --upgrade -r requirements.txt COPY infra/llm_server_cpu/server_config.json server_config.json EXPOSE 8085 CMD ["python3", "-m", "llama_cpp.server", "--config_file", "server_config.json"]
The error trace:
2 × Building wheel for llama-cpp-python (pyproject.toml) did not run successfully. 25.92 │ exit code: 1 25.92 ╰─> [158 lines of output] 25.92 *** scikit-build-core 0.9.3 using CMake 3.29.2 (wheel) 25.92 *** Configuring CMake... 25.92 loading initial cache file /tmp/tmph68_ek8q/build/CMakeInit.txt 25.92 -- The C compiler identification is GNU 8.3.0 25.92 -- The CXX compiler identification is GNU 8.3.0 25.92 -- Detecting C compiler ABI info 25.92 -- Detecting C compiler ABI info - done 25.92 -- Check for working C compiler: /usr/bin/cc - skipped 25.92 -- Detecting C compile features 25.92 -- Detecting C compile features - done 25.92 -- Detecting CXX compiler ABI info 25.92 -- Detecting CXX compiler ABI info - done 25.92 -- Check for working CXX compiler: /usr/bin/c++ - skipped 25.92 -- Detecting CXX compile features 25.92 -- Detecting CXX compile features - done 25.92 -- Found Git: /usr/bin/git (found version "2.20.1") 25.92 -- Performing Test CMAKE_HAVE_LIBC_PTHREAD 25.92 -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed 25.92 -- Check if compiler accepts -pthread 25.92 -- Check if compiler accepts -pthread - yes 25.92 -- Found Threads: TRUE 25.92 -- Looking for sgemm_ 25.92 -- Looking for sgemm_ - found 25.92 -- Found BLAS: /usr/lib/aarch64-linux-gnu/libopenblas.so 25.92 -- BLAS found, Libraries: /usr/lib/aarch64-linux-gnu/libopenblas.so 25.92 -- Found PkgConfig: /usr/bin/pkg-config (found version "0.29") 25.92 -- Checking for module 'openblas64' 25.92 -- No package 'openblas64' found 25.92 -- Checking for module 'openblas' 25.92 -- Found openblas, version 0.3.5 25.92 -- BLAS found, Includes: /usr/include/aarch64-linux-gnu 25.92 -- Warning: ccache not found - consider installing it for faster compilation or disable this warning with LLAMA_CCACHE=OFF 25.92 -- CMAKE_SYSTEM_PROCESSOR: aarch64 25.92 -- ARM detected 25.92 -- Performing Test COMPILER_SUPPORTS_FP16_FORMAT_I3E 25.92 -- Performing Test COMPILER_SUPPORTS_FP16_FORMAT_I3E - Failed 25.92 CMake Warning (dev) at CMakeLists.txt:26 (install): 25.92 Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION. 25.92 This warning is for project developers. Use -Wno-dev to suppress it. 25.92 25.92 CMake Warning (dev) at CMakeLists.txt:35 (install): 25.92 Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION. 25.92 This warning is for project developers. Use -Wno-dev to suppress it. 25.92 25.92 -- Configuring done (0.3s) 25.92 -- Generating done (0.0s) 25.92 -- Build files have been written to: /tmp/tmph68_ek8q/build 25.92 *** Building project with Ninja... 25.92 Change Dir: '/tmp/tmph68_ek8q/build' 25.92 25.92 Run Build Command(s): /tmp/pip-build-env-tb00ftrb/normal/lib/python3.10/site-packages/ninja/data/bin/ninja -v 25.92 [1/27] cd /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp && /tmp/pip-build-env-tb00ftrb/normal/lib/python3.10/site-packages/cmake/data/bin/cmake -DMSVC= -DCMAKE_C_COMPILER_VERSION=8.3.0 -DCMAKE_C_COMPILER_ID=GNU -DCMAKE_VS_PLATFORM_NAME= -DCMAKE_C_COMPILER=/usr/bin/cc -P /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/common/../scripts/gen-build-info-cpp.cmake 25.92 -- Found Git: /usr/bin/git (found version "2.20.1") 25.92 [2/27] /usr/bin/c++ -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_LLAMAFILE -DGGML_USE_OPENBLAS -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -O3 -DNDEBUG -std=gnu++11 -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-array-bounds -Wno-format-truncation -Wextra-semi -MD -MT vendor/llama.cpp/common/CMakeFiles/build_info.dir/build-info.cpp.o -MF vendor/llama.cpp/common/CMakeFiles/build_info.dir/build-info.cpp.o.d -o vendor/llama.cpp/common/CMakeFiles/build_info.dir/build-info.cpp.o -c /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/common/build-info.cpp 25.92 [3/27] /usr/bin/cc -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_LLAMAFILE -DGGML_USE_OPENBLAS -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -pthread -MD -MT vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-quants.c.o -MF vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-quants.c.o.d -o vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-quants.c.o -c /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c 25.92 FAILED: vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-quants.c.o 25.92 /usr/bin/cc -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_LLAMAFILE -DGGML_USE_OPENBLAS -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -pthread -MD -MT vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-quants.c.o -MF vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-quants.c.o.d -o vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-quants.c.o -c /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c 25.92 In file included from /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c:5: 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c: In function ‘ggml_vec_dot_q3_K_q8_K’: 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-impl.h:293:27: error: implicit declaration of function ‘vld1q_s8_x4’; did you mean ‘vld1q_s8_x2’? [-Werror=implicit-function-declaration] 25.92 #define ggml_vld1q_s8_x4 vld1q_s8_x4 25.92 ^~~~~~~~~~~ 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c:5443:48: note: in expansion of macro ‘ggml_vld1q_s8_x4’ 25.92 const ggml_int8x16x4_t q8bytes_1 = ggml_vld1q_s8_x4(q8); q8 += 64; 25.92 ^~~~~~~~~~~~~~~~ 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-impl.h:293:27: error: invalid initializer 25.92 #define ggml_vld1q_s8_x4 vld1q_s8_x4 25.92 ^~~~~~~~~~~ 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c:5443:48: note: in expansion of macro ‘ggml_vld1q_s8_x4’ 25.92 const ggml_int8x16x4_t q8bytes_1 = ggml_vld1q_s8_x4(q8); q8 += 64; 25.92 ^~~~~~~~~~~~~~~~ 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-impl.h:293:27: error: invalid initializer 25.92 #define ggml_vld1q_s8_x4 vld1q_s8_x4 25.92 ^~~~~~~~~~~ 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c:5444:48: note: in expansion of macro ‘ggml_vld1q_s8_x4’ 25.92 const ggml_int8x16x4_t q8bytes_2 = ggml_vld1q_s8_x4(q8); q8 += 64; 25.92 ^~~~~~~~~~~~~~~~ 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c: In function ‘ggml_vec_dot_q5_K_q8_K’: 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-impl.h:293:27: error: invalid initializer 25.92 #define ggml_vld1q_s8_x4 vld1q_s8_x4 25.92 ^~~~~~~~~~~ 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c:6928:46: note: in expansion of macro ‘ggml_vld1q_s8_x4’ 25.92 const ggml_int8x16x4_t q8bytes = ggml_vld1q_s8_x4(q8); q8 += 64; 25.92 ^~~~~~~~~~~~~~~~ 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c: In function ‘ggml_vec_dot_q6_K_q8_K’: 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-impl.h:291:27: error: implicit declaration of function ‘vld1q_u8_x4’; did you mean ‘vld1q_u8_x2’? [-Werror=implicit-function-declaration] 25.92 #define ggml_vld1q_u8_x4 vld1q_u8_x4 25.92 ^~~~~~~~~~~ 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c:7610:40: note: in expansion of macro ‘ggml_vld1q_u8_x4’ 25.92 ggml_uint8x16x4_t q6bits = ggml_vld1q_u8_x4(q6); q6 += 64; 25.92 ^~~~~~~~~~~~~~~~ 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-impl.h:291:27: error: invalid initializer 25.92 #define ggml_vld1q_u8_x4 vld1q_u8_x4 25.92 ^~~~~~~~~~~ 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c:7610:40: note: in expansion of macro ‘ggml_vld1q_u8_x4’ 25.92 ggml_uint8x16x4_t q6bits = ggml_vld1q_u8_x4(q6); q6 += 64; 25.92 ^~~~~~~~~~~~~~~~ 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-impl.h:293:27: error: invalid initializer 25.92 #define ggml_vld1q_s8_x4 vld1q_s8_x4 25.92 ^~~~~~~~~~~ 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c:7611:40: note: in expansion of macro ‘ggml_vld1q_s8_x4’ 25.92 ggml_int8x16x4_t q8bytes = ggml_vld1q_s8_x4(q8); q8 += 64; 25.92 ^~~~~~~~~~~~~~~~ 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c:7636:21: error: incompatible types when assigning to type ‘int8x16x4_t’ {aka ‘struct int8x16x4_t’} from type ‘int’ 25.92 q8bytes = ggml_vld1q_s8_x4(q8); q8 += 64; 25.92 ^ 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c: In function ‘ggml_vec_dot_iq2_xxs_q8_K’: 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c:8365:17: error: incompatible types when assigning to type ‘int8x16x4_t’ {aka ‘struct int8x16x4_t’} from type ‘int’ 25.92 q8b = ggml_vld1q_s8_x4(q8); q8 += 64; 25.92 ^ 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c: In function ‘ggml_vec_dot_iq2_xs_q8_K’: 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c:8503:17: error: incompatible types when assigning to type ‘int8x16x4_t’ {aka ‘struct int8x16x4_t’} from type ‘int’ 25.92 q8b = ggml_vld1q_s8_x4(q8); q8 += 64; 25.92 ^ 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c: In function ‘ggml_vec_dot_iq2_s_q8_K’: 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c:8787:17: error: incompatible types when assigning to type ‘int8x16x4_t’ {aka ‘struct int8x16x4_t’} from type ‘int’ 25.92 q8b = ggml_vld1q_s8_x4(q8); q8 += 64; 25.92 ^ 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c: In function ‘ggml_vec_dot_iq3_xxs_q8_K’: 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c:8979:17: error: incompatible types when assigning to type ‘int8x16x4_t’ {aka ‘struct int8x16x4_t’} from type ‘int’ 25.92 q8b = ggml_vld1q_s8_x4(q8); q8 += 64; 25.92 ^ 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c: In function ‘ggml_vec_dot_iq3_s_q8_K’: 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c:9144:17: error: incompatible types when assigning to type ‘int8x16x4_t’ {aka ‘struct int8x16x4_t’} from type ‘int’ 25.92 q8b = ggml_vld1q_s8_x4(q8); q8 += 64; 25.92 ^ 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c: In function ‘ggml_vec_dot_iq1_s_q8_K’: 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c:9370:17: error: incompatible types when assigning to type ‘int8x16x4_t’ {aka ‘struct int8x16x4_t’} from type ‘int’ 25.92 q8b = ggml_vld1q_s8_x4(q8); q8 += 64; 25.92 ^ 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c: In function ‘ggml_vec_dot_iq1_m_q8_K’: 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c:9529:17: error: incompatible types when assigning to type ‘int8x16x4_t’ {aka ‘struct int8x16x4_t’} from type ‘int’ 25.92 q8b = ggml_vld1q_s8_x4(q8); q8 += 64; 25.92 ^ 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c: In function ‘ggml_vec_dot_iq4_xs_q8_K’: 25.92 /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-quants.c:9837:20: error: incompatible types when assigning to type ‘int8x16x4_t’ {aka ‘struct int8x16x4_t’} from type ‘int’ 25.92 q8b = ggml_vld1q_s8_x4(q8); q8 += 64; 25.92 ^ 25.92 cc1: some warnings being treated as errors 25.92 [4/27] /usr/bin/cc -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_LLAMAFILE -DGGML_USE_OPENBLAS -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -pthread -MD -MT vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-alloc.c.o -MF vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-alloc.c.o.d -o vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-alloc.c.o -c /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-alloc.c 25.92 [5/27] /usr/bin/c++ -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_LLAMAFILE -DGGML_USE_OPENBLAS -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/common/. -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-array-bounds -Wno-format-truncation -Wextra-semi -MD -MT vendor/llama.cpp/common/CMakeFiles/common.dir/console.cpp.o -MF vendor/llama.cpp/common/CMakeFiles/common.dir/console.cpp.o.d -o vendor/llama.cpp/common/CMakeFiles/common.dir/console.cpp.o -c /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/common/console.cpp 25.92 [6/27] /usr/bin/cc -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_LLAMAFILE -DGGML_USE_OPENBLAS -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -pthread -MD -MT vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-backend.c.o -MF vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-backend.c.o.d -o vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-backend.c.o -c /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml-backend.c 25.92 [7/27] /usr/bin/c++ -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_LLAMAFILE -DGGML_USE_OPENBLAS -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/common/. -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-array-bounds -Wno-format-truncation -Wextra-semi -MD -MT vendor/llama.cpp/common/CMakeFiles/common.dir/grammar-parser.cpp.o -MF vendor/llama.cpp/common/CMakeFiles/common.dir/grammar-parser.cpp.o.d -o vendor/llama.cpp/common/CMakeFiles/common.dir/grammar-parser.cpp.o -c /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/common/grammar-parser.cpp 25.92 [8/27] /usr/bin/c++ -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_LLAMAFILE -DGGML_USE_OPENBLAS -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/common/. -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-array-bounds -Wno-format-truncation -Wextra-semi -MD -MT vendor/llama.cpp/common/CMakeFiles/common.dir/sampling.cpp.o -MF vendor/llama.cpp/common/CMakeFiles/common.dir/sampling.cpp.o.d -o vendor/llama.cpp/common/CMakeFiles/common.dir/sampling.cpp.o -c /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/common/sampling.cpp 25.92 [9/27] /usr/bin/c++ -DLLAMA_BUILD -DLLAMA_SHARED -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/examples/llava/. -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/examples/llava/../.. -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/examples/llava/../../common -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/. -O3 -DNDEBUG -fPIC -Wno-cast-qual -pthread -MD -MT vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o -MF vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o.d -o vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o -c /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/examples/llava/llava.cpp 25.92 [10/27] /usr/bin/c++ -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_LLAMAFILE -DGGML_USE_OPENBLAS -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-array-bounds -Wno-format-truncation -Wextra-semi -pthread -MD -MT vendor/llama.cpp/CMakeFiles/ggml.dir/sgemm.cpp.o -MF vendor/llama.cpp/CMakeFiles/ggml.dir/sgemm.cpp.o.d -o vendor/llama.cpp/CMakeFiles/ggml.dir/sgemm.cpp.o -c /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/sgemm.cpp 25.92 [11/27] /usr/bin/c++ -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_LLAMAFILE -DGGML_USE_OPENBLAS -DLLAMA_BUILD -DLLAMA_SHARED -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -Dllama_EXPORTS -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-array-bounds -Wno-format-truncation -Wextra-semi -pthread -MD -MT vendor/llama.cpp/CMakeFiles/llama.dir/unicode-data.cpp.o -MF vendor/llama.cpp/CMakeFiles/llama.dir/unicode-data.cpp.o.d -o vendor/llama.cpp/CMakeFiles/llama.dir/unicode-data.cpp.o -c /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/unicode-data.cpp 25.92 [12/27] /usr/bin/c++ -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_LLAMAFILE -DGGML_USE_OPENBLAS -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/common/. -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-array-bounds -Wno-format-truncation -Wextra-semi -MD -MT vendor/llama.cpp/common/CMakeFiles/common.dir/train.cpp.o -MF vendor/llama.cpp/common/CMakeFiles/common.dir/train.cpp.o.d -o vendor/llama.cpp/common/CMakeFiles/common.dir/train.cpp.o -c /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/common/train.cpp 25.92 [13/27] /usr/bin/cc -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_LLAMAFILE -DGGML_USE_OPENBLAS -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -pthread -MD -MT vendor/llama.cpp/CMakeFiles/ggml.dir/ggml.c.o -MF vendor/llama.cpp/CMakeFiles/ggml.dir/ggml.c.o.d -o vendor/llama.cpp/CMakeFiles/ggml.dir/ggml.c.o -c /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/ggml.c 25.92 [14/27] /usr/bin/c++ -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_LLAMAFILE -DGGML_USE_OPENBLAS -DLLAMA_BUILD -DLLAMA_SHARED -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -Dllama_EXPORTS -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-array-bounds -Wno-format-truncation -Wextra-semi -pthread -MD -MT vendor/llama.cpp/CMakeFiles/llama.dir/unicode.cpp.o -MF vendor/llama.cpp/CMakeFiles/llama.dir/unicode.cpp.o.d -o vendor/llama.cpp/CMakeFiles/llama.dir/unicode.cpp.o -c /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/unicode.cpp 25.92 [15/27] /usr/bin/c++ -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_LLAMAFILE -DGGML_USE_OPENBLAS -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/common/. -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-array-bounds -Wno-format-truncation -Wextra-semi -MD -MT vendor/llama.cpp/common/CMakeFiles/common.dir/json-schema-to-grammar.cpp.o -MF vendor/llama.cpp/common/CMakeFiles/common.dir/json-schema-to-grammar.cpp.o.d -o vendor/llama.cpp/common/CMakeFiles/common.dir/json-schema-to-grammar.cpp.o -c /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/common/json-schema-to-grammar.cpp 25.92 [16/27] /usr/bin/c++ -DLLAMA_BUILD -DLLAMA_SHARED -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/examples/llava/. -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/examples/llava/../.. -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/examples/llava/../../common -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/. -O3 -DNDEBUG -fPIC -Wno-cast-qual -pthread -MD -MT vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o -MF vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o.d -o vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o -c /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/examples/llava/clip.cpp 25.92 [17/27] /usr/bin/c++ -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_LLAMAFILE -DGGML_USE_OPENBLAS -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/common/. -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-array-bounds -Wno-format-truncation -Wextra-semi -MD -MT vendor/llama.cpp/common/CMakeFiles/common.dir/common.cpp.o -MF vendor/llama.cpp/common/CMakeFiles/common.dir/common.cpp.o.d -o vendor/llama.cpp/common/CMakeFiles/common.dir/common.cpp.o -c /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/common/common.cpp 25.92 [18/27] /usr/bin/c++ -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_LLAMAFILE -DGGML_USE_OPENBLAS -DLLAMA_BUILD -DLLAMA_SHARED -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -Dllama_EXPORTS -I/tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-array-bounds -Wno-format-truncation -Wextra-semi -pthread -MD -MT vendor/llama.cpp/CMakeFiles/llama.dir/llama.cpp.o -MF vendor/llama.cpp/CMakeFiles/llama.dir/llama.cpp.o.d -o vendor/llama.cpp/CMakeFiles/llama.dir/llama.cpp.o -c /tmp/pip-install-tuyxmx5n/llama-cpp-python_2d9d4194a33a41ab980c0abe2b61c087/vendor/llama.cpp/llama.cpp 25.92 ninja: build stopped: subcommand failed. 25.92 25.92 25.92 *** CMake build failed 25.92 [end of output]
The text was updated successfully, but these errors were encountered:
No branches or pull requests
This error is on M3 Max inside docker container with linux.
Docker file:
The error trace:
The text was updated successfully, but these errors were encountered: