MSBUILD错误:MSB1009,项目文件不存在,选项:ollama_llama_server.vcxproj,llm\generate\generate_windows.go的第3行运行"powershell",退出状态为1,

tnkciper  于 2个月前  发布在  Windows
关注(0)|答案(3)|浏览(32)

问题是什么?

我正在尝试在Windows上编译,但是遇到了这个错误:

PS C:\my_cpp_projects\ollama> Install-Module -Name ThreadJob -Scope CurrentUser

Untrusted repository
You are installing the modules from an untrusted repository. If you trust this repository, change its
InstallationPolicy value by running the Set-PSRepository cmdlet. Are you sure you want to install the modules from
'PSGallery'?
[Y] Yes  [A] Yes to All  [N] No  [L] No to All  [S] Suspend  [?] Help (default is "N"): a
PS C:\my_cpp_projects\ollama> go generate ./...
Already on 'minicpm-v2.5'
Your branch is up to date with 'origin/minicpm-v2.5'.
Submodule path '../llama.cpp': checked out '65f7455cea443bd9b6fd8546ef53440d6f6d963f'
Checking for MinGW...

CommandType     Name                                               Version    Source
-----------     ----                                               -------    ------
Application     gcc.exe                                            0.0.0.0    C:\mingw64\bin\gcc.exe
Application     mingw32-make.exe                                   0.0.0.0    C:\mingw64\bin\mingw32-make.exe
Building static library
generating config with: cmake -S ../llama.cpp -B ../build/windows/amd64_static -G MinGW Makefiles -DCMAKE_C_COMPILER=gcc.exe -DCMAKE_CXX_COMPILER=g++.exe -DBUILD_SHARED_LIBS=off -DLLAMA_NATIVE=off -DLLAMA_AVX=off -DLLAMA_AVX2=off -DLLAMA_AVX512=off -DLLAMA_F16C=off -DLLAMA_FMA=off
cmake version 3.30.2

CMake suite maintained and supported by Kitware (kitware.com/cmake).
-- OpenMP found
-- Warning: ccache not found - consider installing it for faster compilation or disable this warning with LLAMA_CCACHE=OFF
-- CMAKE_SYSTEM_PROCESSOR: AMD64
-- x86 detected
-- Configuring done (0.6s)
-- Generating done (4.9s)
-- Build files have been written to: C:/my_cpp_projects/ollama/llm/build/windows/amd64_static
building with: cmake --build ../build/windows/amd64_static --config Release --target llama --target ggml
[  0%] Building C object CMakeFiles/ggml.dir/ggml.c.obj
C:\my_cpp_projects\ollama\llm\llama.cpp\ggml.c:84:8: warning: type qualifiers ignored on function return type [-Wignored-qualifiers]
   84 | static atomic_bool atomic_flag_test_and_set(atomic_flag * ptr) {
      |        ^~~~~~~~~~~
[ 16%] Building C object CMakeFiles/ggml.dir/ggml-alloc.c.obj
[ 16%] Building C object CMakeFiles/ggml.dir/ggml-backend.c.obj
[ 33%] Building C object CMakeFiles/ggml.dir/ggml-quants.c.obj
[ 50%] Building CXX object CMakeFiles/ggml.dir/sgemm.cpp.obj
[ 50%] Built target ggml
[ 66%] Building CXX object CMakeFiles/llama.dir/llama.cpp.obj
C:\my_cpp_projects\ollama\llm\llama.cpp\llama.cpp: In member function 'std::string llama_file::GetErrorMessageWin32(DWORD) const':
C:\my_cpp_projects\ollama\llm\llama.cpp\llama.cpp:1319:46: warning: format '%s' expects argument of type 'char*', but argument 2 has type 'DWORD' {aka 'long unsigned int'} [-Wformat=]
 1319 |             ret = format("Win32 error code: %s", error_code);
      |                                             ~^   ~~~~~~~~~~
      |                                              |   |
      |                                              |   DWORD {aka long unsigned int}
      |                                              char*
      |                                             %ld
C:\my_cpp_projects\ollama\llm\llama.cpp\llama.cpp: In constructor 'llama_mmap::llama_mmap(llama_file*, size_t, bool)':
C:\my_cpp_projects\ollama\llm\llama.cpp\llama.cpp:1657:38: warning: cast between incompatible function types from 'FARPROC' {aka 'long long int (*)()'} to 'BOOL (*)(HANDLE, ULONG_PTR, PWIN32_MEMORY_RANGE_ENTRY, ULONG)' {aka 'int (*)(void*, long long unsigned int, _WIN32_MEMORY_RANGE_ENTRY*, long unsigned int)'} [-Wcast-function-type]
 1657 |             pPrefetchVirtualMemory = reinterpret_cast<decltype(pPrefetchVirtualMemory)> (GetProcAddress(hKernel32, "PrefetchVirtualMemory"));
      |                                      ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
C:\my_cpp_projects\ollama\llm\llama.cpp\llama.cpp: In function 'float* llama_get_logits_ith(llama_context*, int32_t)':
C:\my_cpp_projects\ollama\llm\llama.cpp\llama.cpp:18512:65: warning: format '%lu' expects argument of type 'long unsigned int', but argument 2 has type 'std::vector<int>::size_type' {aka 'long long unsigned int'} [-Wformat=]
18512 |             throw std::runtime_error(format("out of range [0, %lu)", ctx->output_ids.size()));
      |                                                               ~~^    ~~~~~~~~~~~~~~~~~~~~~~
      |                                                                 |                        |
      |                                                                 long unsigned int        std::vector<int>::size_type {aka long long unsigned int}
      |                                                               %llu
C:\my_cpp_projects\ollama\llm\llama.cpp\llama.cpp: In function 'float* llama_get_embeddings_ith(llama_context*, int32_t)':
C:\my_cpp_projects\ollama\llm\llama.cpp\llama.cpp:18557:65: warning: format '%lu' expects argument of type 'long unsigned int', but argument 2 has type 'std::vector<int>::size_type' {aka 'long long unsigned int'} [-Wformat=]
18557 |             throw std::runtime_error(format("out of range [0, %lu)", ctx->output_ids.size()));
      |                                                               ~~^    ~~~~~~~~~~~~~~~~~~~~~~
      |                                                                 |                        |
      |                                                                 long unsigned int        std::vector<int>::size_type {aka long long unsigned int}
      |                                                               %llu
[ 83%] Building CXX object CMakeFiles/llama.dir/unicode.cpp.obj
[ 83%] Building CXX object CMakeFiles/llama.dir/unicode-data.cpp.obj
[100%] Linking CXX static library libllama.a
[100%] Built target llama
[100%] Built target ggml
Building LCD CPU
generating config with: cmake -S ../llama.cpp -B ../build/windows/amd64/cpu -DCMAKE_POSITION_INDEPENDENT_CODE=on -A x64 -DLLAMA_AVX=off -DLLAMA_AVX2=off -DLLAMA_AVX512=off -DLLAMA_FMA=off -DLLAMA_F16C=off -DBUILD_SHARED_LIBS=on -DLLAMA_NATIVE=off -DLLAMA_SERVER_VERBOSE=off -DCMAKE_BUILD_TYPE=Release
cmake version 3.30.2

CMake suite maintained and supported by Kitware (kitware.com/cmake).
-- Building for: Visual Studio 17 2022
-- Selecting Windows SDK version 10.0.22621.0 to target Windows 10.0.19045.
-- The C compiler identification is MSVC 19.40.33813.0
-- The CXX compiler identification is MSVC 19.40.33813.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.40.33807/bin/Hostx64/x64/cl.exe - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.40.33807/bin/Hostx64/x64/cl.exe - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.46.0.windows.1")
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
-- Looking for pthread_create in pthreads
-- Looking for pthread_create in pthreads - not found
-- Looking for pthread_create in pthread
-- Looking for pthread_create in pthread - not found
-- Found Threads: TRUE
-- Found OpenMP_C: -openmp (found version "2.0")
-- Found OpenMP_CXX: -openmp (found version "2.0")
-- Found OpenMP: TRUE (found version "2.0")
-- OpenMP found
-- Warning: ccache not found - consider installing it for faster compilation or disable this warning with LLAMA_CCACHE=OFF
-- CMAKE_SYSTEM_PROCESSOR: AMD64
-- CMAKE_GENERATOR_PLATFORM: x64
-- x86 detected
-- Configuring done (28.0s)
-- Generating done (1.1s)
CMake Warning:
  Manually-specified variables were not used by the project:

    LLAMA_F16C

-- Build files have been written to: C:/my_cpp_projects/ollama/llm/build/windows/amd64/cpu
building with: cmake --build ../build/windows/amd64/cpu --config Release --target ollama_llama_server
Versão do MSBuild 17.10.4+10fbfbf2e para .NET Framework
MSBUILD : error MSB1009: Arquivo de projeto não existe.
Opção: ollama_llama_server.vcxproj
llm\generate\generate_windows.go:3: running "powershell": exit status 1
PS C:\my_cpp_projects\ollama>

操作系统

Windows

GPU

Intel

CPU

Intel

Ollama版本

  • 无响应*
yi0zb3m4

yi0zb3m41#

你安装了哪个版本的Visual Studio?

aiazj4mn

aiazj4mn2#

最新版本已下载并安装,昨天在一台干净的机器上安装了它。

go版本:go1.22.6(Windows/AMD64)
gcc.exe (GCC) 13.2.0
g++.exe (GCC) 13.2.0
MinGW发行版:https://nuwen.net/mingw.html
PowerShell 7.4.3

brqmpdu1

brqmpdu13#

这可能是一个cmake生成器问题。我正在使用MSVC cmake

> (get-command cmake).source
C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\CommonExtensions\Microsoft\CMake\CMake\bin\cmake.ex

。尝试从MSVC开发人员Shell构建,看看是否能解决问题。(只需确保gogcc出现在你的PATH中)

相关问题