Skip to content

Conversation

@JohannesGaessler
Copy link
Collaborator

Fixes #8380 .

I think the easiest solution to resolve the discrepancy between LLAMA_CCACHE and GGML_CCACHE is to simply rename the option.

@JohannesGaessler JohannesGaessler added the Review Complexity : Low Trivial changes to code that most beginner devs (or those who want a break) can tackle. e.g. UI fix label Jul 9, 2024
@github-actions github-actions bot added the build Compilation issues label Jul 9, 2024
Copy link
Member

@ggerganov ggerganov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add deprecation warnings:

# cmake
llama_option_depr(WARNING LLAMA_NO_CCACHE GGML_NO_CCACHE)
# make
ifdef LLAMA_NO_CCACHE
GGML_NO_CCACHE := 1
DEPRECATE_WARNING := 1
endif

@JohannesGaessler
Copy link
Collaborator Author

The deprecation warning for CMake would be misleading - the LLAMA_CCACHE option would be completely removed by this PR, otherwise the value of GGML_CCACHE will always be overridden. And also the deprecation warning code only prints a warning when an option is set to ON, rather than OFF. However, even without any further changes CMake will print the following:

CMake Warning:
  Manually-specified variables were not used by the project:

    LLAMA_CCACHE

Do you think that would be enough?

@ggerganov
Copy link
Member

Yes, that should be enough 👍

@JohannesGaessler JohannesGaessler force-pushed the cmake-fix-ccache-print branch from 4386b18 to 274b3fc Compare July 9, 2024 13:56
@JohannesGaessler JohannesGaessler merged commit a03e8dd into ggml-org:master Jul 9, 2024
arthw pushed a commit to arthw/llama.cpp that referenced this pull request Jul 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

build Compilation issues Review Complexity : Low Trivial changes to code that most beginner devs (or those who want a break) can tackle. e.g. UI fix

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Bug: GGML_CCACHE=OFF Doesn't Work

2 participants