diff options
author | grahameth <96447521+grahameth@users.noreply.github.com> | 2023-08-09 22:46:40 +0200 |
---|---|---|
committer | GitHub <noreply@github.com> | 2023-08-09 22:46:40 +0200 |
commit | ea04a4ca1940d92becc0ee26523aa2c4a18cf938 (patch) | |
tree | 602ab25bb31813889c1ea7dd0408b0984715df71 /llama.h | |
parent | 25d43e0eb578b6e73046d9d6644a3a14d460600d (diff) |
add log_callback to llama_context_params for custom logging. (#2234)
* add log_callback to llama_context_params for custom logging.
* Fix macro expansion on gcc
* Add struct llama_state for global variables and move log_callback there
* Turn log level into enum and some minor changes.
* Remove model_for_logging parameter (not needed anymore)
* Convert remaining fprintf(stderr, ...) calls to use new macros.
* Fix enum and initialize g_state
* Fix log calls after merge
* Fix missing static
* Add back all the new lines in the logging strings
* Add comment for llama_log_callback and replace remaining printf calls
---------
Co-authored-by: grahameth <->
Co-authored-by: Helmut <helmut.buhler@inf.h-brs.de>
Diffstat (limited to 'llama.h')
-rw-r--r-- | llama.h | 19 |
1 files changed, 18 insertions, 1 deletions
@@ -86,7 +86,20 @@ extern "C" { typedef void (*llama_progress_callback)(float progress, void *ctx); - struct llama_context_params { + enum llama_log_level { + LLAMA_LOG_LEVEL_ERROR = 2, + LLAMA_LOG_LEVEL_WARN = 3, + LLAMA_LOG_LEVEL_INFO = 4 + }; + + // Signature for logging events + // Note that text includes the new line character at the end for most events. + // If your logging mechanism cannot handle that, check if the last character is '\n' and strip it + // if it exists. + // It might not exist for progress report where '.' is output repeatedly. + typedef void (*llama_log_callback)(llama_log_level level, const char * text, void * user_data); + + struct llama_context_params { uint32_t seed; // RNG seed, -1 for random int32_t n_ctx; // text context int32_t n_batch; // prompt processing batch size @@ -195,6 +208,10 @@ extern "C" { int32_t n_eval; }; + // Set callback for all future logging events. + // If this is not called, or NULL is supplied, everything is output on stderr. + LLAMA_API void llama_log_set(llama_log_callback log_callback, void * user_data); + LLAMA_API int llama_max_devices(); LLAMA_API struct llama_context_params llama_context_default_params(); |