index
:
llama.cpp.git
master
llama.cpp
user
about
summary
refs
log
tree
commit
diff
log msg
author
committer
range
path:
root
/
examples
Age
Commit message (
Expand
)
Author
2023-03-28
main.cpp fixes, refactoring (#571)
anzz1
2023-03-27
Fix missing ggml link in cmake for examples/* on w64-mingw32 (#542)
Marco Matthies
2023-03-26
Update README and comments for standalone perplexity tool (#525)
Stephan Walter
2023-03-26
[main] fix infinite generation (-n == -1) (#523)
anzz1
2023-03-26
Exit from interactive mode if input stream is bad (#491)
Harald Fernengel
2023-03-25
(Windows) Set console to UTF-8 on init (#420)
anzz1
2023-03-25
Fix colors enabling on WIN32
Georgi Gerganov
2023-03-25
If n_predict == -1, generate forever
Georgi Gerganov
2023-03-25
Inifinite generation via context swapping (#71)
Georgi Gerganov
2023-03-25
Cleanup STL headers + fix embedding examples + minor stuff
Georgi Gerganov
2023-03-25
Move chat scripts into "./examples"
Georgi Gerganov
2023-03-25
Overhaul the examples structure
Georgi Gerganov
2023-03-24
Immediately start processing the prompt before user input has been provided (...
Georgi Gerganov
2023-03-21
fix typo in chatLLaMa (#368)
Mathieu Nayrolles
2023-03-21
Add chatLLaMa script (#198)
Jean-Christophe Hoelt