index
:
llama.cpp.git
master
llama.cpp
user
about
summary
refs
log
tree
commit
diff
log msg
author
committer
range
path:
root
/
examples
/
main
Age
Commit message (
Expand
)
Author
2023-06-04
llama : Metal inference (#1642)
Georgi Gerganov
2023-06-03
Fix prompt cache saving and chat-persistent rollover (#1678)
Evan Jones
2023-05-29
Work around for recalculating logits in cached prompts (Fixes #1585) (#1609)
DannyDaemonic
2023-05-28
Only show -ngl option when relevant + other doc/arg handling updates (#1625)
Kerfuffle
2023-05-25
Some improvements to loading the session with --prompt-cache (#1550)
Kerfuffle
2023-05-20
llama : add llama_init_backend() API (close #1527)
Georgi Gerganov
2023-05-19
main : make reverse prompt option act as a stop token in non-interactive mode...
Jason McCartney
2023-05-18
Fixes #1511 lambda issue for w64devkit (mingw) (#1513)
DannyDaemonic
2023-05-16
define default model path once, sync path with readme (#1366)
András Salamon
2023-05-12
llama : fix --mtest option (close #1414)
Georgi Gerganov
2023-05-10
main : add option to save full output to session (#1338)
Evan Jones
2023-05-08
Interface improvements and `--multiline-input` (previously `--author-mode`) (...
DannyDaemonic
2023-05-08
llama : require first token to be BOS (#1303)
Georgi Gerganov
2023-05-06
Remove default arguments from sampling functions (#1343)
Jed Fox
2023-05-04
main : add --in-suffix option (#1318)
44670
2023-05-04
Only escape prompts when used with `-e` (#1311)
DannyDaemonic
2023-05-04
Update main's README.md with new features (#1296)
DannyDaemonic
2023-05-04
fix #1224 reverse prompt and multi line (#1297)
Tomas
2023-05-02
Handle signals properly on Windows (#1123)
DannyDaemonic
2023-05-02
examples : add llama_init_from_gpt_params() common function (#1290)
Ron Evans
2023-05-02
examples : improve vertical alignment of a few variables (#1286)
Ron Evans
2023-05-02
llama : allow 0 as a seed number. (#1275)
Robert Brisita
2023-05-02
main : switch input_noecho to input_echo to remove negation (#979)
Ron Evans
2023-05-01
Add git-based build information for better issue tracking (#1232)
DannyDaemonic
2023-05-01
llama : fix session load / save (#1263)
Georgi Gerganov
2023-04-29
common : change default parameters to pre-#1126 (#1223)
Georgi Gerganov
2023-04-29
llama : new sampling algorithms (#1126)
Ivan Stepanov
2023-04-28
llama : add session file format and saved sessions in main (#1169)
Evan Jones
2023-04-24
examples/main README improvements and some light refactoring (#1131)
mgroeber9110
2023-04-23
Fix LoRA acronym (#1145)
slaren
2023-04-23
Added README.md for main with examples and explanations (#1139)
DannyDaemonic
2023-04-22
Fix CI: ARM NEON, quantization unit tests, editorconfig (#1122)
Stephan Walter
2023-04-22
llama : print timings on ctrl+c exit (#1021)
wbpxre150
2023-04-21
main : evaluate tokens in batches after swapping context (#1014)
Alex Klinkhamer
2023-04-17
Add LoRA support (#820)
slaren
2023-04-16
examples: add missing <ctime> include for time() (#1011)
Pavol Rusnak
2023-04-14
Revert "main : alternative instruct mode (Vicuna support, etc.) (#863)" (#982)
Pavol Rusnak
2023-04-14
main : alternative instruct mode (Vicuna support, etc.) (#863)
Tomáš Pazdiora
2023-04-11
Fix whitespace, add .editorconfig, add GitHub workflow (#883)
Pavol Rusnak
2023-04-11
Windows fixes (#890)
comex
2023-04-10
Rewrite loading code to try to satisfy everyone:
comex
2023-04-08
fix for windows utf-8 input (#840)
Tomáš Pazdiora
2023-04-06
Do not crash when it has nothing to say. (#796)
Sergey Alirzaev
2023-04-03
Windows: reactive sigint handler after each Ctrl-C (#736)
mgroeber9110
2023-03-28
llama : fix linkage with mingw (#551)
anzz1
2023-03-28
all : be more strict about converting float to double (#458)
Stephan Walter
2023-03-28
main.cpp fixes, refactoring (#571)
anzz1
2023-03-27
Fix missing ggml link in cmake for examples/* on w64-mingw32 (#542)
Marco Matthies
2023-03-26
[main] fix infinite generation (-n == -1) (#523)
anzz1
2023-03-26
Exit from interactive mode if input stream is bad (#491)
Harald Fernengel
[next]