aboutsummaryrefslogtreecommitdiff
path: root/examples/main/main.cpp
AgeCommit message (Expand)Author
2023-06-24llama : make model stateless and context stateful (llama_state) (#1797)Didzis Gosko
2023-06-17minor : warning fixesGeorgi Gerganov
2023-06-16Fixed possible macro redefinition (#1892)FrankHB
2023-06-16build : fix and ignore MSVC warnings (#1889)Borislav Stanimirov
2023-06-13llama : do a warm-up eval at start for better timings (#1824)Georgi Gerganov
2023-06-11Fix issue where interactive mode crashes when input exceeds ctx size (#1789)Kerfuffle
2023-06-06main: add the possibility to open the prompt cache read-only (#1640)Willy Tarreau
2023-06-04llama : Metal inference (#1642)Georgi Gerganov
2023-06-03Fix prompt cache saving and chat-persistent rollover (#1678)Evan Jones
2023-05-29Work around for recalculating logits in cached prompts (Fixes #1585) (#1609)DannyDaemonic
2023-05-25Some improvements to loading the session with --prompt-cache (#1550)Kerfuffle
2023-05-20llama : add llama_init_backend() API (close #1527)Georgi Gerganov
2023-05-19main : make reverse prompt option act as a stop token in non-interactive mode...Jason McCartney
2023-05-18Fixes #1511 lambda issue for w64devkit (mingw) (#1513)DannyDaemonic
2023-05-16define default model path once, sync path with readme (#1366)András Salamon
2023-05-12llama : fix --mtest option (close #1414)Georgi Gerganov
2023-05-10main : add option to save full output to session (#1338)Evan Jones
2023-05-08Interface improvements and `--multiline-input` (previously `--author-mode`) (...DannyDaemonic
2023-05-08llama : require first token to be BOS (#1303)Georgi Gerganov
2023-05-06Remove default arguments from sampling functions (#1343)Jed Fox
2023-05-04main : add --in-suffix option (#1318)44670
2023-05-04fix #1224 reverse prompt and multi line (#1297)Tomas
2023-05-02Handle signals properly on Windows (#1123)DannyDaemonic
2023-05-02examples : add llama_init_from_gpt_params() common function (#1290)Ron Evans
2023-05-02examples : improve vertical alignment of a few variables (#1286)Ron Evans
2023-05-02llama : allow 0 as a seed number. (#1275)Robert Brisita
2023-05-02main : switch input_noecho to input_echo to remove negation (#979)Ron Evans
2023-05-01Add git-based build information for better issue tracking (#1232)DannyDaemonic
2023-05-01llama : fix session load / save (#1263)Georgi Gerganov
2023-04-29common : change default parameters to pre-#1126 (#1223)Georgi Gerganov
2023-04-29llama : new sampling algorithms (#1126)Ivan Stepanov
2023-04-28llama : add session file format and saved sessions in main (#1169)Evan Jones
2023-04-24examples/main README improvements and some light refactoring (#1131)mgroeber9110
2023-04-22Fix CI: ARM NEON, quantization unit tests, editorconfig (#1122)Stephan Walter
2023-04-22llama : print timings on ctrl+c exit (#1021)wbpxre150
2023-04-21main : evaluate tokens in batches after swapping context (#1014)Alex Klinkhamer
2023-04-17Add LoRA support (#820)slaren
2023-04-16examples: add missing <ctime> include for time() (#1011)Pavol Rusnak
2023-04-14Revert "main : alternative instruct mode (Vicuna support, etc.) (#863)" (#982)Pavol Rusnak
2023-04-14main : alternative instruct mode (Vicuna support, etc.) (#863)Tomáš Pazdiora
2023-04-11Fix whitespace, add .editorconfig, add GitHub workflow (#883)Pavol Rusnak
2023-04-11Windows fixes (#890)comex
2023-04-10Rewrite loading code to try to satisfy everyone:comex
2023-04-08fix for windows utf-8 input (#840)Tomáš Pazdiora
2023-04-06Do not crash when it has nothing to say. (#796)Sergey Alirzaev
2023-04-03Windows: reactive sigint handler after each Ctrl-C (#736)mgroeber9110
2023-03-28all : be more strict about converting float to double (#458)Stephan Walter
2023-03-28main.cpp fixes, refactoring (#571)anzz1
2023-03-26[main] fix infinite generation (-n == -1) (#523)anzz1
2023-03-26Exit from interactive mode if input stream is bad (#491)Harald Fernengel