aboutsummaryrefslogtreecommitdiff
AgeCommit message (Expand)Author
2023-03-22fix perplexity after c-api refactor (#390)Erik Scholz
2023-03-22Add details on perplexity to README.md (#395)Gary Linscott
2023-03-22Add missing header for memcpy (#386)Yusuf Kağan Hanoğlu
2023-03-22When seed <= 0 - use the clock to generate oneGeorgi Gerganov
2023-03-22Init llama_context_params properly from CLI (#370)Georgi Gerganov
2023-03-22Remove temporary notice and update hot topicsGeorgi Gerganov
2023-03-22Introduce C-style API (#370)Georgi Gerganov
2023-03-21Add SHA256SUMS file and instructions to README how to obtain and verify the d...Gary Mulder
2023-03-22Fix bin dir for win cianzz1
2023-03-21specify build type for ctest on windows (#371)Erik Scholz
2023-03-21Add notice about pending changeGeorgi Gerganov
2023-03-21fix typo in chatLLaMa (#368)Mathieu Nayrolles
2023-03-21Update issue templatesGeorgi Gerganov
2023-03-21We could use std::unordered_map over std::map (#305)Fabio R. Sluzala
2023-03-21Fix color codes emitting mid-UTF8 code. (#312)Matvey Soloviev
2023-03-21Importer for GPTQ quantized LLaMA models (#301)comex
2023-03-21Compute perplexity over prompt (#270)Gary Linscott
2023-03-21Add chatLLaMa script (#198)Jean-Christophe Hoelt
2023-03-21makefile: Fix CPU feature detection on Haiku (#218)Alex von Gluck IV
2023-03-21Enable ANSI colors on Windows 10+ (#311)anzz1
2023-03-21Minor style changesGeorgi Gerganov
2023-03-21Add chat.sh scriptGeorgi Gerganov
2023-03-21Check for reverse prompt by characters instead of tokens (#292) (#330)tjohnman
2023-03-21Check for reverse prompt by characters instead of tokens (#292) (#330)tjohnman
2023-03-21Fix convert script, warnings alpaca instructions, default paramsGeorgi Gerganov
2023-03-21Add OpenBSD support (#314)Kevin Lo
2023-03-21fix typo in comment (#318)Mack Straight
2023-03-21Makefile: slightly cleanup for Mac Intel; echo instead of run ./main -h (#335)Qingyou Meng
2023-03-21cmdline option for custom amount of model parts (--n_parts N) (#348)anzz1
2023-03-21Update IPFS links to quantized alpaca with new tokenizer format (#352)Kevin Kwok
2023-03-21Change default repeat_penalty to 1.0Georgi Gerganov
2023-03-21Add tokenizer test + revert to C++11 (#355)Georgi Gerganov
2023-03-21Add initial AVX512 support for dot product on Linux (#320)Casey Primozic
2023-03-21Adding missing features of CMakeLists.txt & Refactoring (#131)nusu-github
2023-03-20Nix flake: set meta.mainProgram to llamaBen Siraphob
2023-03-20Fixed tokenizer.model not found error when model dir is symlink (#325)Qingyou Meng
2023-03-20move file magic/version to header, print expected version (#319)Mack Straight
2023-03-20Docker - Fix publish docker image in GitHub Registry (#235)Bernat Vadell
2023-03-20sentencepiece bpe compatible tokenizer (#252)Mack Straight
2023-03-20Add tqdm to Python requirements (#293)Stephan Walter
2023-03-19bugfix: default should not be interactive (#304)cocktailpeanut
2023-03-19Rename scriptGeorgi Gerganov
2023-03-19Add temporary helper script for Alpaca chatGeorgi Gerganov
2023-03-19fix coloring of last `n_batch` of prompt, and refactor line input (#221)Rickey Bowers Jr
2023-03-19Support for multiple reverse prompts. (#299)tjohnman
2023-03-19Improved quantize script (#222)Suaj Carrot
2023-03-19Make prompt randomization optional. (#300)tjohnman
2023-03-19Respect the maximum number of tokens in interactive. (#298)tjohnman
2023-03-19Add --ignore-eos parameter (#181)slaren
2023-03-19interactive mode: print '\n' in sigint_handler, this flush stdout thus ensure...Qingyou Meng