index
:
llama.cpp.git
master
llama.cpp
user
about
summary
refs
log
tree
commit
diff
log msg
author
committer
range
Age
Commit message (
Expand
)
Author
2023-03-22
fix: add POSIX functionality for Linux compilation (#51)
Valentyn Bezshapkin
2023-03-22
Don't force immediate interactive without `-i` (#354)
tjohnman
2023-03-22
cmake: make llama an actual library (#392)
Erik Scholz
2023-03-22
fix perplexity after c-api refactor (#390)
Erik Scholz
2023-03-22
Add details on perplexity to README.md (#395)
Gary Linscott
2023-03-22
Add missing header for memcpy (#386)
Yusuf Kağan Hanoğlu
2023-03-22
When seed <= 0 - use the clock to generate one
Georgi Gerganov
2023-03-22
Init llama_context_params properly from CLI (#370)
Georgi Gerganov
2023-03-22
Remove temporary notice and update hot topics
Georgi Gerganov
2023-03-22
Introduce C-style API (#370)
Georgi Gerganov
2023-03-21
Add SHA256SUMS file and instructions to README how to obtain and verify the d...
Gary Mulder
2023-03-22
Fix bin dir for win ci
anzz1
2023-03-21
specify build type for ctest on windows (#371)
Erik Scholz
2023-03-21
Add notice about pending change
Georgi Gerganov
2023-03-21
fix typo in chatLLaMa (#368)
Mathieu Nayrolles
2023-03-21
Update issue templates
Georgi Gerganov
2023-03-21
We could use std::unordered_map over std::map (#305)
Fabio R. Sluzala
2023-03-21
Fix color codes emitting mid-UTF8 code. (#312)
Matvey Soloviev
2023-03-21
Importer for GPTQ quantized LLaMA models (#301)
comex
2023-03-21
Compute perplexity over prompt (#270)
Gary Linscott
2023-03-21
Add chatLLaMa script (#198)
Jean-Christophe Hoelt
2023-03-21
makefile: Fix CPU feature detection on Haiku (#218)
Alex von Gluck IV
2023-03-21
Enable ANSI colors on Windows 10+ (#311)
anzz1
2023-03-21
Minor style changes
Georgi Gerganov
2023-03-21
Add chat.sh script
Georgi Gerganov
2023-03-21
Check for reverse prompt by characters instead of tokens (#292) (#330)
tjohnman
2023-03-21
Check for reverse prompt by characters instead of tokens (#292) (#330)
tjohnman
2023-03-21
Fix convert script, warnings alpaca instructions, default params
Georgi Gerganov
2023-03-21
Add OpenBSD support (#314)
Kevin Lo
2023-03-21
fix typo in comment (#318)
Mack Straight
2023-03-21
Makefile: slightly cleanup for Mac Intel; echo instead of run ./main -h (#335)
Qingyou Meng
2023-03-21
cmdline option for custom amount of model parts (--n_parts N) (#348)
anzz1
2023-03-21
Update IPFS links to quantized alpaca with new tokenizer format (#352)
Kevin Kwok
2023-03-21
Change default repeat_penalty to 1.0
Georgi Gerganov
2023-03-21
Add tokenizer test + revert to C++11 (#355)
Georgi Gerganov
2023-03-21
Add initial AVX512 support for dot product on Linux (#320)
Casey Primozic
2023-03-21
Adding missing features of CMakeLists.txt & Refactoring (#131)
nusu-github
2023-03-20
Nix flake: set meta.mainProgram to llama
Ben Siraphob
2023-03-20
Fixed tokenizer.model not found error when model dir is symlink (#325)
Qingyou Meng
2023-03-20
move file magic/version to header, print expected version (#319)
Mack Straight
2023-03-20
Docker - Fix publish docker image in GitHub Registry (#235)
Bernat Vadell
2023-03-20
sentencepiece bpe compatible tokenizer (#252)
Mack Straight
2023-03-20
Add tqdm to Python requirements (#293)
Stephan Walter
2023-03-19
bugfix: default should not be interactive (#304)
cocktailpeanut
2023-03-19
Rename script
Georgi Gerganov
2023-03-19
Add temporary helper script for Alpaca chat
Georgi Gerganov
2023-03-19
fix coloring of last `n_batch` of prompt, and refactor line input (#221)
Rickey Bowers Jr
2023-03-19
Support for multiple reverse prompts. (#299)
tjohnman
2023-03-19
Improved quantize script (#222)
Suaj Carrot
2023-03-19
Make prompt randomization optional. (#300)
tjohnman
[next]