aboutsummaryrefslogtreecommitdiff
AgeCommit message (Expand)Author
2023-05-05readme: add missing info (#1324)Pavol Rusnak
2023-05-05Fix for OpenCL / clbast builds on macOS. (#1329)Ionoclast Laboratories
2023-05-05Convert.py @staticmethod (#1327)Benjamin Lecaillon
2023-05-05quantize: make output filename optional, default to ggml-model-<ftype>.bin (#...slaren
2023-05-04Wrap exceptions in std::exception to verbose output on exception. (#1316)Ivan Stepanov
2023-05-04convert: support DT_BF16 tensors (#1309)Ivan Stepanov
2023-05-04readme : add OpenBuddy link (#1321)44670
2023-05-04main : add --in-suffix option (#1318)44670
2023-05-04ggml : change immintrin.h to intrin.h for compatibility (#1307)Ron Jailall
2023-05-04Only escape prompts when used with `-e` (#1311)DannyDaemonic
2023-05-04Update main's README.md with new features (#1296)DannyDaemonic
2023-05-04fix #1224 reverse prompt and multi line (#1297)Tomas
2023-05-03ggml : vectorize Q8_0 quantizationGeorgi Gerganov
2023-05-03examples : read chat prompts from a template file (#1196)khimaros
2023-05-03minor : fix whitespaces (#1302)Georgi Gerganov
2023-05-03minor : fix trailing whitespacesGeorgi Gerganov
2023-05-03scripts : platform independent script to verify sha256 checksums (#1203)KASR
2023-05-03examples : various prompt and example fixes (#1298)CRD716
2023-05-02llama : only copy used KV cache in get / set state (#1272)Evan Jones
2023-05-02Process escape sequences given in prompts (#1173)DannyDaemonic
2023-05-02Handle signals properly on Windows (#1123)DannyDaemonic
2023-05-02Call sh on build-info.sh (#1294)DannyDaemonic
2023-05-03fix build-info.h for git submodules (#1289)kuvaus
2023-05-03fix missing parameters in `llama_init_from_gpt_params` (#1293)slaren
2023-05-02examples : add llama_init_from_gpt_params() common function (#1290)Ron Evans
2023-05-02llama : fix compile warningsGeorgi Gerganov
2023-05-02ggml : fix 32-bit ARMGeorgi Gerganov
2023-05-02examples : improve vertical alignment of a few variables (#1286)Ron Evans
2023-05-02ggml : fix ppc64le build error and make cmake detect Power processors (#1284)Marvin Gießing
2023-05-02llama : allow 0 as a seed number. (#1275)Robert Brisita
2023-05-02main : switch input_noecho to input_echo to remove negation (#979)Ron Evans
2023-05-02ggml: add names to tensors (#1268)slaren
2023-05-01Add git-based build information for better issue tracking (#1232)DannyDaemonic
2023-05-01cuBLAS: refactor and optimize f16 mat mul performance (#1259)slaren
2023-05-01llama : update stubs for systems without mmap and mlock (#1266)xloem
2023-05-01ggml : fix ggml_used_mem() (#1264)Kerfuffle
2023-05-01llama : fix session load / save (#1263)Georgi Gerganov
2023-05-01cuBLAS: fall back to pageable memory if pinned alloc fails (#1233)slaren
2023-05-01llama : let context be const when accessing const data (#1261)Alex Klinkhamer
2023-04-30ggml : fix UB (int << 31)Georgi Gerganov
2023-04-30build: add armv{6,7,8} support to cmake (#1251)Pavol Rusnak
2023-04-30common : better default number of threads (#934)jon-chuang
2023-04-30ggml : add CLBlast q5_0, q5_1, q8_0 dequant kernels (#1225)0cc4m
2023-04-30ggml : add Q5 WASM SIMD + GGML_FTYPEGeorgi Gerganov
2023-04-30Various fixes to mat_mul benchmark (#1253)Stephan Walter
2023-04-30ggml : fix labels for GGML_OP_ALIBIGeorgi Gerganov
2023-04-29ggml : fix 32-bit ARM NEONGeorgi Gerganov
2023-04-29ggml : use vzip instead of vuzp for consistencyGeorgi Gerganov
2023-04-29ggml : fix visibility and unused warningsGeorgi Gerganov
2023-04-29ggml : fix #if for f32_f32 mul_mat (CLBlast) (#1229)Georgi Gerganov