aboutsummaryrefslogtreecommitdiff
path: root/examples/server/README.md
diff options
context:
space:
mode:
authorjwj7140 <32943891+jwj7140@users.noreply.github.com>2023-07-05 03:06:12 +0900
committerGitHub <noreply@github.com>2023-07-04 21:06:12 +0300
commitf257fd255044decffad93dee2502875ce66ad80c (patch)
tree721030366e4a3ca93088493581ad19750b6bff95 /examples/server/README.md
parent7ee76e45afae7f9a7a53e93393accfb5b36684e1 (diff)
Add an API example using server.cpp similar to OAI. (#2009)
* add api_like_OAI.py * add evaluated token count to server * add /v1/ endpoints binding
Diffstat (limited to 'examples/server/README.md')
-rw-r--r--examples/server/README.md16
1 files changed, 16 insertions, 0 deletions
diff --git a/examples/server/README.md b/examples/server/README.md
index ba4b2fe..4ed226e 100644
--- a/examples/server/README.md
+++ b/examples/server/README.md
@@ -190,3 +190,19 @@ Run with bash:
```sh
bash chat.sh
```
+
+### API like OAI
+
+API example using Python Flask: [api_like_OAI.py](api_like_OAI.py)
+This example must be used with server.cpp
+
+```sh
+python api_like_OAI.py
+```
+
+After running the API server, you can use it in Python by setting the API base URL.
+```python
+openai.api_base = "http://<Your api-server IP>:port"
+```
+
+Then you can utilize llama.cpp as an OpenAI's **chat.completion** or **text_completion** API