åæãšæ³šæäºé
ãã®èšäºã§ã¯ä»¥äžãåæãšããŠããŸãã
- Ubuntu 22.04
- Ollama ã®å®è¡ç°å¢ã¯æ§ç¯æžã¿ãšããŸãã
- Ollama ã®ããŒãžã§ã³ã¯ 0.1.41 ã§ãã
æ¬èšäºã§ã¯ã以äžã® Ollama ã®å ¬åŒãªããžããªã«èšèŒãããŠããå 容ãåèã«ããŠããŸãã
ãã®èšäºã®ãŽãŒã«
以äžã®ãããªcurlã³ãã³ãã§ Llama3 ã«è³ªåãéä¿¡ãããã®çµæã確èªããããšããŽãŒã«ãšããŸãã
$ curl http://localhost:11434/api/chat -d '{
  "model": "phi3:mini",
  "stream": false,
  "messages": [
    { "role": "user", "content": "ãªã空ã¯ãããã®?" }
  ]
}'äžèšãå®è¡ãããšãåçã®æ£ç¢ºããèªç¶ãã¯å¥ãšããŠä»¥äžã®ããã«åçãè¿ã£ãŠããŸãã
{
  "model":"phi3:mini",
  "created_at":"2024-06-07T06:06:36.259965418Z",
  "message":{
    "role":"assistant",
    "content":" æ¥æ¬èªã§ã®åçãšããŠãã空ã¯éãã®ããã倧æ°äžã«é²ãéšæ»Žãå«ãŸããŠããããã§ããããããã®åŸ®ç²åã¯å
ãåå°ãã芳枬è
ã«ãšã£ãŠèŠããªãéè²ãäŒããŸããã\n\næ¥æ¬èªã§ã®ã¹ããããã€ã¹ãããã§èª¬æããå Žå:\n\n1. 空ã¯äœããå«ãŸããŠããããšããéãè²ã«èŠããã®ã§ãã\n2. 空äžã«é²ãæ°Žãå«ãã æ¶²äœãããããã®ååãå
ã®æ³¢é·ãå€åãããŸãã\n3. 倧æ°äžã®ç©è³ªã¯å
ã®æ³¢é·ã埮å°ãªç¯å²ã§é®æããéè²ãèŠããããã«æ£ä¹±ããŸãã\n4. ãã®çŸè±¡ã¯äººéãä»ã®çç©ããã¯çŽæ¥èгå¯ããããšãã§ããªããããéè²ãç¥ãæ¹æ³ãšããŠãå
åŠçãªè§£æãçè«ã¢ãã«ãçšããŸãã"
  },
  "done_reason":"stop",
  "done":true,
  "total_duration":55640212015,
  "load_duration":1168323,
  "prompt_eval_duration":444553000,
  "eval_count":319,
  "eval_duration":55059956000
}Ubuntu ã§ Ollama ã®ç°å¢ãæ§ç¯ããæé ã«ã€ããŠã¯ã以äžã«ãŸãšããŠããŸããå¿ èŠãªæ¹ã¯èŠãŠã¿ãŠãã ããã
ð§ Ubuntuã«Ollamaãã€ã³ã¹ããŒã«ããŠPhi3ãåãããŸã§ã®æé 
Ubuntuã«Ollamaãã€ã³ã¹ããŒã«ããŠMicrosoft瀟ã®SLMã§ããPhi3ãåãããŸã§ã®æé ã解説ããŸãã
ritaiz.com
Ollama ã®ç¶æ ã確èªãã
å ¬åŒããã¥ã¡ã³ãã«èšèŒãããŠããã¹ã¯ãªããïŒãã¡ãïŒãå®è¡ã㊠Ollama ã Ubuntu ã«ã€ã³ã¹ããŒã«ããå ŽåãUbuntu ã®ãµãŒãã¹ãšã㊠Ollama ãèªåèµ·åããããã«èšå®ãããŠããŸãã詊ãã«ä»¥äžã®ã³ãã³ããå®è¡ã㊠Ollama ã®ç¶æ ã確èªããŠã¿ãŠãã ããã
$ sudo systemctl status ollama以äžã®ããã«è¡šç€ºãããŸãã
$ sudo systemctl status ollama
â ollama.service - Ollama Service
     Loaded: loaded (/etc/systemd/system/ollama.service; enabled; vendor preset: enabled)
     Active: active (running) since Fri 2024-06-07 10:16:06 JST; 4min 19s ago
   Main PID: 588934 (ollama)
      Tasks: 31 (limit: 19087)
     Memory: 3.8G
        CPU: 1min 155ms
     CGroup: /system.slice/ollama.service
             ââ588934 /usr/local/bin/ollama serve
 
# çç¥äžèšãèŠããšãActive: active(running)ãšãªã£ãŠãããOllama ãèµ·åããŠããããšã確èªã§ããŸãã
ãã Ollama ã忢ãããå Žåã¯ãä»ã®ãµãŒãã¹ã忢ãããå Žåãšåæ§ã«ä»¥äžã®ããã«systemctl stopã³ãã³ãã䜿ã£ãŠåæ¢ã§ããŸãã
$ sudo systemctl stop ollamaèµ·åãããå Žåã¯startã䜿çšããŸãã
$ sudo systemctl start ollama以é㯠Ollama ãèµ·åããŠããç¶æ ã§é²ããŠãããŸãã
Ollama ã§äœ¿çšã§ããã¢ãã«ã確èªãã
Ollama ãèµ·åããŠããç¶æ ã§ä»¥äžã®ã³ãã³ãã䜿ã£ãŠ Ollama ã§äœ¿çšã§ããã¢ãã«ã確èªããŸãã
$ ollama list
NAME            ID              SIZE    MODIFIED
phi3:mini       64c1188f2485    2.4 GB  8 days ago
phi3:medium     1e67dff39209    7.9 GB  8 days agoäžèšãããphi3:miniãšphi3:mediumã®ã¢ãã«ã䜿çšã§ããããšãããããŸãã
ããä»ã®ã¢ãã«ãäŸãã° Llama3 ã䜿çšãããå Žåã¯ã以äžãå®è¡ã㊠Llama3 ãããŠã³ããŒãããŸãã4.7GB ã»ã©ã®ãµã€ãºããããããããŠã³ããŒãã«ã¯æéãããããŸãã
$ ollama pull llama3
pulling manifest
pulling 6a0746a1ec1a... 100% ââââââââââââââââââââââââââââââââââââââââââ 4.7 GB
pulling 4fa551d4f938... 100% ââââââââââââââââââââââââââââââââââââââââââ  12 KB
pulling 8ab4849b038c... 100% ââââââââââââââââââââââââââââââââââââââââââ  254 B
pulling 577073ffcc6c... 100% ââââââââââââââââââââââââââââââââââââââââââ  110 B
pulling 3f8eb4da87fa... 100% ââââââââââââââââââââââââââââââââââââââââââ  485 B
verifying sha256 digest
writing manifest
removing any unused layers
successäžèšã®ããã«successãšè¡šç€ºãããŠããã°ãLlama3 ã®ããŠã³ããŒããå®äºããŠããŸãã
以äžã§å床確èªãããšãllama3ã远å ãããŠããããšãããããŸãã
$ ollama list
NAME            ID              SIZE    MODIFIED
llama3:latest   365c0bd3c000    4.7 GB  5 minutes ago
phi3:mini       64c1188f2485    2.4 GB  8 days ago
phi3:medium     1e67dff39209    7.9 GB  8 days agoããã§ Phi3 ã Llama3 ã䜿çšããæºåãæŽããŸããã
API çµç±ã§ãªã¯ãšã¹ããéä¿¡ãã
systemctl statusã§ç¢ºèªããããã«ãOllama ãèµ·åããŠããç¶æ
ã§ããã°ä»¥äžã®ã³ãã³ããåã Ubuntu äžã§å®è¡ããããšã§ãOllama ã«è³ªåãéä¿¡ãããã®çµæã確èªããããšãã§ããŸãã以äžã®ããã«ãmodelã§äœ¿çšãããã¢ãã«ãæå®ããããšãã§ããŸãã
$ curl http://localhost:11434/api/chat -d '{
  "model": "phi3:mini",
  "messages": [
    { "role": "user", "content": "ãªã空ã¯ãããã®?" }
  ]
}'äžèšãå®è¡ãããšã以äžã®ããã« åçãïŒæåã¥ã€è¿ã£ãŠããŸãã
$ curl http://localhost:11434/api/chat -d '{
  "model": "phi3:mini",
  "messages": [
    { "role": "user", "content": "ãªã空ã¯ãããã®?" }
  ]
}'
{"model":"phi3:mini","created_at":"2024-06-07T06:04:50.710600764Z","message":{"role":"assistant","content":" "},"done":false}
{"model":"phi3:mini","created_at":"2024-06-07T06:04:50.835076269Z","message":{"role":"assistant","content":"ã"},"done":false}
{"model":"phi3:mini","created_at":"2024-06-07T06:04:51.152210923Z","message":{"role":"assistant","content":"空"},"done":false}
{"model":"phi3:mini","created_at":"2024-06-07T06:04:51.272242582Z","message":{"role":"assistant","content":"ã"},"done":false}
{"model":"phi3:mini","created_at":"2024-06-07T06:04:51.405790738Z","message":{"role":"assistant","content":"ãš"},"done":false}
# çç¥ãã£ããã¢ããªãšããŠäœ¿ãããå Žåã¯è¯ãã§ãããåçã 1 æåã¥ã€è¿åŽããããªãå Žåã¯ä»¥äžã®ããã«streamãªãã·ã§ã³ãfalseãšããããšã§åçå
šãŠãäžåºŠã«è¿åŽããããšãã§ããŸãã
$ curl http://localhost:11434/api/chat -d '{
  "model": "phi3:mini",
  "stream": false,
  "messages": [
    { "role": "user", "content": "ãªã空ã¯ãããã®?" }
  ]
}'äžèšãå®è¡ãããšä»¥äžã®ããã«è¡šç€ºãããŸãã
$ curl http://localhost:11434/api/chat -d '{
  "model": "phi3:mini",
  "stream": false,
  "messages": [
    { "role": "user", "content": "ãªã空ã¯ãããã®?" }
  ]
}'
{
  "model":"phi3:mini",
  "created_at":"2024-06-07T06:06:36.259965418Z",
  "message":{
    "role":"assistant",
    "content":" æ¥æ¬èªã§ã®åçãšããŠãã空ã¯éãã®ããã倧æ°äžã«é²ãéšæ»Žãå«ãŸããŠããããã§ããããããã®åŸ®ç²åã¯å
ãåå°ãã芳枬è
ã«ãšã£ãŠèŠããªãéè²ãäŒããŸããã\n\næ¥æ¬èªã§ã®ã¹ããããã€ã¹ãããã§èª¬æããå Žå:\n\n1. 空ã¯äœããå«ãŸããŠããããšããéãè²ã«èŠããã®ã§ãã\n2. 空äžã«é²ãæ°Žãå«ãã æ¶²äœãããããã®ååãå
ã®æ³¢é·ãå€åãããŸãã\n3. 倧æ°äžã®ç©è³ªã¯å
ã®æ³¢é·ã埮å°ãªç¯å²ã§é®æããéè²ãèŠããããã«æ£ä¹±ããŸãã\n4. ãã®çŸè±¡ã¯äººéãä»ã®çç©ããã¯çŽæ¥èгå¯ããããšãã§ããªããããéè²ãç¥ãæ¹æ³ãšããŠãå
åŠçãªè§£æãçè«ã¢ãã«ãçšããŸãã"
  },
  "done_reason":"stop",
  "done":true,
  "total_duration":55640212015,
  "load_duration":1168323,
  "prompt_eval_duration":444553000,
  "eval_count":319,
  "eval_duration":55059956000
}ãªãããã®ãŸãŸã ãš Ollama ãèµ·åããŠãã Ubuntu ãããã API ãžã®ãªã¯ãšã¹ããåãä»ãã§ããŸããã
ä»ã®ãã·ã³ããããªã¯ãšã¹ããåãä»ããããã«ããã«ã¯ã/etc/systemd/system/ollama.serviceãã¡ã€ã«ãç·šéããŠåæ ãããå¿
èŠããããŸããæ¬¡ã§èª¬æããŸãã
Ollama ãããŒã«ã«ãã·ã³ä»¥å€ãããã¢ã¯ã»ã¹ã§ããããã«ãã
以äžã®ããã«ã/etc/systemd/system/ollama.serviceã«Environment="OLLAMA_HOST=0.0.0.0"ã远èšããŸãã
[Unit]
Description=Ollama Service
After=network-online.target
 
[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/home/linuxbrew/.linuxbrew/bin:/home/linuxbrew/.linuxbrew/sbin:/home/hisui/.nvm/versions/node/v20.9.0/bin:/home/hisui/.local/share/pnpm:/home/hisui/.cargo/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/local/go/bin:/usr/local/go/bin"
Environment="OLLAMA_HOST=0.0.0.0"
 
[Install]
WantedBy=default.target
 äžèšãç·šéåŸã以äžã®ã³ãã³ããå®è¡ããŠå€æŽãåæ ã㊠Ollama ãåèµ·åããŸãã
$ sudo systemctl daemon-reload
$ sudo systemctl restart ollamaäžèšã®ããã«ããããšã§ãä»ã®ãã·ã³ããã Ollama ã«ãªã¯ãšã¹ããéä¿¡ã§ããããã«ãªããŸãã
äŸãã°ãOllama ãåããŠãã Ubuntu ã® IP ã¢ãã¬ã¹ã192.168.1.100ã§ããå Žåã«ãåããããã¯ãŒã¯å
ã«ããå¥ã®ç«¯æ«ãã以äžã®curlãå®è¡ããã°ãåçãè¿ã£ãŠããŸãã
$ curl http://192.168.1.100:11434/api/chat -d '{
  "model": "phi3:mini",
  "stream": false,
  "messages": [
    { "role": "user", "content": "ãªã空ã¯ãããã®?" }
  ]
}'ããäžèšãå®è¡ããŠä»¥äžã®ãããªãšã©ãŒã衚瀺ãããå Žåã¯ãOllama ãåäœããŠããªãããIP ã¢ãã¬ã¹ãããŒãçªå·ãééã£ãŠããå¯èœæ§ããããŸãããããã¯ãã¡ã€ã¢ãŠã©ãŒã«ã®èšå®ã«ãã£ãŠæ¥ç¶ããããã¯ãããŠããå¯èœæ§ããããŸãã®ã§ç¢ºèªããŠã¿ãŠãã ããã
curl: (7) Failed to connect to 192.168.1.100 port 11434 after 26 ms: Couldnt connect to serverãªããOLLAMA_HOST以å€ã«æå®ã§ããç°å¢å€æ°ãšããŠã¯ã以äžã®å
¬åŒãªããžããªã®å
容ãåèã«ãªããšæããŸãã
ããã©ã«ãã®ããŒãçªå· 11434 以å€ã®ããŒãçªå·ã䜿çšãã
以äžã®ããã«ããããšã§ããã©ã«ãã®ããŒãçªå·ã§ãã11434以å€ã®ããŒãçªå·ãæå®ããããšãã§ããŸãã以äžã¯11435ãæå®ããŠããäŸã§ãã
# çç¥
Environment="OLLAMA_HOST=0.0.0.0:11435"
# çç¥äžèšãä¿åããŠåç¯ã§èª¬æããããã«åæ ããŠåèµ·åããã°ãæå®ããããŒãçªå·ã§ Ollama ãèµ·åããŸãã
ãã ããæ³šæç¹ããããããŒãçªå·ãããã©ã«ãã®11434以å€ã«èšå®ããå Žåãollama listããã®ä»ã®ã³ãã³ãããã®ãŸãŸã§ã¯åäœãããªããŸãã詊ãã«å®è¡ããŠã¿ããšã以äžã®ããã« Ollama ã«æ¥ç¶ã§ããªããšãšã©ãŒã衚瀺ãããŸãã
$ ollama list
Error: could not connect to ollama app, is it running?ããç°ãªãããŒãçªå·ã§åããŠãã Ollama ã«å¯ŸããŠollama listãä»ã®ã³ãã³ããå®è¡ãããå Žåã¯ã以äžã®ããã«OLLAMA_HOSTãæå®ããŸãã
$ OLLAMA_HOST="127.0.0.1:11435" ollama list
NAME            ID              SIZE    MODIFIED
llama3:latest   365c0bd3c000    4.7 GB  4 hours ago
phi3:mini       64c1188f2485    2.4 GB  8 days ago
phi3:medium     1e67dff39209    7.9 GB  8 days agoollama listã ãã§ãªããä»ã®ã³ãã³ãã«ã€ããŠãåæ§ã§ãã
ollama serve ã³ãã³ãã«ã€ããŠ
å
¬åŒããã¥ã¡ã³ãã«ãèšèŒãããŠããŸããã Ollama ããµãŒããšããŠèµ·åããããã«ollama serveã³ãã³ãã䜿çšããããšãã§ããŸããUbuntu ã§ã¯ãOllama å
¬åŒã®ã¹ã¯ãªãããå®è¡ããŠã€ã³ã¹ããŒã«ããæç¹ã§ãµãŒãã¹ãšããŠç»é²ãããããïŒå
éšã§ollama serveãå®è¡ããŠããŸãïŒãåºæ¬çã«ãã®ollama serveã¯äœ¿ããªããŠãåé¡ãããŸãããã€ã³ã¹ããŒã«ããæç¹ã§ Ollama ãèµ·åããŠããããã以éã¯ãµãŒãã¹ãšããŠåäœããŠããããã§ãã
ãããµãŒãã¹ãšããŠåããŠãã Ollama ãããç¶æ
ã§ollama serveãå®è¡ãããšã以äžã®ããã«ãšã©ãŒã衚瀺ãããŸãã
$ ollama serve
Error: listen tcp 0.0.0.0:11434: bind: address already in useãã§ã«ããŒãã䜿çšãããŠããå Žåã¯ã以äžã®ããã«lsofã³ãã³ãã䜿ã£ãŠã©ã®ããã»ã¹ãããŒãã䜿çšããŠããã確èªã§ããŸãã
$ sudo lsof -i:11434
COMMAND     PID   USER   FD   TYPE    DEVICE SIZE/OFF NODE NAME
ollama  2094564 ollama    3u  IPv4 890723407      0t0  TCP localhost:11434 (LISTEN)ãããã£ãŠãUbuntu ãä»ã® Linux ã§ã¯åºæ¬çã«ãµãŒãã¹ãšããŠåããŠãããããåé ã«èšèŒããããã«systemctlãªã©ã䜿ã£ãŠèµ·åã忢ãè¡ããŸãã
ãªããsystemctl stop ollama.serviceã§ Ubuntu ã®ãµãŒãã¹ãšããŠåããŠãã Ollama ã忢ããŠãollama serveãå®è¡ãããšä»¥äžã®ããã«æ£åžžã«åäœããŠãªã¯ãšã¹ãã®åŸ
ã¡åãç¶æ
ãšãªããŸãã
$ ollama serve
2024/06/07 11:41:29 routes.go:1028: INFO server config env="map[OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST: OLLAMA_KEEP_ALIVE: OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:1 OLLAMA_MAX_QUEUE:512 OLLAMA_MAX_VRAM:0 OLLAMA_MODELS: OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:*] OLLAMA_RUNNERS_DIR: OLLAMA_TMPDIR:]"
time=2024-06-07T11:41:29.174+09:00 level=INFO source=images.go:729 msg="total blobs: 5"
time=2024-06-07T11:41:29.176+09:00 level=INFO source=images.go:736 msg="total unused blobs removed: 0"
time=2024-06-07T11:41:29.176+09:00 level=INFO source=routes.go:1074 msg="Listening on 127.0.0.1:11434 (version 0.1.39)"
time=2024-06-07T11:41:29.177+09:00 level=INFO source=payload.go:30 msg="extracting embedded files" dir=/tmp/ollama1580340421/runners
time=2024-06-07T11:41:32.332+09:00 level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cpu cpu_avx cpu_avx2 cuda_v11 rocm_v60002]"
time=2024-06-07T11:41:32.344+09:00 level=INFO source=types.go:71 msg="inference compute" id=0 library=cpu compute="" driver=0.0 name="" total="15.6 GiB" available="479.7 MiB"詊ãã«curlã³ãã³ãã§è³ªåãããŠã¿ããšã以äžã®ãããªåºåã远å è¡šç€ºããããšæããŸãã
[GIN] 2024/06/07 - 11:45:18 | 200 | 17.488916719s |       127.0.0.1 | POST     "/api/generate"
 äžèšã®ããã«ãã°ããªã¢ã«ã¿ã€ã ã«ç¢ºèªã§ããããããããã°ãªã©ã«äŸ¿å©ã§ãã
ããç°ãªãããŒãçªå·ã§åããããå Žåã¯ã以äžã®ããã«OLLAMA_HOSTãæå®ããŸãã
$ OLLAMA_HOST=0.0.0.0:11435 ollama serveAPI ã§æå®ã§ãããªãã·ã§ã³ã«ã€ããŠ
以äžã®å
¬åŒãªããžããªå
ã®ããã¥ã¡ã³ãã«ãstreamããã®ä»ã®è²ã
ãªãªãã·ã§ã³ãèšèŒãããŠããŸãããŸããAPI ãšããŠçšæãããŠãããšã³ããã€ã³ããèšèŒãããŠããŸãã
äžã€ã®äŸãšããŠãåã質åã«å¯ŸããŠæ¯ååãåçãæ¬²ããå Žåã¯ã以äžã®ããã«seedãæå®ããããã«temperatureã0ã«èšå®ããããšã§åãåçãè¿ã£ãŠããããã«ãªããŸããtemperatureã¯åçã®åµé æ§ãšå³å¯æ§ãå·Šå³ãããªãã·ã§ã³ã§ãã
$ curl http://192.168.1.100:11434/api/chat -d '{
  "model": "phi3:mini",
  "stream": false,
  "messages": [
    { "role": "user", "content": "ãªã空ã¯ãããã®?" }
  ],
  "options": {
    "seed": 100,
    "temperature": 0
  }
}'
 äžèšãå®è¡ãããšãåé ã«èšèŒãããã®ãšã¯ç°ãªãåçãè¿ã£ãŠããŸãããå床å®è¡ããŠãåãåçã«ãªããŸãã
{
  "model":"phi3:mini",
  "created_at":"2024-06-07T06:22:43.585726991Z",
  "message":{
    "role":"assistant",
    "content":" 空ãããããããšåŒã°ããçç±ã¯ããã®ç©è³ªãç¹æ§ã«åºã¥ããŠããŸãã空ã¯å€§æ°ãå«ãç¡éåã®ç¶æ
ã§ããã倧æ°äžã«ã¯æ°ŽçŽ é
žåç玠ïŒCO2ïŒãå«ãŸããŠããããã®ã¬ã¹ãå
åæãè¡ãæ€ç©ãæµ·è»ãªã©ããæŸåºãããããã§ãããã®çµæãšããŠãç©ºã¯æ°ŽçŽ é
žåç玠ãå«ãã§ãããããæ°ŽçŽ é
žåç玠ã倧æ°äžã«ååšããããšããããããããšèŠãªãããŸãã\n\nãããã空èªäœããæ°Žãã®äžéšã§ã¯ãªãã空æ°ãå«ãæ¶²äœã§ã¯ãããŸããã空æ°ã¯çªçŽ é
žåç©ïŒNOxïŒãä»ã®ææ©ååãå«ãã§ãããããããæ°Žã«ãã£ãŠå€åãããããç©ºãšæ°Žã¯ç°ãªãæ§è³ªãæã£ãŠããŸãã\n\nãããã£ãŠã空ãããããããšèšãããã®ã¯ããã®å€§æ°äžã«å«ãŸããæ°ŽçŽ é
žåç玠ãšãã®ç©ççãªç¹æ§ããã§ãã"
    },
    "done_reason":"stop",
    "done":true,"
    total_duration":65730704614,
    "load_duration":1461885502,
    "prompt_eval_count":16,
    "prompt_eval_duration":855754000,
    "eval_count":359,
    "eval_duration":63319975000
}Ollama ãæŽæ°ãã
Ollama ãã€ã³ã¹ããŒã«ããããã®ã³ãã³ããå床å®è¡ããããšã§ Ollama ãæŽæ°ããããšãã§ããŸãã
$ curl -fsSL https://ollama.com/install.sh | shäžèšãå®è¡ãããšä»¥äžã®ããã«è¡šç€ºãããŸãã
$ curl -fsSL https://ollama.com/install.sh | sh
>>> Downloading ollama...
######################################################################## 100.0%##O#-#                   ######################################################################## 100.0%
>>> Installing ollama to /usr/local/bin...
>>> Adding ollama user to render group...
>>> Adding ollama user to video group...
>>> Adding current user to ollama group...
>>> Creating ollama systemd service...
>>> Enabling and starting ollama service...
>>> The Ollama API is now available at 127.0.0.1:11434.
>>> Install complete. Run "ollama" from the command line.
WARNING: No NVIDIA/AMD GPU detected. Ollama will run in CPU-only mode.äžèšã§ Ollama ã®æŽæ°ãå®äºããææ°ã®ããŒãžã§ã³ãã€ã³ã¹ããŒã«ãããŸãã
ãŸãšã
Ollama ããµãŒããšããŠåãã㊠API ã§æäœããæé ã«ã€ããŠè§£èª¬ããŸãããä»åã¯curlã³ãã³ãã䜿ã£ãŠ API ã«ãªã¯ãšã¹ããéä¿¡ããŸããããå®éã«ã¯ã¢ãã€ã«ã¢ããªã Web ã¢ããªãã API ãªã¯ãšã¹ããéãããšã§ãLlama3 ã Phi3 ãªã©ã®çæ AI ãæŽ»çšããã¢ããªãäœãããšãã§ããŸãã
åŒç€Ÿã§ã¯ãChatGPT ãã¯ãããšããçæ AI ãæŽ»çšããæ¥åã·ã¹ãã ãã¢ããªã±ãŒã·ã§ã³ã®éçºãè¡ã£ãŠããŸãããèå³ãããæ¹ã¯ããæ°è»œã«ãåãåãããã ããã
ãåãåãã