Skip to content

Commit 1e749bf

Browse files
committed
demos
1 parent 6605a32 commit 1e749bf

22 files changed

+1666
-4907
lines changed

demos/_is_valid_model.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ A [demo](../README.md#demos) of [Ollama Bash Lib](https://github.com/attogram/ol
55
| test | output | return | result |
66
|------|--------|--------|--------|
77
| `mistral:7b` | mistral:7b | 0 | ✅ PASS |
8-
| `` | hf.co/bartowski/Ministral-8B-Instruct-2410-GGUF:IQ4_XS | 0 | ✅ PASS |
8+
| `` | smollm2:1.7b | 0 | ✅ PASS |
99
| `hf.co/user/Model-name:QUANT` | hf.co/user/Model-name:QUANT | 0 | ✅ PASS |
1010
| `abcdefghijklmnopqrstuvwxyz` | abcdefghijklmnopqrstuvwxyz | 0 | ✅ PASS |
1111
| `1234567890` | 1234567890 | 0 | ✅ PASS |

demos/ollama_api_get.md

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ A [demo](../README.md#demos) of [Ollama Bash Lib](https://github.com/attogram/ol
44

55
## Setup
66

7-
OLLAMA_HOST:
7+
OLLAMA_HOST: http://localhost:11434
88

99
OBL_API: http://localhost:11434
1010

@@ -34,23 +34,23 @@ result: output: {"version":"0.11.6"}
3434
```
3535
ollama_api_get
3636
37-
[DEBUG] 15:32:00:608428500: ollama_api_get: []
38-
[DEBUG] 15:32:00:631006800: _call_curl: [GET] []
39-
[DEBUG] 15:32:00:652850600: _call_curl: OBL_API: http://localhost:11434
40-
[DEBUG] 15:32:00:674555600: _call_curl: args: -s -N --max-time 300 -H Content-Type: application/json -w HTTP_CODE_DELIMITER%{http_code} -X GET http://localhost:11434
41-
[DEBUG] 15:32:00:958313500: ollama_api_get: success
37+
[DEBUG] 19:56:03:708919800: ollama_api_get: []
38+
[DEBUG] 19:56:03:745496300: _call_curl: [GET] []
39+
[DEBUG] 19:56:03:778170400: _call_curl: OBL_API: http://localhost:11434
40+
[DEBUG] 19:56:03:797803100: _call_curl: args: -s -N --max-time 300 -H Content-Type: application/json -w HTTP_CODE_DELIMITER%{http_code} -X GET http://localhost:11434
41+
[DEBUG] 19:56:04:092320200: ollama_api_get: success
4242
result: lines: 1
4343
result: output: Ollama is running
4444
```
4545

4646
```
4747
ollama_api_get -P "/api/version"
4848
49-
[DEBUG] 15:32:01:044775400: ollama_api_get: [/api/version]
50-
[DEBUG] 15:32:01:066980800: _call_curl: [GET] [/api/version]
51-
[DEBUG] 15:32:01:101754900: _call_curl: OBL_API: http://localhost:11434
52-
[DEBUG] 15:32:01:123782300: _call_curl: args: -s -N --max-time 300 -H Content-Type: application/json -w HTTP_CODE_DELIMITER%{http_code} -X GET http://localhost:11434/api/version
53-
[DEBUG] 15:32:01:424351200: ollama_api_get: success
49+
[DEBUG] 19:56:04:158451900: ollama_api_get: [/api/version]
50+
[DEBUG] 19:56:04:179528200: _call_curl: [GET] [/api/version]
51+
[DEBUG] 19:56:04:201419000: _call_curl: OBL_API: http://localhost:11434
52+
[DEBUG] 19:56:04:221040400: _call_curl: args: -s -N --max-time 300 -H Content-Type: application/json -w HTTP_CODE_DELIMITER%{http_code} -X GET http://localhost:11434/api/version
53+
[DEBUG] 19:56:04:492848700: ollama_api_get: success
5454
result: lines: 1
5555
result: output: {"version":"0.11.6"}
5656
```

demos/ollama_app_vars.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ OLLAMA_CONTEXT_LENGTH : # Context length to use unless otherwise specified (d
88
OLLAMA_DEBUG : # Show additional debug information (e.g. OLLAMA_DEBUG=1)
99
OLLAMA_FLASH_ATTENTION : # Enabled flash attention
1010
OLLAMA_GPU_OVERHEAD : # Reserve a portion of VRAM per GPU (bytes)
11-
OLLAMA_HOST : # IP Address for the ollama server (default 127.0.0.1:11434)
11+
OLLAMA_HOST : http://localhost:11434 # IP Address for the ollama server (default 127.0.0.1:11434)
1212
OLLAMA_INTEL_GPU : # Enable experimental Intel GPU detection
1313
OLLAMA_KEEP_ALIVE : # The duration that models stay loaded in memory (default "5m")
1414
OLLAMA_KV_CACHE_TYPE : # Quantization type for the K/V cache (default: f16)

demos/ollama_chat.md

Lines changed: 8 additions & 64 deletions
Large diffs are not rendered by default.

demos/ollama_generate.md

Lines changed: 21 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ A [demo](../README.md#demos) of [Ollama Bash Lib](https://github.com/attogram/ol
99
ollama_generate -m "$model" -p "$prompt"
1010
```
1111
```
12-
Cute, fluffy, and fast-moving.
12+
Cute, fluffy, adventurous.
1313
```
1414

1515
## Demo Debug
@@ -19,24 +19,24 @@ Cute, fluffy, and fast-moving.
1919
OBL_DEBUG=1 ollama_generate -m "$model" -p "$prompt"
2020
```
2121
```
22-
[DEBUG] 15:32:35:429171900: ollama_generate: [-m] [dolphin3:8b] [-p] [Describe a rabbit in 3 words]
23-
[DEBUG] 15:32:35:468262000: ollama_generate: init: model: [] prompt: []
24-
[DEBUG] 15:32:35:489570300: ollama_generate: while getopts: OPTARG: [dolphin3:8b] opt: [m]
25-
[DEBUG] 15:32:35:511840500: ollama_generate: while getopts: OPTARG: [Describe a rabbit in 3 words] opt: [p]
26-
[DEBUG] 15:32:35:537158600: ollama_generate: final: model: [dolphin3:8b] prompt: [Describe a rabbit in 3 words]
27-
[DEBUG] 15:32:35:559616500: ollama_generate: checking: model: [dolphin3:8b]
28-
[DEBUG] 15:32:35:583955100: ollama_generate: checked: model: [dolphin3:8b]
29-
[DEBUG] 15:32:35:644889900: ollama_generate_json: [dolphin3:8b] [Describe a rabbit in 3 words]
30-
[DEBUG] 15:32:35:684773200: ollama_generate_json: json_payload: {"model":"dolphin3:8b","prompt":"Describe a rabbit in 3 words","stream":false,"thinking":false}
31-
[DEBUG] 15:32:35:722860400: ollama_api_post: [/api/generate] {"model":"dolphin3:8b","prompt":"Describe a rabbit in 3 words","stream":false,"thinking":false}
32-
[DEBUG] 15:32:35:757659800: _call_curl: [POST] [/api/generate] {"model":"dolphin3:8b","prompt":"Describe a rabbit in 3 words","stream":false,"thinking":false}
33-
[DEBUG] 15:32:35:818978600: _call_curl: OBL_API: http://localhost:11434
34-
[DEBUG] 15:32:35:844650000: _call_curl: json_body: {"model":"dolphin3:8b","prompt":"Describe a rabbit in 3 words","stream":false,"thinking":false}
35-
[DEBUG] 15:32:35:866758600: _call_curl: piping json_body | curl -s -N --max-time 300 -H Content-Type: application/json -w HTTP_CODE_DELIMITER%{http_code} -X POST http://localhost:11434/api/generate -d @-
36-
[DEBUG] 15:32:36:424820300: ollama_api_post: success
37-
[DEBUG] 15:32:36:448715200: ollama_generate_json: success
38-
[DEBUG] 15:32:36:516192000: ollama_generate: result: 480 bytes: {"model":"dolphin3:8b","created_at":"2025-08-23T13:32:36.3944121Z","response":"Fluffy, Ears, Hopper.","done":true,"done_
39-
[DEBUG] 15:32:36:597235300: ollama_generate: thinking: off
40-
Fluffy, Ears, Hopper.
41-
[DEBUG] 15:32:36:676621700: ollama_generate: success
22+
[DEBUG] 19:56:15:657322000: ollama_generate: [-m] [smollm2:1.7b] [-p] [Describe a rabbit in 3 words]
23+
[DEBUG] 19:56:15:727768300: ollama_generate: init: model: [] prompt: []
24+
[DEBUG] 19:56:15:748958400: ollama_generate: while getopts: OPTARG: [smollm2:1.7b] opt: [m]
25+
[DEBUG] 19:56:15:771258600: ollama_generate: while getopts: OPTARG: [Describe a rabbit in 3 words] opt: [p]
26+
[DEBUG] 19:56:15:791877300: ollama_generate: final: model: [smollm2:1.7b] prompt: [Describe a rabbit in 3 words]
27+
[DEBUG] 19:56:15:822072000: ollama_generate: checking: model: [smollm2:1.7b]
28+
[DEBUG] 19:56:15:843737300: ollama_generate: checked: model: [smollm2:1.7b]
29+
[DEBUG] 19:56:15:889406100: ollama_generate_json: [smollm2:1.7b] [Describe a rabbit in 3 words]
30+
[DEBUG] 19:56:15:927197800: ollama_generate_json: json_payload: {"model":"smollm2:1.7b","prompt":"Describe a rabbit in 3 words","stream":false,"thinking":false}
31+
[DEBUG] 19:56:15:977858600: ollama_api_post: [/api/generate] {"model":"smollm2:1.7b","prompt":"Describe a rabbit in 3 words","stream":false,"thinking":false}
32+
[DEBUG] 19:56:16:000354200: _call_curl: [POST] [/api/generate] {"model":"smollm2:1.7b","prompt":"Describe a rabbit in 3 words","stream":false,"thinking":false}
33+
[DEBUG] 19:56:16:040814100: _call_curl: OBL_API: http://localhost:11434
34+
[DEBUG] 19:56:16:064366400: _call_curl: json_body: {"model":"smollm2:1.7b","prompt":"Describe a rabbit in 3 words","stream":false,"thinking":false}
35+
[DEBUG] 19:56:16:085589700: _call_curl: piping json_body | curl -s -N --max-time 300 -H Content-Type: application/json -w HTTP_CODE_DELIMITER%{http_code} -X POST http://localhost:11434/api/generate -d @-
36+
[DEBUG] 19:56:16:469116000: ollama_api_post: success
37+
[DEBUG] 19:56:16:498336000: ollama_generate_json: success
38+
[DEBUG] 19:56:16:561138500: ollama_generate: result: 497 bytes: {"model":"smollm2:1.7b","created_at":"2025-08-23T17:56:16.4462811Z","response":"Plush: fluffy, bunny, cute.","done":true
39+
[DEBUG] 19:56:16:637826500: ollama_generate: thinking: off
40+
Plush: fluffy, bunny, cute.
41+
[DEBUG] 19:56:16:732610200: ollama_generate: success
4242
```

demos/ollama_generate_json.md

Lines changed: 147 additions & 77 deletions
Original file line numberDiff line numberDiff line change
@@ -4,113 +4,183 @@ A [demo](../README.md#demos) of [Ollama Bash Lib](https://github.com/attogram/ol
44

55
## Setup
66

7-
OLLAMA_HOST:
7+
OLLAMA_HOST: http://localhost:11434
88
OBL_API: http://localhost:11434
99

1010

1111
## Demo
1212

1313
```bash
14-
ollama_generate_json -m "gemma3n:e2b" -p "Describe a rabbit in 3 words"
14+
ollama_generate_json -m "granite3.3:2b" -p "Describe a rabbit in 3 words"
1515
```
1616
```json
1717
{
18-
"model": "gemma3n:e2b",
19-
"created_at": "2025-08-23T13:32:42.2608687Z",
20-
"response": "Cute, fluffy, quick. \n",
18+
"model": "granite3.3:2b",
19+
"created_at": "2025-08-23T17:56:18.8180891Z",
20+
"response": "Fast, Fluffy, Burrowing.",
2121
"done": true,
2222
"done_reason": "stop",
2323
"context": [
24-
105,
25-
2364,
26-
107,
27-
82858,
24+
49152,
25+
2946,
26+
49153,
27+
39558,
28+
390,
29+
17071,
30+
2821,
31+
44,
32+
30468,
33+
225,
34+
36,
35+
34,
36+
36,
37+
38,
38+
32,
39+
203,
40+
4282,
41+
884,
42+
8080,
43+
278,
44+
659,
45+
30,
46+
18909,
47+
810,
48+
25697,
49+
32,
50+
2448,
51+
884,
52+
312,
53+
17247,
54+
19551,
55+
47330,
56+
32,
57+
0,
58+
203,
59+
49152,
2860
496,
29-
27973,
30-
528,
31-
236743,
32-
236800,
33-
4171,
34-
106,
35-
107,
36-
105,
37-
4368,
61+
49153,
62+
8591,
63+
312,
64+
40810,
65+
328,
66+
225,
67+
37,
68+
8153,
69+
0,
70+
203,
71+
49152,
72+
17594,
73+
49153,
74+
12200,
75+
30,
76+
5449,
77+
2966,
3878
107,
39-
55973,
40-
236764,
41-
7745,
42-
236760,
43-
35411,
44-
236764,
45-
3823,
46-
236761,
47-
236743,
48-
107
79+
30,
80+
34630,
81+
643,
82+
299,
83+
32
4984
],
50-
"total_duration": 4646174500,
51-
"load_duration": 4275268700,
52-
"prompt_eval_count": 16,
53-
"prompt_eval_duration": 221857900,
54-
"eval_count": 9,
55-
"eval_duration": 146994200
85+
"total_duration": 1203194400,
86+
"load_duration": 985779500,
87+
"prompt_eval_count": 50,
88+
"prompt_eval_duration": 117790400,
89+
"eval_count": 11,
90+
"eval_duration": 97787200
5691
}
5792

5893
```
5994

6095
## Demo Debug
6196

6297
```bash
63-
OBL_DEBUG=1 ollama_generate_json -m "gemma3n:e2b" -p "Describe a rabbit in 3 words"
98+
OBL_DEBUG=1 ollama_generate_json -m "granite3.3:2b" -p "Describe a rabbit in 3 words"
6499
```
65100
```json
66-
[DEBUG] 15:32:42:326972100: ollama_generate_json: [gemma3n:e2b] [Describe a rabbit in 3 words]
67-
[DEBUG] 15:32:42:375982400: ollama_generate_json: json_payload: {"model":"gemma3n:e2b","prompt":"Describe a rabbit in 3 words","stream":false,"thinking":false}
68-
[DEBUG] 15:32:42:422972900: ollama_api_post: [/api/generate] {"model":"gemma3n:e2b","prompt":"Describe a rabbit in 3 words","stream":false,"thinking":false}
69-
[DEBUG] 15:32:42:446332400: _call_curl: [POST] [/api/generate] {"model":"gemma3n:e2b","prompt":"Describe a rabbit in 3 words","stream":false,"thinking":false}
70-
[DEBUG] 15:32:42:487397500: _call_curl: OBL_API: http://localhost:11434
71-
[DEBUG] 15:32:42:515520300: _call_curl: json_body: {"model":"gemma3n:e2b","prompt":"Describe a rabbit in 3 words","stream":false,"thinking":false}
72-
[DEBUG] 15:32:42:542426000: _call_curl: piping json_body | curl -s -N --max-time 300 -H Content-Type: application/json -w HTTP_CODE_DELIMITER%{http_code} -X POST http://localhost:11434/api/generate -d @-
73-
[DEBUG] 15:32:43:073363100: ollama_api_post: success
74-
[DEBUG] 15:32:43:095487700: ollama_generate_json: success
101+
[DEBUG] 19:56:18:890341100: ollama_generate_json: [granite3.3:2b] [Describe a rabbit in 3 words]
102+
[DEBUG] 19:56:18:941769400: ollama_generate_json: json_payload: {"model":"granite3.3:2b","prompt":"Describe a rabbit in 3 words","stream":false,"thinking":false}
103+
[DEBUG] 19:56:18:980143500: ollama_api_post: [/api/generate] {"model":"granite3.3:2b","prompt":"Describe a rabbit in 3 words","stream":false,"thinking":false}
104+
[DEBUG] 19:56:19:000915400: _call_curl: [POST] [/api/generate] {"model":"granite3.3:2b","prompt":"Describe a rabbit in 3 words","stream":false,"thinking":false}
105+
[DEBUG] 19:56:19:045773400: _call_curl: OBL_API: http://localhost:11434
106+
[DEBUG] 19:56:19:068263600: _call_curl: json_body: {"model":"granite3.3:2b","prompt":"Describe a rabbit in 3 words","stream":false,"thinking":false}
107+
[DEBUG] 19:56:19:089336900: _call_curl: piping json_body | curl -s -N --max-time 300 -H Content-Type: application/json -w HTTP_CODE_DELIMITER%{http_code} -X POST http://localhost:11434/api/generate -d @-
108+
[DEBUG] 19:56:19:479428700: ollama_api_post: success
109+
[DEBUG] 19:56:19:506919200: ollama_generate_json: success
75110
{
76-
"model": "gemma3n:e2b",
77-
"created_at": "2025-08-23T13:32:43.0422504Z",
78-
"response": "Cute, fluffy, hopping. \n",
111+
"model": "granite3.3:2b",
112+
"created_at": "2025-08-23T17:56:19.4470972Z",
113+
"response": "Fast, Furious, Breeding.",
79114
"done": true,
80115
"done_reason": "stop",
81116
"context": [
82-
105,
83-
2364,
84-
107,
85-
82858,
117+
49152,
118+
2946,
119+
49153,
120+
39558,
121+
390,
122+
17071,
123+
2821,
124+
44,
125+
30468,
126+
225,
127+
36,
128+
34,
129+
36,
130+
38,
131+
32,
132+
203,
133+
4282,
134+
884,
135+
8080,
136+
278,
137+
659,
138+
30,
139+
18909,
140+
810,
141+
25697,
142+
32,
143+
2448,
144+
884,
145+
312,
146+
17247,
147+
19551,
148+
47330,
149+
32,
150+
0,
151+
203,
152+
49152,
86153
496,
87-
27973,
88-
528,
89-
236743,
90-
236800,
91-
4171,
92-
106,
93-
107,
94-
105,
95-
4368,
96-
107,
97-
55973,
98-
236764,
99-
7745,
100-
236760,
101-
35411,
102-
236764,
103-
85834,
104-
236761,
105-
236743,
106-
107
154+
49153,
155+
8591,
156+
312,
157+
40810,
158+
328,
159+
225,
160+
37,
161+
8153,
162+
0,
163+
203,
164+
49152,
165+
17594,
166+
49153,
167+
12200,
168+
30,
169+
506,
170+
305,
171+
2406,
172+
30,
173+
551,
174+
18749,
175+
299,
176+
32
107177
],
108-
"total_duration": 233288600,
109-
"load_duration": 104823100,
110-
"prompt_eval_count": 16,
111-
"prompt_eval_duration": 16844400,
112-
"eval_count": 9,
113-
"eval_duration": 111100900
178+
"total_duration": 123293000,
179+
"load_duration": 14684500,
180+
"prompt_eval_count": 50,
181+
"prompt_eval_duration": 1085800,
182+
"eval_count": 11,
183+
"eval_duration": 107522700
114184
}
115185

116186
```

0 commit comments

Comments
 (0)