@@ -18,8 +18,8 @@ running `npm install`.
1818
1919## Usage
2020
21- To use the module and run gptscripts, you need to first set the OPENAI_API_KEY environment variable to your OpenAI API
22- key.
21+ To use the module and run gptscripts, you need to first set the ` OPENAI_API_KEY ` environment variable to your OpenAI API
22+ key. You can also set the ` GPTSCRIPT_BIN ` environment variable to change the execution of the gptscripts.
2323
2424To ensure it is working properly, you can run the following command:
2525
@@ -31,23 +31,24 @@ You will see "Hello, World!" in the output of the command.
3131
3232## Client
3333
34- There are currently a couple "global" options, and the client helps to manage those. A client without any options is
35- likely what you want. However, here are the current global options:
36-
37- - ` gptscriptURL ` : The URL (including `http(s)://) of an "SDK server" to use instead of the fork/exec model.
38- - ` gptscriptBin ` : The path to a ` gptscript ` binary to use instead of the bundled one.
34+ The client allows the caller to run gptscript files, tools, and other operations (see below). There are currently no
35+ options for this singleton client, so ` new gptscript.Client() ` is all you need. Although, the intention is that a
36+ single client is all you need for the life of your application, you should call ` close() ` on the client when you are
37+ done.
3938
4039## Options
4140
4241These are optional options that can be passed to the various ` exec ` functions.
4342None of the options is required, and the defaults will reduce the number of calls made to the Model API.
4443
45- - ` disableCache ` : Enable or disable caching, default (true)
46- - ` cacheDir ` : Specify the cache directory
44+ - ` cache ` : Enable or disable caching. Default (true).
45+ - ` cacheDir ` : Specify the cache directory.
4746- ` quiet ` : No output logging
48- - ` chdir ` : Change current working directory
4947- ` subTool ` : Use tool of this name, not the first tool
48+ - ` input ` : Input arguments for the tool run
5049- ` workspace ` : Directory to use for the workspace, if specified it will not be deleted on exit
50+ - ` chatState ` : The chat state to continue, or null to start a new chat and return the state
51+ - ` confirm ` : Prompt before running potentially dangerous commands
5152
5253## Functions
5354
@@ -64,6 +65,7 @@ async function listTools() {
6465 const client = new gptscript.Client ();
6566 const tools = await client .listTools ();
6667 console .log (tools);
68+ client .close ()
6769}
6870```
6971
@@ -78,12 +80,13 @@ const gptscript = require('@gptscript-ai/gptscript');
7880
7981async function listModels () {
8082 let models = [];
83+ const client = new gptscript.Client ();
8184 try {
82- const client = new gptscript.Client ();
8385 models = await client .listModels ();
8486 } catch (error) {
8587 console .error (error);
8688 }
89+ client .close ()
8790}
8891```
8992
@@ -97,12 +100,13 @@ Get the first of the current `gptscript` binary being used for the calls.
97100const gptscript = require (' @gptscript-ai/gptscript' );
98101
99102async function version () {
103+ const client = new gptscript.Client ();
100104 try {
101- const client = new gptscript.Client ();
102105 console .log (await client .version ());
103106 } catch (error) {
104107 console .error (error);
105108 }
109+ client .close ()
106110}
107111```
108112
@@ -118,13 +122,14 @@ const t = {
118122 instructions: " Who was the president of the united states in 1928?"
119123};
120124
125+ const client = new gptscript.Client ();
121126try {
122- const client = new gptscript.Client ();
123- const run = client .evaluate (t);
127+ const run = await client .evaluate (t);
124128 console .log (await run .text ());
125129} catch (error) {
126130 console .error (error);
127131}
132+ client .close ();
128133```
129134
130135### run
@@ -140,13 +145,14 @@ const opts = {
140145};
141146
142147async function execFile () {
148+ const client = new gptscript.Client ();
143149 try {
144- const client = new gptscript.Client ();
145- const run = client .run (' ./hello.gpt' , opts);
150+ const run = await client .run (' ./hello.gpt' , opts);
146151 console .log (await run .text ());
147152 } catch (e) {
148153 console .error (e);
149154 }
155+ client .close ();
150156}
151157```
152158
@@ -156,17 +162,6 @@ The `Run` object exposes event handlers so callers can access the progress event
156162
157163The ` Run ` object exposes these events with their corresponding event type:
158164
159- | Event type | Event object |
160- | ---------------------------| -------------------|
161- | RunEventType.RunStart | RunStartFrame |
162- | RunEventType.RunFinish | RunFinishFrame |
163- | RunEventType.CallStart | CallStartFrame |
164- | RunEventType.CallChat | CallChatFrame |
165- | RunEventType.CallContinue | CallContinueFrame |
166- | RunEventType.CallProgress | CallProgressFrame |
167- | RunEventType.CallFinish | CallFinishFrame |
168- | RunEventType.Event | Frame |
169-
170165Subscribing to ` RunEventType.Event ` gets you all events.
171166
172167``` javascript
@@ -178,9 +173,9 @@ const opts = {
178173};
179174
180175async function streamExecFileWithEvents () {
176+ const client = new gptscript.Client ();
181177 try {
182- const client = new gptscript.Client ();
183- const run = client .run (' ./test.gpt' , opts);
178+ const run = await client .run (' ./test.gpt' , opts);
184179
185180 run .on (gptscript .RunEventType .Event , data => {
186181 console .log (` event: ${ JSON .stringify (data)} ` );
@@ -190,6 +185,45 @@ async function streamExecFileWithEvents() {
190185 } catch (e) {
191186 console .error (e);
192187 }
188+ client .close ();
189+ }
190+ ```
191+
192+ ### Confirm
193+
194+ If a gptscript can run commands, you may want to inspect and confirm/deny the command before they are run. This can be
195+ done with the ` confirm ` method. A user should listen for the ` RunEventType.CallConfirm ` event.
196+
197+ ``` javascript
198+ const gptscript = require (' @gptscript-ai/gptscript' );
199+
200+ const opts = {
201+ disableCache: true ,
202+ input: " --testin how high is that there mouse?" ,
203+ confirm: true
204+ };
205+
206+ async function streamExecFileWithEvents () {
207+ const client = new gptscript.Client ();
208+ try {
209+ const run = await client .run (' ./test.gpt' , opts);
210+
211+ run .on (gptscript .RunEventType .CallConfirm , async (data : gptscript .CallFrame ) => {
212+ // data.Tool has the information for the command being run.
213+ // data.Input has the input for this command
214+
215+ await client .confirm ({
216+ id: data .id ,
217+ accept: true , // false if the command should not be run
218+ message: " " , // Explain the denial (ignored if accept is true)
219+ })
220+ });
221+
222+ await run .text ();
223+ } catch (e) {
224+ console .error (e);
225+ }
226+ client .close ();
193227}
194228```
195229
@@ -219,7 +253,7 @@ const t = {
219253
220254async function streamExecFileWithEvents () {
221255 const client = new gptscript.Client ();
222- let run = client .evaluate (t, opts);
256+ let run = await client .evaluate (t, opts);
223257 try {
224258 // Wait for the initial run to complete.
225259 await run .text ();
@@ -238,6 +272,7 @@ async function streamExecFileWithEvents() {
238272 console .error (e);
239273 }
240274
275+ client .close ();
241276
242277 // The state here should either be RunState.Finished (on success) or RunState.Error (on error).
243278 console .log (run .state )
0 commit comments