Skip to content

Commit 9fc6064

Browse files
committed
Indentation fix for Android App README.md
1 parent 02764d0 commit 9fc6064

File tree

1 file changed

+50
-50
lines changed

1 file changed

+50
-50
lines changed

examples/android_app/README.md

Lines changed: 50 additions & 50 deletions
Original file line numberDiff line numberDiff line change
@@ -195,27 +195,27 @@ The demo app defaults to use agent in the chat. You can switch between simple in
195195
In Agent workflow, the chat history including images are stored per Agent session on the server side. There is no need to look up for chat history in the app unless you are running image reasoning.
196196
* Llama Stack agent is capable of running multi-turn inference using both customized and built-in tools (exclude 1B/3B Llama models). Here is an example creating the agent configuration
197197
```
198-
val agentConfig =
199-
AgentConfig.builder()
200-
.enableSessionPersistence(false)
201-
.instructions("You are a helpful assistant")
202-
.maxInferIters(100)
203-
.model("meta-llama/Llama-3.1-8B-Instruct")
204-
.samplingParams(
205-
SamplingParams.builder()
206-
.strategy(
207-
SamplingParams.Strategy.ofGreedySampling()
208-
)
209-
.build()
210-
)
211-
.toolChoice(AgentConfig.ToolChoice.AUTO)
212-
.toolPromptFormat(AgentConfig.ToolPromptFormat.JSON)
213-
.clientTools(
214-
listOf(
215-
CustomTools.getCreateCalendarEventTool()
216-
)
198+
val agentConfig =
199+
AgentConfig.builder()
200+
.enableSessionPersistence(false)
201+
.instructions("You are a helpful assistant")
202+
.maxInferIters(100)
203+
.model("meta-llama/Llama-3.1-8B-Instruct")
204+
.samplingParams(
205+
SamplingParams.builder()
206+
.strategy(
207+
SamplingParams.Strategy.ofGreedySampling()
217208
)
218209
.build()
210+
)
211+
.toolChoice(AgentConfig.ToolChoice.AUTO)
212+
.toolPromptFormat(AgentConfig.ToolPromptFormat.JSON)
213+
.clientTools(
214+
listOf(
215+
CustomTools.getCreateCalendarEventTool()
216+
)
217+
)
218+
.build()
219219
```
220220
In this sample snippet:
221221
* We sent max inference interation to be 100
@@ -226,41 +226,41 @@ In this sample snippet:
226226
Once the `agentConfig` is built, create an agent along with session and turn service where client is your `LlamaStackClientOkHttpClient` created for remote inference
227227

228228
```
229-
val agentService = client!!.agents()
230-
val agentCreateResponse = agentService.create(
231-
AgentCreateParams.builder()
232-
.agentConfig(agentConfig)
233-
.build(),
234-
)
235-
236-
val agentId = agentCreateResponse.agentId()
237-
val sessionService = agentService.session()
238-
val agentSessionCreateResponse = sessionService.create(
239-
AgentSessionCreateParams.builder()
240-
.agentId(agentId)
241-
.sessionName("test-session")
242-
.build()
243-
)
244-
245-
val sessionId = agentSessionCreateResponse.sessionId()
246-
val turnService = agentService.turn()
229+
val agentService = client!!.agents()
230+
val agentCreateResponse = agentService.create(
231+
AgentCreateParams.builder()
232+
.agentConfig(agentConfig)
233+
.build(),
234+
)
235+
236+
val agentId = agentCreateResponse.agentId()
237+
val sessionService = agentService.session()
238+
val agentSessionCreateResponse = sessionService.create(
239+
AgentSessionCreateParams.builder()
240+
.agentId(agentId)
241+
.sessionName("test-session")
242+
.build()
243+
)
244+
245+
val sessionId = agentSessionCreateResponse.sessionId()
246+
val turnService = agentService.turn()
247247
```
248248
Then you can create a streaming event for this turn service for simple inference
249249
```
250-
turnService.createStreaming(
251-
AgentTurnCreateParams.builder()
252-
.agentId(agentId)
253-
.messages(
254-
listOf(
255-
AgentTurnCreateParams.Message.ofUser(
256-
UserMessage.builder()
257-
.content(InterleavedContent.ofString("What is the capital of France?"))
258-
.build()
259-
)
250+
turnService.createStreaming(
251+
AgentTurnCreateParams.builder()
252+
.agentId(agentId)
253+
.messages(
254+
listOf(
255+
AgentTurnCreateParams.Message.ofUser(
256+
UserMessage.builder()
257+
.content(InterleavedContent.ofString("What is the capital of France?"))
258+
.build()
260259
)
261-
.sessionId(sessionId)
262-
.build()
263-
)
260+
)
261+
.sessionId(sessionId)
262+
.build()
263+
)
264264
```
265265
You can find more examples in `ExampleLlamaStackRemoteInference.kt`. Note that remote agent workflow only supports streaming response currently.
266266

0 commit comments

Comments
 (0)