You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: blazor/ai-assistview/speech/speech-to-text.md
+20-1Lines changed: 20 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -23,7 +23,18 @@ Before integrating `Speech-to-Text`, ensure the following:
23
23
24
24
## Configure Speech-to-Text
25
25
26
-
To enable Speech-to-Text functionality, modify the `Home.razor` file to incorporate the Web Speech API. The [SpeechToText](https://blazor.syncfusion.com/documentation/speech-to-text/getting-started-web-app) component listens for microphone input, transcribes spoken words, and updates the AI AssistView's editable footer with the transcribed text. The transcribed text is then sent as a prompt to the Azure OpenAI service via the AI AssistView component.
26
+
To enable Speech-to-Text functionality in the Blazor AI AssistView component, update the `Home.razor` file to incorporate the Web Speech API.
27
+
28
+
The [SpeechToText](https://blazor.syncfusion.com/documentation/speech-to-text/getting-started-web-app) component listens to audio input from the device’s microphone, transcribes spoken words into text, and updates the AI AssistView’s editable footer with the recognized text. Once the transcription appears in the footer, users can send it as a message to others.
29
+
30
+
### Configuration Options
31
+
32
+
***[`Language`](https://help.syncfusion.com/cr/blazor/Syncfusion.Blazor.Inputs.SfSpeechToText.html#Syncfusion_Blazor_Inputs_SfSpeechToText_Language)**: Specifies the language for speech recognition. For example:
33
+
34
+
*`en-US` for American English
35
+
*`fr-FR` for French
36
+
37
+
***[`AllowInterimResults`](https://help.syncfusion.com/cr/blazor/Syncfusion.Blazor.Inputs.SfSpeechToText.html#Syncfusion_Blazor_Inputs_SfSpeechToText_AllowInterimResults)**: Set to `true` to receive real-time (interim) recognition results, or `false` to receive only final results.
27
38
28
39
The `speechtotext.js` file handles operations related to the content of the editable footer, such as checking for meaningful input, clearing existing text, and updating the content with the transcribed value. Meanwhile, the `speechtotext.css` file styles the AI AssistView layout and ensures the component remains responsive across different screen sizes and devices.
29
40
@@ -268,6 +279,14 @@ function updateContentEditableDiv(element, value) {
268
279
269
280

270
281
282
+
## Error Handling
283
+
284
+
The `SpeechToText` component provides events to handle errors that may occur during speech recognition. For more information, refer to the [Error Handling](https://blazor.syncfusion.com/documentation/speech-to-text/speech-recognition#error-handling) section in the documentation.
285
+
286
+
## Browser Compatibility
287
+
288
+
The `SpeechToText` component relies on the [Speech Recognition API](https://developer.mozilla.org/en-US/docs/Web/API/SpeechRecognition), which has limited browser support. Refer to the [Browser Compatibility](https://blazor.syncfusion.com/documentation/speech-to-text/speech-recognition#browser-support) section for detailed information.
Copy file name to clipboardExpand all lines: blazor/chat-ui/speech-to-text.md
+6-14Lines changed: 6 additions & 14 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -11,21 +11,9 @@ documentation: ug
11
11
12
12
The Syncfusion Blazor Chat UI component integrates `Speech-to-Text` functionality through the browser's [Web Speech API](https://developer.mozilla.org/en-US/docs/Web/API/Web_Speech_API). This enables the conversion of spoken words into text using the device's microphone, allowing users to interact with the Chat UI through voice input.
13
13
14
-
## Prerequisites
15
-
16
-
Before integrating `Speech-to-Text`, ensure the following:
***Syncfusion Speech to Text**: Package [Syncfusion.Blazor.Inputs](https://www.nuget.org/packages/Syncfusion.Blazor.Inputs) installed.
21
-
22
-
## Set Up the Chat UI Component
23
-
24
-
Follow the Syncfusion Chat UI [Getting Started](./getting-started) guide to configure and render the Chat UI component in your Blazor application.
25
-
26
14
## Configure Speech-to-Text
27
15
28
-
To enable Speech-to-Text functionality in the Angular Chat UI component, update the `Home.razor` file to incorporate the Web Speech API.
16
+
To enable Speech-to-Text functionality in the Blazor Chat UI component, update the `Home.razor` file to incorporate the Web Speech API.
29
17
30
18
The [SpeechToText](https://blazor.syncfusion.com/documentation/speech-to-text/getting-started-web-app) component listens to audio input from the device’s microphone, transcribes spoken words into text, and updates the Chat UI’s editable footer with the recognized text. Once the transcription appears in the footer, users can send it as a message to others.
31
19
@@ -220,4 +208,8 @@ The `SpeechToText` component provides events to handle errors that may occur dur
220
208
221
209
## Browser Compatibility
222
210
223
-
The `SpeechToText` component relies on the [Speech Recognition API](https://developer.mozilla.org/en-US/docs/Web/API/SpeechRecognition), which has limited browser support. Refer to the [Browser Compatibility](https://blazor.syncfusion.com/documentation/speech-to-text/speech-recognition#browser-support) section for detailed information.
211
+
The `SpeechToText` component relies on the [Speech Recognition API](https://developer.mozilla.org/en-US/docs/Web/API/SpeechRecognition), which has limited browser support. Refer to the [Browser Compatibility](https://blazor.syncfusion.com/documentation/speech-to-text/speech-recognition#browser-support) section for detailed information.
0 commit comments