|
9 | 9 | specific event depends solely on the state attained at the previous event. For a more complete |
10 | 10 | description of these, feel free to consult the <a |
11 | 11 | class="text-blue-500" |
| 12 | + target="_blank" |
12 | 13 | href="https://en.wikipedia.org/wiki/Markov_chain">Wikipedia article</a |
13 | 14 | > |
14 | 15 | on markov chains. Concretely, markov chains are used for autocompletion when typing. You can see |
|
21 | 22 | in the input file, the probability of a certain word appearing right after it. The |
22 | 23 | <code>big.json</code> |
23 | 24 | model in the GitHub repository is created from a few books, taken from |
24 | | - <a class="text-blue-500" href="https://www.gutenberg.org/">Project Gutenberg</a>. In this model, |
25 | | - the word "zinc" has a 50% chance to be followed by "and", 25% chance to be followed by |
26 | | - "Immersion" and a 25% chance to be followed by "A". This is essentially what a markov chain is. |
27 | | - After parsing the JSON file and storing the information into a data structure that is optimized |
28 | | - for our needs, the program will spout whole paragraphs on autocomplete from one starting word, |
29 | | - even if they tend to be nonsensical at times, which is normal, since the words proposed by the |
30 | | - model are just suggestions of the most common words. |
| 25 | + <a target="_blank" class="text-blue-500" href="https://www.gutenberg.org/">Project Gutenberg</a |
| 26 | + >. In this model, the word "zinc" has a 50% chance to be followed by "and", 25% chance to be |
| 27 | + followed by "Immersion" and a 25% chance to be followed by "A". This is essentially what a |
| 28 | + markov chain is. After parsing the JSON file and storing the information into a data structure |
| 29 | + that is optimized for our needs, the program will spout whole paragraphs on autocomplete from |
| 30 | + one starting word, even if they tend to be nonsensical at times, which is normal, since the |
| 31 | + words proposed by the model are just suggestions of the most common words. |
31 | 32 | </p> |
32 | 33 | <p>Example prompt using a model based on the Frankenstein book:</p> |
33 | 34 | <code> |
|
0 commit comments