diff --git a/book/thy.tex b/book/thy.tex index 265d639..31f6607 100644 --- a/book/thy.tex +++ b/book/thy.tex @@ -161,7 +161,7 @@ \section{Probably Approximately Correct Learning} at most $5\%$ error. If this situtation is guaranteed to happen, then this hypothetical learning algorithm is a PAC learning algorithm. It satisfies ``probably'' because it only failed in one out of ten cases, -and it's ``approximate'' because it achieved low, but non-zero, error +and its ``approximate'' because it achieved low, but non-zero, error on the remainder of the cases. This leads to the formal definition of an $(\ep,\de)$ PAC-learning @@ -510,7 +510,7 @@ \section{Complexity of Infinite Hypothesis Spaces} dimension is the \emph{maximum} number of points for which you can always find such a classifier. -\thinkaboutit{What is that labeling? What is it's name?} +\thinkaboutit{What is that labeling? What is its name?} You can think of VC dimension as a game between you and an adversary. To play this game, \emph{you} choose $K$ unlabeled points however you