Skip to content

Commit 535e91e

Browse files
committed
Maintenance: fix typo
1 parent 8f1ba68 commit 535e91e

File tree

3 files changed

+12
-12
lines changed

3 files changed

+12
-12
lines changed

Problems/x_batch_normalization/learn.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -15,11 +15,11 @@ The process of Batch Normalization consists of the following steps:
1515

1616
### Structure of Batch Normalization for BCHW Input
1717

18-
For an input tensor with the shape **BCHW** (where:
18+
For an input tensor with the shape **BCHW**, where:
1919
- **B**: batch size,
2020
- **C**: number of channels,
2121
- **H**: height,
22-
- **W**: width),
22+
- **W**: width,
2323
the Batch Normalization process operates on specific dimensions based on the task's requirement.
2424

2525
#### 1. Mean and Variance Calculation
@@ -46,7 +46,7 @@ The mean and variance are computed **over all spatial positions (H, W)** and **a
4646

4747
#### 2. Normalization
4848

49-
Once the mean $\mu_c$ and variance $\sigma_c^2$ have been computed for each channel, the next step is to **normalize** the input. The normalization is done by subtracting the mean and dividing by the standard deviation (square root of the variance, plus a small constant $\epsilon$ for numerical stability):
49+
Once the mean $\mu_c$ and variance $\sigma_c^2$ have been computed for each channel, the next step is to **normalize** the input. The normalization is done by subtracting the mean and dividing by the standard deviation (plus a small constant $\epsilon$ for numerical stability):
5050

5151
$$
5252
\hat{x}_{i,c,h,w} = \frac{x_{i,c,h,w} - \mu_c}{\sqrt{\sigma_c^2 + \epsilon}}

Problems/x_group_normalization/learn.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -15,25 +15,25 @@ The process of Group Normalization consists of the following steps:
1515

1616
### Structure of Group Normalization for BCHW Input
1717

18-
For an input tensor with the shape **BCHW** (where:
18+
For an input tensor with the shape **BCHW** , where:
1919
- **B**: batch size,
2020
- **C**: number of channels,
2121
- **H**: height,
22-
- **W**: width),
22+
- **W**: width,
2323
the Group Normalization process operates on specific dimensions based on the task's requirement.
2424

2525
#### 1. Group Division
2626

2727
- The input feature dimension **C** (channels) is divided into several groups. The number of groups is determined by the **n_groups** parameter, and the size of each group is calculated as:
2828

2929
$$
30-
\text{group\_size} = \frac{C}{n_{\text{groups}}}
30+
\text{groupSize} = \frac{C}{n_{\text{groups}}}
3131
$$
3232

3333
Where:
3434
- **C** is the number of channels.
3535
- **n_groups** is the number of groups into which the channels are divided.
36-
- **group_size** is the number of channels in each group.
36+
- **groupSize** is the number of channels in each group.
3737

3838
The input tensor is then reshaped to group the channels into the specified groups.
3939

@@ -42,18 +42,18 @@ the Group Normalization process operates on specific dimensions based on the tas
4242
- For each group, the **mean** $\mu_g$ and **variance** $\sigma_g^2$ are computed over the spatial dimensions and across the batch. This normalization helps to stabilize the activations within each group.
4343

4444
$$
45-
\mu_g = \frac{1}{B \cdot H \cdot W \cdot \text{group\_size}} \sum_{i=1}^{B} \sum_{h=1}^{H} \sum_{w=1}^{W} \sum_{g=1}^{\text{group\_size}} x_{i,g,h,w}
45+
\mu_g = \frac{1}{B \cdot H \cdot W \cdot \text{groupSize}} \sum_{i=1}^{B} \sum_{h=1}^{H} \sum_{w=1}^{W} \sum_{g=1}^{\text{groupSize}} x_{i,g,h,w}
4646
$$
4747

4848
$$
49-
\sigma_g^2 = \frac{1}{B \cdot H \cdot W \cdot \text{group\_size}} \sum_{i=1}^{B} \sum_{h=1}^{H} \sum_{w=1}^{W} \sum_{g=1}^{\text{group\_size}} (x_{i,g,h,w} - \mu_g)^2
49+
\sigma_g^2 = \frac{1}{B \cdot H \cdot W \cdot \text{groupSize}} \sum_{i=1}^{B} \sum_{h=1}^{H} \sum_{w=1}^{W} \sum_{g=1}^{\text{groupSize}} (x_{i,g,h,w} - \mu_g)^2
5050
$$
5151

5252
Where:
5353
- $x_{i,g,h,w}$ is the activation at batch index $i$, group index $g$, height $h$, and width $w$.
5454
- $B$ is the batch size.
5555
- $H$ and $W$ are the spatial dimensions (height and width).
56-
- $\text{group_size}$ is the number of channels in each group.
56+
- $\text{groupSize}$ is the number of channels in each group.
5757

5858
#### 3. Normalization
5959

Problems/x_instance_normalization/learn.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -14,11 +14,11 @@ The process of Instance Normalization consists of the following steps:
1414

1515
### Structure of Instance Normalization for BCHW Input
1616

17-
For an input tensor with the shape **BCHW** (where:
17+
For an input tensor with the shape **BCHW** , where:
1818
- **B**: batch size,
1919
- **C**: number of channels,
2020
- **H**: height,
21-
- **W**: width),
21+
- **W**: width,
2222
Instance Normalization operates on the spatial dimensions (height and width) of each instance (image) separately.
2323

2424
#### 1. Mean and Variance Calculation for Each Instance

0 commit comments

Comments
 (0)