Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions docs/tutorials/poly.rst
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,7 @@ Example: Wiring a Square Function
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Let's say we want to compute :math:`y = x \otimes x` (the tensor product of :math:`x` with itself).

* **The Math (STP)**: Requires two input operands (Left, Right) and produces one output.
* **The Circuit (SP)**: Has only *one* global input (:math:`x`). We need to wire this single input to *both* the Left and Right operands of the STP.

Expand Down Expand Up @@ -151,6 +152,7 @@ This happens when every operand is made of segments that are:

**Why does this matter?**
If your data is "Uniform 1D", it fits into regular tensors. This means we don't need slow, sparse lookups. We can use highly optimized code:

* **Vectorization**: Using ``vmap`` in JAX or PyTorch.
* **CUDA Kernels**: We provide specialized GPU kernels for this case that are very fast.

Expand Down
1 change: 1 addition & 0 deletions docs/tutorials/stp.rst
Original file line number Diff line number Diff line change
Expand Up @@ -80,6 +80,7 @@ Now for the connectivity. A **path** defines a single term in our sparse calcula
It specifies which segments interact.

If we want to connect:

* Segment 0 from Operand 0
* Segment 1 from Operand 1
* Segment 0 from Operand 2
Expand Down
Loading