From 1a8cf7f8c1222516dbf2c1ed8667dec8c6c6c8fb Mon Sep 17 00:00:00 2001 From: Mason Garrison Date: Mon, 25 Nov 2024 19:37:12 -0500 Subject: [PATCH] Update README.md Fixed the duplicate parenthesis in line 47 [emergent/emer](https://github.com/emer/emergent) --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index ac0799d..3a77db8 100644 --- a/README.md +++ b/README.md @@ -44,7 +44,7 @@ See the [ra25 example](https://github.com/emer/leabra/blob/main/examples/ra25/RE * Each structural element directly has all the parameters controlling its behavior -- e.g., the `Layer` contains an `ActParams` field (named `Act`), etc, instead of using a separate `Spec` structure as in C++ emergent. The Spec-like ability to share parameter settings across multiple layers etc is instead achieved through a **styling**-based paradigm -- you apply parameter "styles" to relevant layers instead of assigning different specs to them. This paradigm should be less confusing and less likely to result in accidental or poorly understood parameter applications. We adopt the CSS (cascading-style-sheets) standard where parameters can be specifed in terms of the Name of an object (e.g., `#Hidden`), the *Class* of an object (e.g., `.TopDown` -- where the class name TopDown is manually assigned to relevant elements), and the *Type* of an object (e.g., `Layer` applies to all layers). Multiple space-separated classes can be assigned to any given element, enabling a powerful combinatorial styling strategy to be used. -* The [emergent/emer]((https://github.com/emer/emergent) interfaces are designed to support generic access to network state, e.g., for the 3D network viewer, but specifically avoid anything algorithmic. Thus, they should allow viewing of any kind of network, including PyTorch backprop nets. +* The [emergent/emer](https://github.com/emer/emergent) interfaces are designed to support generic access to network state, e.g., for the 3D network viewer, but specifically avoid anything algorithmic. Thus, they should allow viewing of any kind of network, including PyTorch backprop nets. * Layers have a `Shape` property, using the `tensor.Shape` type, which specifies their n-dimensional (tensor) shape. Standard layers are expected to use a 2D Y*X shape (note: dimension order is now outer-to-inner or *RowMajor* now), and a 4D shape then enables `Pools` ("unit groups") as hypercolumn-like structures within a layer that can have their own local level of inihbition, and are also used extensively for organizing patterns of connectivity.