Skip to content

cantordust/pyrception

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

102 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PyPI - Version Read The Docs

🌄 Overview

Pyrception is a simulation framework for biosensors. Currently, it provides the base ingredients for simulating key parts of the structural and functional elements of visual processing observed in the mammalian retina. The long-term goal of Pyrception is to support multiple sensory modalities (such as auditory, olfactory and tactile), and to provide methods for integrating those inputs into a unified multisensory input signal (such as spike trains). Alongside this, Pyrception can also serve as an input conversion for encoding raw multimodal sensory input into a uniform spike train suitable for processing with spiking neural networks.

zebra.mp4

An RGB video processed with different layers (see the full example).

🪛 Installation

You can install Pyrception from PyPI:

pip install pyrception

or directly from GitHub (optionally in development mode):

git clone git@github.com:cantordust/pyrception.git
cd pyrception
pip install -e .

♻️ Optional dependencies

Pyrception supports several dependency groups:

  • cli: Command-line interface.
  • events: Support for processing events (including from event cameras).
  • dev: Development tools (for testing, profiling, etc.).
  • torch: PyTorch support.
  • ipy: ipykernel & ipywidgets (for running inside notebooks).
  • docs: Tools for building the documentation.
    • NOTE: The documentation is built using MKDocs, which has been discontinued. The documentation will likely move to Zensical soon.
  • all: All of the above.

Use the --group with pip to enable a dependency group (repeat for each group). For instance:

pip install -e . --group events --group docs

will pull in all dependencies necessary for event-based input and building the documentation.

⏯️ Usage

Please refer to the documentation, which contains step-by-step notebooks demonstrating how to use Pyrception with a static image and an RGB video. More notebooks are currently being developed, including sparse event input from an event camera. Stay tuned.

📈 Development

Please open an issue if you discover a bug, feature. That said, contributions are welcome!

To generate and view the documentation locally, clone the repository and run the MkDocs build pipeline (note that you have to install Pyrception with the docs dependency group):

git clone git@github.com:cantordust/pyrception.git
cd pyrception
pip install -e . --group docs
cd docs
mkdocs build

Then, to view the documentation locally, start the MkDocs server:

mkdocs serve

📋 ToDo

Short-term

👁️ Visual package

  • All major types of retinal cells.
    • Receptors (raw input, Weber's law).
    • Horizontal cells (mean local brightness, normalising feedback).
    • Bipolar cells (positive and negative contrast, temporal filter, excitatory input to ganglion cells).
    • Amacrine cells (inhibitory input to ganglion cells, modulatory signal to bipolar cells).
    • Ganglion cells (spiking).
  • Logpolar kernel arrangement.
  • Uniform or Gaussian kernels.
  • Arbitrary kernel, size, shape and orientation.
  • 🚧 Colour vision (with colour opponency).
  • 🚧 Temporal dynamics.
  • 🚧 Events as input.
  • Saccadic movements.

👂 Auditory package

WIP.

👃 Olfactory package

WIP.

🔧 Others

  • 🚧 Support alternative backends for sparse matrix operations (CuPy, PyTorch, Sparse).
  • 🚧 Interfacing with (neuromorphic) hardware, such as event cameras.

About

Bioinspired perception in Python

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages