Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
32 changes: 13 additions & 19 deletions .github/workflows/python-package-conda.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,25 +10,19 @@ jobs:

steps:
- uses: actions/checkout@v4
- name: Set up Python 3.10
uses: actions/setup-python@v3

- name: Install Miniconda
uses: conda-incubator/setup-miniconda@v2
with:
python-version: '3.10'
- name: Add conda to system path
auto-update-conda: true
python-version: 3.11
environment-name: test

- name: Install dependencies
run: |
# $CONDA is an environment variable pointing to the root of the miniconda directory
echo $CONDA/bin >> $GITHUB_PATH
- name: Install current library and dependencies
run: |
pip install -e .
# - name: Lint with flake8
# run: |
# conda install flake8
# # stop the build if there are Python syntax errors or undefined names
# flake8 . --count --select=E9,F63,F7,F82 --show-source --statistics
# # exit-zero treats all errors as warnings. The GitHub editor is 127 chars wide
# flake8 . --count --exit-zero --max-complexity=10 --max-line-length=127 --statistics
- name: Test with pytest
conda install -n test numpy=1.26 pytest pip
conda run -n test pip install -e .

- name: Run tests
run: |
conda install pytest
pytest
conda run -n test pytest
4 changes: 3 additions & 1 deletion CITATION.cff
Original file line number Diff line number Diff line change
Expand Up @@ -6,10 +6,12 @@ authors:
orcid: https://orcid.org/0000-0002-2590-1310
- family-names: Gebhardt
given-names: William
orcid: https://orcid.org/0009-0008-7456-6556
- family-names: Mali
given-names: Ankur
orcid: https://orcid.org/0000-0001-5813-3584
title: "ngc-learn"
version: 1.0.0
version: 3.0.0
identifiers:
- type: doi
value: 10.5281/zenodo.6605728
Expand Down
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ ngc-learn requires:
1) Python (>=3.10)
2) NumPy (>=1.22.0)
3) SciPy (>=1.7.0)
4) ngcsimlib (>=1.0.0), (visit official page <a href="https://github.com/NACLab/ngc-sim-lib">here</a>)
4) ngcsimlib (>=3.0.0), (visit official page <a href="https://github.com/NACLab/ngc-sim-lib">here</a>)
5) JAX (>=0.4.28) (to enable GPU use, make sure to install one of the CUDA variants)
<!--
5) scikit-learn (>=1.3.1) if using `ngclearn.utils.density`
Expand All @@ -42,7 +42,7 @@ ngc-learn requires:
-->

---
ngc-learn 2.0.0 and later require Python 3.10 or newer as well as ngcsimlib >=1.0.0.
ngc-learn 3.0.0 and later require Python 3.10 or newer as well as ngcsimlib >=3.0.0.
ngc-learn's plotting capabilities (routines within `ngclearn.utils.viz`) require
Matplotlib (>=3.8.0) and imageio (>=2.31.5) and both plotting and density estimation
tools (routines within ``ngclearn.utils.density``) will require Scikit-learn (>=0.24.2).
Expand Down Expand Up @@ -75,7 +75,7 @@ Python 3.11.4 (main, MONTH DAY YEAR, TIME) [GCC XX.X.X] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import ngclearn
>>> ngclearn.__version__
'2.0.0'
'3.0.0'
```

<i>Note:</i> For access to the previous Tensorflow-2 version of ngc-learn (of
Expand Down Expand Up @@ -122,7 +122,7 @@ $ python install -e .
</pre>

**Version:**<br>
2.0.2 <!--1.2.3-Beta--> <!-- -Alpha -->
3.0.0 <!--1.2.3-Beta--> <!-- -Alpha -->

Author:
Alexander G. Ororbia II<br>
Expand Down
Binary file added docs/images/museum/harmonium/rbm_arch.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/museum/harmonium/rbm_recon.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/museum/harmonium/samples_0.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/museum/harmonium/samples_1.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/museum/harmonium/samples_2.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/tutorials/neurocog/gmm_fit.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/tutorials/neurocog/gmm_samples.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
9 changes: 4 additions & 5 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,7 @@
Welcome to ngc-learn's documentation!
=====================================

**ngc-learn** is a Python library for building, simulating, and analyzing
biomimetic computational models, arbitrary predictive processing/coding models,
and spiking neural networks. This toolkit is built on top of
`JAX <https://github.com/google/jax>`_ and is distributed under the 3-Clause BSD license.
**ngc-learn** is a Python library for building, simulating, and analyzing biomimetic and NeuroAI computational models, arbitrary predictive processing/coding models, spiking neural networks, and general dynamical systems. This toolkit is built on top of `JAX <https://github.com/google/jax>`_ and is distributed under the 3-Clause BSD license.

.. toctree::
:maxdepth: 1
Expand All @@ -23,6 +20,7 @@ and spiking neural networks. This toolkit is built on top of

tutorials/intro
tutorials/theory
tutorials/configuration/index
tutorials/index
tutorials/neurocog/index

Expand Down Expand Up @@ -52,9 +50,10 @@ and spiking neural networks. This toolkit is built on top of

.. toctree::
:maxdepth: 1
:caption: Papers that use NGC-Learn
:caption: NGC-Learn Papers & Media

ngclearn_papers
ngclearn_talks

Indices and tables
==================
Expand Down
57 changes: 16 additions & 41 deletions docs/installation.md
Original file line number Diff line number Diff line change
@@ -1,65 +1,41 @@
# Installation

**ngc-learn** officially supports Linux on Python 3. It can be run with or
without a GPU.
**ngc-learn** officially supports Linux on Python 3. It can be run with or without a GPU.

<i>Setup:</i> <a href="https://github.com/NACLab/ngc-learn">ngc-learn</a>,
in its entirety (including its supporting utilities),
requires that you ensure that you have installed the following base dependencies in
your system. Note that this library was developed and tested on Ubuntu 22.04 (and earlier versions on 18.04/20.04).
Specifically, ngc-learn requires:
<i>Setup:</i> <a href="https://github.com/NACLab/ngc-learn">NGC-Learn</a>, in its entirety (including its supporting utility sub-packages), requires that you ensure that you have installed the following base dependencies in your system. Note that this library was developed and tested on Ubuntu 22.04 (with much earlier versions on Ubuntu 18.04/20.04).
Specifically, NGC-Learn requires:
* Python (>=3.10)
* ngcsimlib (>=1.0.0), (<a href="https://github.com/NACLab/ngc-sim-lib">official page</a>)
* ngcsimlib (>=3.0.0), (<a href="https://github.com/NACLab/ngc-sim-lib">official page</a>)
* NumPy (>=1.22.0)
* SciPy (>=1.7.0)
* JAX (>= 0.4.28; and jaxlib>=0.4.28) <!--(tested for cuda 11.8)-->
* Matplotlib (>=3.8.0), (for `ngclearn.utils.viz`)
* Scikit-learn (>=1.6.1), (for `ngclearn.utils.patch_utils` and `ngclearn.utils.density`)

Note that the above requirements are taken care of if one installs ngc-learn
through either `pip`. One can either install the CPU version of ngc-learn (if no JAX is
pre-installed or only the CPU version of JAX is installed currently) via
Note that the above requirements are taken care of if one installs NGC-Learn through either `pip`. One can either install the CPU version of NGC-Learn (if no JAX is pre-installed or only the CPU version of JAX is currently installed) via:
```console
$ pip install ngclearn
```

or install the GPU version of ngc-learn by first installing the
<a href="https://jax.readthedocs.io/en/latest/installation.html">CUDA 12
version of JAX</a> before running the above pip command.
or install the GPU version of NGC-Learn by first installing the <a href="https://jax.readthedocs.io/en/latest/installation.html">CUDA 12 version of JAX</a> before running the above pip command.

Alternatively, one may locally, step-by-step (see below), install and setup
ngc-learn from source after pulling from the repo.
Alternatively, one may locally, step-by-step (see below), install and setup NGC-Learn from source after pulling from the repo.

Note that installing the official pip package without any form of JAX installed
on your system will default to downloading the CPU version of ngc-learn (see
below for installing the GPU version).
Note that installing the official pip package without any form of JAX installed on your system will default to downloading the CPU version of NGC-Learn (see below for installing the GPU version).

## Install from Source

0. Install ngc-sim-lib first (as an editable install); visit the repo
https://github.com/NACLab/ngc-sim-lib for details.
1. Install NGC-Sim-Lib first (as an editable install); visit the repo https://github.com/NACLab/ngc-sim-lib for details.

1. Clone the ngc-learn repository:
2. Clone the NGC-Learn repository:
```console
$ git clone https://github.com/NACLab/ngc-learn.git
$ cd ngc-learn
```

2. (<i>Optional</i>; only for GPU version) Install JAX for either CUDA 12 , depending
on your system setup. Follow the
<a href="https://jax.readthedocs.io/en/latest/installation.html">installation instructions</a>
on the official JAX page to properly install the CUDA 11 or 12 version.
3. (<i>Optional</i>; only for GPU version) Install JAX for either CUDA 12 , depending on your system setup. Follow the <a href="https://jax.readthedocs.io/en/latest/installation.html">installation instructions</a> on the official JAX page to properly install the CUDA 11 or 12 version.

<!--
3. (<i>Optional</i>) Install, a pre-package installation step (for only the
non-ngclearn dependencies), the base requirements (and a few extras for building
the docs) with:
```console
$ pip install -r requirements.txt
```
-->

3. Install the ngc-learn package via:
4. Install the NGC-Learn package via:
```console
$ pip install .
```
Expand All @@ -68,22 +44,21 @@ or, to install as an editable install for development, run:
$ pip install -e .
```

If the installation was successful, you should see the following if you test
it against your Python interpreter, i.e., run the <code>$ python</code> command
and complete the following sequence of steps as depicted in the screenshot below:<br>
<!--<img src="images/test_ngclearn_install.png" width="512">-->
If the installation was successful, you should see the following if you test it against your Python interpreter, i.e., run the <code>$ python</code> command and complete the following sequence of steps as depicted in the screenshot below:<br>

```console
Python 3.11.4 (main, MONTH DAY YEAR, TIME) [GCC XX.X.X] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import ngclearn
>>> ngclearn.__version__
'2.0.2'
'3.0.0'
```

<!--
<i>Note</i>: If you do not have a JSON configuration file in place (see tutorials
for details) locally where you call the import to ngc-learn, a warning will pop
up containing within it "<i>UserWarning: Missing file to preload modules from.</i>";
this still means that ngc-learn installed successfully but you will need to
point to a JSON configuration when building projects with ngc-learn.
-->

50 changes: 49 additions & 1 deletion docs/modeling/neurons.md
Original file line number Diff line number Diff line change
Expand Up @@ -86,6 +86,22 @@ and `dmu` is the first derivative with respect to the mean parameter.
:noindex:
```

#### Bernoulli Error Cell

This cell is (currently) fixed to be a (factorized) multivariate Bernoulli cell.
Concretely, this cell implements compartments/mechanics to facilitate Bernoulli
log likelihood error calculations.

```{eval-rst}
.. autoclass:: ngclearn.components.BernoulliErrorCell
:noindex:

.. automethod:: advance_state
:noindex:
.. automethod:: reset
:noindex:
```

## Spiking Neurons

These neuronal cells exhibit dynamics that involve emission of discrete action
Expand Down Expand Up @@ -117,10 +133,42 @@ negative pressure on the membrane potential values at `t`).
:noindex:
```

### The IF (Integrate-and-Fire) Cell

This cell (the simple "integrator") models dynamics over the voltage `v`. Note that `thr` is used as the membrane potential threshold and no adaptive threshold mechanics are implemented for this cell model.
(This cell is primarily a faster, convenience formulation that omits the leak element of the LIF.)

```{eval-rst}
.. autoclass:: ngclearn.components.IFCell
:noindex:

.. automethod:: advance_state
:noindex:
.. automethod:: reset
:noindex:
```

### The Winner-Take-All (WTAS) Cell

This cell models dynamics over the voltage `v` as a simple instantaneous
softmax function of the electrical current input, where only a single
spike, which wins the competition across the group of neuronal units
within this component, emits a pulse/spike.

```{eval-rst}
.. autoclass:: ngclearn.components.WTASCell
:noindex:

.. automethod:: advance_state
:noindex:
.. automethod:: reset
:noindex:
```

### The LIF (Leaky Integrate-and-Fire) Cell

This cell (the "leaky integrator") models dynamics over the voltage `v`
and threshold shift `thrTheta` (a homeostatic variable). Note that `thr`
and threshold shift `thr_theta` (a homeostatic variable). Note that `thr`
is used as a baseline level for the membrane potential threshold while
`thrTheta` is treated as a form of short-term plasticity (full
threshold is: `thr + thrTheta(t)`).
Expand Down
28 changes: 16 additions & 12 deletions docs/modeling/synapses.md
Original file line number Diff line number Diff line change
@@ -1,17 +1,7 @@
# Synapses

The synapse is a key building block for connecting/wiring together the various
component cells that one would use for characterizing a biomimetic neural system.
These particular objects are meant to perform, per simulated time step, a
specific type of transformation -- such as a linear transform or a
convolution -- utilizing their underlying synaptic parameters.
Most times, a synaptic cable will be represented by a set of matrices (or filters)
that are used to conduct a projection of an input signal (a value presented to a
pre-synaptic/input compartment) resulting in an output signal (a value that
appears within one of its post-synaptic compartments). Notably, a synapse component is
typically associated with a local plasticity rule, e.g., a Hebbian-type
update, that either is triggered online, i.e., at some or all simulation time
steps, or by integrating a differential equation, e.g., via eligibility traces.
The synapse is a key building block for connecting/wiring together the various component cells that one would use for characterizing a biomimetic neural system. These particular objects are meant to perform, per simulated time step, a specific type of transformation -- such as a linear transform or a convolution -- utilizing their underlying synaptic parameters. Most times, a synaptic cable will be represented by a set of matrices (or filters) that are used to conduct a projection of an input signal (a value presented to a pre-synaptic/input compartment) resulting in an output signal (a value that appears within one of its post-synaptic compartments). There are three general groupings of synaptic components in ngc-learn: 1) non-plastic static synapses (only perform fixed transformations of input signals); 2) non-plastic dynamic synapses (perform time-varying, input-dependent transformations on input signals); and 3) plastic synapses that carry out long-term evolution.
Notably, plastic synapse components are typically associated with a local plasticity rule, e.g., a Hebbian-type update, that either is triggered online, i.e., at some or all simulation time steps, or by integrating a differential equation, e.g., via eligibility traces.

## Non-Plastic Synapse Types

Expand Down Expand Up @@ -74,6 +64,20 @@ This (chemical) synapse performs a linear transform of its input signals. Note t
:noindex:
```

### Double-Exponential Synapse

This (chemical) synapse performs a linear transform of its input signals. Note that this synapse is "dynamic" in the sense that its efficacies are a function of their pre-synaptic inputs; there is no inherent form of long-term plasticity in this base implementation. Synaptic strength values can be viewed as being filtered/smoothened through a doubleexpoential / difference of two exponentials kernel.

```{eval-rst}
.. autoclass:: ngclearn.components.DoubleExpSynapse
:noindex:

.. automethod:: advance_state
:noindex:
.. automethod:: reset
:noindex:
```

### Alpha Synapse

This (chemical) synapse performs a linear transform of its input signals. Note that this synapse is "dynamic" in the sense that its efficacies are a function of their pre-synaptic inputs; there is no inherent form of long-term plasticity in this base implementation. Synaptic strength values can be viewed as being filtered/smoothened through a kernel that models more realistic rise and fall times of synaptic conductance..
Expand Down
13 changes: 13 additions & 0 deletions docs/museum/event_stdp_patches.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
# Event-based Spike-Timing-Dependent Plasticity (Tavanaei et al.; 2018)

In this exhibit, we create, simulate, and visualize the internally acquired receptive fields of the spiking neural
network (SNN) trained via event-based spike-timing-dependent plasticity (EV-STDP) over image patches. This
reproduces the SNN model originally proposed in (Tavanaei et al., 2018) [1].

The model code for this exhibit can be found
[here](https://github.com/NACLab/ngc-museum/tree/main/exhibits/evstdp_patches).

<!-- references -->
## References
<b>[1]</b> Tavanaei, Amirhossein, Timothée Masquelier, and Anthony Maida. "Representation learning using event-based
STDP." Neural Networks 105 (2018): 294-303.
Loading
Loading