Overview | HNet | Performance Aspects | Mathematics | Biology

The following sections summarize a number of performance aspects that are characteristic of holographic neural processing.  This material is intended for a technical or engineering audience.

Learning Capacity

Summarizes performance aspects pertaining to speed of stimulus-response learning and related memory storage capacity, in comparison to conventional neural networks.

Convergence

Illustrates the convergence characteristics of holographic learning when applying multiple training exposures or “epochs”.

Generalization

Describes generalization aspects when the learning environment is highly complex or "non-linear".

Neural Plasticity

Describes the process of neural plasticity (synaptic pruning and regrowth) and the performance gained through optimization of complex combinatorics.

Computational Complexity

Defines the number of numerical operations or hardware registers (Complex Multiply and Accumulate) that are required to execute a neuro-holographic assembly.