BOOK-CHAPTER

Neural Decoding Using Generative BMI Models

Abstract

In brain-machine interfaces, the Kalman filter method can provide a rigorous and well-understood framework to model the encoding of hand movement in motor cortex, and for inferring or decoding this movement from the firing rates of a cell population. When the restrictive Gaussian assumptions and linear system model holds, the Kalman filter algorithm provides an elegant analytic optimal solution to the tracking problem. If one assumes that the observation time-series (neural activity) is generated by a linear system, then the tuning can be optimally estimated by a linear filter. The second assumption of Gaussianity of the posterior density of the kinematic stimulus given the neural spiking activities reduces all the richness of the interactions to second order information (mean and the covariance). These two assumptions may be too restrictive for BMI applications and may be overcome with methods such as particle filtering. Unfortunately, in the BMI application this particular formulation is also faced with problems of parameter estimation. The generative model is required to find the mapping from the low-dimensional kinematic parameter state space to the high-dimensional output space of neuronal firing patterns (100+ dimensions). Estimating model parameters from the collapsed space to the highdimensional neural can be difficult and yield multiple solutions. For this modeling approach, our use of physiological knowledge in the framework of the model actually complicates the mapping process. As an alternative, one could disregard any knowledge about the system and use a strictly data-driven methodology to build the model. However, if the distributions are not constrained to be Gaussian, but can be described by unbiased consistent means and covariance, the filter can still be optimally derived using a least squared argument. For the HMMs, the results presented here show that the final prediction performance of the bimodel system using the ICHMM is much better than using the VQ-HMM, and superior to that of a single linear predictor. Overall, the ICHMM produces good results with few parameters. The one caveat to the ICHMM is the reliance on the ? threshold. Fortunately, this threshold is retrievable from the training set. Interestingly, the ? threshold can be viewed as global weighting for the two classes in this system. If one frames the ICHMM as mixture of experts (ME) perhaps boosting or bagging could be used to locally weight these simplistic classifiers in future work. The ME generates complex and powerful models by combining simpler models that often map different regions of the input space [47]. With boosting, models are weighted to create a strong ensemble decisions so that a weighted majority "votes" for the appropriate class labeling [47]. This is analogous to what the ICHMM currently does with a global ? weighting or biasing. In the next chapter, we will shift the focus from signal processing methods that use estimates of the instantaneous spike rate through binning and concentrate on the spike trains directly. The trade-offs of computing in BMIs directly with spike trains or with rate codes can have an impact on the ability to resolve modulation, correlation, integration, and coincidence in neuronal representations necessary for accurate reconstruction of the sensory or motor function. The implications and significance of the choice of spike traines in BMI approaches will be reviewed next.

Keywords:
Generative grammar Decoding methods Neural decoding Generative model Computer science Artificial intelligence Psychology Algorithm

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
0
Refs
0.26
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Neural Networks and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

BOOK-CHAPTER

Neural Decoding Using Generative BMI Models

Justin C. SanchezJosé C. Prı́ncipe

Synthesis lectures on biomedical engineering Year: 2007 Pages: 141-172
JOURNAL ARTICLE

Neural Decoding with Hierarchical Generative Models

Marcel van GervenFloris P. de LangeTom Heskes

Journal:   Neural Computation Year: 2010 Vol: 22 (12)Pages: 3127-3142
JOURNAL ARTICLE

Interpreting neural decoding models using grouped model reliance

Simon ValentinMaximilian HarkotteTzvetan Popov

Journal:   PLoS Computational Biology Year: 2020 Vol: 16 (1)Pages: e1007148-e1007148
© 2026 ScienceGate Book Chapters — All rights reserved.