Chinese Physics Letters, 2017, Vol. 34, No. 4, Article code 048701 Information Capacity and Transmission in a Courbage–Nekorkin–Vdovin Map-Based Neuron Model * Yuan Yue(岳园)1,2, Yu-Jiang Liu(刘峪江)1, Ya-Lei Song(宋亚磊)1, Yong Chen(陈勇)3, Lian-Chun Yu(俞连春)1** Affiliations 1Institute of Theoretical Physics, Lanzhou University, Lanzhou 730000 2School of Mathematics and Computer Science, Northwest University for Nationalities, Lanzhou 730000 3School of Physics and Nuclear Energy Engineering, Beihang University, Beijing 100191 Received 18 September 2016 *Supported by the National Natural Science Foundation of China under Grant Nos 11564034, 11105062 and 61562075, and the Fundamental Research Funds for the Central Universities under Grant Nos lzujbky-2015-119 and 31920130008.
**Corresponding author. Email: yulch@lzu.edu.cn
Citation Text: Yue Y, Liu Y J, Song Y L, Chen Y and Yu L C 2017 Chin. Phys. Lett. 34 048701 Abstract The map-based neuron models have received attention as valid phenomenological neuron models due to their computational efficiency and flexibility to generate rich patterns. Here we evaluate the information capacity and transmission of the Courbage–Nekorkin–Vdovin (CNV) map-based neuron model with a bursting and tonic firing mode in response to external pulse inputs, in both temporal and rate coding schemes. We find that for both firing modes, the CNV model could capture the essential behavior of the stochastic Hodgkin–Huxley model in information transmission for the temporal coding scheme, with regard to the dependence of total entropy, noise entropy, information rate, and coding efficiency on the strength of the input signal. However, in tonic firing mode, it fails to replicate the input strength-dependent information rate in the rate coding scheme. Our results suggest that the CNV map-based neuron model could capture the essential behavior of information processing of typical conductance-based neuron models. DOI:10.1088/0256-307X/34/4/048701 PACS:87.19.lo, 87.19.ls, 87.19.lc, 87.19.ll © 2017 Chinese Physics Society Article Text Neurons are the basic building blocks of neural systems that generate electrical signals called action potentials (APs). Neural information is believed to embed in the frequency or timing of AP series generated by neurons in response to stimuli.[1] The Hodgkin–Huxley (HH) model successfully described the AP generation process and plays an important role in the modeling of neural systems.[2] However, the HH model is written as complex ordinary differential equations (ODEs), in which the gating dynamics of sodium and potassium ion channels are involved. To consider noise, especially ion channel noise, a stochastic method must be used to simulate the detailed stochastic gating of ion channels.[3] Therefore, the HH model faces several disadvantages in the modeling of large scale neuronal networks because of its large computational cost. In such circumstances, detailed descriptions of bottom ion channel gating dynamics in a single neuron are no longer vital. This particular need leads to models that describe the energy landscape of neurons in the information transmission,[4-6] or models that mimic the dynamic behavior of transmembrane voltage with much simpler mathematic systems, i.e., map-based neuron models.[7] These map-based models are often derived from two-dimensional ODEs, such as the FitzHugh–Nagumo model, and maintain the dynamic essence for the generation of APs in the HH model.[8] Studies have shown that instead of employing the numerical method to solve ODEs with small time steps, simple iteration with the map-based model could reproduce the dynamic behavior of neurons without concerning the involved biophysical mechanisms. This makes the map-based neuron model very efficient for computation. Additionally, the map-based neuron model could generate an abundant firing pattern, including tonic firing, bursting, and chaotic behavior.[9,10] The chaotic dynamics enable the map-based model to simulate stochastic firing caused by the noisy environment of the neuron, without introducing random number generators as in ODE neuron models. Therefore, the map-based neuron model may be the choice for future modeling of a large-scale neuronal network for reducing computational cost. However, current studies mainly focus on the dynamic analysis of the map-based model.[11,12] Some studies have tried to use the map-based model for modeling realistic neural systems, e.g., respiratory rhythm.[13] Slight effort was made to understand the information capacity and transmission of those map-based neuron models. In this study, using the Courbage–Nekorkin–Vdovin (CNV) model as an example,[14] we calculated information entropy and mutual information in the context of Shannon's information theory with the method proposed by Strong et al.,[15] in both rate and timing coding schemes. We found that the CNV model replicates the essential information processing characteristics of the typical HH model in the temporal coding scheme, but its information rate for tonic firing in the rate coding scheme is not consistent with the HH model. The CNV model is defined in the following form $$\begin{align} x(n+1)=\,&x(n)+F(x(n))-y(n)-\beta H(x(n)-d),\\ y(n+1)=\,&y(n)+\epsilon (x(n)-J(n)),~~ \tag {1} \end{align} $$ where $n$ is the iteration step, $x$ is the fast dynamic variable and represents the evolution of the neuronal membrane potential, $y$ is the slow dynamic variable and represents the dynamics of outward ionic currents, $\epsilon$ defines the timescale of the slow dynamic, $J(n)$ is the external stimulus, which depends on the iteration step $n$, and $\beta$ $(\beta>0)$ and $d$ $(d>0)$ control the threshold property of spiking oscillations. In the model, the functions $F(x)$ and $H(x-d)$ are of the following form $$\begin{align} F(x)=\,&\begin{cases} \!\! -m_{0}x, & x \leq J_{\min},\\\!\! m_{1}(x-a), & J_{\min} < x < J_{\max},\\\!\! -m_{0}(x-1), & x \geq J_{\max}, \end{cases}~~ \tag {2} \end{align} $$ $$\begin{align} H(x)=\,&\begin{cases} \!\! 1, & x \geq 0,\\\!\! 0, & x < 0, \end{cases}~~ \tag {3} \end{align} $$ where $J_{\min}=\frac {am_{1}}{m_{0}+m_{1}}$, $J_{\max}=\frac {m_{0}+am_{1}}{m_{0}+m_{1}}$, $m_{0}>0$, and $m_{1}>0$. The CNV model could generate several firing patterns depending on the chosen parameters. Figure 1 demonstrates both the bursting and tonic firing patterns generated by the map-based model. Under sufficient conditions, chaotic attractors existed in the $x$–$y$ phase plane (Fig. 1(a)). Figure 1(b) demonstrates the evolution of the fast variable $x$ (i.e., the membrane potential) along with the iteration steps. It is clearly observed that bursting firing is exhibited, where periods of rapid action potential spiking are followed by quiescent periods. The chaotic nature in bursting patterns results in random numbers of spikes in each burst as well as the spike timing. With another set of parameters, as $J$ increases, the stable fixed point (the intersection of two nullclines) becomes unstable and a limit cycle is formed in the phase plane through the Hopf bifurcation (Fig. 1(c)). In this case, the fast variable $x$ generates single spikes followed by quiescent periods, known as tonic firing. It was noted that unlike the chaotic pattern, this system is non-chaotic when it is oscillating with a tonic pattern. Therefore, in the following we added a Gaussian random number ($\mu=0$, $\sigma^2=1$) along with $J(n)$ to represent the noise input.
cpl-34-4-048701-fig1.png
Fig. 1. (a) The phase plane of a bursting mode in the CNV model. (b) The firing behavior of a bursting mode in the CNV model. The parameters used in the model are $x_0=0$, $y_0=0$, $d=0.4$, $\beta=0.4$, $\epsilon=0.002$, $J=0.13$, $m_0=0.864$, $m_1=0.65$ and $a=0.2$. (c) The phase plane of a tonic mode in the CNV model. (d) The firing behavior of the tonic mode in the CNV model. The parameters used in the model are $x_0=0$, $y_0=0$, $d=0.3$, $\beta=0.05$, $\epsilon=0.004$, $J=0.13$, $m_0=0.4$, $m_1=0.3$ and $a=0.2$. In the plot of phase planes (a) and (c), 360000 iterations were performed and the corresponding $(x,y)$ were plotted by red scattered points to identify the system's trajectory. Cyan and pink lines represent the two nullclines of this two-dimensional mapping system, i.e., $F(x)-y-\beta H(x-d)=0$ and $\epsilon(x-J)=0$.
We then set the input $J(n)$ as a pulse train input, which means that $J(n)$ remains at its baseline value $J_{0}=0.01$ and then jumps to $J_{I}$, remains there for a time (characterized by a time window of $T_I=250$ step), and then returns back to $J_{0}$. The inter-pulse interval is defined as $T_0=2000$ step. For convenience, we also assumed that each iteration step was 2 ms. In the temporal coding scheme, we used the information entropy rate to measure the information processing capacity of the CNV model with the method proposed by Strong et al.[15] Firstly, the spike trains of the neuron were packaged into binary sequences with a time bin of $\Delta t$. Then, the whole discretized sequence was scanned with a sliding time window $T$, thus the sequence changed into a 'words' sequence. A single word was composed of either $0's$ (no AP) or $1's$ (AP), with the length of the word $k$ ($k=T/{\Delta t}$). The time bin $\Delta t=2$ ms, which was small enough so that the changed firing sequence was a binary sequence, but was also large enough that the sequence of words was well sampled. After obtaining the probability $P(W)$ of all words, the total entropy rate of all words could be calculated by $$ H_{\rm total}=-\frac{1}{T}\sum_WP(W)\log_{2}P(W).~~ \tag {4} $$ By applying the same pulse train inputs into the neurons repeatedly, we could estimate the conditional probability $P(W|t)$ over all the trials. Then, the noise entropy rate, the information rate that is interrupted by noise, is calculated by $$ H_{\rm noise}=\Big\langle-\frac{1}{T}\sum_WP(W|t)\log_{2}P(W|t)\Big\rangle_{t},~~ \tag {5} $$ where $\langle\cdots\rangle_{t}$ indicates all averaged time points. Figure 2 plots an example for the dependence of total entropy rate and noise entropy rate as a function of reciprocal of the sliding window length, $1/T$. It is seen that when the window is small, both the total and noise entropy rate increase linearly with $1/T$. However, when the window is larger, the dependence of entropy rate on $1/T$ is no longer linear due to the undersampling problem. Therefore, Strong et al., proposed to obtain the window length independent entropy rates ($T\rightarrow \infty$) with linear extrapolating to $1/T=0$.[15] Thus in Fig. 2, the total entropy rate and noise entropy rate are obtained by finding the intercepts of the corresponding fitting lines with the $y$-axis at $1/T=0$.
cpl-34-4-048701-fig2.png
Fig. 2. The dependence of total and noise entropy rates on the reciprocal of sliding window length and, $1/T$. The calculation is based on the bursting mode with $J_{I}=0.1$. The two dashed straight lines are the linear extrapolations to $1/T=0$.
The information rate, the average information encoded into the spike trains in response to stimulus, is the difference between $H_{\rm total}$ and $H_{\rm noise}$ $$ I_{\rm temp}=H_{\rm total}-H_{\rm noise}.~~ \tag {6} $$ Thus the encoding efficiency $\eta$ is then calculated by $$ \eta=\frac{I_{\rm temp}}{H_{\rm total}}.~~ \tag {7} $$ We investigate the information capacity of the bursting mode in the CNV neuron. Figure 3(a) shows that as $J_I$ increases, the total information entropy rate $H_{\rm total}^{\rm B}$ of the bursting firing increases, because strong inputs tend to induce more spikes. However, as $J_I$ increases, the noise entropy rate $H_{\rm noise}^{\rm B}$ increases as well, indicating more variability in the response spike trains as the input strength increases. As a result, with the increase of the input strength, the information rate $I_{\rm temp}^{\rm B}$ increases, which implies that strong inputs would enable the neuron to transmit more information. In Fig. 3(b), the coding efficiency $\eta^{\rm B}$ of the CNV neuron in the bursting mode decreases as the input strength increases. In the tonic firing mode, with the increase of the input strength, the total entropy $H_{\rm total}^{\rm T}$ increases, as well as the noise entropy $H_{\rm noise}^{\rm T}$ (Fig. 4(a)). Meanwhile, the mutual information $I_{\rm temp}^{\rm T}$ increases monotonically and then is kept to be constant with the further increase in the input strength. As the input strength increases, the coding efficiency $\eta^{\rm T}$ for the tonic firing decreases (Fig. 4(b)).
cpl-34-4-048701-fig3.png
Fig. 3. The information capacity of the CNV model in the bursting firing mode. Here $H_{\rm total}^{\rm B}$, $H_{\rm noise}^{\rm B}$, $I_{\rm temp}^{\rm B}$ and $\eta^{\rm B}$ are the total entropy, noise entropy, mutual information rate, and coding efficiency of the neuron in the temporal coding scheme, respectively, and $I_{\rm rate}^{\rm B}$ is the information rate in the rate coding scheme.
cpl-34-4-048701-fig4.png
Fig. 4. The information capacity of the CNV model with regard to the tonic firing mode. Here $H_{\rm total}^{\rm T}$, $H_{\rm noise}^{\rm T}$, $I_{\rm temp}^{\rm T}$ and $\eta^{\rm T}$ are the total entropy, noise entropy, mutual information rate, and coding efficiency of the neuron in the temporal coding scheme, respectively, and $I_{\rm rate}^{\rm T}$ is the information rate in the rate coding scheme.
It is noted that the comparison of the information entropy rate between different firing patterns is impossible in the CNV model, because different firing patterns require different parameters. In particular, the choosing of $\epsilon$ would have a great impact on the timescale of oscillation of the model.$^{14}$ However, it could be conjectured that if both firing patterns have the same timescale, the total entropy rate and mutual information for the bursting mode are larger than those in the tonic mode even if the received input strength is the same. This is because in the bursting mode, the neuron can carry information not only through the inter-bursting intervals but also through the specific timing of many spikes in the bursting periods. However, in the tonic firing mode the information can only be encoded with the time of each spikes, or in other words, the inter-spike intervals. Next, we evaluated the information capacity of the CNV neurons in the rate coding scheme with the information rate $$ I_{\rm rate}=\frac{\overline{r}}{T}\int_{0}^{\rm T} dt \frac{\overline{r}}{r(t)} \log_{2}\Big[\frac{\overline{r}}{r(t)}\Big]\,({\rm bit/sec}),~~ \tag {8} $$ where $\overline{r}$ is the average spike rate, calculated as the total spike number divided by the total time, $T$ is the duration of the repeated stimulus, and $r(t)$ is the instantaneous firing rate of the neuron, calculated by a sliding window of 50 step (100 ms). In the rate coding scheme, for the bursting mode, we found that with the increase of the input strength, the mutual information rate $I_{\rm rate}^{\rm B}$ increases and then decreases, and a maximal information rate exists at $J_I=0.15$ (Fig. 3(a)). This maximal information rate indicates that there is an optimal input strength that could maximize the information transmission of the neuron, which carries information with its spiking rate. However in the tonic mode, the information rate $I_{\rm rate}^{\rm T}$ increases monotonically as $J_I$ increases (Fig. 4(a)). It is interesting that in the bursting mode, the mutual information in the temporal coding scheme is smaller than the rate coding scheme. Because the system in this mode is deterministic, the only source for the variability in the spike train is provided by the chaotic behavior of the system. This variability is large enough to sabotage the signal transmission in the temporal coding scheme (indicated by a large noise entropy), thus resulting in a small information rate, although the total information capacity is rather large. However, in the rate coding scheme, the chaos-induced variability has a slight effect on the number of spikes within a time interval (the firing rate), which makes it more robust to noise than the temporal coding. Because the chaotic behavior is inherent and cannot be controlled, our results suggest that the CNV model may fail to model the information transmission with temporal coding when the noise perturbation is small. In the tonic mode, the mutual information in the rate coding scheme increases as the input strength increases (Fig. 4(a)). We note that in this case, because the noise input can be controlled, the mutual information rate for temporal coding can be smaller (Fig. 4(a)) or larger (result not shown) than that for rate coding. Compared with the similar work by Schneidman on the stochastic HH neuron model, we found that, in general, the CNV map-based neuron model could replicate the information transmission behavior of the HH model. In particular, as the input strength increases, the total entropy, noise entropy and information rate increase monotonically in both models. As the input strength increases, the coding efficiency decreases in both models. The only difference is that in the case of small noise and small input strength, the HH model may have a higher coding efficiency, which is meaningless biologically. In the rate coding scheme, the information rate curve of the CNV model in the bursting mode is similar to that of the HH model, as both have a maximum at the moderate input strength (Figs. 5 and 7 in Ref. [16]). However, for the tonic mode, the information rate of the CNV model increases monotonically. In conclusion, considering the CNV model as an example, we have evaluated the transmission and information capacity of map-based neurons with the bursting and tonic firing mode in response to external pulse inputs in both temporal and rate coding schemes. We found that in both firing modes, the CNV model could capture the behavior of the information transmission of the realistic neurons in the temporal coding scheme, using the HH neuron model as a comparison. However, the CNV model failed to replicate the dependence of the information rate on the input strength of the HH model in the rate coding scheme. Our results suggest that map-based neurons could capture the essential characteristics of HH type neurons in the context of information entropy with both bursting and tonic firing modes in the temporal coding scheme.
References A quantitative description of membrane current and its application to conduction and excitation in nerveOptimal size of stochastic Hodgkin-Huxley neuronal systems for maximal energy efficiency in coding pulse signalsA New Work Mechanism on Neuronal ActivityEnergy distribution property and energy coding of a structural neural networkEnergy Function and Energy Evolution on Neuronal PopulationsMap-based models in neuronal dynamicsA brief history of excitable map-based neurons and neural networksChaotic oscillations in a map-based model of neural activitySubthreshold oscillations in a map-based neuron modelBursting regimes in map-based neuron models coupled through fast threshold modulationBurst synchronization of electrically and chemically coupled map-based neuronsNeural Mechanisms Underlying Breathing ComplexityMAP BASED MODELS IN NEURODYNAMICSEntropy and Information in Neural Spike Trains
[1]Dayan P and Abbott L F 2001 Theoretical Neuroscience (Cambridge: MIT)
[2] Hodgkin A L and Huxley A F 1952 J. Physiol. 117 500
[3] Yu L C and Liu L W 2014 Phys. Rev. E 89 032725
[4] Wang R B, Tsuda I and Zhang Z K 2015 Int. J. Neural Syst. 25 1450037
[5] Wang Z Y and Wang R B 2014 Front. Comput. Neurosc. 8 14
[6] Wang R B, Zhang Z K and Chen G R 2008 IEEE Trans. Neural Netw. 19 535
[7] Ibarz B, Casado J M and Sanjuan M A F 2011 Phys. Rep. 501 1
[8] Girardi-Schappo M, Tragtenberg M H R and Kinouchi O 2013 J. Neurosci. Meth. 220 116
[9] Courbage M, Nekorkin V I and Vdovin L V 2007 Chaos 17 043109
[10] Shilnikov A L and Rulkov N F 2004 Phys. Lett. A 328 177
[11] Ibarz B, Cao H and Sanjuan M A F 2008 Phys. Rev. E 77 051918
[12] Shi X and Lu Q 2009 Physica A 388 2410
[13] Hess A, Yu L, Klein I et al 2013 PLoS ONE 8 e75740
[14] Courbage M and Nekorkin V I 2010 Int. J. Bifurcation Chaos Appl. Sci. Eng. 20 1631
[15] Strong S P, Koberle R, Rob D R V S and William B 1998 Phys. Rev. Lett. 80 197
[16]Schneidman E 2001 PhD Dissertation (Jerusalem: The Hebrew University)