Abstract

We propose a modified supervised learning algorithm for optical spiking neural networks, which introduces synaptic time-delay plasticity on the basis of traditional weight training. Delay learning is combined with the remote supervised method that is incorporated with photonic spike-timing-dependent plasticity. A spike sequence learning task implemented via the proposed algorithm is found to have better performance than via the traditional weight-based method. Moreover, the proposed algorithm is also applied to two benchmark data sets for classification. In a simple network structure with only a few optical neurons, the classification accuracy based on the delay-weight learning algorithm is significantly improved compared with weight-based learning. The introduction of delay adjusting improves the learning efficiency and performance of the algorithm, which is helpful for photonic neuromorphic computing and is also important specifically for understanding information processing in the biological brain.

© 2021 Chinese Laser Press

1. INTRODUCTION

As the improvement of traditional neural networks has gradually approached an upper limit, research focuses on neural networks with more biological reality. The spiking neural networks (SNNs), normally known as the third generation of artificial neural networks (ANNs), are more biologically plausible than previous ANNs [1] and have attracted more and more attention in recent decades [26]. Spikes transmitted in the biological neural networks enable the network to capture the rich dynamics of neurons and to integrate different information dimensions [3]. However, the information representation and processing manner has become a controversial issue and remains very challenging.

Rate coding is widely used in traditional SNNs; however, there is biological evidence that the precise timing of spikes also conveys information in nervous systems [79]. The precise timing of spikes enables higher information encoding capacity and lower power consumption [1012], which are extremely important in information processing of the human brain. However, the exact learning mechanism still remains an open problem [13]. It has been shown that in the cerebellum and the cerebellar cortex, there exist signals that act like an instructor that helps the processing of information [14,15]. Several supervised learning algorithms have been proposed upon which specific problems that are tightly related to neural processing such as spike sequence learning and pattern recognition have been solved successfully [1620]. Remote supervised method (ReSuMe) is one of the supervised learning algorithms originally derived from the well-known Widrow-Hoff rule [17]. Based on photonic spike-timing-dependent plasticity (STDP) and anti-STDP rules, the synaptic weights can be adjusted to train the output neuron to fire spikes at the desired time.

Time-delayed transmission is an intrinsic feature in neural networks. Biological evidence shows that the transmission velocities in the nervous system can be modulated [21,22], for example, by changing the length and thickness of dendrites and axons [23]. The adjustability of both delay and weight of a synapse is referred to as synaptic plasticity. Delay plasticity has also been found to be helpful for the neuron in changing its firing behavior and synchronization, and it helps to understand the process of learning [24,25]. Delay selection and delay shift are two basic approaches incorporated in delay learning works [26,27]. To be specific, delay selection is to strengthen the weight of an optimal synapse among multiple subconnections, and delay shift is a more biologically plausible method training neurons to fire with coincident input spikes and with constant weight. Recently, researchers found that combined adjustment of delay and weight enhances the performance of an SNN [2832]. In 2015, a delay learning remote supervised method (DL-ReSuMe) for spiking neurons was proposed to merge the delay shift approach and weight adjustment based on ReSuMe, by which the learning accuracy and learning speed are enhanced [30]. In 2018, Taherkhani et al. proposed to appropriately train both weights and delays of excitatory and inhibitory neurons in a multilayer SNN to fire multiple desired spikes. Experimental evaluation on benchmark data sets shows that higher classification accuracy than single layer and a similar multilayer SNN can be achieved by the proposed method [31]. In 2020, Zhang et al. investigated the synaptic delay plasticity and proposed a novel learning method, where two representative supervised learning methods, ReSuMe and the perceptron based spiking neuron learning rule (PBSNLR), were studied and found to outperform the traditional synaptic weight learning methods [32].

For the sake of emulating realistic biological behaviors, SNN hardware realizations are designed to seek ultralow power consumption [5]. Devices for the implementation of basic elements of SNN, namely, artificial spiking neurons and synapses, have been achieved via complementary metal-oxide-semiconductors (CMOS), transistors, and the emerging nonvolatile memory technologies [3338], etc. Photonic neuromorphic systems have attracted attention for being a potential candidate in applications of ultrafast processing. Despite its similarity with biological neurons [39], the semiconductor laser also exceeds its electronic counterpart in its ultrafast response and low power consumption. Numerous studies on photonic synapses and photonic neurons [4053] have laid a solid foundation for significant progress in photonic neuromorphic computing based on both software and hardware implementations [5459]. In 2019, an all-optical SNN with self-learning capacity was physically implemented on a nanophotonic chip, which is capable of supervised and unsupervised learning [55]. In 2020, we proposed to solve XOR in an all-optical neuromorphic system with inhibitory dynamics of a single photonic spiking neuron based on vertical-cavity surface-emitting lasers (VCSELs) with an embedding saturable absorber (VCSEL-SA) [58], and an all-optical spiking neural network based on VCSELs was also proposed for supervised learning and pattern classification [59]. However, as far as we know, delay learning has not yet been applied in photonic SNNs.

In this work, we propose to incorporate delay learning with the traditional algorithm based on an optical SNN. First, we propose a modified algorithm that combines delay learning with ReSuMe in a photonic SNN, which adjusts synaptic weight and delay simultaneously. By implementing a spike sequence learning task, better performance and learning efficiency of the proposed algorithm than that of its weight-based counterpart are verified. Then, the proposed algorithm is also implemented for classification, where two benchmarks, the Iris data set and the breast cancer data set, are adopted. By applying the delay-weight (DW)-based algorithm, the testing accuracy for both benchmarks is significantly improved (reaching 92%).

2. SYSTEM MODEL

A. Photonic Neurons and Synapses Based on VCSEL-SA and VCSOA

The schematic diagram of DW-based supervised learning architecture is illustrated in Fig. 1, where the pre-synaptic neurons (PREs) of the input layer are fully connected to the single post-synaptic neuron (POST) in the output layer via photonic synapses with adjustable weight ω and delay d. The actual output of the POST and the target output are sent to the DW learning algorithm module, based on which ω and d can be adjusted independently to train the POST to fire spikes at a desired time. To implement a spatiotemporal encoding, the PREi is stimulated by a pre-coded square-wave pulse whose central timing contains time information. In a possible experiment, the modulated spikes from PREs are sent into the DW module, which contains a programmable attenuator array for the modulation of weight and a programmable true delay line (TTDL) array for the adjusting of delay. The time resolution could reach 20 ps for the TTDL devices [60], which might be sufficient for the realization of DW-ReSuMe. A possible control unit can be introduced to detect the precise spiking times of the PRE and POST photonic neurons and calculate the variation of weight and delay for each synapse during a training cycle. The training process stops when accuracy meets the requirement. Note that given the technologies available, we think it may be more realistic to adopt an ex situ learning approach for the training process as offline training is much easier. Once trained, the SNN can be directly used for inference implemented on optical hardware [61].

 figure: Fig. 1.

Fig. 1. Schematic diagram of DW-based learning in a single-layer photonic SNN.

Download Full Size | PPT Slide | PDF

The photonic neurons and synapses are the basic elements in a photonic SNN. VCSEL-SA can mimic spiking dynamics of a biological neuron [51] and VCSEL that works below threshold value can serve as a vertical-cavity semiconductor optical amplifier (VCSOA) that is able to perform the STDP function [47], which provides possibilities of large-scale integration and low power consumption in a photonic SNN. In this work, the spiking dynamics are implemented via the excitable VCSEL-SA neurons. The rate equations of a VCSEL-SA are written as follows [57]:

S˙i,o=Γaga(nan0a)Si,o+Γsgs(nsn0s)Si,oSi,o/τph+βBrna2,
na·=Γaga(nan0a)(SΦpre,iΦpost,o)na/τa+Ia/(eVa),Φpre,i=keiτphλiPei(τi,Δτ)/(hcVa),Φpost,o=i=1nωiλiτphPi(tdi)/(hcVa),
ns·=Γsgs(nsn0s)Si,ons/τs+Is/eVs,
where i = 1, 2, ..., n is the number of the PREs, and o denotes the POST. The subscripts a and s represent the gain and absorber regions, respectively. Si,o(t) stands for the photon density in the cavity of the PREs and POST. Γa is the gain region confinement factor. ga is the gain region differential gain/loss. na (ns) is the carrier density in the gain (absorber) region.

The term Φpre,i in Eq. (2) describes the pre-coded square-wave pulses injected as external stimulus into PREs, and Φpost,o represents the weighted sum of all of the pre-synaptic spikes fed into the POST. di is the adjustable transmission delay from PREi to the PSOT. kei, τi, and Δτ denote the strength, the central timing, and the temporal duration of the pulse, respectively. Δτi=τiτi1 is the time interval between two adjacent input pulses. ωi is the coupling weight between the PREi and the POST that can be tuned according to the supervised training method. The output power of PREs and POST can be calculated by Pi,o(t)ηcΓaSi,o(t)Vahc/(τphλi,o). In practice, the ωi is calculated as an initial weight ω0 multiplying a constant coefficient ωf=ηcΓahc/(τphλi,o) to match the optical system. The remaining parameters are the same for all neurons as in Ref. [56]. The rate equations are numerically solved by using the fourth-order Runge-Kutta method.

B. DW-Based Learning Algorithm

In most of the algorithms with tasks tightly related to neural processing, such as spike sequence learning and pattern recognition, only weight adjustment is considered. However, the time delay from the PREs to the POST is hardly considered, which may play a vital role in brain computing [31]. Under this consideration, the time-delay plasticity is combined with the ReSuMe algorithm. The weight and delay changes of the i-th synapse after each training epoch are

Δωi=(ndno)+tdtitdΔωSTDP(tdti)+totitoΔωaSTDP(toti),
Δdi=(DidDio);Did=tdti,Dio=toti,
where nd and no are spike numbers of the desired and the actual output spike sequences. ti, td, and to denote the input, target, and output spiking time, respectively. A schematic illustration of ReSuMe is shown in Fig. 2, where the Δω depends on three parts: namely, the non-Hebbian term, the difference between target spiking time td and input time ti, and the difference between actual output spiking time to and ti. The first term of Eq. (4) is a non-Hebbian term that aims to adjust the average strength of the input synapses to accelerate training. The ΔωSTDP(tdti) and ΔωaSTDP(toti) in Eq. (4) are photonic STDP and anti-STDP learning rules denoting the synaptic potentiation (depression), which can be calculated by [43,57]
ΔωSTDP(tdti)={Δωo(Δt),iftdti>00,iftdti0,
ΔωaSTDP(toti)={Δωo(Δt),iftoti>00,iftoti0.
 figure: Fig. 2.

Fig. 2. Schematic illustration of the ReSuMe incorporated with optical STDP rule. i, d, and o denote the input, the target, and the output, respectively.

Download Full Size | PPT Slide | PDF

titd and tito in Hebbian terms represent that this rule only modifies inputs that contribute to the neuron state before the desired or actual output firing time but neglect those inputs that fire afterward, which leads to better performance of the proposed DW-based algorithm.

The term Did (Dio) is the distance between ti and td (to). The delay adjustment is based on the distance between input spikes and output spikes (target spikes), in a way similar to that of synaptic weight. Note that both Δω and Δd approach 0 if the POST fires at the desired times, which ensures the convergence of the proposed algorithm. In addition, the synaptic weights (delays) are adjusted with a learning rate ηf(ηd) and within a learning window Tω(Td). The weight and delay of the i-th synapse are adjusted only if the input spike distance Did is less than the learning window. Finally, the weight and delay of the i-th synapse are updated by

ωi(x+1)=ωi(x)+ηωΔωi,
di(x+1)=di(x)+ηdΔdi,
where the term x denotes the training cycle. In general, ηd=0.5 contributes to better performance, while other parameters should be carefully selected according to different tasks.

A simple case for delay learning is illustrated in Fig. 3. Consider two PREs; each fires a spike at t1 and t2, respectively. After delayed transmission through the synapses, the actual input spikes locate at t1+d1 and t2+d2 [see in Fig. 3(a1)]. The principle of delay learning is to shift the actual input spikes toward the desired time, also called coherence learning, in which several coherent input spikes trigger the POST to fire a spike very shortly after the input time [Figs. 3(a2) and 3(b2)], while a single input spike cannot enable the release of spikes at the desired time [Fig. 3(b1)].

 figure: Fig. 3.

Fig. 3. (a1) and (b1) Input pattern and output pattern before delay adjustment. (a2) and (b2) After 7 training epochs.

Download Full Size | PPT Slide | PDF

3. RESULTS AND DISCUSSION

The learning capability of a single neuron is highly related to the property of a neural network. In Fig. 4, we compare the performance of the weight-based ReSuMe and the DW-ReSuMe in recognizing a single spike, where Ni=No=1, d=2ns, and td=8ns. To quantify convergence rate, the value of spike sequence distance (SSD) (defined in Ref. [57]) is adopted. The learning window Tω is 3 ns, while Td is not constrained. The input spiking time ti varies from 4 ns to 7 ns. The time range within which an input pattern can be successfully learned, denoted as valid input window, suggests the learning ability of a neuron. The SSD at different training epochs is presented in Fig. 4 for the two algorithms. The valid input window for ReSuMe and DW-ReSuMe is 0.3 ns (4.4–4.7) and 2.4 ns (4.4–6.8) at the 50th training epoch, respectively, as shown in Figs. 4(a) and 4(b). It is shown that with DW-ReSuMe, the simple network is able to learn a single input spike with wider time range and with higher precision. Not surprisingly, with the increase of learning epochs, the valid input window also becomes a little wider. Note that weight learning is crucial when the single input is too weak to trigger an output spike in the POST. Here, we emphasize the importance of initial weight ω0 and weight learning rate ηω on the performance of DW-based learning. When ω0 is too small, a larger learning rate of weight is required to generate an output spike before the delay learning has shifted the input spiking time right at that of the target. As can be seen in Fig. 4(c), the efficient input window is widened by increasing the learning rate from 0.1×ω0 to 0.2×ω0. In addition, for large ω0, smaller ηω is required to avoid missing the optimal solution, as indicated in Fig. 4(d). The initial weight ω0 ranges from a unit value of 0.1 to 2, and smaller ηω contributes to a larger efficient input window. We suggest relatively larger weight learning rate ηω and smaller initial weight ω0 for spike sequence learning. For a classification problem, ω0 should be large enough to trigger a spike of the PREs, while smaller ηω is necessary to obtain higher accuracy.

 figure: Fig. 4.

Fig. 4. Comparison of the learning capability of a single neuron based on (a) weight-based ReSuMe and (b) DW-ReSuMe. The value of SSD after the 50th, 100th, and 300th training epoch is presented for different ti. (c) The valid input window as a function of ηω for different ω0 based on DW-ReSuMe. (d) The valid input window as a function of ω0 for different ηω based on DW-ReSuMe. n=1, td=8ns.

Download Full Size | PPT Slide | PDF

However, note that if the desired output contains more than one spike, the delay learning window Td has to be limited within the range of the minimum inter-spike interval (ISI) to maintain stable performance. Since some of the input spikes are shifted toward a certain target spike, there is an additional consideration in the DW-ReSuMe that the injection power of the output neuron should be limited to protect devices of photonic neurons and to ensure spiking dynamics.

A. Spike Sequence Learning

Then a spike sequence learning task is implemented via a single-layer photonic SNN. Both DW-ReSuMe and ReSuMe are considered for comparison. The PRE includes 60 input neurons stimulated via pre-coded rectangular pulses with a time interval Δτi of 0.1 ns, each connected to the POST with a photonic synapse. In Fig. 5 we show the learning process of the two algorithms. The black line in Fig. 5(a1) describes the carrier density na of the POST after training of 300 epochs based on DW-ReSuMe. When na exceeds a threshold value (marked in red dotted line), a spike is emitted. We can see that after training, the POST is able to fire accurately at the desired time (denoted by blue dotted line). The training process is further illustrated in Fig. 5(b1), where the SSD converges quickly from 2 to 0 within about 20 training epochs.

 figure: Fig. 5.

Fig. 5. (a1) Carrier density of the POST after training and (b1) the evolution of output spikes based on the DW-ReSuMe; (a2) and (b2) those based on ReSuMe. The black solid line is na and the red solid line represents P.

Download Full Size | PPT Slide | PDF

The evolution of synaptic weights and delays is shown in Fig. 6. The initial weights and delays of all neurons are identical, as indicated by Figs. 6(a) and 6(b), respectively. After training, some of the weights and delays are potentiated, while others are depressed or hardly changed. It is interesting to note that for some synapses, the weights and (or) delays change obviously at the first six training epochs. Such fluctuations during training are mainly caused by the combined effect of delay learning and weight learning. That is, when initially the output spiking time to is far before the target td, the update amount of both weight and delay is relatively large as the distance between to and td is large. In this case, the combined adjustment of weight and delay may make to lag far behind td but closer to td than in the previous epoch. Note that the weight-based ReSuMe is not able to solve this problem in an SNN with 60 PREs, as shown in Figs. 5(a2) and 5(b2). The results show that combined with delay learning, DW-ReSuMe is more powerful than the weight-based algorithm.

 figure: Fig. 6.

Fig. 6. Evolution of (a1) synaptic weights ωi and (a2) delays di during the first 20 training epochs.

Download Full Size | PPT Slide | PDF

Moreover, we found that in the cases of target spikes that have arbitrary different ISIs, the DW-ReSuMe also performs better than the traditional weight-based algorithm. Here, two cases are considered (refer to Ref. [57]). Figure 7 illustrates the learning process and evolution of SSD during 300 learning epochs. The desired spike sequence is [10 ns, 12 ns, 14 ns, 18 ns, 20 ns, 22 ns, 24 ns, 26 ns, 29 ns] in Figs. 7(a1) and 7(a2) and [10 ns, 11 ns, 13 ns, 14.5 ns, 17 ns, 21 ns, 23 ns, 25.5 ns, 27 ns] in Figs. 7(b1) and 7(b2). In both cases, with the increase of the learning epoch, the POST gradually learns to produce spikes at the desired time, and the values of SSD gradually decrease to approximate 0 after about 20 learning epochs and stay steady. Compared with our previous work [57], the network based on the DW-ReSuMe algorithm has better learning ability than that based on the weight-based ReSuMe.

 figure: Fig. 7.

Fig. 7. Learning spike sequences with ununiformed ISI. (a1) and (b1) The evolution of output spikes for spike sequence [10 ns, 12 ns, 14 ns, 18 ns, 20 ns, 22 ns, 24 ns, 26 ns, 29 ns] and [10 ns, 11 ns, 13 ns, 14.5 ns, 17 ns, 21 ns, 23 ns, 25.5 ns, 27 ns], respectively. (a2) and (b2) The evolution for the corresponding distance.

Download Full Size | PPT Slide | PDF

B. Fisher’s Iris Data Set

The Fisher’s Iris flower data set (Fisher, 1936) is a classic benchmark of pattern recognition that contains three classes of Iris flowers with a total of 150 case entries [31]. One of the classes is linearly separable from the other two, while the other two classes are linearly inseparable. Four measurements are used to describe and differentiate the three classes, and each measurement is directly encoded into single spike firing at different times and is linearly rescaled into the interval of [5 ns, 10 ns]. The network comprises 4 PREs and a single POST. The encoding spikes are fed into the 4 PREs of the SNN via synapses with adjustable delay and weight. The initial delays are 2 ns for all of the four input neurons, and the initial weight ω0 for each synapse is randomly selected as a constant coefficient (which is 0.1 here) multiplied by a random number from [0,1]. In this case, nearly all of the PRE neurons are capable of generating just a single spike in response of each input pattern. The weight learning rate ηω is 0.01 and decays by half every 20 training epochs to enhance the convergence of learning. The output of the network is represented by the precise spiking time of the POST, which fires a desired spike at different times for different classes. Here, the target spikes for the three classes are 8 ns, 9 ns, and 10 ns, respectively. If the output spike locates within 40% the interval of target spikes around a certain td, the input entry is classified into the corresponding class.

In our scheme, according to Ref. [31], 50% of the IRIS data are used for training and the rest for testing. The training and testing processes are implemented through program simulation based on MATLAB. For each training epoch, all of the entries in the training data set are injected into the input neurons of the SNN successively. Based on the learning algorithm, the Δωi (Δdi) of the i-th input synapse for each entry is calculated and summed as ΔWi (ΔDi). After each training epoch, the mean value of ΔWi (ΔDi) is used for weight (delay) update, namely, the actual update amount for weight (delay) is Δωi=ΔWi/Ntrain (Δdi=ΔDi/Ntrain), where Ntrain=75 is the size of the training data set. After each training epoch, the updated weights and delays are also used for testing. The accuracy is defined as the number of correctly classified entries divided by the total entry number. Both training accuracy and testing accuracy with different learning algorithms are presented in Fig. 8. Based on the DW-ReSuMe, the accuracy arises rapidly at first and gradually approaches 96% for the training data set and 92% for the testing data set, as indicated by the red solid line.

 figure: Fig. 8.

Fig. 8. (a) Training accuracy and (b) testing accuracy varying with training epochs for weight-based ReSuMe (blue solid line) and DW-ReSuMe (red solid line). Td=1ns, Tω=4ns. The blue dotted line indicates an accuracy of 90%.

Download Full Size | PPT Slide | PDF

For a detailed illustration of the classification results, a scattered plot of the target spiking time td and actual output spiking time to is shown in Figs. 9(a) and 9(b) for the training data set and testing data set, respectively. However, based on the same SNN architecture, the accuracy of the weight-based ReSuMe (blue solid line) only reaches 65%, with the same initial weights and delays, but a higher learning rate (ηω=0.1) is required to make the algorithm work efficiently. The results indicate that the performance can be greatly enhanced by introducing time-delay learning.

 figure: Fig. 9.

Fig. 9. Illustration of classification results for (a) training data set and (b) testing data set. The orange cycles denote target spiking time, the blue squares represent the actual spiking time, and misclassified samples are highlighted in bright blue.

Download Full Size | PPT Slide | PDF

Moreover, we also investigate the effect of learning window on the classification accuracy. The testing accuracy as a function of weight learning window Tω and delay learning window Td is shown in Figs. 10(a) and 10(b), respectively. We can see that a relatively larger Tω is required for achieving higher accuracy. However, Td should be selected as the minimum ISI of target spikes, which is 1 ns. Note that the accuracy can reach more than 85% even without the adjustment of weight. The results indicate that delay learning is an extremely efficient learning algorithm in a photonic SNN with temporal encoding, which suggests that delay learning may be an essential mechanism in the biological spiking neuron systems.

 figure: Fig. 10.

Fig. 10. Testing accuracy as a function of (a) weight learning window Tω and (b) delay learning window Td.

Download Full Size | PPT Slide | PDF

C. Breast Cancer Data Set

The breast cancer data set contains 608 case entries, and, is divided into benign and malignant cases, each has nine measurements [62]. Five significant measurements are selected and pre-encoded into rectangular pulses, which trigger the input neurons to fire spikes at specific time within the range of 6.5 to 11 ns. The network contains 5 PREs and 1 POST. The initial delays are 0 ns for all of the 5 PREs, and the initial weights are randomly selected using the same method as for the Iris data set. The weight learning rate ηω is 0.0001. The SNN is trained to fire a spike at 9 ns for the first case and at 13 ns for the second case. Four hundred eight entries are used to train the network, with the rest for testing. The accuracy on the training data set and testing data set reaches 93% and 92%, respectively, as shown in Figs. 11(a) and 11(b). The training data set is trained for over 100 epochs, and the accuracy converges quickly within 20 training epochs. However, with weight-based ReSuMe, the classification accuracy is less than 30% with the same operating parameters (not shown here). It is worth noting that in this case, the ReSuMe algorithm does not work efficiently, limited by the simple network structure and encoding schemes. The accuracy mainly depends on the initial synaptic delay d0. In Fig. 11, we also present the training results based on ReSuMe (blue solid line), with an initial delay of 0.5 ns. The weight learning rate here is 0.01.

 figure: Fig. 11.

Fig. 11. (a) Training accuracy and (b) testing accuracy varying with training epochs based on DW-ReSuMe (red solid line) and ReSuMe (blue solid line), respectively. Td=4ns, Tω=5ns.

Download Full Size | PPT Slide | PDF

Moreover, in consideration of the effect of initial delay d0 on the training performance of ReSuMe and DW-ReSuMe, we show in Fig. 12 the accuracy of the two data sets after 60 training epochs as a function of different initial delays based on the two algorithms. Both training and testing results are considered. Figures 12(a1) and 12(a2) show the training and testing accuracy of the Iris data set, respectively, and the results of the breast cancer data set are shown in Figs. 12(b1) and 12(b2). Obviously, the proposed DW-based algorithm is less sensitive to varying initial delays. Note that in the Iris data set, the learning accuracy of DW-ReSuMe is small when d0 is less than 1.5 ns, which is mainly constrained by the relatively small delay learning window (1 ns).

 figure: Fig. 12.

Fig. 12. (a1) Training accuracy and (a2) testing accuracy for the Iris data set after 60 training epochs with different initial delay d0. (b1) and (b2) The results for the breast cancer data set.

Download Full Size | PPT Slide | PDF

Finally, note that in Fig. 11 there is a downward fluctuation when the accuracy is close to the stable state. The fluctuations may be related to learning rate. A larger learning rate usually generates more obvious fluctuations, as illustrated in Figs. 13(a1) and 13(a2). The training accuracy fluctuates more obviously when the weight learning rate ηω is larger. However, at about the 10th training cycle, the obvious downward fluctuation is not affected by weight learning rate. We can reasonably assume that the fluctuation is caused by the relatively larger constant delay learning rate (ηd=0.5). As this value contributes to better training performance in different tasks, we consider a decaying ηd that is reduced by half after about five training epochs. The results are presented in Figs. 13(b1) and 13(b2), from which we can see that the fluctuation disappears. However, not surprisingly, the accuracy grows much more slowly.

 figure: Fig. 13.

Fig. 13. Learning accuracy of the breast cancer data set based on DW-ReSuMe with different cases of ηd. The left column corresponds to the training accuracy with (a1) constant ηd and with (b1) decaying ηd. (a2) and (b2) The right column shows the corresponding results of testing accuracy. Td=4ns, Tω=5ns.

Download Full Size | PPT Slide | PDF

DW-based learning has shown excellent performance in single-layer networks. However, the real neural networks are usually hierarchical, and synaptic weights and delays can be modulated based on different biological mechanisms, which form the foundations of complex brain functions. It is quite interesting and challenging to consider how to effectively introduce delay adjustment into deep learning networks.

4. CONCLUSION

This paper proposed a supervised DW learning method in an optical SNN. Based on precise timing of spikes, delay learning trains coherent inputs in the input layer of an SNN via shifting the synaptic delays according to the desired and actual output timing. The proposed DW-ReSuMe is applied to spike sequence learning and two classification benchmarks, the Iris data set and breast cancer data set, successfully. The performance of the SNN is significantly improved compared with its weight-based counterpart. By introducing time-delay learning in a photonic SNN, fewer optical neurons are required to solve different tasks, which is significant to photonic neuromorphic computing. The results also suggest that synaptic delay and weight may be a combined learning mechanism in real biological neural networks, which deserves deeper investigation.

Funding

National Outstanding Youth Science Fund Project of National Natural Science Foundation of China (62022062); National Natural Science Foundation of China (61674119, 61974177).

Disclosures

The authors declare no conflicts of interest.

REFERENCES

1. S. Ghosh-Dastidar and H. Adeli, “Spiking neural networks,” Int. J. Neural Syst. 19, 295–308 (2009). [CrossRef]  

2. R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris Jr., M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007). [CrossRef]  

3. N. Kasabov, K. Dhoble, N. Nuntalid, and G. Indiveri, “Dynamic evolving spiking neural networks for on-line spatio- and spectro-temporal pattern recognition,” Neural Netw. 41, 188–201 (2013). [CrossRef]  

4. K. Roy, J. Akhilesh, and P. Priyadarshini, “Towards spike-based machine intelligence with neuromorphic computing,” Nature 575, 607–617 (2019). [CrossRef]  

5. W. Q. Zhang, B. Gao, J. S. Tang, P. Yao, and H. Q. Wu, “Neuro-inspired computing chips,” Nat. Electron. 3, 371–382 (2020). [CrossRef]  

6. T. Clarence, M. Arlija, and N. Kasabov, “Spiking neural networks: background, recent development and the NeuCube architecture,” Neural Process. Lett. 3, 1675–1701 (2020). [CrossRef]  

7. A. Cariani, “Temporal codes and computations for sensory representation and scene analysis,” IEEE Trans. Neural Netw. 15, 1100–1111 (2004). [CrossRef]  

8. S. M. Bohte, “The evidence for neural information processing with precise spike-times: a survey,” Natural Comput. 3, 195–206 (2004). [CrossRef]  

9. A. Mohemmed and S. Schliebs, “Training spiking neural networks to associate spatio-temporal input-output spike patterns,” Neurocomputing 107, 3–10 (2013). [CrossRef]  

10. S. B. Laughlin, R. R. de Ruyter van Steveninck, and J. C. Anderson, “The metabolic cost of neural information,” Nat. Neurosci. 1, 36–41 (1998). [CrossRef]  

11. S. B. Laughlin, “Energy as a constraint on the coding and processing of sensory information,” Curr. Opinion Neurobiol. 11, 475–480 (2001). [CrossRef]  

12. H. Paugam-Moisy and S. Bohte, “Computing with spiking neuron networks,” in Handbook of Natural Computing (Springer, 2012), pp. 335–376.

13. J. Hu, H. Tang, K. C. Tan, H. Li, and L. Shi, “A spike-timing-based integrated model for pattern recognition,” Neural Comput. 25, 450–472 (2013). [CrossRef]  

14. F. Ponulak and A. Kasinski, “Introduction to spiking neural networks: information processing, learning and applications,” Acta Neurobiol. Experim. 71, 409–433 (2011).

15. H. Jörntell and C. Hansel, “Synaptic memories upside down: bidirectional plasticity at cerebellar parallel fiber-Purkinje cell synapses,” Neuron 52, 227–238 (2006). [CrossRef]  

16. R. Gütig and H. Sompolinsky, “The tempotron: a neuron that learns spike timing-based decisions,” Nat. Neurosci. 9, 420–428 (2006). [CrossRef]  

17. F. Ponulak and A. Kasinski, “Supervised learning in spiking neural networks with ReSuMe: sequence learning, classification, and spike shifting,” Neural Comput. 22, 467–510 (2010). [CrossRef]  

18. I. Sporea and A. Grüning, “Supervised learning in multilayer spiking neural networks,” Neural Comput. 25, 473–509 (2013). [CrossRef]  

19. S. R. Kulkarni and B. Rajendran, “Spiking neural networks for handwritten digit recognition—supervised learning and network optimization,” Neural Netw. 103, 118–127 (2018). [CrossRef]  

20. C. Hong, X. Wei, J. Wang, B. Deng, H. Yu, and Y. Che, “Training spiking neural networks for cognitive tasks: a versatile framework compatible with various temporal codes,” IEEE Trans. Neural Netw. Learn. Syst. 31, 1285–1296 (2020). [CrossRef]  

21. S. Boudkkazi, L. Fronzaroli-Molinieres, and D. Debanne, “Presynaptic action potential waveform determines cortical synaptic latency,” J. Physiol. 589, 1117–1131 (2011). [CrossRef]  

22. J. W. Lin and D. S. Faber, “Modulation of synaptic delay during synaptic plasticity,” Trends Neurosci. 25, 449–455 (2002). [CrossRef]  

23. C. W. Eurich, K. Pawelzik, U. Ernst, J. D. Cowan, and J. G. Milton, “Dynamics of self-organized delay adaptation,” Phys. Rev. Lett. 82, 1594–1597 (1999). [CrossRef]  

24. X. B. Gong, Y. B. Wang, and B. Ying, “Delay-induced firing behavior and transitions in adaptive neuronal networks with two types of synapses,” Sci. China Chem. 56, 222–229 (2012). [CrossRef]  

25. M. Dhamala, V. K. Jirsa, and M. Ding, “Enhancement of neural synchrony by time delay,” Phys. Rev. Lett. 92, 074104 (2004). [CrossRef]  

26. S. Ghosh-Dastidar and H. Adeli, “Improved spiking neural networks for EEG classification and epilepsy and seizure detection,” Integr. Comput.-Aided Eng. 14, 187–212 (2007). [CrossRef]  

27. P. Adibi, M. R. Meybodi, and R. Safabakhsh, “Unsupervised learning of synaptic delays based on learning automata in an RBF-like network of spiking neurons for data clustering,” Neurocomputing 64, 335–357 (2005). [CrossRef]  

28. S. Ghosh-Dastidar and H. Adeli, “A new supervised learning algorithm for multiple spiking neural networks with application in epilepsy and seizure detection,” Neural Netw. 22, 1419–1431 (2009). [CrossRef]  

29. A. Taherkhani, A. Belatreche, Y. Li, and L. P. Maguire, “EDL: an extended delay learning based remote supervised method for spiking neurons,” in Neural Information Processing (Springer, 2015), pp. 190–197.

30. A. Taherkhani, A. Belatreche, Y. Li, and L. P. Maguire, “DL-ReSuMe: a delay learning-based remote supervised method for spiking neurons,” IEEE Trans. Neur. Netw. Learn. Syst. 26, 3137–3149 (2015). [CrossRef]  

31. A. Taherkhani, A. Belatreche, Y. Li, and L. P. Maguire, “A supervised learning algorithm for learning precise timing of multiple spikes in multilayer spiking neural networks,” IEEE Trans. Neur. Netw. Learn. Syst. 29, 5394–5407 (2018). [CrossRef]  

32. M. L. Zhang, J. B. Wu, A. Belatreche, Z. H. Pan, X. R. Xie, and Y. S. Chua, “Supervised learning in spiking neural networks with synaptic delay-weight plasticity,” Neurocomputing 409, 103–118 (2020). [CrossRef]  

33. W. Xu, S. Y. Min, H. Hwang, and T. W. Lee, “Organic core-sheath nanowire artificial synapses with femtojoule energy consumption,” Sci. Adv. 2, e1501326 (2016). [CrossRef]  

34. I. Sourikopoulos, S. Hedayat, C. Loyez, F. Danneville, V. Hoel, E. Mercier, and A. Cappy, “A 4-fJ/spike artificial neuron in 65 nm CMOS technology,” Front. Neurosci. 11, 123 (2017). [CrossRef]  

35. J. D. Zhu, Y. C. Yang, R. D. Jia, Z. X. Liang, W. Zhu, Z. A. Rehman, L. Bao, X. X. Zhang, Y. M. Cai, L. Song, and R. Huang, “Ion gated synaptic transistors based on 2D van der Waals crystals with tunable diffusive dynamics,” Adv. Mater. 30, 1800195 (2018). [CrossRef]  

36. B. Irem, L. G. Manuel, S. R. Nandakumar, M. Timoleon, P. Thomas, R. Bipin, L. Yusuf, S. Abu, and E. Evangelos, “Neuromorphic computing with multi-memristive synapses,” Nat. Commun. 9, 2514 (2018). [CrossRef]  

37. Y. Zhou, N. Xu, B. Gao, Y. Y. Chen, B. Y. Dong, Y. Li, Y. H. He, and X. S. Miao, “Complementary graphene-ferroelectric transistors (C-GFTs) as synapses with modulatable plasticity for supervised learning,” in IEEE International Electron Devices Meeting (IEDM) (2019), pp. 1–4.

38. S. G. Hu, G. C. Qiao, Y. A. Liu, L. M. Rong, Q. Yu, and Y. Liu, “An improved memristor model connecting plastic synapse synapses and nonlinear spiking neuron,” J. Phys. D 52, 275402 (2019). [CrossRef]  

39. J. Ohtsubo, R. Ozawa, and M. Nanbu, “Synchrony of small nonlinear networks in chaotic semiconductor lasers,” Jpn. J. Appl. Phys. 54, 072702 (2015). [CrossRef]  

40. A. Hurtado, I. D. Henning, and M. J. Adams, “Optical neuron using polarization switching in a 1550 nm-VCSEL,” Opt. Express 18, 25170–25176 (2010). [CrossRef]  

41. A. Hurtado, K. Schires, I. Henning, and M. Adams, “Investigation of vertical cavity surface emitting laser dynamics for neuromorphic photonic systems,” Appl. Phys. Lett. 100, 103703 (2012). [CrossRef]  

42. M. P. Fok, Y. Tian, D. Rosenbluth, and P. R. Prucnal, “Pulse lead/lag timing detection for adaptive feedback and control based on optical spike-timing-dependent plasticity,” Opt. Lett. 38, 419–421 (2013). [CrossRef]  

43. Q. Li, Z. Wang, Y. Le, C. Sun, X. Song, and C. Wu, “Optical implementation of neural learning algorithms based on cross-gain modulation in a semiconductor optical amplifier,” Proc. SPIE 10019, 100190E (2016). [CrossRef]  

44. S. Y. Xiang, Y. H. Zhang, X. X. Guo, J. F. Li, A. J. Wen, W. Pan, and Y. Hao, “Cascadable neuron-like spiking dynamics in coupled VCSELs subject to orthogonally polarized optical pulse injection,” IEEE J. Sel. Top. Quantum Electron. 23, 1700207 (2017). [CrossRef]  

45. T. Deng, J. Robertson, and A. Hurtado, “Controlled propagation of spiking dynamics in vertical-cavity surface-emitting lasers: towards neuromorphic photonic networks,” IEEE J. Sel. Top. Quantum Electron. 23, 1800408 (2017). [CrossRef]  

46. Z. G. Chen, R. Carlos, W. H. P. Pernice, C. D. Wrigh, and H. Bhaskara, “On-chip photonic synapse,” Sci. Adv. 3, e1700160 (2017). [CrossRef]  

47. S. Y. Xiang, J. K. Gong, Y. H. Zhang, X. X. Guo, A. Wen, and Y. Hao, “Numerical implementation of wavelength-dependent photonic spike timing dependent plasticity based on VCSOA,” IEEE J. Quantum Electron. 54, 8100107 (2018). [CrossRef]  

48. Y. H. Zhang, S. Y. Xiang, J. K. Gong, X. X. Guo, A. J. Wen, and Y. Hao, “Spike encoding and storage properties in mutually coupled vertical-cavity surface-emitting lasers subject to optical pulse injection,” Appl. Opt. 57, 1731–1737 (2018). [CrossRef]  

49. I. Chakraborty, G. Saha, A. Sengupta, and K. Roy, “Toward fast neural computing using all-photonic phase change spiking neurons,” Sci. Rep. 8, 12980 (2018). [CrossRef]  

50. I. Chakraborty, G. Saha, and K. Roy, “Photonic in-memory computing primitive for spiking neural networks using phase-change materials,” Phys. Rev. Appl. 11, 014063 (2019). [CrossRef]  

51. Y. H. Zhang, S. Y. Xiang, X. Guo, A. Wen, and Y. Hao, “All-optical inhibitory dynamics in photonic neuron based on polarization mode competition in a VCSEL with an embedded saturable absorber,” Opt. Lett. 44, 1548–1551 (2019). [CrossRef]  

52. J. Robertson, E. Wade, Y. Kopp, J. Bueno, and A. Hurtado, “Toward neuromorphic photonic networks of ultrafast spiking laser neurons,” IEEE J. Sel. Top. Quantum Electron. 26, 7700715 (2020). [CrossRef]  

53. B. W. Ma and W. W. Zou, “Demonstration of a distributed feedback laser diode working as a graded-potential-signaling photonic neuron and its application to neuromorphic information processing,” Sci. China Inf. Sci. 63, 160408 (2020). [CrossRef]  

54. S. Y. Xiang, Y. H. Zhang, J. K. Gong, X. X. Guo, L. Lin, and Y. Hao, “STDP-based unsupervised spike pattern learning in a photonic spiking neural network with VCSELs and VCSOAs,” IEEE J. Sel. Top. Quantum Electron. 25, 1700109 (2019). [CrossRef]  

55. J. Feldmann, N. Youngblood, C. D. Wright, H. Bhaskaran, and W. H. P. Pernice, “All-optical spiking neurosynaptic networks with self-learning capabilities,” Nature 569, 208–215 (2019). [CrossRef]  

56. C. Ríos, N. Youngblood, Z. Cheng, M. L. Gallo, W. H. P. Pernice, C. D. Wright, A. Sebastian, and H. Bhaskaran, “In-memory computing on a photonic platform,” Sci. Adv. 5, eaau5759 (2019). [CrossRef]  

57. Z. W. Song, S. Y. Xiang, Z. X. Ren, G. Q. Han, and Y. Hao, “Spike sequence learning in a photonic spiking neural network consisting of VCSELs-SA with supervised training,” IEEE J. Sel. Top. Quantum Electron. 26, 1700209 (2020). [CrossRef]  

58. S. Y. Xiang, Z. X. Ren, Y. H. Zhang, Z. W. Song, and Y. Hao, “All-optical neuromorphic XOR operation with inhibitory dynamics of a single photonic spiking neuron based on VCSEL-SA,” Opt. Lett. 45, 1104–1107 (2020). [CrossRef]  

59. S. Y. Xiang, Z. X. Ren, Z. W. Song, Y. H. Zhang, X. X. Guo, G. Q. Han, and Y. Hao, “Computing primitive of fully-VCSELs-based all-optical spiking neural network for supervised learning and pattern classification,” IEEE Trans. Neural Netw. (2020).

60. S. Moallemi, R. Welker, and J. Kitchen, “Wide band programmable true time delay block for phased array antenna applications,” in IEEE Dallas Circuits and Systems Conference (DCAS) (2016),pp. 1–4.

61. G. Wetzstein, A. Ozcan, S. Gigan, S. Fan, D. Englund, M. Soljačić, C. Denz, D. A. B. Miller, and D. Psaltis, “Inference in artificial intelligence with deep optics and photonics,” Nature 588, 39–47 (2020). [CrossRef]  

62. A. Roy, S. Govil, and R. Miranda, “An algorithm to generate radial basis function (RBF)-like nets for classification problems,” Neural Netw. 8, 179–201 (1995). [CrossRef]  

References

  • View by:

  1. S. Ghosh-Dastidar and H. Adeli, “Spiking neural networks,” Int. J. Neural Syst. 19, 295–308 (2009).
    [Crossref]
  2. R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris, M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007).
    [Crossref]
  3. N. Kasabov, K. Dhoble, N. Nuntalid, and G. Indiveri, “Dynamic evolving spiking neural networks for on-line spatio- and spectro-temporal pattern recognition,” Neural Netw. 41, 188–201 (2013).
    [Crossref]
  4. K. Roy, J. Akhilesh, and P. Priyadarshini, “Towards spike-based machine intelligence with neuromorphic computing,” Nature 575, 607–617 (2019).
    [Crossref]
  5. W. Q. Zhang, B. Gao, J. S. Tang, P. Yao, and H. Q. Wu, “Neuro-inspired computing chips,” Nat. Electron. 3, 371–382 (2020).
    [Crossref]
  6. T. Clarence, M. Arlija, and N. Kasabov, “Spiking neural networks: background, recent development and the NeuCube architecture,” Neural Process. Lett. 3, 1675–1701 (2020).
    [Crossref]
  7. A. Cariani, “Temporal codes and computations for sensory representation and scene analysis,” IEEE Trans. Neural Netw. 15, 1100–1111 (2004).
    [Crossref]
  8. S. M. Bohte, “The evidence for neural information processing with precise spike-times: a survey,” Natural Comput. 3, 195–206 (2004).
    [Crossref]
  9. A. Mohemmed and S. Schliebs, “Training spiking neural networks to associate spatio-temporal input-output spike patterns,” Neurocomputing 107, 3–10 (2013).
    [Crossref]
  10. S. B. Laughlin, R. R. de Ruyter van Steveninck, and J. C. Anderson, “The metabolic cost of neural information,” Nat. Neurosci. 1, 36–41 (1998).
    [Crossref]
  11. S. B. Laughlin, “Energy as a constraint on the coding and processing of sensory information,” Curr. Opinion Neurobiol. 11, 475–480 (2001).
    [Crossref]
  12. H. Paugam-Moisy and S. Bohte, “Computing with spiking neuron networks,” in Handbook of Natural Computing (Springer, 2012), pp. 335–376.
  13. J. Hu, H. Tang, K. C. Tan, H. Li, and L. Shi, “A spike-timing-based integrated model for pattern recognition,” Neural Comput. 25, 450–472 (2013).
    [Crossref]
  14. F. Ponulak and A. Kasinski, “Introduction to spiking neural networks: information processing, learning and applications,” Acta Neurobiol. Experim. 71, 409–433 (2011).
  15. H. Jörntell and C. Hansel, “Synaptic memories upside down: bidirectional plasticity at cerebellar parallel fiber-Purkinje cell synapses,” Neuron 52, 227–238 (2006).
    [Crossref]
  16. R. Gütig and H. Sompolinsky, “The tempotron: a neuron that learns spike timing-based decisions,” Nat. Neurosci. 9, 420–428 (2006).
    [Crossref]
  17. F. Ponulak and A. Kasinski, “Supervised learning in spiking neural networks with ReSuMe: sequence learning, classification, and spike shifting,” Neural Comput. 22, 467–510 (2010).
    [Crossref]
  18. I. Sporea and A. Grüning, “Supervised learning in multilayer spiking neural networks,” Neural Comput. 25, 473–509 (2013).
    [Crossref]
  19. S. R. Kulkarni and B. Rajendran, “Spiking neural networks for handwritten digit recognition—supervised learning and network optimization,” Neural Netw. 103, 118–127 (2018).
    [Crossref]
  20. C. Hong, X. Wei, J. Wang, B. Deng, H. Yu, and Y. Che, “Training spiking neural networks for cognitive tasks: a versatile framework compatible with various temporal codes,” IEEE Trans. Neural Netw. Learn. Syst. 31, 1285–1296 (2020).
    [Crossref]
  21. S. Boudkkazi, L. Fronzaroli-Molinieres, and D. Debanne, “Presynaptic action potential waveform determines cortical synaptic latency,” J. Physiol. 589, 1117–1131 (2011).
    [Crossref]
  22. J. W. Lin and D. S. Faber, “Modulation of synaptic delay during synaptic plasticity,” Trends Neurosci. 25, 449–455 (2002).
    [Crossref]
  23. C. W. Eurich, K. Pawelzik, U. Ernst, J. D. Cowan, and J. G. Milton, “Dynamics of self-organized delay adaptation,” Phys. Rev. Lett. 82, 1594–1597 (1999).
    [Crossref]
  24. X. B. Gong, Y. B. Wang, and B. Ying, “Delay-induced firing behavior and transitions in adaptive neuronal networks with two types of synapses,” Sci. China Chem. 56, 222–229 (2012).
    [Crossref]
  25. M. Dhamala, V. K. Jirsa, and M. Ding, “Enhancement of neural synchrony by time delay,” Phys. Rev. Lett. 92, 074104 (2004).
    [Crossref]
  26. S. Ghosh-Dastidar and H. Adeli, “Improved spiking neural networks for EEG classification and epilepsy and seizure detection,” Integr. Comput.-Aided Eng. 14, 187–212 (2007).
    [Crossref]
  27. P. Adibi, M. R. Meybodi, and R. Safabakhsh, “Unsupervised learning of synaptic delays based on learning automata in an RBF-like network of spiking neurons for data clustering,” Neurocomputing 64, 335–357 (2005).
    [Crossref]
  28. S. Ghosh-Dastidar and H. Adeli, “A new supervised learning algorithm for multiple spiking neural networks with application in epilepsy and seizure detection,” Neural Netw. 22, 1419–1431 (2009).
    [Crossref]
  29. A. Taherkhani, A. Belatreche, Y. Li, and L. P. Maguire, “EDL: an extended delay learning based remote supervised method for spiking neurons,” in Neural Information Processing (Springer, 2015), pp. 190–197.
  30. A. Taherkhani, A. Belatreche, Y. Li, and L. P. Maguire, “DL-ReSuMe: a delay learning-based remote supervised method for spiking neurons,” IEEE Trans. Neur. Netw. Learn. Syst. 26, 3137–3149 (2015).
    [Crossref]
  31. A. Taherkhani, A. Belatreche, Y. Li, and L. P. Maguire, “A supervised learning algorithm for learning precise timing of multiple spikes in multilayer spiking neural networks,” IEEE Trans. Neur. Netw. Learn. Syst. 29, 5394–5407 (2018).
    [Crossref]
  32. M. L. Zhang, J. B. Wu, A. Belatreche, Z. H. Pan, X. R. Xie, and Y. S. Chua, “Supervised learning in spiking neural networks with synaptic delay-weight plasticity,” Neurocomputing 409, 103–118 (2020).
    [Crossref]
  33. W. Xu, S. Y. Min, H. Hwang, and T. W. Lee, “Organic core-sheath nanowire artificial synapses with femtojoule energy consumption,” Sci. Adv. 2, e1501326 (2016).
    [Crossref]
  34. I. Sourikopoulos, S. Hedayat, C. Loyez, F. Danneville, V. Hoel, E. Mercier, and A. Cappy, “A 4-fJ/spike artificial neuron in 65 nm CMOS technology,” Front. Neurosci. 11, 123 (2017).
    [Crossref]
  35. J. D. Zhu, Y. C. Yang, R. D. Jia, Z. X. Liang, W. Zhu, Z. A. Rehman, L. Bao, X. X. Zhang, Y. M. Cai, L. Song, and R. Huang, “Ion gated synaptic transistors based on 2D van der Waals crystals with tunable diffusive dynamics,” Adv. Mater. 30, 1800195 (2018).
    [Crossref]
  36. B. Irem, L. G. Manuel, S. R. Nandakumar, M. Timoleon, P. Thomas, R. Bipin, L. Yusuf, S. Abu, and E. Evangelos, “Neuromorphic computing with multi-memristive synapses,” Nat. Commun. 9, 2514 (2018).
    [Crossref]
  37. Y. Zhou, N. Xu, B. Gao, Y. Y. Chen, B. Y. Dong, Y. Li, Y. H. He, and X. S. Miao, “Complementary graphene-ferroelectric transistors (C-GFTs) as synapses with modulatable plasticity for supervised learning,” in IEEE International Electron Devices Meeting (IEDM) (2019), pp. 1–4.
  38. S. G. Hu, G. C. Qiao, Y. A. Liu, L. M. Rong, Q. Yu, and Y. Liu, “An improved memristor model connecting plastic synapse synapses and nonlinear spiking neuron,” J. Phys. D 52, 275402 (2019).
    [Crossref]
  39. J. Ohtsubo, R. Ozawa, and M. Nanbu, “Synchrony of small nonlinear networks in chaotic semiconductor lasers,” Jpn. J. Appl. Phys. 54, 072702 (2015).
    [Crossref]
  40. A. Hurtado, I. D. Henning, and M. J. Adams, “Optical neuron using polarization switching in a 1550 nm-VCSEL,” Opt. Express 18, 25170–25176 (2010).
    [Crossref]
  41. A. Hurtado, K. Schires, I. Henning, and M. Adams, “Investigation of vertical cavity surface emitting laser dynamics for neuromorphic photonic systems,” Appl. Phys. Lett. 100, 103703 (2012).
    [Crossref]
  42. M. P. Fok, Y. Tian, D. Rosenbluth, and P. R. Prucnal, “Pulse lead/lag timing detection for adaptive feedback and control based on optical spike-timing-dependent plasticity,” Opt. Lett. 38, 419–421 (2013).
    [Crossref]
  43. Q. Li, Z. Wang, Y. Le, C. Sun, X. Song, and C. Wu, “Optical implementation of neural learning algorithms based on cross-gain modulation in a semiconductor optical amplifier,” Proc. SPIE 10019, 100190E (2016).
    [Crossref]
  44. S. Y. Xiang, Y. H. Zhang, X. X. Guo, J. F. Li, A. J. Wen, W. Pan, and Y. Hao, “Cascadable neuron-like spiking dynamics in coupled VCSELs subject to orthogonally polarized optical pulse injection,” IEEE J. Sel. Top. Quantum Electron. 23, 1700207 (2017).
    [Crossref]
  45. T. Deng, J. Robertson, and A. Hurtado, “Controlled propagation of spiking dynamics in vertical-cavity surface-emitting lasers: towards neuromorphic photonic networks,” IEEE J. Sel. Top. Quantum Electron. 23, 1800408 (2017).
    [Crossref]
  46. Z. G. Chen, R. Carlos, W. H. P. Pernice, C. D. Wrigh, and H. Bhaskara, “On-chip photonic synapse,” Sci. Adv. 3, e1700160 (2017).
    [Crossref]
  47. S. Y. Xiang, J. K. Gong, Y. H. Zhang, X. X. Guo, A. Wen, and Y. Hao, “Numerical implementation of wavelength-dependent photonic spike timing dependent plasticity based on VCSOA,” IEEE J. Quantum Electron. 54, 8100107 (2018).
    [Crossref]
  48. Y. H. Zhang, S. Y. Xiang, J. K. Gong, X. X. Guo, A. J. Wen, and Y. Hao, “Spike encoding and storage properties in mutually coupled vertical-cavity surface-emitting lasers subject to optical pulse injection,” Appl. Opt. 57, 1731–1737 (2018).
    [Crossref]
  49. I. Chakraborty, G. Saha, A. Sengupta, and K. Roy, “Toward fast neural computing using all-photonic phase change spiking neurons,” Sci. Rep. 8, 12980 (2018).
    [Crossref]
  50. I. Chakraborty, G. Saha, and K. Roy, “Photonic in-memory computing primitive for spiking neural networks using phase-change materials,” Phys. Rev. Appl. 11, 014063 (2019).
    [Crossref]
  51. Y. H. Zhang, S. Y. Xiang, X. Guo, A. Wen, and Y. Hao, “All-optical inhibitory dynamics in photonic neuron based on polarization mode competition in a VCSEL with an embedded saturable absorber,” Opt. Lett. 44, 1548–1551 (2019).
    [Crossref]
  52. J. Robertson, E. Wade, Y. Kopp, J. Bueno, and A. Hurtado, “Toward neuromorphic photonic networks of ultrafast spiking laser neurons,” IEEE J. Sel. Top. Quantum Electron. 26, 7700715 (2020).
    [Crossref]
  53. B. W. Ma and W. W. Zou, “Demonstration of a distributed feedback laser diode working as a graded-potential-signaling photonic neuron and its application to neuromorphic information processing,” Sci. China Inf. Sci. 63, 160408 (2020).
    [Crossref]
  54. S. Y. Xiang, Y. H. Zhang, J. K. Gong, X. X. Guo, L. Lin, and Y. Hao, “STDP-based unsupervised spike pattern learning in a photonic spiking neural network with VCSELs and VCSOAs,” IEEE J. Sel. Top. Quantum Electron. 25, 1700109 (2019).
    [Crossref]
  55. J. Feldmann, N. Youngblood, C. D. Wright, H. Bhaskaran, and W. H. P. Pernice, “All-optical spiking neurosynaptic networks with self-learning capabilities,” Nature 569, 208–215 (2019).
    [Crossref]
  56. C. Ríos, N. Youngblood, Z. Cheng, M. L. Gallo, W. H. P. Pernice, C. D. Wright, A. Sebastian, and H. Bhaskaran, “In-memory computing on a photonic platform,” Sci. Adv. 5, eaau5759 (2019).
    [Crossref]
  57. Z. W. Song, S. Y. Xiang, Z. X. Ren, G. Q. Han, and Y. Hao, “Spike sequence learning in a photonic spiking neural network consisting of VCSELs-SA with supervised training,” IEEE J. Sel. Top. Quantum Electron. 26, 1700209 (2020).
    [Crossref]
  58. S. Y. Xiang, Z. X. Ren, Y. H. Zhang, Z. W. Song, and Y. Hao, “All-optical neuromorphic XOR operation with inhibitory dynamics of a single photonic spiking neuron based on VCSEL-SA,” Opt. Lett. 45, 1104–1107 (2020).
    [Crossref]
  59. S. Y. Xiang, Z. X. Ren, Z. W. Song, Y. H. Zhang, X. X. Guo, G. Q. Han, and Y. Hao, “Computing primitive of fully-VCSELs-based all-optical spiking neural network for supervised learning and pattern classification,” IEEE Trans. Neural Netw. (2020).
  60. S. Moallemi, R. Welker, and J. Kitchen, “Wide band programmable true time delay block for phased array antenna applications,” in IEEE Dallas Circuits and Systems Conference (DCAS) (2016),pp. 1–4.
  61. G. Wetzstein, A. Ozcan, S. Gigan, S. Fan, D. Englund, M. Soljačić, C. Denz, D. A. B. Miller, and D. Psaltis, “Inference in artificial intelligence with deep optics and photonics,” Nature 588, 39–47 (2020).
    [Crossref]
  62. A. Roy, S. Govil, and R. Miranda, “An algorithm to generate radial basis function (RBF)-like nets for classification problems,” Neural Netw. 8, 179–201 (1995).
    [Crossref]

2020 (9)

W. Q. Zhang, B. Gao, J. S. Tang, P. Yao, and H. Q. Wu, “Neuro-inspired computing chips,” Nat. Electron. 3, 371–382 (2020).
[Crossref]

T. Clarence, M. Arlija, and N. Kasabov, “Spiking neural networks: background, recent development and the NeuCube architecture,” Neural Process. Lett. 3, 1675–1701 (2020).
[Crossref]

C. Hong, X. Wei, J. Wang, B. Deng, H. Yu, and Y. Che, “Training spiking neural networks for cognitive tasks: a versatile framework compatible with various temporal codes,” IEEE Trans. Neural Netw. Learn. Syst. 31, 1285–1296 (2020).
[Crossref]

M. L. Zhang, J. B. Wu, A. Belatreche, Z. H. Pan, X. R. Xie, and Y. S. Chua, “Supervised learning in spiking neural networks with synaptic delay-weight plasticity,” Neurocomputing 409, 103–118 (2020).
[Crossref]

J. Robertson, E. Wade, Y. Kopp, J. Bueno, and A. Hurtado, “Toward neuromorphic photonic networks of ultrafast spiking laser neurons,” IEEE J. Sel. Top. Quantum Electron. 26, 7700715 (2020).
[Crossref]

B. W. Ma and W. W. Zou, “Demonstration of a distributed feedback laser diode working as a graded-potential-signaling photonic neuron and its application to neuromorphic information processing,” Sci. China Inf. Sci. 63, 160408 (2020).
[Crossref]

Z. W. Song, S. Y. Xiang, Z. X. Ren, G. Q. Han, and Y. Hao, “Spike sequence learning in a photonic spiking neural network consisting of VCSELs-SA with supervised training,” IEEE J. Sel. Top. Quantum Electron. 26, 1700209 (2020).
[Crossref]

S. Y. Xiang, Z. X. Ren, Y. H. Zhang, Z. W. Song, and Y. Hao, “All-optical neuromorphic XOR operation with inhibitory dynamics of a single photonic spiking neuron based on VCSEL-SA,” Opt. Lett. 45, 1104–1107 (2020).
[Crossref]

G. Wetzstein, A. Ozcan, S. Gigan, S. Fan, D. Englund, M. Soljačić, C. Denz, D. A. B. Miller, and D. Psaltis, “Inference in artificial intelligence with deep optics and photonics,” Nature 588, 39–47 (2020).
[Crossref]

2019 (7)

S. Y. Xiang, Y. H. Zhang, J. K. Gong, X. X. Guo, L. Lin, and Y. Hao, “STDP-based unsupervised spike pattern learning in a photonic spiking neural network with VCSELs and VCSOAs,” IEEE J. Sel. Top. Quantum Electron. 25, 1700109 (2019).
[Crossref]

J. Feldmann, N. Youngblood, C. D. Wright, H. Bhaskaran, and W. H. P. Pernice, “All-optical spiking neurosynaptic networks with self-learning capabilities,” Nature 569, 208–215 (2019).
[Crossref]

C. Ríos, N. Youngblood, Z. Cheng, M. L. Gallo, W. H. P. Pernice, C. D. Wright, A. Sebastian, and H. Bhaskaran, “In-memory computing on a photonic platform,” Sci. Adv. 5, eaau5759 (2019).
[Crossref]

I. Chakraborty, G. Saha, and K. Roy, “Photonic in-memory computing primitive for spiking neural networks using phase-change materials,” Phys. Rev. Appl. 11, 014063 (2019).
[Crossref]

Y. H. Zhang, S. Y. Xiang, X. Guo, A. Wen, and Y. Hao, “All-optical inhibitory dynamics in photonic neuron based on polarization mode competition in a VCSEL with an embedded saturable absorber,” Opt. Lett. 44, 1548–1551 (2019).
[Crossref]

S. G. Hu, G. C. Qiao, Y. A. Liu, L. M. Rong, Q. Yu, and Y. Liu, “An improved memristor model connecting plastic synapse synapses and nonlinear spiking neuron,” J. Phys. D 52, 275402 (2019).
[Crossref]

K. Roy, J. Akhilesh, and P. Priyadarshini, “Towards spike-based machine intelligence with neuromorphic computing,” Nature 575, 607–617 (2019).
[Crossref]

2018 (7)

A. Taherkhani, A. Belatreche, Y. Li, and L. P. Maguire, “A supervised learning algorithm for learning precise timing of multiple spikes in multilayer spiking neural networks,” IEEE Trans. Neur. Netw. Learn. Syst. 29, 5394–5407 (2018).
[Crossref]

S. R. Kulkarni and B. Rajendran, “Spiking neural networks for handwritten digit recognition—supervised learning and network optimization,” Neural Netw. 103, 118–127 (2018).
[Crossref]

J. D. Zhu, Y. C. Yang, R. D. Jia, Z. X. Liang, W. Zhu, Z. A. Rehman, L. Bao, X. X. Zhang, Y. M. Cai, L. Song, and R. Huang, “Ion gated synaptic transistors based on 2D van der Waals crystals with tunable diffusive dynamics,” Adv. Mater. 30, 1800195 (2018).
[Crossref]

B. Irem, L. G. Manuel, S. R. Nandakumar, M. Timoleon, P. Thomas, R. Bipin, L. Yusuf, S. Abu, and E. Evangelos, “Neuromorphic computing with multi-memristive synapses,” Nat. Commun. 9, 2514 (2018).
[Crossref]

S. Y. Xiang, J. K. Gong, Y. H. Zhang, X. X. Guo, A. Wen, and Y. Hao, “Numerical implementation of wavelength-dependent photonic spike timing dependent plasticity based on VCSOA,” IEEE J. Quantum Electron. 54, 8100107 (2018).
[Crossref]

Y. H. Zhang, S. Y. Xiang, J. K. Gong, X. X. Guo, A. J. Wen, and Y. Hao, “Spike encoding and storage properties in mutually coupled vertical-cavity surface-emitting lasers subject to optical pulse injection,” Appl. Opt. 57, 1731–1737 (2018).
[Crossref]

I. Chakraborty, G. Saha, A. Sengupta, and K. Roy, “Toward fast neural computing using all-photonic phase change spiking neurons,” Sci. Rep. 8, 12980 (2018).
[Crossref]

2017 (4)

S. Y. Xiang, Y. H. Zhang, X. X. Guo, J. F. Li, A. J. Wen, W. Pan, and Y. Hao, “Cascadable neuron-like spiking dynamics in coupled VCSELs subject to orthogonally polarized optical pulse injection,” IEEE J. Sel. Top. Quantum Electron. 23, 1700207 (2017).
[Crossref]

T. Deng, J. Robertson, and A. Hurtado, “Controlled propagation of spiking dynamics in vertical-cavity surface-emitting lasers: towards neuromorphic photonic networks,” IEEE J. Sel. Top. Quantum Electron. 23, 1800408 (2017).
[Crossref]

Z. G. Chen, R. Carlos, W. H. P. Pernice, C. D. Wrigh, and H. Bhaskara, “On-chip photonic synapse,” Sci. Adv. 3, e1700160 (2017).
[Crossref]

I. Sourikopoulos, S. Hedayat, C. Loyez, F. Danneville, V. Hoel, E. Mercier, and A. Cappy, “A 4-fJ/spike artificial neuron in 65 nm CMOS technology,” Front. Neurosci. 11, 123 (2017).
[Crossref]

2016 (2)

W. Xu, S. Y. Min, H. Hwang, and T. W. Lee, “Organic core-sheath nanowire artificial synapses with femtojoule energy consumption,” Sci. Adv. 2, e1501326 (2016).
[Crossref]

Q. Li, Z. Wang, Y. Le, C. Sun, X. Song, and C. Wu, “Optical implementation of neural learning algorithms based on cross-gain modulation in a semiconductor optical amplifier,” Proc. SPIE 10019, 100190E (2016).
[Crossref]

2015 (2)

J. Ohtsubo, R. Ozawa, and M. Nanbu, “Synchrony of small nonlinear networks in chaotic semiconductor lasers,” Jpn. J. Appl. Phys. 54, 072702 (2015).
[Crossref]

A. Taherkhani, A. Belatreche, Y. Li, and L. P. Maguire, “DL-ReSuMe: a delay learning-based remote supervised method for spiking neurons,” IEEE Trans. Neur. Netw. Learn. Syst. 26, 3137–3149 (2015).
[Crossref]

2013 (5)

I. Sporea and A. Grüning, “Supervised learning in multilayer spiking neural networks,” Neural Comput. 25, 473–509 (2013).
[Crossref]

N. Kasabov, K. Dhoble, N. Nuntalid, and G. Indiveri, “Dynamic evolving spiking neural networks for on-line spatio- and spectro-temporal pattern recognition,” Neural Netw. 41, 188–201 (2013).
[Crossref]

J. Hu, H. Tang, K. C. Tan, H. Li, and L. Shi, “A spike-timing-based integrated model for pattern recognition,” Neural Comput. 25, 450–472 (2013).
[Crossref]

A. Mohemmed and S. Schliebs, “Training spiking neural networks to associate spatio-temporal input-output spike patterns,” Neurocomputing 107, 3–10 (2013).
[Crossref]

M. P. Fok, Y. Tian, D. Rosenbluth, and P. R. Prucnal, “Pulse lead/lag timing detection for adaptive feedback and control based on optical spike-timing-dependent plasticity,” Opt. Lett. 38, 419–421 (2013).
[Crossref]

2012 (2)

A. Hurtado, K. Schires, I. Henning, and M. Adams, “Investigation of vertical cavity surface emitting laser dynamics for neuromorphic photonic systems,” Appl. Phys. Lett. 100, 103703 (2012).
[Crossref]

X. B. Gong, Y. B. Wang, and B. Ying, “Delay-induced firing behavior and transitions in adaptive neuronal networks with two types of synapses,” Sci. China Chem. 56, 222–229 (2012).
[Crossref]

2011 (2)

S. Boudkkazi, L. Fronzaroli-Molinieres, and D. Debanne, “Presynaptic action potential waveform determines cortical synaptic latency,” J. Physiol. 589, 1117–1131 (2011).
[Crossref]

F. Ponulak and A. Kasinski, “Introduction to spiking neural networks: information processing, learning and applications,” Acta Neurobiol. Experim. 71, 409–433 (2011).

2010 (2)

F. Ponulak and A. Kasinski, “Supervised learning in spiking neural networks with ReSuMe: sequence learning, classification, and spike shifting,” Neural Comput. 22, 467–510 (2010).
[Crossref]

A. Hurtado, I. D. Henning, and M. J. Adams, “Optical neuron using polarization switching in a 1550 nm-VCSEL,” Opt. Express 18, 25170–25176 (2010).
[Crossref]

2009 (2)

S. Ghosh-Dastidar and H. Adeli, “Spiking neural networks,” Int. J. Neural Syst. 19, 295–308 (2009).
[Crossref]

S. Ghosh-Dastidar and H. Adeli, “A new supervised learning algorithm for multiple spiking neural networks with application in epilepsy and seizure detection,” Neural Netw. 22, 1419–1431 (2009).
[Crossref]

2007 (2)

S. Ghosh-Dastidar and H. Adeli, “Improved spiking neural networks for EEG classification and epilepsy and seizure detection,” Integr. Comput.-Aided Eng. 14, 187–212 (2007).
[Crossref]

R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris, M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007).
[Crossref]

2006 (2)

H. Jörntell and C. Hansel, “Synaptic memories upside down: bidirectional plasticity at cerebellar parallel fiber-Purkinje cell synapses,” Neuron 52, 227–238 (2006).
[Crossref]

R. Gütig and H. Sompolinsky, “The tempotron: a neuron that learns spike timing-based decisions,” Nat. Neurosci. 9, 420–428 (2006).
[Crossref]

2005 (1)

P. Adibi, M. R. Meybodi, and R. Safabakhsh, “Unsupervised learning of synaptic delays based on learning automata in an RBF-like network of spiking neurons for data clustering,” Neurocomputing 64, 335–357 (2005).
[Crossref]

2004 (3)

M. Dhamala, V. K. Jirsa, and M. Ding, “Enhancement of neural synchrony by time delay,” Phys. Rev. Lett. 92, 074104 (2004).
[Crossref]

A. Cariani, “Temporal codes and computations for sensory representation and scene analysis,” IEEE Trans. Neural Netw. 15, 1100–1111 (2004).
[Crossref]

S. M. Bohte, “The evidence for neural information processing with precise spike-times: a survey,” Natural Comput. 3, 195–206 (2004).
[Crossref]

2002 (1)

J. W. Lin and D. S. Faber, “Modulation of synaptic delay during synaptic plasticity,” Trends Neurosci. 25, 449–455 (2002).
[Crossref]

2001 (1)

S. B. Laughlin, “Energy as a constraint on the coding and processing of sensory information,” Curr. Opinion Neurobiol. 11, 475–480 (2001).
[Crossref]

1999 (1)

C. W. Eurich, K. Pawelzik, U. Ernst, J. D. Cowan, and J. G. Milton, “Dynamics of self-organized delay adaptation,” Phys. Rev. Lett. 82, 1594–1597 (1999).
[Crossref]

1998 (1)

S. B. Laughlin, R. R. de Ruyter van Steveninck, and J. C. Anderson, “The metabolic cost of neural information,” Nat. Neurosci. 1, 36–41 (1998).
[Crossref]

1995 (1)

A. Roy, S. Govil, and R. Miranda, “An algorithm to generate radial basis function (RBF)-like nets for classification problems,” Neural Netw. 8, 179–201 (1995).
[Crossref]

Abu, S.

B. Irem, L. G. Manuel, S. R. Nandakumar, M. Timoleon, P. Thomas, R. Bipin, L. Yusuf, S. Abu, and E. Evangelos, “Neuromorphic computing with multi-memristive synapses,” Nat. Commun. 9, 2514 (2018).
[Crossref]

Adams, M.

A. Hurtado, K. Schires, I. Henning, and M. Adams, “Investigation of vertical cavity surface emitting laser dynamics for neuromorphic photonic systems,” Appl. Phys. Lett. 100, 103703 (2012).
[Crossref]

Adams, M. J.

Adeli, H.

S. Ghosh-Dastidar and H. Adeli, “Spiking neural networks,” Int. J. Neural Syst. 19, 295–308 (2009).
[Crossref]

S. Ghosh-Dastidar and H. Adeli, “A new supervised learning algorithm for multiple spiking neural networks with application in epilepsy and seizure detection,” Neural Netw. 22, 1419–1431 (2009).
[Crossref]

S. Ghosh-Dastidar and H. Adeli, “Improved spiking neural networks for EEG classification and epilepsy and seizure detection,” Integr. Comput.-Aided Eng. 14, 187–212 (2007).
[Crossref]

Adibi, P.

P. Adibi, M. R. Meybodi, and R. Safabakhsh, “Unsupervised learning of synaptic delays based on learning automata in an RBF-like network of spiking neurons for data clustering,” Neurocomputing 64, 335–357 (2005).
[Crossref]

Akhilesh, J.

K. Roy, J. Akhilesh, and P. Priyadarshini, “Towards spike-based machine intelligence with neuromorphic computing,” Nature 575, 607–617 (2019).
[Crossref]

Anderson, J. C.

S. B. Laughlin, R. R. de Ruyter van Steveninck, and J. C. Anderson, “The metabolic cost of neural information,” Nat. Neurosci. 1, 36–41 (1998).
[Crossref]

Arlija, M.

T. Clarence, M. Arlija, and N. Kasabov, “Spiking neural networks: background, recent development and the NeuCube architecture,” Neural Process. Lett. 3, 1675–1701 (2020).
[Crossref]

Bao, L.

J. D. Zhu, Y. C. Yang, R. D. Jia, Z. X. Liang, W. Zhu, Z. A. Rehman, L. Bao, X. X. Zhang, Y. M. Cai, L. Song, and R. Huang, “Ion gated synaptic transistors based on 2D van der Waals crystals with tunable diffusive dynamics,” Adv. Mater. 30, 1800195 (2018).
[Crossref]

Beeman, D.

R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris, M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007).
[Crossref]

Belatreche, A.

M. L. Zhang, J. B. Wu, A. Belatreche, Z. H. Pan, X. R. Xie, and Y. S. Chua, “Supervised learning in spiking neural networks with synaptic delay-weight plasticity,” Neurocomputing 409, 103–118 (2020).
[Crossref]

A. Taherkhani, A. Belatreche, Y. Li, and L. P. Maguire, “A supervised learning algorithm for learning precise timing of multiple spikes in multilayer spiking neural networks,” IEEE Trans. Neur. Netw. Learn. Syst. 29, 5394–5407 (2018).
[Crossref]

A. Taherkhani, A. Belatreche, Y. Li, and L. P. Maguire, “DL-ReSuMe: a delay learning-based remote supervised method for spiking neurons,” IEEE Trans. Neur. Netw. Learn. Syst. 26, 3137–3149 (2015).
[Crossref]

A. Taherkhani, A. Belatreche, Y. Li, and L. P. Maguire, “EDL: an extended delay learning based remote supervised method for spiking neurons,” in Neural Information Processing (Springer, 2015), pp. 190–197.

Bhaskara, H.

Z. G. Chen, R. Carlos, W. H. P. Pernice, C. D. Wrigh, and H. Bhaskara, “On-chip photonic synapse,” Sci. Adv. 3, e1700160 (2017).
[Crossref]

Bhaskaran, H.

J. Feldmann, N. Youngblood, C. D. Wright, H. Bhaskaran, and W. H. P. Pernice, “All-optical spiking neurosynaptic networks with self-learning capabilities,” Nature 569, 208–215 (2019).
[Crossref]

C. Ríos, N. Youngblood, Z. Cheng, M. L. Gallo, W. H. P. Pernice, C. D. Wright, A. Sebastian, and H. Bhaskaran, “In-memory computing on a photonic platform,” Sci. Adv. 5, eaau5759 (2019).
[Crossref]

Bipin, R.

B. Irem, L. G. Manuel, S. R. Nandakumar, M. Timoleon, P. Thomas, R. Bipin, L. Yusuf, S. Abu, and E. Evangelos, “Neuromorphic computing with multi-memristive synapses,” Nat. Commun. 9, 2514 (2018).
[Crossref]

Bohte, S.

H. Paugam-Moisy and S. Bohte, “Computing with spiking neuron networks,” in Handbook of Natural Computing (Springer, 2012), pp. 335–376.

Bohte, S. M.

S. M. Bohte, “The evidence for neural information processing with precise spike-times: a survey,” Natural Comput. 3, 195–206 (2004).
[Crossref]

Boudkkazi, S.

S. Boudkkazi, L. Fronzaroli-Molinieres, and D. Debanne, “Presynaptic action potential waveform determines cortical synaptic latency,” J. Physiol. 589, 1117–1131 (2011).
[Crossref]

Bower, J. M.

R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris, M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007).
[Crossref]

Brette, R.

R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris, M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007).
[Crossref]

Bueno, J.

J. Robertson, E. Wade, Y. Kopp, J. Bueno, and A. Hurtado, “Toward neuromorphic photonic networks of ultrafast spiking laser neurons,” IEEE J. Sel. Top. Quantum Electron. 26, 7700715 (2020).
[Crossref]

Cai, Y. M.

J. D. Zhu, Y. C. Yang, R. D. Jia, Z. X. Liang, W. Zhu, Z. A. Rehman, L. Bao, X. X. Zhang, Y. M. Cai, L. Song, and R. Huang, “Ion gated synaptic transistors based on 2D van der Waals crystals with tunable diffusive dynamics,” Adv. Mater. 30, 1800195 (2018).
[Crossref]

Cappy, A.

I. Sourikopoulos, S. Hedayat, C. Loyez, F. Danneville, V. Hoel, E. Mercier, and A. Cappy, “A 4-fJ/spike artificial neuron in 65 nm CMOS technology,” Front. Neurosci. 11, 123 (2017).
[Crossref]

Cariani, A.

A. Cariani, “Temporal codes and computations for sensory representation and scene analysis,” IEEE Trans. Neural Netw. 15, 1100–1111 (2004).
[Crossref]

Carlos, R.

Z. G. Chen, R. Carlos, W. H. P. Pernice, C. D. Wrigh, and H. Bhaskara, “On-chip photonic synapse,” Sci. Adv. 3, e1700160 (2017).
[Crossref]

Carnevale, T.

R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris, M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007).
[Crossref]

Chakraborty, I.

I. Chakraborty, G. Saha, and K. Roy, “Photonic in-memory computing primitive for spiking neural networks using phase-change materials,” Phys. Rev. Appl. 11, 014063 (2019).
[Crossref]

I. Chakraborty, G. Saha, A. Sengupta, and K. Roy, “Toward fast neural computing using all-photonic phase change spiking neurons,” Sci. Rep. 8, 12980 (2018).
[Crossref]

Che, Y.

C. Hong, X. Wei, J. Wang, B. Deng, H. Yu, and Y. Che, “Training spiking neural networks for cognitive tasks: a versatile framework compatible with various temporal codes,” IEEE Trans. Neural Netw. Learn. Syst. 31, 1285–1296 (2020).
[Crossref]

Chen, Y. Y.

Y. Zhou, N. Xu, B. Gao, Y. Y. Chen, B. Y. Dong, Y. Li, Y. H. He, and X. S. Miao, “Complementary graphene-ferroelectric transistors (C-GFTs) as synapses with modulatable plasticity for supervised learning,” in IEEE International Electron Devices Meeting (IEDM) (2019), pp. 1–4.

Chen, Z. G.

Z. G. Chen, R. Carlos, W. H. P. Pernice, C. D. Wrigh, and H. Bhaskara, “On-chip photonic synapse,” Sci. Adv. 3, e1700160 (2017).
[Crossref]

Cheng, Z.

C. Ríos, N. Youngblood, Z. Cheng, M. L. Gallo, W. H. P. Pernice, C. D. Wright, A. Sebastian, and H. Bhaskaran, “In-memory computing on a photonic platform,” Sci. Adv. 5, eaau5759 (2019).
[Crossref]

Chua, Y. S.

M. L. Zhang, J. B. Wu, A. Belatreche, Z. H. Pan, X. R. Xie, and Y. S. Chua, “Supervised learning in spiking neural networks with synaptic delay-weight plasticity,” Neurocomputing 409, 103–118 (2020).
[Crossref]

Clarence, T.

T. Clarence, M. Arlija, and N. Kasabov, “Spiking neural networks: background, recent development and the NeuCube architecture,” Neural Process. Lett. 3, 1675–1701 (2020).
[Crossref]

Cowan, J. D.

C. W. Eurich, K. Pawelzik, U. Ernst, J. D. Cowan, and J. G. Milton, “Dynamics of self-organized delay adaptation,” Phys. Rev. Lett. 82, 1594–1597 (1999).
[Crossref]

Danneville, F.

I. Sourikopoulos, S. Hedayat, C. Loyez, F. Danneville, V. Hoel, E. Mercier, and A. Cappy, “A 4-fJ/spike artificial neuron in 65 nm CMOS technology,” Front. Neurosci. 11, 123 (2017).
[Crossref]

Davison, A. P.

R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris, M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007).
[Crossref]

de Ruyter van Steveninck, R. R.

S. B. Laughlin, R. R. de Ruyter van Steveninck, and J. C. Anderson, “The metabolic cost of neural information,” Nat. Neurosci. 1, 36–41 (1998).
[Crossref]

Debanne, D.

S. Boudkkazi, L. Fronzaroli-Molinieres, and D. Debanne, “Presynaptic action potential waveform determines cortical synaptic latency,” J. Physiol. 589, 1117–1131 (2011).
[Crossref]

Deng, B.

C. Hong, X. Wei, J. Wang, B. Deng, H. Yu, and Y. Che, “Training spiking neural networks for cognitive tasks: a versatile framework compatible with various temporal codes,” IEEE Trans. Neural Netw. Learn. Syst. 31, 1285–1296 (2020).
[Crossref]

Deng, T.

T. Deng, J. Robertson, and A. Hurtado, “Controlled propagation of spiking dynamics in vertical-cavity surface-emitting lasers: towards neuromorphic photonic networks,” IEEE J. Sel. Top. Quantum Electron. 23, 1800408 (2017).
[Crossref]

Denz, C.

G. Wetzstein, A. Ozcan, S. Gigan, S. Fan, D. Englund, M. Soljačić, C. Denz, D. A. B. Miller, and D. Psaltis, “Inference in artificial intelligence with deep optics and photonics,” Nature 588, 39–47 (2020).
[Crossref]

Destexhe, A.

R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris, M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007).
[Crossref]

Dhamala, M.

M. Dhamala, V. K. Jirsa, and M. Ding, “Enhancement of neural synchrony by time delay,” Phys. Rev. Lett. 92, 074104 (2004).
[Crossref]

Dhoble, K.

N. Kasabov, K. Dhoble, N. Nuntalid, and G. Indiveri, “Dynamic evolving spiking neural networks for on-line spatio- and spectro-temporal pattern recognition,” Neural Netw. 41, 188–201 (2013).
[Crossref]

Diesmann, M.

R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris, M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007).
[Crossref]

Ding, M.

M. Dhamala, V. K. Jirsa, and M. Ding, “Enhancement of neural synchrony by time delay,” Phys. Rev. Lett. 92, 074104 (2004).
[Crossref]

Djurfeldt, M.

R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris, M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007).
[Crossref]

Dong, B. Y.

Y. Zhou, N. Xu, B. Gao, Y. Y. Chen, B. Y. Dong, Y. Li, Y. H. He, and X. S. Miao, “Complementary graphene-ferroelectric transistors (C-GFTs) as synapses with modulatable plasticity for supervised learning,” in IEEE International Electron Devices Meeting (IEDM) (2019), pp. 1–4.

El Boustani, S.

R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris, M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007).
[Crossref]

Englund, D.

G. Wetzstein, A. Ozcan, S. Gigan, S. Fan, D. Englund, M. Soljačić, C. Denz, D. A. B. Miller, and D. Psaltis, “Inference in artificial intelligence with deep optics and photonics,” Nature 588, 39–47 (2020).
[Crossref]

Ermentrout, B.

R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris, M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007).
[Crossref]

Ernst, U.

C. W. Eurich, K. Pawelzik, U. Ernst, J. D. Cowan, and J. G. Milton, “Dynamics of self-organized delay adaptation,” Phys. Rev. Lett. 82, 1594–1597 (1999).
[Crossref]

Eurich, C. W.

C. W. Eurich, K. Pawelzik, U. Ernst, J. D. Cowan, and J. G. Milton, “Dynamics of self-organized delay adaptation,” Phys. Rev. Lett. 82, 1594–1597 (1999).
[Crossref]

Evangelos, E.

B. Irem, L. G. Manuel, S. R. Nandakumar, M. Timoleon, P. Thomas, R. Bipin, L. Yusuf, S. Abu, and E. Evangelos, “Neuromorphic computing with multi-memristive synapses,” Nat. Commun. 9, 2514 (2018).
[Crossref]

Faber, D. S.

J. W. Lin and D. S. Faber, “Modulation of synaptic delay during synaptic plasticity,” Trends Neurosci. 25, 449–455 (2002).
[Crossref]

Fan, S.

G. Wetzstein, A. Ozcan, S. Gigan, S. Fan, D. Englund, M. Soljačić, C. Denz, D. A. B. Miller, and D. Psaltis, “Inference in artificial intelligence with deep optics and photonics,” Nature 588, 39–47 (2020).
[Crossref]

Feldmann, J.

J. Feldmann, N. Youngblood, C. D. Wright, H. Bhaskaran, and W. H. P. Pernice, “All-optical spiking neurosynaptic networks with self-learning capabilities,” Nature 569, 208–215 (2019).
[Crossref]

Fok, M. P.

Fronzaroli-Molinieres, L.

S. Boudkkazi, L. Fronzaroli-Molinieres, and D. Debanne, “Presynaptic action potential waveform determines cortical synaptic latency,” J. Physiol. 589, 1117–1131 (2011).
[Crossref]

Gallo, M. L.

C. Ríos, N. Youngblood, Z. Cheng, M. L. Gallo, W. H. P. Pernice, C. D. Wright, A. Sebastian, and H. Bhaskaran, “In-memory computing on a photonic platform,” Sci. Adv. 5, eaau5759 (2019).
[Crossref]

Gao, B.

W. Q. Zhang, B. Gao, J. S. Tang, P. Yao, and H. Q. Wu, “Neuro-inspired computing chips,” Nat. Electron. 3, 371–382 (2020).
[Crossref]

Y. Zhou, N. Xu, B. Gao, Y. Y. Chen, B. Y. Dong, Y. Li, Y. H. He, and X. S. Miao, “Complementary graphene-ferroelectric transistors (C-GFTs) as synapses with modulatable plasticity for supervised learning,” in IEEE International Electron Devices Meeting (IEDM) (2019), pp. 1–4.

Ghosh-Dastidar, S.

S. Ghosh-Dastidar and H. Adeli, “Spiking neural networks,” Int. J. Neural Syst. 19, 295–308 (2009).
[Crossref]

S. Ghosh-Dastidar and H. Adeli, “A new supervised learning algorithm for multiple spiking neural networks with application in epilepsy and seizure detection,” Neural Netw. 22, 1419–1431 (2009).
[Crossref]

S. Ghosh-Dastidar and H. Adeli, “Improved spiking neural networks for EEG classification and epilepsy and seizure detection,” Integr. Comput.-Aided Eng. 14, 187–212 (2007).
[Crossref]

Gigan, S.

G. Wetzstein, A. Ozcan, S. Gigan, S. Fan, D. Englund, M. Soljačić, C. Denz, D. A. B. Miller, and D. Psaltis, “Inference in artificial intelligence with deep optics and photonics,” Nature 588, 39–47 (2020).
[Crossref]

Gong, J. K.

S. Y. Xiang, Y. H. Zhang, J. K. Gong, X. X. Guo, L. Lin, and Y. Hao, “STDP-based unsupervised spike pattern learning in a photonic spiking neural network with VCSELs and VCSOAs,” IEEE J. Sel. Top. Quantum Electron. 25, 1700109 (2019).
[Crossref]

S. Y. Xiang, J. K. Gong, Y. H. Zhang, X. X. Guo, A. Wen, and Y. Hao, “Numerical implementation of wavelength-dependent photonic spike timing dependent plasticity based on VCSOA,” IEEE J. Quantum Electron. 54, 8100107 (2018).
[Crossref]

Y. H. Zhang, S. Y. Xiang, J. K. Gong, X. X. Guo, A. J. Wen, and Y. Hao, “Spike encoding and storage properties in mutually coupled vertical-cavity surface-emitting lasers subject to optical pulse injection,” Appl. Opt. 57, 1731–1737 (2018).
[Crossref]

Gong, X. B.

X. B. Gong, Y. B. Wang, and B. Ying, “Delay-induced firing behavior and transitions in adaptive neuronal networks with two types of synapses,” Sci. China Chem. 56, 222–229 (2012).
[Crossref]

Goodman, P. H.

R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris, M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007).
[Crossref]

Govil, S.

A. Roy, S. Govil, and R. Miranda, “An algorithm to generate radial basis function (RBF)-like nets for classification problems,” Neural Netw. 8, 179–201 (1995).
[Crossref]

Grüning, A.

I. Sporea and A. Grüning, “Supervised learning in multilayer spiking neural networks,” Neural Comput. 25, 473–509 (2013).
[Crossref]

Guo, X.

Guo, X. X.

S. Y. Xiang, Y. H. Zhang, J. K. Gong, X. X. Guo, L. Lin, and Y. Hao, “STDP-based unsupervised spike pattern learning in a photonic spiking neural network with VCSELs and VCSOAs,” IEEE J. Sel. Top. Quantum Electron. 25, 1700109 (2019).
[Crossref]

Y. H. Zhang, S. Y. Xiang, J. K. Gong, X. X. Guo, A. J. Wen, and Y. Hao, “Spike encoding and storage properties in mutually coupled vertical-cavity surface-emitting lasers subject to optical pulse injection,” Appl. Opt. 57, 1731–1737 (2018).
[Crossref]

S. Y. Xiang, J. K. Gong, Y. H. Zhang, X. X. Guo, A. Wen, and Y. Hao, “Numerical implementation of wavelength-dependent photonic spike timing dependent plasticity based on VCSOA,” IEEE J. Quantum Electron. 54, 8100107 (2018).
[Crossref]

S. Y. Xiang, Y. H. Zhang, X. X. Guo, J. F. Li, A. J. Wen, W. Pan, and Y. Hao, “Cascadable neuron-like spiking dynamics in coupled VCSELs subject to orthogonally polarized optical pulse injection,” IEEE J. Sel. Top. Quantum Electron. 23, 1700207 (2017).
[Crossref]

S. Y. Xiang, Z. X. Ren, Z. W. Song, Y. H. Zhang, X. X. Guo, G. Q. Han, and Y. Hao, “Computing primitive of fully-VCSELs-based all-optical spiking neural network for supervised learning and pattern classification,” IEEE Trans. Neural Netw. (2020).

Gütig, R.

R. Gütig and H. Sompolinsky, “The tempotron: a neuron that learns spike timing-based decisions,” Nat. Neurosci. 9, 420–428 (2006).
[Crossref]

Han, G. Q.

Z. W. Song, S. Y. Xiang, Z. X. Ren, G. Q. Han, and Y. Hao, “Spike sequence learning in a photonic spiking neural network consisting of VCSELs-SA with supervised training,” IEEE J. Sel. Top. Quantum Electron. 26, 1700209 (2020).
[Crossref]

S. Y. Xiang, Z. X. Ren, Z. W. Song, Y. H. Zhang, X. X. Guo, G. Q. Han, and Y. Hao, “Computing primitive of fully-VCSELs-based all-optical spiking neural network for supervised learning and pattern classification,” IEEE Trans. Neural Netw. (2020).

Hansel, C.

H. Jörntell and C. Hansel, “Synaptic memories upside down: bidirectional plasticity at cerebellar parallel fiber-Purkinje cell synapses,” Neuron 52, 227–238 (2006).
[Crossref]

Hao, Y.

S. Y. Xiang, Z. X. Ren, Y. H. Zhang, Z. W. Song, and Y. Hao, “All-optical neuromorphic XOR operation with inhibitory dynamics of a single photonic spiking neuron based on VCSEL-SA,” Opt. Lett. 45, 1104–1107 (2020).
[Crossref]

Z. W. Song, S. Y. Xiang, Z. X. Ren, G. Q. Han, and Y. Hao, “Spike sequence learning in a photonic spiking neural network consisting of VCSELs-SA with supervised training,” IEEE J. Sel. Top. Quantum Electron. 26, 1700209 (2020).
[Crossref]

S. Y. Xiang, Y. H. Zhang, J. K. Gong, X. X. Guo, L. Lin, and Y. Hao, “STDP-based unsupervised spike pattern learning in a photonic spiking neural network with VCSELs and VCSOAs,” IEEE J. Sel. Top. Quantum Electron. 25, 1700109 (2019).
[Crossref]

Y. H. Zhang, S. Y. Xiang, X. Guo, A. Wen, and Y. Hao, “All-optical inhibitory dynamics in photonic neuron based on polarization mode competition in a VCSEL with an embedded saturable absorber,” Opt. Lett. 44, 1548–1551 (2019).
[Crossref]

S. Y. Xiang, J. K. Gong, Y. H. Zhang, X. X. Guo, A. Wen, and Y. Hao, “Numerical implementation of wavelength-dependent photonic spike timing dependent plasticity based on VCSOA,” IEEE J. Quantum Electron. 54, 8100107 (2018).
[Crossref]

Y. H. Zhang, S. Y. Xiang, J. K. Gong, X. X. Guo, A. J. Wen, and Y. Hao, “Spike encoding and storage properties in mutually coupled vertical-cavity surface-emitting lasers subject to optical pulse injection,” Appl. Opt. 57, 1731–1737 (2018).
[Crossref]

S. Y. Xiang, Y. H. Zhang, X. X. Guo, J. F. Li, A. J. Wen, W. Pan, and Y. Hao, “Cascadable neuron-like spiking dynamics in coupled VCSELs subject to orthogonally polarized optical pulse injection,” IEEE J. Sel. Top. Quantum Electron. 23, 1700207 (2017).
[Crossref]

S. Y. Xiang, Z. X. Ren, Z. W. Song, Y. H. Zhang, X. X. Guo, G. Q. Han, and Y. Hao, “Computing primitive of fully-VCSELs-based all-optical spiking neural network for supervised learning and pattern classification,” IEEE Trans. Neural Netw. (2020).

Harris, F. C.

R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris, M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007).
[Crossref]

He, Y. H.

Y. Zhou, N. Xu, B. Gao, Y. Y. Chen, B. Y. Dong, Y. Li, Y. H. He, and X. S. Miao, “Complementary graphene-ferroelectric transistors (C-GFTs) as synapses with modulatable plasticity for supervised learning,” in IEEE International Electron Devices Meeting (IEDM) (2019), pp. 1–4.

Hedayat, S.

I. Sourikopoulos, S. Hedayat, C. Loyez, F. Danneville, V. Hoel, E. Mercier, and A. Cappy, “A 4-fJ/spike artificial neuron in 65 nm CMOS technology,” Front. Neurosci. 11, 123 (2017).
[Crossref]

Henning, I.

A. Hurtado, K. Schires, I. Henning, and M. Adams, “Investigation of vertical cavity surface emitting laser dynamics for neuromorphic photonic systems,” Appl. Phys. Lett. 100, 103703 (2012).
[Crossref]

Henning, I. D.

Hines, M.

R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris, M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007).
[Crossref]

Hoel, V.

I. Sourikopoulos, S. Hedayat, C. Loyez, F. Danneville, V. Hoel, E. Mercier, and A. Cappy, “A 4-fJ/spike artificial neuron in 65 nm CMOS technology,” Front. Neurosci. 11, 123 (2017).
[Crossref]

Hong, C.

C. Hong, X. Wei, J. Wang, B. Deng, H. Yu, and Y. Che, “Training spiking neural networks for cognitive tasks: a versatile framework compatible with various temporal codes,” IEEE Trans. Neural Netw. Learn. Syst. 31, 1285–1296 (2020).
[Crossref]

Hu, J.

J. Hu, H. Tang, K. C. Tan, H. Li, and L. Shi, “A spike-timing-based integrated model for pattern recognition,” Neural Comput. 25, 450–472 (2013).
[Crossref]

Hu, S. G.

S. G. Hu, G. C. Qiao, Y. A. Liu, L. M. Rong, Q. Yu, and Y. Liu, “An improved memristor model connecting plastic synapse synapses and nonlinear spiking neuron,” J. Phys. D 52, 275402 (2019).
[Crossref]

Huang, R.

J. D. Zhu, Y. C. Yang, R. D. Jia, Z. X. Liang, W. Zhu, Z. A. Rehman, L. Bao, X. X. Zhang, Y. M. Cai, L. Song, and R. Huang, “Ion gated synaptic transistors based on 2D van der Waals crystals with tunable diffusive dynamics,” Adv. Mater. 30, 1800195 (2018).
[Crossref]

Hurtado, A.

J. Robertson, E. Wade, Y. Kopp, J. Bueno, and A. Hurtado, “Toward neuromorphic photonic networks of ultrafast spiking laser neurons,” IEEE J. Sel. Top. Quantum Electron. 26, 7700715 (2020).
[Crossref]

T. Deng, J. Robertson, and A. Hurtado, “Controlled propagation of spiking dynamics in vertical-cavity surface-emitting lasers: towards neuromorphic photonic networks,” IEEE J. Sel. Top. Quantum Electron. 23, 1800408 (2017).
[Crossref]

A. Hurtado, K. Schires, I. Henning, and M. Adams, “Investigation of vertical cavity surface emitting laser dynamics for neuromorphic photonic systems,” Appl. Phys. Lett. 100, 103703 (2012).
[Crossref]

A. Hurtado, I. D. Henning, and M. J. Adams, “Optical neuron using polarization switching in a 1550 nm-VCSEL,” Opt. Express 18, 25170–25176 (2010).
[Crossref]

Hwang, H.

W. Xu, S. Y. Min, H. Hwang, and T. W. Lee, “Organic core-sheath nanowire artificial synapses with femtojoule energy consumption,” Sci. Adv. 2, e1501326 (2016).
[Crossref]

Indiveri, G.

N. Kasabov, K. Dhoble, N. Nuntalid, and G. Indiveri, “Dynamic evolving spiking neural networks for on-line spatio- and spectro-temporal pattern recognition,” Neural Netw. 41, 188–201 (2013).
[Crossref]

Irem, B.

B. Irem, L. G. Manuel, S. R. Nandakumar, M. Timoleon, P. Thomas, R. Bipin, L. Yusuf, S. Abu, and E. Evangelos, “Neuromorphic computing with multi-memristive synapses,” Nat. Commun. 9, 2514 (2018).
[Crossref]

Jia, R. D.

J. D. Zhu, Y. C. Yang, R. D. Jia, Z. X. Liang, W. Zhu, Z. A. Rehman, L. Bao, X. X. Zhang, Y. M. Cai, L. Song, and R. Huang, “Ion gated synaptic transistors based on 2D van der Waals crystals with tunable diffusive dynamics,” Adv. Mater. 30, 1800195 (2018).
[Crossref]

Jirsa, V. K.

M. Dhamala, V. K. Jirsa, and M. Ding, “Enhancement of neural synchrony by time delay,” Phys. Rev. Lett. 92, 074104 (2004).
[Crossref]

Jörntell, H.

H. Jörntell and C. Hansel, “Synaptic memories upside down: bidirectional plasticity at cerebellar parallel fiber-Purkinje cell synapses,” Neuron 52, 227–238 (2006).
[Crossref]

Kasabov, N.

T. Clarence, M. Arlija, and N. Kasabov, “Spiking neural networks: background, recent development and the NeuCube architecture,” Neural Process. Lett. 3, 1675–1701 (2020).
[Crossref]

N. Kasabov, K. Dhoble, N. Nuntalid, and G. Indiveri, “Dynamic evolving spiking neural networks for on-line spatio- and spectro-temporal pattern recognition,” Neural Netw. 41, 188–201 (2013).
[Crossref]

Kasinski, A.

F. Ponulak and A. Kasinski, “Introduction to spiking neural networks: information processing, learning and applications,” Acta Neurobiol. Experim. 71, 409–433 (2011).

F. Ponulak and A. Kasinski, “Supervised learning in spiking neural networks with ReSuMe: sequence learning, classification, and spike shifting,” Neural Comput. 22, 467–510 (2010).
[Crossref]

Kitchen, J.

S. Moallemi, R. Welker, and J. Kitchen, “Wide band programmable true time delay block for phased array antenna applications,” in IEEE Dallas Circuits and Systems Conference (DCAS) (2016),pp. 1–4.

Kopp, Y.

J. Robertson, E. Wade, Y. Kopp, J. Bueno, and A. Hurtado, “Toward neuromorphic photonic networks of ultrafast spiking laser neurons,” IEEE J. Sel. Top. Quantum Electron. 26, 7700715 (2020).
[Crossref]

Kulkarni, S. R.

S. R. Kulkarni and B. Rajendran, “Spiking neural networks for handwritten digit recognition—supervised learning and network optimization,” Neural Netw. 103, 118–127 (2018).
[Crossref]

Lansner, A.

R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris, M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007).
[Crossref]

Laughlin, S. B.

S. B. Laughlin, “Energy as a constraint on the coding and processing of sensory information,” Curr. Opinion Neurobiol. 11, 475–480 (2001).
[Crossref]

S. B. Laughlin, R. R. de Ruyter van Steveninck, and J. C. Anderson, “The metabolic cost of neural information,” Nat. Neurosci. 1, 36–41 (1998).
[Crossref]

Le, Y.

Q. Li, Z. Wang, Y. Le, C. Sun, X. Song, and C. Wu, “Optical implementation of neural learning algorithms based on cross-gain modulation in a semiconductor optical amplifier,” Proc. SPIE 10019, 100190E (2016).
[Crossref]

Lee, T. W.

W. Xu, S. Y. Min, H. Hwang, and T. W. Lee, “Organic core-sheath nanowire artificial synapses with femtojoule energy consumption,” Sci. Adv. 2, e1501326 (2016).
[Crossref]

Li, H.

J. Hu, H. Tang, K. C. Tan, H. Li, and L. Shi, “A spike-timing-based integrated model for pattern recognition,” Neural Comput. 25, 450–472 (2013).
[Crossref]

Li, J. F.

S. Y. Xiang, Y. H. Zhang, X. X. Guo, J. F. Li, A. J. Wen, W. Pan, and Y. Hao, “Cascadable neuron-like spiking dynamics in coupled VCSELs subject to orthogonally polarized optical pulse injection,” IEEE J. Sel. Top. Quantum Electron. 23, 1700207 (2017).
[Crossref]

Li, Q.

Q. Li, Z. Wang, Y. Le, C. Sun, X. Song, and C. Wu, “Optical implementation of neural learning algorithms based on cross-gain modulation in a semiconductor optical amplifier,” Proc. SPIE 10019, 100190E (2016).
[Crossref]

Li, Y.

A. Taherkhani, A. Belatreche, Y. Li, and L. P. Maguire, “A supervised learning algorithm for learning precise timing of multiple spikes in multilayer spiking neural networks,” IEEE Trans. Neur. Netw. Learn. Syst. 29, 5394–5407 (2018).
[Crossref]

A. Taherkhani, A. Belatreche, Y. Li, and L. P. Maguire, “DL-ReSuMe: a delay learning-based remote supervised method for spiking neurons,” IEEE Trans. Neur. Netw. Learn. Syst. 26, 3137–3149 (2015).
[Crossref]

A. Taherkhani, A. Belatreche, Y. Li, and L. P. Maguire, “EDL: an extended delay learning based remote supervised method for spiking neurons,” in Neural Information Processing (Springer, 2015), pp. 190–197.

Y. Zhou, N. Xu, B. Gao, Y. Y. Chen, B. Y. Dong, Y. Li, Y. H. He, and X. S. Miao, “Complementary graphene-ferroelectric transistors (C-GFTs) as synapses with modulatable plasticity for supervised learning,” in IEEE International Electron Devices Meeting (IEDM) (2019), pp. 1–4.

Liang, Z. X.

J. D. Zhu, Y. C. Yang, R. D. Jia, Z. X. Liang, W. Zhu, Z. A. Rehman, L. Bao, X. X. Zhang, Y. M. Cai, L. Song, and R. Huang, “Ion gated synaptic transistors based on 2D van der Waals crystals with tunable diffusive dynamics,” Adv. Mater. 30, 1800195 (2018).
[Crossref]

Lin, J. W.

J. W. Lin and D. S. Faber, “Modulation of synaptic delay during synaptic plasticity,” Trends Neurosci. 25, 449–455 (2002).
[Crossref]

Lin, L.

S. Y. Xiang, Y. H. Zhang, J. K. Gong, X. X. Guo, L. Lin, and Y. Hao, “STDP-based unsupervised spike pattern learning in a photonic spiking neural network with VCSELs and VCSOAs,” IEEE J. Sel. Top. Quantum Electron. 25, 1700109 (2019).
[Crossref]

Liu, Y.

S. G. Hu, G. C. Qiao, Y. A. Liu, L. M. Rong, Q. Yu, and Y. Liu, “An improved memristor model connecting plastic synapse synapses and nonlinear spiking neuron,” J. Phys. D 52, 275402 (2019).
[Crossref]

Liu, Y. A.

S. G. Hu, G. C. Qiao, Y. A. Liu, L. M. Rong, Q. Yu, and Y. Liu, “An improved memristor model connecting plastic synapse synapses and nonlinear spiking neuron,” J. Phys. D 52, 275402 (2019).
[Crossref]

Loyez, C.

I. Sourikopoulos, S. Hedayat, C. Loyez, F. Danneville, V. Hoel, E. Mercier, and A. Cappy, “A 4-fJ/spike artificial neuron in 65 nm CMOS technology,” Front. Neurosci. 11, 123 (2017).
[Crossref]

Ma, B. W.

B. W. Ma and W. W. Zou, “Demonstration of a distributed feedback laser diode working as a graded-potential-signaling photonic neuron and its application to neuromorphic information processing,” Sci. China Inf. Sci. 63, 160408 (2020).
[Crossref]

Maguire, L. P.

A. Taherkhani, A. Belatreche, Y. Li, and L. P. Maguire, “A supervised learning algorithm for learning precise timing of multiple spikes in multilayer spiking neural networks,” IEEE Trans. Neur. Netw. Learn. Syst. 29, 5394–5407 (2018).
[Crossref]

A. Taherkhani, A. Belatreche, Y. Li, and L. P. Maguire, “DL-ReSuMe: a delay learning-based remote supervised method for spiking neurons,” IEEE Trans. Neur. Netw. Learn. Syst. 26, 3137–3149 (2015).
[Crossref]

A. Taherkhani, A. Belatreche, Y. Li, and L. P. Maguire, “EDL: an extended delay learning based remote supervised method for spiking neurons,” in Neural Information Processing (Springer, 2015), pp. 190–197.

Manuel, L. G.

B. Irem, L. G. Manuel, S. R. Nandakumar, M. Timoleon, P. Thomas, R. Bipin, L. Yusuf, S. Abu, and E. Evangelos, “Neuromorphic computing with multi-memristive synapses,” Nat. Commun. 9, 2514 (2018).
[Crossref]

Mercier, E.

I. Sourikopoulos, S. Hedayat, C. Loyez, F. Danneville, V. Hoel, E. Mercier, and A. Cappy, “A 4-fJ/spike artificial neuron in 65 nm CMOS technology,” Front. Neurosci. 11, 123 (2017).
[Crossref]

Meybodi, M. R.

P. Adibi, M. R. Meybodi, and R. Safabakhsh, “Unsupervised learning of synaptic delays based on learning automata in an RBF-like network of spiking neurons for data clustering,” Neurocomputing 64, 335–357 (2005).
[Crossref]

Miao, X. S.

Y. Zhou, N. Xu, B. Gao, Y. Y. Chen, B. Y. Dong, Y. Li, Y. H. He, and X. S. Miao, “Complementary graphene-ferroelectric transistors (C-GFTs) as synapses with modulatable plasticity for supervised learning,” in IEEE International Electron Devices Meeting (IEDM) (2019), pp. 1–4.

Miller, D. A. B.

G. Wetzstein, A. Ozcan, S. Gigan, S. Fan, D. Englund, M. Soljačić, C. Denz, D. A. B. Miller, and D. Psaltis, “Inference in artificial intelligence with deep optics and photonics,” Nature 588, 39–47 (2020).
[Crossref]

Milton, J. G.

C. W. Eurich, K. Pawelzik, U. Ernst, J. D. Cowan, and J. G. Milton, “Dynamics of self-organized delay adaptation,” Phys. Rev. Lett. 82, 1594–1597 (1999).
[Crossref]

Min, S. Y.

W. Xu, S. Y. Min, H. Hwang, and T. W. Lee, “Organic core-sheath nanowire artificial synapses with femtojoule energy consumption,” Sci. Adv. 2, e1501326 (2016).
[Crossref]

Miranda, R.

A. Roy, S. Govil, and R. Miranda, “An algorithm to generate radial basis function (RBF)-like nets for classification problems,” Neural Netw. 8, 179–201 (1995).
[Crossref]

Moallemi, S.

S. Moallemi, R. Welker, and J. Kitchen, “Wide band programmable true time delay block for phased array antenna applications,” in IEEE Dallas Circuits and Systems Conference (DCAS) (2016),pp. 1–4.

Mohemmed, A.

A. Mohemmed and S. Schliebs, “Training spiking neural networks to associate spatio-temporal input-output spike patterns,” Neurocomputing 107, 3–10 (2013).
[Crossref]

Morrison, A.

R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris, M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007).
[Crossref]

Muller, E.

R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris, M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007).
[Crossref]

Nanbu, M.

J. Ohtsubo, R. Ozawa, and M. Nanbu, “Synchrony of small nonlinear networks in chaotic semiconductor lasers,” Jpn. J. Appl. Phys. 54, 072702 (2015).
[Crossref]

Nandakumar, S. R.

B. Irem, L. G. Manuel, S. R. Nandakumar, M. Timoleon, P. Thomas, R. Bipin, L. Yusuf, S. Abu, and E. Evangelos, “Neuromorphic computing with multi-memristive synapses,” Nat. Commun. 9, 2514 (2018).
[Crossref]

Natschläger, T.

R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris, M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007).
[Crossref]

Nuntalid, N.

N. Kasabov, K. Dhoble, N. Nuntalid, and G. Indiveri, “Dynamic evolving spiking neural networks for on-line spatio- and spectro-temporal pattern recognition,” Neural Netw. 41, 188–201 (2013).
[Crossref]

Ohtsubo, J.

J. Ohtsubo, R. Ozawa, and M. Nanbu, “Synchrony of small nonlinear networks in chaotic semiconductor lasers,” Jpn. J. Appl. Phys. 54, 072702 (2015).
[Crossref]

Ozawa, R.

J. Ohtsubo, R. Ozawa, and M. Nanbu, “Synchrony of small nonlinear networks in chaotic semiconductor lasers,” Jpn. J. Appl. Phys. 54, 072702 (2015).
[Crossref]

Ozcan, A.

G. Wetzstein, A. Ozcan, S. Gigan, S. Fan, D. Englund, M. Soljačić, C. Denz, D. A. B. Miller, and D. Psaltis, “Inference in artificial intelligence with deep optics and photonics,” Nature 588, 39–47 (2020).
[Crossref]

Pan, W.

S. Y. Xiang, Y. H. Zhang, X. X. Guo, J. F. Li, A. J. Wen, W. Pan, and Y. Hao, “Cascadable neuron-like spiking dynamics in coupled VCSELs subject to orthogonally polarized optical pulse injection,” IEEE J. Sel. Top. Quantum Electron. 23, 1700207 (2017).
[Crossref]

Pan, Z. H.

M. L. Zhang, J. B. Wu, A. Belatreche, Z. H. Pan, X. R. Xie, and Y. S. Chua, “Supervised learning in spiking neural networks with synaptic delay-weight plasticity,” Neurocomputing 409, 103–118 (2020).
[Crossref]

Paugam-Moisy, H.

H. Paugam-Moisy and S. Bohte, “Computing with spiking neuron networks,” in Handbook of Natural Computing (Springer, 2012), pp. 335–376.

Pawelzik, K.

C. W. Eurich, K. Pawelzik, U. Ernst, J. D. Cowan, and J. G. Milton, “Dynamics of self-organized delay adaptation,” Phys. Rev. Lett. 82, 1594–1597 (1999).
[Crossref]

Pecevski, D.

R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris, M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007).
[Crossref]

Pernice, W. H. P.

J. Feldmann, N. Youngblood, C. D. Wright, H. Bhaskaran, and W. H. P. Pernice, “All-optical spiking neurosynaptic networks with self-learning capabilities,” Nature 569, 208–215 (2019).
[Crossref]

C. Ríos, N. Youngblood, Z. Cheng, M. L. Gallo, W. H. P. Pernice, C. D. Wright, A. Sebastian, and H. Bhaskaran, “In-memory computing on a photonic platform,” Sci. Adv. 5, eaau5759 (2019).
[Crossref]

Z. G. Chen, R. Carlos, W. H. P. Pernice, C. D. Wrigh, and H. Bhaskara, “On-chip photonic synapse,” Sci. Adv. 3, e1700160 (2017).
[Crossref]

Ponulak, F.

F. Ponulak and A. Kasinski, “Introduction to spiking neural networks: information processing, learning and applications,” Acta Neurobiol. Experim. 71, 409–433 (2011).

F. Ponulak and A. Kasinski, “Supervised learning in spiking neural networks with ReSuMe: sequence learning, classification, and spike shifting,” Neural Comput. 22, 467–510 (2010).
[Crossref]

Priyadarshini, P.

K. Roy, J. Akhilesh, and P. Priyadarshini, “Towards spike-based machine intelligence with neuromorphic computing,” Nature 575, 607–617 (2019).
[Crossref]

Prucnal, P. R.

Psaltis, D.

G. Wetzstein, A. Ozcan, S. Gigan, S. Fan, D. Englund, M. Soljačić, C. Denz, D. A. B. Miller, and D. Psaltis, “Inference in artificial intelligence with deep optics and photonics,” Nature 588, 39–47 (2020).
[Crossref]

Qiao, G. C.

S. G. Hu, G. C. Qiao, Y. A. Liu, L. M. Rong, Q. Yu, and Y. Liu, “An improved memristor model connecting plastic synapse synapses and nonlinear spiking neuron,” J. Phys. D 52, 275402 (2019).
[Crossref]

Rajendran, B.

S. R. Kulkarni and B. Rajendran, “Spiking neural networks for handwritten digit recognition—supervised learning and network optimization,” Neural Netw. 103, 118–127 (2018).
[Crossref]

Rehman, Z. A.

J. D. Zhu, Y. C. Yang, R. D. Jia, Z. X. Liang, W. Zhu, Z. A. Rehman, L. Bao, X. X. Zhang, Y. M. Cai, L. Song, and R. Huang, “Ion gated synaptic transistors based on 2D van der Waals crystals with tunable diffusive dynamics,” Adv. Mater. 30, 1800195 (2018).
[Crossref]

Ren, Z. X.

S. Y. Xiang, Z. X. Ren, Y. H. Zhang, Z. W. Song, and Y. Hao, “All-optical neuromorphic XOR operation with inhibitory dynamics of a single photonic spiking neuron based on VCSEL-SA,” Opt. Lett. 45, 1104–1107 (2020).
[Crossref]

Z. W. Song, S. Y. Xiang, Z. X. Ren, G. Q. Han, and Y. Hao, “Spike sequence learning in a photonic spiking neural network consisting of VCSELs-SA with supervised training,” IEEE J. Sel. Top. Quantum Electron. 26, 1700209 (2020).
[Crossref]

S. Y. Xiang, Z. X. Ren, Z. W. Song, Y. H. Zhang, X. X. Guo, G. Q. Han, and Y. Hao, “Computing primitive of fully-VCSELs-based all-optical spiking neural network for supervised learning and pattern classification,” IEEE Trans. Neural Netw. (2020).

Ríos, C.

C. Ríos, N. Youngblood, Z. Cheng, M. L. Gallo, W. H. P. Pernice, C. D. Wright, A. Sebastian, and H. Bhaskaran, “In-memory computing on a photonic platform,” Sci. Adv. 5, eaau5759 (2019).
[Crossref]

Robertson, J.

J. Robertson, E. Wade, Y. Kopp, J. Bueno, and A. Hurtado, “Toward neuromorphic photonic networks of ultrafast spiking laser neurons,” IEEE J. Sel. Top. Quantum Electron. 26, 7700715 (2020).
[Crossref]

T. Deng, J. Robertson, and A. Hurtado, “Controlled propagation of spiking dynamics in vertical-cavity surface-emitting lasers: towards neuromorphic photonic networks,” IEEE J. Sel. Top. Quantum Electron. 23, 1800408 (2017).
[Crossref]

Rochel, O.

R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris, M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007).
[Crossref]

Rong, L. M.

S. G. Hu, G. C. Qiao, Y. A. Liu, L. M. Rong, Q. Yu, and Y. Liu, “An improved memristor model connecting plastic synapse synapses and nonlinear spiking neuron,” J. Phys. D 52, 275402 (2019).
[Crossref]

Rosenbluth, D.

Roy, A.

A. Roy, S. Govil, and R. Miranda, “An algorithm to generate radial basis function (RBF)-like nets for classification problems,” Neural Netw. 8, 179–201 (1995).
[Crossref]

Roy, K.

I. Chakraborty, G. Saha, and K. Roy, “Photonic in-memory computing primitive for spiking neural networks using phase-change materials,” Phys. Rev. Appl. 11, 014063 (2019).
[Crossref]

K. Roy, J. Akhilesh, and P. Priyadarshini, “Towards spike-based machine intelligence with neuromorphic computing,” Nature 575, 607–617 (2019).
[Crossref]

I. Chakraborty, G. Saha, A. Sengupta, and K. Roy, “Toward fast neural computing using all-photonic phase change spiking neurons,” Sci. Rep. 8, 12980 (2018).
[Crossref]

Rudolph, M.

R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris, M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007).
[Crossref]

Safabakhsh, R.

P. Adibi, M. R. Meybodi, and R. Safabakhsh, “Unsupervised learning of synaptic delays based on learning automata in an RBF-like network of spiking neurons for data clustering,” Neurocomputing 64, 335–357 (2005).
[Crossref]

Saha, G.

I. Chakraborty, G. Saha, and K. Roy, “Photonic in-memory computing primitive for spiking neural networks using phase-change materials,” Phys. Rev. Appl. 11, 014063 (2019).
[Crossref]

I. Chakraborty, G. Saha, A. Sengupta, and K. Roy, “Toward fast neural computing using all-photonic phase change spiking neurons,” Sci. Rep. 8, 12980 (2018).
[Crossref]

Schires, K.

A. Hurtado, K. Schires, I. Henning, and M. Adams, “Investigation of vertical cavity surface emitting laser dynamics for neuromorphic photonic systems,” Appl. Phys. Lett. 100, 103703 (2012).
[Crossref]

Schliebs, S.

A. Mohemmed and S. Schliebs, “Training spiking neural networks to associate spatio-temporal input-output spike patterns,” Neurocomputing 107, 3–10 (2013).
[Crossref]

Sebastian, A.

C. Ríos, N. Youngblood, Z. Cheng, M. L. Gallo, W. H. P. Pernice, C. D. Wright, A. Sebastian, and H. Bhaskaran, “In-memory computing on a photonic platform,” Sci. Adv. 5, eaau5759 (2019).
[Crossref]

Sengupta, A.

I. Chakraborty, G. Saha, A. Sengupta, and K. Roy, “Toward fast neural computing using all-photonic phase change spiking neurons,” Sci. Rep. 8, 12980 (2018).
[Crossref]

Shi, L.

J. Hu, H. Tang, K. C. Tan, H. Li, and L. Shi, “A spike-timing-based integrated model for pattern recognition,” Neural Comput. 25, 450–472 (2013).
[Crossref]

Soljacic, M.

G. Wetzstein, A. Ozcan, S. Gigan, S. Fan, D. Englund, M. Soljačić, C. Denz, D. A. B. Miller, and D. Psaltis, “Inference in artificial intelligence with deep optics and photonics,” Nature 588, 39–47 (2020).
[Crossref]

Sompolinsky, H.

R. Gütig and H. Sompolinsky, “The tempotron: a neuron that learns spike timing-based decisions,” Nat. Neurosci. 9, 420–428 (2006).
[Crossref]

Song, L.

J. D. Zhu, Y. C. Yang, R. D. Jia, Z. X. Liang, W. Zhu, Z. A. Rehman, L. Bao, X. X. Zhang, Y. M. Cai, L. Song, and R. Huang, “Ion gated synaptic transistors based on 2D van der Waals crystals with tunable diffusive dynamics,” Adv. Mater. 30, 1800195 (2018).
[Crossref]

Song, X.

Q. Li, Z. Wang, Y. Le, C. Sun, X. Song, and C. Wu, “Optical implementation of neural learning algorithms based on cross-gain modulation in a semiconductor optical amplifier,” Proc. SPIE 10019, 100190E (2016).
[Crossref]

Song, Z. W.

S. Y. Xiang, Z. X. Ren, Y. H. Zhang, Z. W. Song, and Y. Hao, “All-optical neuromorphic XOR operation with inhibitory dynamics of a single photonic spiking neuron based on VCSEL-SA,” Opt. Lett. 45, 1104–1107 (2020).
[Crossref]

Z. W. Song, S. Y. Xiang, Z. X. Ren, G. Q. Han, and Y. Hao, “Spike sequence learning in a photonic spiking neural network consisting of VCSELs-SA with supervised training,” IEEE J. Sel. Top. Quantum Electron. 26, 1700209 (2020).
[Crossref]

S. Y. Xiang, Z. X. Ren, Z. W. Song, Y. H. Zhang, X. X. Guo, G. Q. Han, and Y. Hao, “Computing primitive of fully-VCSELs-based all-optical spiking neural network for supervised learning and pattern classification,” IEEE Trans. Neural Netw. (2020).

Sourikopoulos, I.

I. Sourikopoulos, S. Hedayat, C. Loyez, F. Danneville, V. Hoel, E. Mercier, and A. Cappy, “A 4-fJ/spike artificial neuron in 65 nm CMOS technology,” Front. Neurosci. 11, 123 (2017).
[Crossref]

Sporea, I.

I. Sporea and A. Grüning, “Supervised learning in multilayer spiking neural networks,” Neural Comput. 25, 473–509 (2013).
[Crossref]

Sun, C.

Q. Li, Z. Wang, Y. Le, C. Sun, X. Song, and C. Wu, “Optical implementation of neural learning algorithms based on cross-gain modulation in a semiconductor optical amplifier,” Proc. SPIE 10019, 100190E (2016).
[Crossref]

Taherkhani, A.

A. Taherkhani, A. Belatreche, Y. Li, and L. P. Maguire, “A supervised learning algorithm for learning precise timing of multiple spikes in multilayer spiking neural networks,” IEEE Trans. Neur. Netw. Learn. Syst. 29, 5394–5407 (2018).
[Crossref]

A. Taherkhani, A. Belatreche, Y. Li, and L. P. Maguire, “DL-ReSuMe: a delay learning-based remote supervised method for spiking neurons,” IEEE Trans. Neur. Netw. Learn. Syst. 26, 3137–3149 (2015).
[Crossref]

A. Taherkhani, A. Belatreche, Y. Li, and L. P. Maguire, “EDL: an extended delay learning based remote supervised method for spiking neurons,” in Neural Information Processing (Springer, 2015), pp. 190–197.

Tan, K. C.

J. Hu, H. Tang, K. C. Tan, H. Li, and L. Shi, “A spike-timing-based integrated model for pattern recognition,” Neural Comput. 25, 450–472 (2013).
[Crossref]

Tang, H.

J. Hu, H. Tang, K. C. Tan, H. Li, and L. Shi, “A spike-timing-based integrated model for pattern recognition,” Neural Comput. 25, 450–472 (2013).
[Crossref]

Tang, J. S.

W. Q. Zhang, B. Gao, J. S. Tang, P. Yao, and H. Q. Wu, “Neuro-inspired computing chips,” Nat. Electron. 3, 371–382 (2020).
[Crossref]

Thomas, P.

B. Irem, L. G. Manuel, S. R. Nandakumar, M. Timoleon, P. Thomas, R. Bipin, L. Yusuf, S. Abu, and E. Evangelos, “Neuromorphic computing with multi-memristive synapses,” Nat. Commun. 9, 2514 (2018).
[Crossref]

Tian, Y.

Timoleon, M.

B. Irem, L. G. Manuel, S. R. Nandakumar, M. Timoleon, P. Thomas, R. Bipin, L. Yusuf, S. Abu, and E. Evangelos, “Neuromorphic computing with multi-memristive synapses,” Nat. Commun. 9, 2514 (2018).
[Crossref]

Vieville, T.

R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris, M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007).
[Crossref]

Wade, E.

J. Robertson, E. Wade, Y. Kopp, J. Bueno, and A. Hurtado, “Toward neuromorphic photonic networks of ultrafast spiking laser neurons,” IEEE J. Sel. Top. Quantum Electron. 26, 7700715 (2020).
[Crossref]

Wang, J.

C. Hong, X. Wei, J. Wang, B. Deng, H. Yu, and Y. Che, “Training spiking neural networks for cognitive tasks: a versatile framework compatible with various temporal codes,” IEEE Trans. Neural Netw. Learn. Syst. 31, 1285–1296 (2020).
[Crossref]

Wang, Y. B.

X. B. Gong, Y. B. Wang, and B. Ying, “Delay-induced firing behavior and transitions in adaptive neuronal networks with two types of synapses,” Sci. China Chem. 56, 222–229 (2012).
[Crossref]

Wang, Z.

Q. Li, Z. Wang, Y. Le, C. Sun, X. Song, and C. Wu, “Optical implementation of neural learning algorithms based on cross-gain modulation in a semiconductor optical amplifier,” Proc. SPIE 10019, 100190E (2016).
[Crossref]

Wei, X.

C. Hong, X. Wei, J. Wang, B. Deng, H. Yu, and Y. Che, “Training spiking neural networks for cognitive tasks: a versatile framework compatible with various temporal codes,” IEEE Trans. Neural Netw. Learn. Syst. 31, 1285–1296 (2020).
[Crossref]

Welker, R.

S. Moallemi, R. Welker, and J. Kitchen, “Wide band programmable true time delay block for phased array antenna applications,” in IEEE Dallas Circuits and Systems Conference (DCAS) (2016),pp. 1–4.

Wen, A.

Y. H. Zhang, S. Y. Xiang, X. Guo, A. Wen, and Y. Hao, “All-optical inhibitory dynamics in photonic neuron based on polarization mode competition in a VCSEL with an embedded saturable absorber,” Opt. Lett. 44, 1548–1551 (2019).
[Crossref]

S. Y. Xiang, J. K. Gong, Y. H. Zhang, X. X. Guo, A. Wen, and Y. Hao, “Numerical implementation of wavelength-dependent photonic spike timing dependent plasticity based on VCSOA,” IEEE J. Quantum Electron. 54, 8100107 (2018).
[Crossref]

Wen, A. J.

Y. H. Zhang, S. Y. Xiang, J. K. Gong, X. X. Guo, A. J. Wen, and Y. Hao, “Spike encoding and storage properties in mutually coupled vertical-cavity surface-emitting lasers subject to optical pulse injection,” Appl. Opt. 57, 1731–1737 (2018).
[Crossref]

S. Y. Xiang, Y. H. Zhang, X. X. Guo, J. F. Li, A. J. Wen, W. Pan, and Y. Hao, “Cascadable neuron-like spiking dynamics in coupled VCSELs subject to orthogonally polarized optical pulse injection,” IEEE J. Sel. Top. Quantum Electron. 23, 1700207 (2017).
[Crossref]

Wetzstein, G.

G. Wetzstein, A. Ozcan, S. Gigan, S. Fan, D. Englund, M. Soljačić, C. Denz, D. A. B. Miller, and D. Psaltis, “Inference in artificial intelligence with deep optics and photonics,” Nature 588, 39–47 (2020).
[Crossref]

Wrigh, C. D.

Z. G. Chen, R. Carlos, W. H. P. Pernice, C. D. Wrigh, and H. Bhaskara, “On-chip photonic synapse,” Sci. Adv. 3, e1700160 (2017).
[Crossref]

Wright, C. D.

J. Feldmann, N. Youngblood, C. D. Wright, H. Bhaskaran, and W. H. P. Pernice, “All-optical spiking neurosynaptic networks with self-learning capabilities,” Nature 569, 208–215 (2019).
[Crossref]

C. Ríos, N. Youngblood, Z. Cheng, M. L. Gallo, W. H. P. Pernice, C. D. Wright, A. Sebastian, and H. Bhaskaran, “In-memory computing on a photonic platform,” Sci. Adv. 5, eaau5759 (2019).
[Crossref]

Wu, C.

Q. Li, Z. Wang, Y. Le, C. Sun, X. Song, and C. Wu, “Optical implementation of neural learning algorithms based on cross-gain modulation in a semiconductor optical amplifier,” Proc. SPIE 10019, 100190E (2016).
[Crossref]

Wu, H. Q.

W. Q. Zhang, B. Gao, J. S. Tang, P. Yao, and H. Q. Wu, “Neuro-inspired computing chips,” Nat. Electron. 3, 371–382 (2020).
[Crossref]

Wu, J. B.

M. L. Zhang, J. B. Wu, A. Belatreche, Z. H. Pan, X. R. Xie, and Y. S. Chua, “Supervised learning in spiking neural networks with synaptic delay-weight plasticity,” Neurocomputing 409, 103–118 (2020).
[Crossref]

Xiang, S. Y.

Z. W. Song, S. Y. Xiang, Z. X. Ren, G. Q. Han, and Y. Hao, “Spike sequence learning in a photonic spiking neural network consisting of VCSELs-SA with supervised training,” IEEE J. Sel. Top. Quantum Electron. 26, 1700209 (2020).
[Crossref]

S. Y. Xiang, Z. X. Ren, Y. H. Zhang, Z. W. Song, and Y. Hao, “All-optical neuromorphic XOR operation with inhibitory dynamics of a single photonic spiking neuron based on VCSEL-SA,” Opt. Lett. 45, 1104–1107 (2020).
[Crossref]

S. Y. Xiang, Y. H. Zhang, J. K. Gong, X. X. Guo, L. Lin, and Y. Hao, “STDP-based unsupervised spike pattern learning in a photonic spiking neural network with VCSELs and VCSOAs,” IEEE J. Sel. Top. Quantum Electron. 25, 1700109 (2019).
[Crossref]

Y. H. Zhang, S. Y. Xiang, X. Guo, A. Wen, and Y. Hao, “All-optical inhibitory dynamics in photonic neuron based on polarization mode competition in a VCSEL with an embedded saturable absorber,” Opt. Lett. 44, 1548–1551 (2019).
[Crossref]

Y. H. Zhang, S. Y. Xiang, J. K. Gong, X. X. Guo, A. J. Wen, and Y. Hao, “Spike encoding and storage properties in mutually coupled vertical-cavity surface-emitting lasers subject to optical pulse injection,” Appl. Opt. 57, 1731–1737 (2018).
[Crossref]

S. Y. Xiang, J. K. Gong, Y. H. Zhang, X. X. Guo, A. Wen, and Y. Hao, “Numerical implementation of wavelength-dependent photonic spike timing dependent plasticity based on VCSOA,” IEEE J. Quantum Electron. 54, 8100107 (2018).
[Crossref]

S. Y. Xiang, Y. H. Zhang, X. X. Guo, J. F. Li, A. J. Wen, W. Pan, and Y. Hao, “Cascadable neuron-like spiking dynamics in coupled VCSELs subject to orthogonally polarized optical pulse injection,” IEEE J. Sel. Top. Quantum Electron. 23, 1700207 (2017).
[Crossref]

S. Y. Xiang, Z. X. Ren, Z. W. Song, Y. H. Zhang, X. X. Guo, G. Q. Han, and Y. Hao, “Computing primitive of fully-VCSELs-based all-optical spiking neural network for supervised learning and pattern classification,” IEEE Trans. Neural Netw. (2020).

Xie, X. R.

M. L. Zhang, J. B. Wu, A. Belatreche, Z. H. Pan, X. R. Xie, and Y. S. Chua, “Supervised learning in spiking neural networks with synaptic delay-weight plasticity,” Neurocomputing 409, 103–118 (2020).
[Crossref]

Xu, N.

Y. Zhou, N. Xu, B. Gao, Y. Y. Chen, B. Y. Dong, Y. Li, Y. H. He, and X. S. Miao, “Complementary graphene-ferroelectric transistors (C-GFTs) as synapses with modulatable plasticity for supervised learning,” in IEEE International Electron Devices Meeting (IEDM) (2019), pp. 1–4.

Xu, W.

W. Xu, S. Y. Min, H. Hwang, and T. W. Lee, “Organic core-sheath nanowire artificial synapses with femtojoule energy consumption,” Sci. Adv. 2, e1501326 (2016).
[Crossref]

Yang, Y. C.

J. D. Zhu, Y. C. Yang, R. D. Jia, Z. X. Liang, W. Zhu, Z. A. Rehman, L. Bao, X. X. Zhang, Y. M. Cai, L. Song, and R. Huang, “Ion gated synaptic transistors based on 2D van der Waals crystals with tunable diffusive dynamics,” Adv. Mater. 30, 1800195 (2018).
[Crossref]

Yao, P.

W. Q. Zhang, B. Gao, J. S. Tang, P. Yao, and H. Q. Wu, “Neuro-inspired computing chips,” Nat. Electron. 3, 371–382 (2020).
[Crossref]

Ying, B.

X. B. Gong, Y. B. Wang, and B. Ying, “Delay-induced firing behavior and transitions in adaptive neuronal networks with two types of synapses,” Sci. China Chem. 56, 222–229 (2012).
[Crossref]

Youngblood, N.

C. Ríos, N. Youngblood, Z. Cheng, M. L. Gallo, W. H. P. Pernice, C. D. Wright, A. Sebastian, and H. Bhaskaran, “In-memory computing on a photonic platform,” Sci. Adv. 5, eaau5759 (2019).
[Crossref]

J. Feldmann, N. Youngblood, C. D. Wright, H. Bhaskaran, and W. H. P. Pernice, “All-optical spiking neurosynaptic networks with self-learning capabilities,” Nature 569, 208–215 (2019).
[Crossref]

Yu, H.

C. Hong, X. Wei, J. Wang, B. Deng, H. Yu, and Y. Che, “Training spiking neural networks for cognitive tasks: a versatile framework compatible with various temporal codes,” IEEE Trans. Neural Netw. Learn. Syst. 31, 1285–1296 (2020).
[Crossref]

Yu, Q.

S. G. Hu, G. C. Qiao, Y. A. Liu, L. M. Rong, Q. Yu, and Y. Liu, “An improved memristor model connecting plastic synapse synapses and nonlinear spiking neuron,” J. Phys. D 52, 275402 (2019).
[Crossref]

Yusuf, L.

B. Irem, L. G. Manuel, S. R. Nandakumar, M. Timoleon, P. Thomas, R. Bipin, L. Yusuf, S. Abu, and E. Evangelos, “Neuromorphic computing with multi-memristive synapses,” Nat. Commun. 9, 2514 (2018).
[Crossref]

Zhang, M. L.

M. L. Zhang, J. B. Wu, A. Belatreche, Z. H. Pan, X. R. Xie, and Y. S. Chua, “Supervised learning in spiking neural networks with synaptic delay-weight plasticity,” Neurocomputing 409, 103–118 (2020).
[Crossref]

Zhang, W. Q.

W. Q. Zhang, B. Gao, J. S. Tang, P. Yao, and H. Q. Wu, “Neuro-inspired computing chips,” Nat. Electron. 3, 371–382 (2020).
[Crossref]

Zhang, X. X.

J. D. Zhu, Y. C. Yang, R. D. Jia, Z. X. Liang, W. Zhu, Z. A. Rehman, L. Bao, X. X. Zhang, Y. M. Cai, L. Song, and R. Huang, “Ion gated synaptic transistors based on 2D van der Waals crystals with tunable diffusive dynamics,” Adv. Mater. 30, 1800195 (2018).
[Crossref]

Zhang, Y. H.

S. Y. Xiang, Z. X. Ren, Y. H. Zhang, Z. W. Song, and Y. Hao, “All-optical neuromorphic XOR operation with inhibitory dynamics of a single photonic spiking neuron based on VCSEL-SA,” Opt. Lett. 45, 1104–1107 (2020).
[Crossref]

S. Y. Xiang, Y. H. Zhang, J. K. Gong, X. X. Guo, L. Lin, and Y. Hao, “STDP-based unsupervised spike pattern learning in a photonic spiking neural network with VCSELs and VCSOAs,” IEEE J. Sel. Top. Quantum Electron. 25, 1700109 (2019).
[Crossref]

Y. H. Zhang, S. Y. Xiang, X. Guo, A. Wen, and Y. Hao, “All-optical inhibitory dynamics in photonic neuron based on polarization mode competition in a VCSEL with an embedded saturable absorber,” Opt. Lett. 44, 1548–1551 (2019).
[Crossref]

Y. H. Zhang, S. Y. Xiang, J. K. Gong, X. X. Guo, A. J. Wen, and Y. Hao, “Spike encoding and storage properties in mutually coupled vertical-cavity surface-emitting lasers subject to optical pulse injection,” Appl. Opt. 57, 1731–1737 (2018).
[Crossref]

S. Y. Xiang, J. K. Gong, Y. H. Zhang, X. X. Guo, A. Wen, and Y. Hao, “Numerical implementation of wavelength-dependent photonic spike timing dependent plasticity based on VCSOA,” IEEE J. Quantum Electron. 54, 8100107 (2018).
[Crossref]

S. Y. Xiang, Y. H. Zhang, X. X. Guo, J. F. Li, A. J. Wen, W. Pan, and Y. Hao, “Cascadable neuron-like spiking dynamics in coupled VCSELs subject to orthogonally polarized optical pulse injection,” IEEE J. Sel. Top. Quantum Electron. 23, 1700207 (2017).
[Crossref]

S. Y. Xiang, Z. X. Ren, Z. W. Song, Y. H. Zhang, X. X. Guo, G. Q. Han, and Y. Hao, “Computing primitive of fully-VCSELs-based all-optical spiking neural network for supervised learning and pattern classification,” IEEE Trans. Neural Netw. (2020).

Zhou, Y.

Y. Zhou, N. Xu, B. Gao, Y. Y. Chen, B. Y. Dong, Y. Li, Y. H. He, and X. S. Miao, “Complementary graphene-ferroelectric transistors (C-GFTs) as synapses with modulatable plasticity for supervised learning,” in IEEE International Electron Devices Meeting (IEDM) (2019), pp. 1–4.

Zhu, J. D.

J. D. Zhu, Y. C. Yang, R. D. Jia, Z. X. Liang, W. Zhu, Z. A. Rehman, L. Bao, X. X. Zhang, Y. M. Cai, L. Song, and R. Huang, “Ion gated synaptic transistors based on 2D van der Waals crystals with tunable diffusive dynamics,” Adv. Mater. 30, 1800195 (2018).
[Crossref]

Zhu, W.

J. D. Zhu, Y. C. Yang, R. D. Jia, Z. X. Liang, W. Zhu, Z. A. Rehman, L. Bao, X. X. Zhang, Y. M. Cai, L. Song, and R. Huang, “Ion gated synaptic transistors based on 2D van der Waals crystals with tunable diffusive dynamics,” Adv. Mater. 30, 1800195 (2018).
[Crossref]

Zirpe, M.

R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris, M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007).
[Crossref]

Zou, W. W.

B. W. Ma and W. W. Zou, “Demonstration of a distributed feedback laser diode working as a graded-potential-signaling photonic neuron and its application to neuromorphic information processing,” Sci. China Inf. Sci. 63, 160408 (2020).
[Crossref]

Acta Neurobiol. Experim. (1)

F. Ponulak and A. Kasinski, “Introduction to spiking neural networks: information processing, learning and applications,” Acta Neurobiol. Experim. 71, 409–433 (2011).

Adv. Mater. (1)

J. D. Zhu, Y. C. Yang, R. D. Jia, Z. X. Liang, W. Zhu, Z. A. Rehman, L. Bao, X. X. Zhang, Y. M. Cai, L. Song, and R. Huang, “Ion gated synaptic transistors based on 2D van der Waals crystals with tunable diffusive dynamics,” Adv. Mater. 30, 1800195 (2018).
[Crossref]

Appl. Opt. (1)

Appl. Phys. Lett. (1)

A. Hurtado, K. Schires, I. Henning, and M. Adams, “Investigation of vertical cavity surface emitting laser dynamics for neuromorphic photonic systems,” Appl. Phys. Lett. 100, 103703 (2012).
[Crossref]

Curr. Opinion Neurobiol. (1)

S. B. Laughlin, “Energy as a constraint on the coding and processing of sensory information,” Curr. Opinion Neurobiol. 11, 475–480 (2001).
[Crossref]

Front. Neurosci. (1)

I. Sourikopoulos, S. Hedayat, C. Loyez, F. Danneville, V. Hoel, E. Mercier, and A. Cappy, “A 4-fJ/spike artificial neuron in 65 nm CMOS technology,” Front. Neurosci. 11, 123 (2017).
[Crossref]

IEEE J. Quantum Electron. (1)

S. Y. Xiang, J. K. Gong, Y. H. Zhang, X. X. Guo, A. Wen, and Y. Hao, “Numerical implementation of wavelength-dependent photonic spike timing dependent plasticity based on VCSOA,” IEEE J. Quantum Electron. 54, 8100107 (2018).
[Crossref]

IEEE J. Sel. Top. Quantum Electron. (5)

S. Y. Xiang, Y. H. Zhang, X. X. Guo, J. F. Li, A. J. Wen, W. Pan, and Y. Hao, “Cascadable neuron-like spiking dynamics in coupled VCSELs subject to orthogonally polarized optical pulse injection,” IEEE J. Sel. Top. Quantum Electron. 23, 1700207 (2017).
[Crossref]

T. Deng, J. Robertson, and A. Hurtado, “Controlled propagation of spiking dynamics in vertical-cavity surface-emitting lasers: towards neuromorphic photonic networks,” IEEE J. Sel. Top. Quantum Electron. 23, 1800408 (2017).
[Crossref]

J. Robertson, E. Wade, Y. Kopp, J. Bueno, and A. Hurtado, “Toward neuromorphic photonic networks of ultrafast spiking laser neurons,” IEEE J. Sel. Top. Quantum Electron. 26, 7700715 (2020).
[Crossref]

S. Y. Xiang, Y. H. Zhang, J. K. Gong, X. X. Guo, L. Lin, and Y. Hao, “STDP-based unsupervised spike pattern learning in a photonic spiking neural network with VCSELs and VCSOAs,” IEEE J. Sel. Top. Quantum Electron. 25, 1700109 (2019).
[Crossref]

Z. W. Song, S. Y. Xiang, Z. X. Ren, G. Q. Han, and Y. Hao, “Spike sequence learning in a photonic spiking neural network consisting of VCSELs-SA with supervised training,” IEEE J. Sel. Top. Quantum Electron. 26, 1700209 (2020).
[Crossref]

IEEE Trans. Neur. Netw. Learn. Syst. (2)

A. Taherkhani, A. Belatreche, Y. Li, and L. P. Maguire, “DL-ReSuMe: a delay learning-based remote supervised method for spiking neurons,” IEEE Trans. Neur. Netw. Learn. Syst. 26, 3137–3149 (2015).
[Crossref]

A. Taherkhani, A. Belatreche, Y. Li, and L. P. Maguire, “A supervised learning algorithm for learning precise timing of multiple spikes in multilayer spiking neural networks,” IEEE Trans. Neur. Netw. Learn. Syst. 29, 5394–5407 (2018).
[Crossref]

IEEE Trans. Neural Netw. (1)

A. Cariani, “Temporal codes and computations for sensory representation and scene analysis,” IEEE Trans. Neural Netw. 15, 1100–1111 (2004).
[Crossref]

IEEE Trans. Neural Netw. Learn. Syst. (1)

C. Hong, X. Wei, J. Wang, B. Deng, H. Yu, and Y. Che, “Training spiking neural networks for cognitive tasks: a versatile framework compatible with various temporal codes,” IEEE Trans. Neural Netw. Learn. Syst. 31, 1285–1296 (2020).
[Crossref]

Int. J. Neural Syst. (1)

S. Ghosh-Dastidar and H. Adeli, “Spiking neural networks,” Int. J. Neural Syst. 19, 295–308 (2009).
[Crossref]

Integr. Comput.-Aided Eng. (1)

S. Ghosh-Dastidar and H. Adeli, “Improved spiking neural networks for EEG classification and epilepsy and seizure detection,” Integr. Comput.-Aided Eng. 14, 187–212 (2007).
[Crossref]

J. Comput. Neurosci. (1)

R. Brette, M. Rudolph, T. Carnevale, M. Hines, D. Beeman, J. M. Bower, M. Diesmann, A. Morrison, P. H. Goodman, F. C. Harris, M. Zirpe, T. Natschläger, D. Pecevski, B. Ermentrout, M. Djurfeldt, A. Lansner, O. Rochel, T. Vieville, E. Muller, A. P. Davison, S. El Boustani, and A. Destexhe, “Simulation of networks of spiking neurons: a review of tools and strategies,” J. Comput. Neurosci. 23, 349–398 (2007).
[Crossref]

J. Phys. D (1)

S. G. Hu, G. C. Qiao, Y. A. Liu, L. M. Rong, Q. Yu, and Y. Liu, “An improved memristor model connecting plastic synapse synapses and nonlinear spiking neuron,” J. Phys. D 52, 275402 (2019).
[Crossref]

J. Physiol. (1)

S. Boudkkazi, L. Fronzaroli-Molinieres, and D. Debanne, “Presynaptic action potential waveform determines cortical synaptic latency,” J. Physiol. 589, 1117–1131 (2011).
[Crossref]

Jpn. J. Appl. Phys. (1)

J. Ohtsubo, R. Ozawa, and M. Nanbu, “Synchrony of small nonlinear networks in chaotic semiconductor lasers,” Jpn. J. Appl. Phys. 54, 072702 (2015).
[Crossref]

Nat. Commun. (1)

B. Irem, L. G. Manuel, S. R. Nandakumar, M. Timoleon, P. Thomas, R. Bipin, L. Yusuf, S. Abu, and E. Evangelos, “Neuromorphic computing with multi-memristive synapses,” Nat. Commun. 9, 2514 (2018).
[Crossref]

Nat. Electron. (1)

W. Q. Zhang, B. Gao, J. S. Tang, P. Yao, and H. Q. Wu, “Neuro-inspired computing chips,” Nat. Electron. 3, 371–382 (2020).
[Crossref]

Nat. Neurosci. (2)

S. B. Laughlin, R. R. de Ruyter van Steveninck, and J. C. Anderson, “The metabolic cost of neural information,” Nat. Neurosci. 1, 36–41 (1998).
[Crossref]

R. Gütig and H. Sompolinsky, “The tempotron: a neuron that learns spike timing-based decisions,” Nat. Neurosci. 9, 420–428 (2006).
[Crossref]

Natural Comput. (1)

S. M. Bohte, “The evidence for neural information processing with precise spike-times: a survey,” Natural Comput. 3, 195–206 (2004).
[Crossref]

Nature (3)

K. Roy, J. Akhilesh, and P. Priyadarshini, “Towards spike-based machine intelligence with neuromorphic computing,” Nature 575, 607–617 (2019).
[Crossref]

J. Feldmann, N. Youngblood, C. D. Wright, H. Bhaskaran, and W. H. P. Pernice, “All-optical spiking neurosynaptic networks with self-learning capabilities,” Nature 569, 208–215 (2019).
[Crossref]

G. Wetzstein, A. Ozcan, S. Gigan, S. Fan, D. Englund, M. Soljačić, C. Denz, D. A. B. Miller, and D. Psaltis, “Inference in artificial intelligence with deep optics and photonics,” Nature 588, 39–47 (2020).
[Crossref]

Neural Comput. (3)

F. Ponulak and A. Kasinski, “Supervised learning in spiking neural networks with ReSuMe: sequence learning, classification, and spike shifting,” Neural Comput. 22, 467–510 (2010).
[Crossref]

I. Sporea and A. Grüning, “Supervised learning in multilayer spiking neural networks,” Neural Comput. 25, 473–509 (2013).
[Crossref]

J. Hu, H. Tang, K. C. Tan, H. Li, and L. Shi, “A spike-timing-based integrated model for pattern recognition,” Neural Comput. 25, 450–472 (2013).
[Crossref]

Neural Netw. (4)

S. Ghosh-Dastidar and H. Adeli, “A new supervised learning algorithm for multiple spiking neural networks with application in epilepsy and seizure detection,” Neural Netw. 22, 1419–1431 (2009).
[Crossref]

S. R. Kulkarni and B. Rajendran, “Spiking neural networks for handwritten digit recognition—supervised learning and network optimization,” Neural Netw. 103, 118–127 (2018).
[Crossref]

N. Kasabov, K. Dhoble, N. Nuntalid, and G. Indiveri, “Dynamic evolving spiking neural networks for on-line spatio- and spectro-temporal pattern recognition,” Neural Netw. 41, 188–201 (2013).
[Crossref]

A. Roy, S. Govil, and R. Miranda, “An algorithm to generate radial basis function (RBF)-like nets for classification problems,” Neural Netw. 8, 179–201 (1995).
[Crossref]

Neural Process. Lett. (1)

T. Clarence, M. Arlija, and N. Kasabov, “Spiking neural networks: background, recent development and the NeuCube architecture,” Neural Process. Lett. 3, 1675–1701 (2020).
[Crossref]

Neurocomputing (3)

A. Mohemmed and S. Schliebs, “Training spiking neural networks to associate spatio-temporal input-output spike patterns,” Neurocomputing 107, 3–10 (2013).
[Crossref]

M. L. Zhang, J. B. Wu, A. Belatreche, Z. H. Pan, X. R. Xie, and Y. S. Chua, “Supervised learning in spiking neural networks with synaptic delay-weight plasticity,” Neurocomputing 409, 103–118 (2020).
[Crossref]

P. Adibi, M. R. Meybodi, and R. Safabakhsh, “Unsupervised learning of synaptic delays based on learning automata in an RBF-like network of spiking neurons for data clustering,” Neurocomputing 64, 335–357 (2005).
[Crossref]

Neuron (1)

H. Jörntell and C. Hansel, “Synaptic memories upside down: bidirectional plasticity at cerebellar parallel fiber-Purkinje cell synapses,” Neuron 52, 227–238 (2006).
[Crossref]

Opt. Express (1)

Opt. Lett. (3)

Phys. Rev. Appl. (1)

I. Chakraborty, G. Saha, and K. Roy, “Photonic in-memory computing primitive for spiking neural networks using phase-change materials,” Phys. Rev. Appl. 11, 014063 (2019).
[Crossref]

Phys. Rev. Lett. (2)

M. Dhamala, V. K. Jirsa, and M. Ding, “Enhancement of neural synchrony by time delay,” Phys. Rev. Lett. 92, 074104 (2004).
[Crossref]

C. W. Eurich, K. Pawelzik, U. Ernst, J. D. Cowan, and J. G. Milton, “Dynamics of self-organized delay adaptation,” Phys. Rev. Lett. 82, 1594–1597 (1999).
[Crossref]

Proc. SPIE (1)

Q. Li, Z. Wang, Y. Le, C. Sun, X. Song, and C. Wu, “Optical implementation of neural learning algorithms based on cross-gain modulation in a semiconductor optical amplifier,” Proc. SPIE 10019, 100190E (2016).
[Crossref]

Sci. Adv. (3)

Z. G. Chen, R. Carlos, W. H. P. Pernice, C. D. Wrigh, and H. Bhaskara, “On-chip photonic synapse,” Sci. Adv. 3, e1700160 (2017).
[Crossref]

C. Ríos, N. Youngblood, Z. Cheng, M. L. Gallo, W. H. P. Pernice, C. D. Wright, A. Sebastian, and H. Bhaskaran, “In-memory computing on a photonic platform,” Sci. Adv. 5, eaau5759 (2019).
[Crossref]

W. Xu, S. Y. Min, H. Hwang, and T. W. Lee, “Organic core-sheath nanowire artificial synapses with femtojoule energy consumption,” Sci. Adv. 2, e1501326 (2016).
[Crossref]

Sci. China Chem. (1)

X. B. Gong, Y. B. Wang, and B. Ying, “Delay-induced firing behavior and transitions in adaptive neuronal networks with two types of synapses,” Sci. China Chem. 56, 222–229 (2012).
[Crossref]

Sci. China Inf. Sci. (1)

B. W. Ma and W. W. Zou, “Demonstration of a distributed feedback laser diode working as a graded-potential-signaling photonic neuron and its application to neuromorphic information processing,” Sci. China Inf. Sci. 63, 160408 (2020).
[Crossref]

Sci. Rep. (1)

I. Chakraborty, G. Saha, A. Sengupta, and K. Roy, “Toward fast neural computing using all-photonic phase change spiking neurons,” Sci. Rep. 8, 12980 (2018).
[Crossref]

Trends Neurosci. (1)

J. W. Lin and D. S. Faber, “Modulation of synaptic delay during synaptic plasticity,” Trends Neurosci. 25, 449–455 (2002).
[Crossref]

Other (5)

A. Taherkhani, A. Belatreche, Y. Li, and L. P. Maguire, “EDL: an extended delay learning based remote supervised method for spiking neurons,” in Neural Information Processing (Springer, 2015), pp. 190–197.

Y. Zhou, N. Xu, B. Gao, Y. Y. Chen, B. Y. Dong, Y. Li, Y. H. He, and X. S. Miao, “Complementary graphene-ferroelectric transistors (C-GFTs) as synapses with modulatable plasticity for supervised learning,” in IEEE International Electron Devices Meeting (IEDM) (2019), pp. 1–4.

H. Paugam-Moisy and S. Bohte, “Computing with spiking neuron networks,” in Handbook of Natural Computing (Springer, 2012), pp. 335–376.

S. Y. Xiang, Z. X. Ren, Z. W. Song, Y. H. Zhang, X. X. Guo, G. Q. Han, and Y. Hao, “Computing primitive of fully-VCSELs-based all-optical spiking neural network for supervised learning and pattern classification,” IEEE Trans. Neural Netw. (2020).

S. Moallemi, R. Welker, and J. Kitchen, “Wide band programmable true time delay block for phased array antenna applications,” in IEEE Dallas Circuits and Systems Conference (DCAS) (2016),pp. 1–4.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (13)

Fig. 1.
Fig. 1. Schematic diagram of DW-based learning in a single-layer photonic SNN.
Fig. 2.
Fig. 2. Schematic illustration of the ReSuMe incorporated with optical STDP rule. i, d, and o denote the input, the target, and the output, respectively.
Fig. 3.
Fig. 3. (a1) and (b1) Input pattern and output pattern before delay adjustment. (a2) and (b2) After 7 training epochs.
Fig. 4.
Fig. 4. Comparison of the learning capability of a single neuron based on (a) weight-based ReSuMe and (b) DW-ReSuMe. The value of SSD after the 50th, 100th, and 300th training epoch is presented for different ti. (c) The valid input window as a function of ηω for different ω0 based on DW-ReSuMe. (d) The valid input window as a function of ω0 for different ηω based on DW-ReSuMe. n=1, td=8ns.
Fig. 5.
Fig. 5. (a1) Carrier density of the POST after training and (b1) the evolution of output spikes based on the DW-ReSuMe; (a2) and (b2) those based on ReSuMe. The black solid line is na and the red solid line represents P.
Fig. 6.
Fig. 6. Evolution of (a1) synaptic weights ωi and (a2) delays di during the first 20 training epochs.
Fig. 7.
Fig. 7. Learning spike sequences with ununiformed ISI. (a1) and (b1) The evolution of output spikes for spike sequence [10 ns, 12 ns, 14 ns, 18 ns, 20 ns, 22 ns, 24 ns, 26 ns, 29 ns] and [10 ns, 11 ns, 13 ns, 14.5 ns, 17 ns, 21 ns, 23 ns, 25.5 ns, 27 ns], respectively. (a2) and (b2) The evolution for the corresponding distance.
Fig. 8.
Fig. 8. (a) Training accuracy and (b) testing accuracy varying with training epochs for weight-based ReSuMe (blue solid line) and DW-ReSuMe (red solid line). Td=1ns, Tω=4ns. The blue dotted line indicates an accuracy of 90%.
Fig. 9.
Fig. 9. Illustration of classification results for (a) training data set and (b) testing data set. The orange cycles denote target spiking time, the blue squares represent the actual spiking time, and misclassified samples are highlighted in bright blue.
Fig. 10.
Fig. 10. Testing accuracy as a function of (a) weight learning window Tω and (b) delay learning window Td.
Fig. 11.
Fig. 11. (a) Training accuracy and (b) testing accuracy varying with training epochs based on DW-ReSuMe (red solid line) and ReSuMe (blue solid line), respectively. Td=4ns, Tω=5ns.
Fig. 12.
Fig. 12. (a1) Training accuracy and (a2) testing accuracy for the Iris data set after 60 training epochs with different initial delay d0. (b1) and (b2) The results for the breast cancer data set.
Fig. 13.
Fig. 13. Learning accuracy of the breast cancer data set based on DW-ReSuMe with different cases of ηd. The left column corresponds to the training accuracy with (a1) constant ηd and with (b1) decaying ηd. (a2) and (b2) The right column shows the corresponding results of testing accuracy. Td=4ns, Tω=5ns.

Equations (9)

Equations on this page are rendered with MathJax. Learn more.

S˙i,o=Γaga(nan0a)Si,o+Γsgs(nsn0s)Si,oSi,o/τph+βBrna2,
na·=Γaga(nan0a)(SΦpre,iΦpost,o)na/τa+Ia/(eVa),Φpre,i=keiτphλiPei(τi,Δτ)/(hcVa),Φpost,o=i=1nωiλiτphPi(tdi)/(hcVa),
ns·=Γsgs(nsn0s)Si,ons/τs+Is/eVs,
Δωi=(ndno)+tdtitdΔωSTDP(tdti)+totitoΔωaSTDP(toti),
Δdi=(DidDio);Did=tdti,Dio=toti,
ΔωSTDP(tdti)={Δωo(Δt),iftdti>00,iftdti0,
ΔωaSTDP(toti)={Δωo(Δt),iftoti>00,iftoti0.
ωi(x+1)=ωi(x)+ηωΔωi,
di(x+1)=di(x)+ηdΔdi,

Metrics