Usage of the Backpropagation Throough The Time in the SNNs


In terms of the SNNs accuracy as far as it can be seen in the literature the spiking data representation is main disabler for usage of the bac-propagation algorithm of the gradient descent type from the standard ANNs since the “spike” is not differentiable function and derivative goes to infinity (explodes) for example in case of the LIF neurons.
The proposed solution are surogate neurons, the one with gaussian like response which is then differentiable and backpropagation can be applied, there is a paper by E. Elaismith et al…
Here are several connected questions:
a) Are, in current Nengo implemenations, available such surogate neurons?
b) In case of surogate neurons ca we apply standard ANNs backrpopagation algorithm in current implementation of the Nengo framework?
c) Apart of the learning supervised and unsupervised learning rules described in the Bekolay’s et. al. paper is it available implementation of Backpropagation Through The Time algorithm as in standard RNN networks?

Thank you!