[NengoDL] Signals.scatter() shape issue

I made a new branch of my code (https://github.com/Tioz90/MemristorLearning/blob/a8d222857f26457a70a504faebc8019e2e209231/memristor_nengo/learning_rules.py#L270) because I somehow had a feeling that Building a custom learning rule operator for Nengo DL may have helped with my issue.
Why? Because then I can safely remove the reshape() on my memristor TensorSignals needed to add the extra dimension to keep track of the number of `ops .

And, lo and behold, at first glance it seems as if the issue with .scatter() has been resolved without changing anything in build_step()!

I’m not sure if how I enforce the single op in init() is the “proper” way and I would obviously like to keep the operator merging as a possibility, in order to have all the performance possible.
Do you think the behaviour of scatter() can be improved to better deal with a situation like mine? Or is it the case of changing something else in my code?