I worked on a similar project - the Stanford braindrop chip. It's a really interesting technology, what happened is that most people don't really know how to train those systems. There's a group called Vivum that seems to have a solution.
I work with QDI systems, and I've long suspected that it would be possible to use those same design principles to make analog circuits robust to timing variation. QDI design is about sequencing discrete events using digital operators - AND and OR. I wonder if it is possible to do the same with continuous "events" using the equivalent analog operators, mix and sum.
We got some nice results with spiking/pulsed networks but the small number of units limits the application so we usually end up in a simulator or using more abstract models. There seems to be a commercial product but also only with 0.5K neurons, might be enough for 1D data processing though and filling a good niche there (1mW!) [2]
Just read about it and there are familiar names on the author list. I really wish this type of technology gained more traction but I am afraid it will not receive the deserved focus considering the direction of current research in AI.
> I wonder if it is possible to do the same with continuous "events" using the equivalent analog operators, mix and sum.
I work with QDI systems, and I've long suspected that it would be possible to use those same design principles to make analog circuits robust to timing variation. QDI design is about sequencing discrete events using digital operators - AND and OR. I wonder if it is possible to do the same with continuous "events" using the equivalent analog operators, mix and sum.