Hacker News new | past | comments | ask | show | jobs | submit login

Neuromorphic hardware is an area where I encountered analogue computing [1]. Biological neurons would be modeled by a leaky integration (resistor/capacitator) unit. The system was 1*10^5 times faster than real-time—too fast to use it for robotics—and consumed little power but was sensitive to temperature (much as our brains). If I recall correctly, the technology has been used at cern, as digital HW would have required too high clock speeds. I have no clue what happened to the technology but there were other attempts at neuromorphic, analogue hardware. It was very exciting to observe and use this research!

[1]https://brainscales.kip.uni-heidelberg.de/

edit: newer link: https://open-neuromorphic.org/neuromorphic-computing/hardwar...




I worked on a similar project - the Stanford braindrop chip. It's a really interesting technology, what happened is that most people don't really know how to train those systems. There's a group called Vivum that seems to have a solution.

I work with QDI systems, and I've long suspected that it would be possible to use those same design principles to make analog circuits robust to timing variation. QDI design is about sequencing discrete events using digital operators - AND and OR. I wonder if it is possible to do the same with continuous "events" using the equivalent analog operators, mix and sum.


We got some nice results with spiking/pulsed networks but the small number of units limits the application so we usually end up in a simulator or using more abstract models. There seems to be a commercial product but also only with 0.5K neurons, might be enough for 1D data processing though and filling a good niche there (1mW!) [2]

[2] https://open-neuromorphic.org/neuromorphic-computing/hardwar...


> the Stanford braindrop chip Interesting.

Just read about it and there are familiar names on the author list. I really wish this type of technology gained more traction but I am afraid it will not receive the deserved focus considering the direction of current research in AI.

> I wonder if it is possible to do the same with continuous "events" using the equivalent analog operators, mix and sum.

Don't know much about QDI but sounds promising.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: