Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

People have been poking at it for over 30 years now.

I'm scratching it up to "it's the future, and always will be ..."



People have been poking at every architecture improvement for 30 years. Moore's Law makes it clear that the only winning move is to go into the most popular path.

Now that Moore's Law is on its way out, people can actually try new things, and discover what pays off and what does not.


This could also be a chance to figuratively reboot computing tech, and start anew from different fundamentals: quaternary, biological, photonic...

What may have been hard to implement 30-40 years ago may be easier now with current technology. Some of these could definitely supplement existing binary/boolean silicon in certain domains if not replace it, like using actual brains in AI-as-a-Service, for image recognition and so on,


Achronix Semiconductor used it for cutting-edge, 1+GHz FPGA's. Check them out.


The industry might hit the point one day, but for now it seems to fall in the bucket of "modest payoff, very very very high cost".

The upsides are real but other avenues of development may still have higher payoff vs cost (effort).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: