People have been poking at every architecture improvement for 30 years. Moore's Law makes it clear that the only winning move is to go into the most popular path.
Now that Moore's Law is on its way out, people can actually try new things, and discover what pays off and what does not.
This could also be a chance to figuratively reboot computing tech, and start anew from different fundamentals: quaternary, biological, photonic...
What may have been hard to implement 30-40 years ago may be easier now with current technology. Some of these could definitely supplement existing binary/boolean silicon in certain domains if not replace it, like using actual brains in AI-as-a-Service, for image recognition and so on,
I'm scratching it up to "it's the future, and always will be ..."