Definitely not status quo. One big example is when Twitter moved away from that back to server side rendering for performance reasons [1].
There are basically three aspects of website performance - server performance, network performance, and client performance. You can control server performance and network latency (CDN), but you can't control what's going on on your users computers.
For example, if they're tab-abusers (guilty as charged) with 50+ non-trivial tabs open, plus other apps open and stuff running in the background, then your beautiful high-performance UI could start chugging (as many do on my abused laptop).
So you really have to think about what your main userbase will be and whether you need to put the rendering chunk of your app's CPU usage on your own servers or on your client's desktops/laptops/tablets/phones.
There are basically three aspects of website performance - server performance, network performance, and client performance. You can control server performance and network latency (CDN), but you can't control what's going on on your users computers.
For example, if they're tab-abusers (guilty as charged) with 50+ non-trivial tabs open, plus other apps open and stuff running in the background, then your beautiful high-performance UI could start chugging (as many do on my abused laptop).
So you really have to think about what your main userbase will be and whether you need to put the rendering chunk of your app's CPU usage on your own servers or on your client's desktops/laptops/tablets/phones.
[1]: https://blog.twitter.com/2012/improving-performance-twitterc...
- http://openmymind.net/2012/5/30/Client-Side-vs-Server-Side-R...