It's not difficult to set up middleware that'll render the page for any clients that require it. (For instance, we can assume any client that identifies as "bot" that's not Google probably wants a pre-rendered page, which we can do quite effortlessly. Here's one implementation for Nodejs: https://prerender.io, or you can always roll your own with something like Phantom.js.
Note that sending a different response to googlebot than what you send to normal users is a violation of Google's guidelines and can get your site penalized. Use at your own peril.
No where in that article does it say it's ok to only serve the snapshot to googlebot. Serving different content to googlebot than what you serve to users is called cloaking and is against their guidelines: https://support.google.com/webmasters/answer/66355
I've invested a significant amount of time in this topic and would love if you were right, but I've never seen the money quote that it's ok to do this. In fact, everything I've read says that you have to treat search bots the same as you treat normal users.
wow this is amazing. would love to see an offshoot of this where it could render a sitemap, or even keep a live sitemap up to date via cron.d or something (just hoping out loud)
Dynamic constructions of sitemaps is surprisingly difficult. We would need to poll every single page you have, just to check if you potentially have a new link to a new page in your site. And everytime you add a new page, that's a whole another page to scrape and analyse.