It looks like it's just applying a bunch of filters/convolutions to the image. You could write it in C++ to be pretty quick, in CUDA you could get it down to microseconds I think.
Though that does not take account of the latency of the webcam itself.
Yeah but then you haven’t turned a MacBook into a Touchscreen with $1 of Hardware. You’ve turned a MacBook into a Touchscreen with $price_of_GPU + $price_of_eGPU_adapter + $1.
And additionally you’ve made your MacBook a lot less portable.
I'm always amused when Mac people ask me why I don't use Macs and I tell them "I program CUDA for a living" and they respond "But you can jerry rig a PCIx slot in a box over a wonky cable plug it into your Thunderfire port" or something. As if that's a real solution.
Commercially available jank is still jank. And the matter of portability (why else would you be using a laptop in the first place?) still remains. Forget walking around with it around towb in a bag; if I am trying to work on my deck chair outside, where do I put that mother of all dongles? Balance it precariously on the chair's armrest? No thanks!
I wonder if Mac users are pulling our legs, or if they are True Believers. Like it or not, a significant number of people doing signal processing, simulation, AI, and even playing games, need or strongly prefer Nvidia and CUDA. And for them, Macs are not an option.
Though that does not take account of the latency of the webcam itself.