first 5K 960fps VR180 sample

 



960fps is 32X slower than normal playback speed! Here is a test footage captured with the K2 pro in 120fps, and we made 8X interpolation to 960fps with an AI algorithm using depth awareness. No visible artifact is observed! For easy comparison, frame buffering with optical flow in Premiere Pro is also included. Check it yourself 🙂

The CUDA app is very hunger for memory. For this 5K sample we have to divide each source frames into 9 sub frames in order to avoid memory overflow. The sub frame would be processed individually and then stitch back together when all sub frames were done. Each of these 1800*900 sub frames consumed 10.3 out of 11 Gb GPU memory to process. The processing time was very slow. Our GTX1080ti card has 3584 CUDA cores, and it took over 300 seconds to generate just 8 interpolated frames. That said for a two seconds 960fps footage it will take almost a day to render. However, the wait is definitely worthy. The outcome is very pleasant, far better than what we can get with optical flow. You may download the footage and sideload into your headset for a closer check:
VR180 at 5K 10bit h.265
https://www.dropbox.com/s/24a9x2e16if…​

Here is the AI engine I used:
Earlier this year, researchers from two universities and Google published a new AI-powered technique they developed called “Depth-Aware Video Frame Interpolation” or DAIN, and it’s simply mind-blowing. The tech can interpolate a 30fps video all the way to 120fps or even 480fps with almost no visible artifacts.
https://sites.google.com/view/wenboba…

Deja una respuesta