Welcome, Guest
Username Password: Remember me
  • Page:
  • 1

TOPIC: GPU Decoding

GPU Decoding 4 months, 1 week ago #193426

  • SchwobaSeggl
  • Pro User
  • OFFLINE
  • Senior Boarder
  • Posts: 79
  • 4 months, 1 week ago
As I found out only recently, LWKS apparently decodes compressed video on the CPU.

Linux has the VDPAU interface that provides ridiculously fast H264 decoding, it might also be possible via OpenGL. DaVinci Resolve does it for H264, I think they use CUDA, which might be the more complicated way and a limitation to Nvidia (?).

Anyways, since H264 is nowadays common also in pro cameras such as the Panasonic GH5, it would be good to decode this on the GPU.

On my box, which is not exactly slow, I get in trouble as soon as I have stuff using 4K, which is most likely due to CPU limitations. While my humongous GeForce card is asleep...
kxStudio/Ubuntu Linux user, audio guy doing live mixdown and recording, long-time Ardour user. Now getting into video with LWKS.

Re: GPU Decoding 4 months ago #193429

  • haraldthi
  • Pro User
  • OFFLINE
  • Senior Boarder
  • Posts: 57
  • 4 months ago
Oh yes.

However, Lightworks seems to default to the MainConcept libraries, that seems to prioritize bug-free and widely compatible over GPU speed.
When most decoding is directly to screen, and the GPU ends up with the data at any rate, and exports and other tasks aren't nearly so time dependant it could make a problem with syncing back to CPU.

Wouldn't it at least be possible to do a bit of testing? And see what codecs can be handled on simpler and less compatible GPU decoders (and encoders) before choosing to do it on CPU?

Re: GPU Decoding 4 months ago #193431

  • SchwobaSeggl
  • Pro User
  • OFFLINE
  • Senior Boarder
  • Posts: 79
  • 4 months ago
You can test on Linux using MPlayer, and yes, it doesn't always work on the GPU that well. But then again, Resolve can do it... perhaps one could have a white-list, something like "8bit H264 from the XY camera works" with stable defaults – and you could add your own.
kxStudio/Ubuntu Linux user, audio guy doing live mixdown and recording, long-time Ardour user. Now getting into video with LWKS.

Re: GPU Decoding 4 months ago #193433

  • hugly
  • OFFLINE
  • Platinum Boarder
  • Posts: 19233
  • 4 months ago
haraldthi wrote:
Lightworks seems to default to the MainConcept libraries, that seems to prioritize bug-free and widely compatible over GPU speed.

That's the point.

Inventing GPU decoding, which is still under construction and proprietary on all platforms, will invent a whole lot of problems.

On the other hand, due to the fact that nowadays almost all NLEs are trying to support GPU decoding (and encoding) in the one or the other way, Lightworks will be forced to invent it, sooner or later, just to have it on the feature list. If it will be usable in a meaningful and stable way, is another discussion.

Wouldn't it at least be possible to do a bit of testing? And see what codecs can be handled on simpler and less compatible GPU decoders (and encoders) before choosing to do it on CPU?

Proxies would be a good playfield for this, under real world conditions.
It's better to travel well than to arrive...
Last Edit: 4 months ago by hugly.

Re: GPU Decoding 4 months ago #193438

  • SchwobaSeggl
  • Pro User
  • OFFLINE
  • Senior Boarder
  • Posts: 79
  • 4 months ago
I think not only the competition will force LWKS to introduce GPU decoding, also the 4k and 8k craze that is going on.

Most videographers might say that 4K is nice but still far away from being useful, but their customers, being bombed with 4k TV screen ads everywhere, will soon demand it right away.

For me, 4k is of value since I currently shoot concerts mainly and I usually got a 4k cam sitting in the back and in the post production I can zoom intro that picture and fake having more cameras and positions.
kxStudio/Ubuntu Linux user, audio guy doing live mixdown and recording, long-time Ardour user. Now getting into video with LWKS.

Re: GPU Decoding 4 months ago #193454

Resolve decodes compressed files with the CPU just like Lightworks. As far as I know this is universal to all NLE's. GPU is dedicated to timeline rendering of decompressed video streams. CPU decompresses and buffers uncompressed frames to memory for GPU processing.
The main problem with using GPU decoding in an NLE is that the high efficiency dedicated hardware decoding routines used by players can only do one video stream. Not much use when you have two or more clips on a timeline being rendered with FX.

For high end systems with two or more GPU's it would be possible to dedicate one to decompression and the other to rendering, but this wouldn't be much help to most of us with more modest hardware and not worth the effort at this point for the NLE programmers.

Maxing out system ram and using workstation platforms with lots of PCIE lanes and high count multi-core processors is the road to performance.

On more modest systems, make sure as few apps as possible are running in the background to free up resources.

www.cinema5d.com/boost-davinci-resolve-performance/

Another option to explore is transcoding your originals to an efficient DI mezzanine codec like Cineform or Prores. Lightworks has integrated support for Cineform.
Razz

Digital Bolex 2k Cinema DNG raw camera
Canon GL2 DV camcorder
iPAD Mini 3 Iographer rig

Workstation: Intel i7-4770k, Asrock Z87 Thunderbolt 2 MB, 16GB 1866 DDR3 ram,
2TB Seagate Hybrid system drive, 2TB Seagate NAS media drive, E-sata III hot swap drive bay, Nvidia GTX760 2GB GPU
Lightworks kybrd. Shuttlepro v2
Win10 Pro 64bit, Lightworks 14.0 64bit

Mobile Workstation: MSI GTX72 Dominator
Intel i7-6700HQ 2.7GHz Win10 64bit
16GB DDR4 ram, 500GB M.2 SSD
Nvidia GTX970 3GB GPU
USB3, USB3.1-C, Thunderbolt 3 ports
Shuttlepro2 Win10 64bit LW 14.0 64 bit
Last Edit: 4 months ago by David Rasberry.

Re: GPU Decoding 4 months ago #193474

  • hugly
  • OFFLINE
  • Platinum Boarder
  • Posts: 19233
  • 4 months ago
David Rasberry wrote:
Maxing out system ram and using workstation platforms with lots of PCIE lanes and high count multi-core processors is the road to performance.

Yes, that's what I think and hope too, but unfortunmately, that's science fiction currently. Lightworks encoding/decoding doesn't perform well on my new AMD 1920x Threadripper (12 physical cores, plenty of PCI lanes) mounted on 400 Euro mainboard, equipped with 32GB of main memory and a GTX 1060 with 6GB.

I hope that next release will bring some improvement on this.
It's better to travel well than to arrive...
Last Edit: 4 months ago by hugly.

Re: GPU Decoding 3 months, 2 weeks ago #194602

  • SchwobaSeggl
  • Pro User
  • OFFLINE
  • Senior Boarder
  • Posts: 79
  • 3 months, 2 weeks ago
Resolve introduced H264 decoding in the GPU in their R14 beta 4 apparently. (Only in their Studio version.) However, I'm not sure if they use dedicated GPU routines at all for it, since Resolve demands CUDA, it might as well be that they "simply" use H264 decoding implemented in CUDA.

Meanwhile I opted for to AVC Intra proxy clips, which really makes my old Xeon and fat Nvidia jump with joy. But it wastes space. Well, the usual time/space trade-off again. Plus, it really takes it's time to generate those proxies. Which is OK for me, since I don't do urgent stuff like news. But for others it might be a pain.
kxStudio/Ubuntu Linux user, audio guy doing live mixdown and recording, long-time Ardour user. Now getting into video with LWKS.

Re: GPU Decoding 3 months, 2 weeks ago #194603

  • hugly
  • OFFLINE
  • Platinum Boarder
  • Posts: 19233
  • 3 months, 2 weeks ago
I found that the GPU encoding with the NVenc library of ffmpeg is very fast, but as soon as scaling on the GPU comes into play there's a significant performance drop down and not much of a benefit left compared with CPU encoding with fast encoder settings. Weird. A library problem?
It's better to travel well than to arrive...
Last Edit: 3 months, 2 weeks ago by hugly.
  • Page:
  • 1
Time to create page: 0.32 seconds
Scroll To Top