Hacker Newsnew | past | comments | ask | show | jobs | submitlogin



VVC has a lot of the same issues that plagued HEVC adoption. There really isn't much reason to not use AV1 if you want the best codec currently.


VVC is a generational improvement in compression ratio over AV1, which might make it the best for a lot of applications. Too bad it will probably languish in patent licensing hell.


> VVC is a generational improvement in compression ratio over AV1

That's probably true, but do you know where I could find a comparison that shows a variety of speed settings for both codecs? And ideally discusses how fast VVC is expected to encode once everything is more finalized.

It's easy enough to run everything at default speed but then the results are almost meaningless.


It's always really slow at the start, but once hardware acceleration happens it becomes much less of a problem.

Or if you are just an end user consuming streaming video content, then you only care that your player can decode it which I don't think will be any sort of problem if things play out like they did for HEVC vs VP9.

If all the streaming services choose VVC over AV1, then all the player companies will put hardware decoders in their SoC.


Well people definitely had complaints about HEVC compared to H.264 and VP9, but it seems that HEVC did perfectly fine.

I deal with a lot of video and HEVC is by and far the most used codec that I see.

I am not sure why AV1 vs VVC will go any differently.

Just like HEVC was about a generation better than VP9, VVC is about a generation better than AV1.

I guess it depends on what streaming companies choose. Since they all chose HEVC, all the cheap player chips in TVs and boxes and sticks all have HEVC hardware decode. Even though YouTube chose VP9, it didn't really seem to make a difference in the grand scheme of things.

YouTube chose AV1 of course, but if all the streaming companies go VVC, well then I think it will proliferate similarly to HEVC.


Unless you need hardware acceleration. I’ve managed to implement live h265 encode and decode with Directx12 on Windows on Intel and Nvidia, but the driver crashes on AMD and they seem uninterested in fixing it, so there I have to fallback to h264.


Well, it’s two louder, obviously.


Best in terms of compression efficiency. Although x266 has been delayed by 6 months and counting.

But it is currently being used in India and China already. So its usage is much bigger than a lot of people would imagine.


Have you experimented with it much? First I am hearing of it




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: