crazygringo 18 hours ago

Wow. To me, the big news here is that ~30% of devices now support AV1 hardware decoding. The article lists a bunch of examples of devices that have gained it in the past few years. I had no idea it was getting that popular -- fantastic news!

So now that h.264, h.265, and AV1 seem to be the three major codecs with hardware support, I wonder what will be the next one?

  • 0manrho 17 hours ago

    > To me, the big news here is that ~30% of devices now support AV1 hardware decoding

    Where did it say that?

    > AV1 powers approximately 30% of all Netflix viewing

    Is admittedly a bit non-specific, it could be interpreted as 30% of users or 30% of hours-of-video-streamed, which are very different metrics. If 5% of your users are using AV1, but that 5% watches far above the average, you can have a minority userbase with an outsized representation in hours viewed.

    I'm not saying that's the case, just giving an example of how it doesn't necessarily translate to 30% of devices using Netflix supporting AV1.

    Also, the blog post identifies that there is an effective/efficient software decoder, which allows people without hardware acceleration to still view AV1 media in some cases (the case they defined was Android based phones). So that kinda complicates what "X% of devices support AV1 playback," as it doesn't necessarily mean they have hardware decoding.

    • cogman10 5 hours ago

      That was one of the best decisions of AOMedia.

      AV1 was specifically designed to be friendly for a hardware decoder and that decision makes it friendly to software decoding. This happened because AOMedia got hardware manufacturers on the board pretty early on and took their feedback seriously.

      VP8/9 took a long time to get decent hardware decoding and part of the reason for that was because the stream was more complex than the AV1 stream.

      • Neywiny 4 hours ago

        Hmmm disagree on your chain there. Plenty of easy hardware algorithms are hard for software. For example, in hardware (including FPGAs), bit movement/shuffling is borderline trivial if it's constant, while in software you have to shift and mask and or over and over. In hardware you literally just switch which wire is connected to what on the next stage. Same for weird bit widths. Hardware doesn't care (too much) if you're operating on 9 bit quantities or 33 or 65. Software isn't that granular and often you'll double your storage and waste a bunch.

        I think they certainly go hand in hand in that algorithms relatively easier for software vs previously are easier for hardware vs previously and vice versa, but they are good at different things.

        • cogman10 4 hours ago

          I'm not claiming that software will be more efficient. I'm claiming that things that make it easy to go fast in hardware make it easy to go fast in software.

          Bit masking/shifting is certainly more expensive in software, but it's also about the cheapest software operation. In most cases it's a single cycle transform. In the best cases, it's something that can be done with some type of SIMD instruction. And in even better cases, it's a repeated operation which can be distributed across the array of GPU vector processors.

          What kills both hardware and software performance is data dependency and conditional logic. That's the sort of thing that was limited in the AV1 stream.

      • galad87 2 hours ago

        All I read about is that it's less hardware friendly than H.264 and HEVC, and they were all complaining about it. AV2 should be better in this regard.

        Where did you read that it was designed to make creating an hardware decoder easier?

        • cogman10 an hour ago

          It was a presentation on AV1 before it was released. I'll see if I can find it but I'm not holding my breath. It's mostly coming from my own recollection.

          Ok, I don't think I'll find it. I think I'm mostly just regurgitating what I remember watching at one of the research symposiums. IDK which one it was unfortunately [1]

          [1] https://www.youtube.com/@allianceforopenmedia2446/videos

    • sophiebits 13 hours ago

      “30% of viewing” I think clearly means either time played or items played. I’ve never worked with a data team that would possibly write that and mean users.

      If it was a stat about users they’d say “of users”, “of members”, “of active watchers”, or similar. If they wanted to be ambiguous they’d say “has reached 30% adoption” or something.

      • csdreamer7 3 hours ago

        I am not in data science so I can not validate your comment, but 30% of viewing I would assume mean users or unique/discreet viewing sessions and not watched minutes. I would appreciate it if Netflix would clarify.

      • 0manrho 13 hours ago

        Agreed, but this is the internet, the ultimate domain of pedantry, and they didn't say it explicitly, so I'm not going to put words in their mouth just to have a circular discussion about why I'm claiming they said something they didn't technically say, which is why I asked "Where did it say that" at the very top.

        Also, either way, my point was and still stands: it doesn't say 30% of devices have hardware encoding.

    • endorphine 12 hours ago

      In either case, it is still big news.

  • JoshTriplett 18 hours ago

    > So now that h.264, h.265, and AV1 seem to be the three major codecs with hardware support, I wonder what will be the next one?

    Hopefully AV2.

    • jsheard 18 hours ago

      H266/VVC has a five year head-start over AV2, so probably that first unless hardware vendors decide to skip it entirely. The final AV2 spec is due this year, so any day now, but it'll take a while to make it's way into hardware.

      • adgjlsfhk1 18 hours ago

        H266 is getting fully skipped (except possibly by Apple). The licensing is even worse than H265, the gains are smaller, and Google+Netflix have basically guaranteed that they won't use it (in favor of AV1 and AV2 when ready).

        • johncolanduoni 9 hours ago

          Did anybody, including the rightsholders, come out ahead on H265? From the outside it looked like the mutually assured destruction situation with the infamous mobile patents, where they all end up paying lawyers to demand money from each other for mostly paper gains.

          • tux3 8 hours ago

            Why, the patent office did. There are many ideas that cannot be reinvented for the next few decades, and thanks to submarine patents it is simply not safe to innovate without your own small regiment of lawers.

            This is a big victory for the patent system.

            • Dylan16807 4 hours ago

              The patent office getting $100k or whatever doesn't sound like a win for them either.

              I'm not sure what you mean by "patent system" having a victory here, but it's not that the goal of promoting innovation is happening.

          • gary_0 4 hours ago

            MBAs got to make deals and lawyers got to file lawsuits. Everyone else got to give them money. God bless the bureaucracy.

        • TitaRusell 3 hours ago

          For smart TVs Netflix is obviously a very important partner.

      • adzm 18 hours ago

        VVC is pretty much a dead end at this point. Hardly anyone is using it; it's benefits over AV1 are extremely minimal and no one wants the royalty headache. Basically everyone learned their lesson with HEVC.

        • ksec 7 hours ago

          It is being used in China and India for Streaming. Brazil chose it with LCEVC for their TV 3.0. Broadcasting industry is also preparing for VVC. So it is not popular as in Web and Internet is usage, but it is certainly not dead.

          I am eagerly awaiting for AV2 test results.

      • kevincox 18 hours ago

        If it has a five year start and we've seen almost zero hardware shipping that is a pretty bad sign.

        IIRC AV1 decoding hardware started shipping within a year of the bitstream being finalized. (Encoding took quite a bit longer but that is pretty reasonable)

      • shmerl 17 hours ago

        When even H.265 is being dropped by the likes of Dell, adoption of H.266 will be even worse making it basically DOA for anything promising. It's plagued by the same problems H.265 is.

        • SG- 11 hours ago

          Dell is significant in the streaming and media world?

          • close04 10 hours ago

            Dell and HP are significant in the "devices" world and they just dropped the support for HEVC hardware encoding/decoding [1] to save a few cents per device. You can still pay for the Microsoft add-in that does this. It's not just streaming, your Teams background blur was handled like that.

            Eventually people and companies will associate HEVC with "that thing that costs extra to work", and software developers will start targeting AV1/2 so their software performance isn't depending on whether the laptop manufacturer or user paid for the HEVC license.

            [1] https://arstechnica.com/gadgets/2025/11/hp-and-dell-disable-...

            • nolok 10 hours ago

              On the same line, Synology dropped it on their NAS too (for their video, media etc ... Even thumbnails, they ask the sender device to generate one locally and send it, the NAS won't do it anymore for HEVC)

  • mort96 9 hours ago

    > So now that h.264, h.265, and AV1 seem to be the three major codecs with hardware support, I wonder what will be the next one?

    Hopefully, we can just stay on AV1 for a long while. I don't feel any need to obsolete all the hardware that's now finally getting hardware decoding support for AV1.

  • alex_duf 11 hours ago

    That's not at all how I read it.

    They mentioned they delivered a software decoder on android first, then they also targeted web browsers (presumably through wasm). So out of these 30%, a good chunk of it is software not hardware.

    That being said, it's a pretty compelling argument for phone and tv manufacturers to get their act together, as Apple has already done.

  • thrdbndndn 18 hours ago

    I'm not too surprised. It's similar to the metric that "XX% of Internet is on IPv6" -- it's almost entirely driven by mobile devices, specifically phones. As soon as both mainstream Android and iPhones support it, the adoption of AV1 should be very 'easy'.

    (And yes, even for something like Netflix lots of people consume it with phones.)

  • dehrmann 18 hours ago

    Not trolling, but I'd bet something that's augmented with generative AI. Not to the level of describing scenes with words, but context-aware interpolation.

    • mort96 9 hours ago

      I don't want my video decoder inventing details which aren't there. I much rather want obvious compression artifacts than a codec where the "compression artifacts" look like perfectly realistic, high-quality hallucinated details.

      • cubefox 7 hours ago

        In case of many textures (grass, sand, hair, skin etc) it makes little difference whether the high frequency details are reproduced exactly or hallucinated. E.g. it doesn't matter whether the 1262nd blade of grass from the left side is bending to the left or to the right.

        • mort96 6 hours ago

          And in the case of many others, it makes a very significant difference. And a codec doesn't have enough information to know.

          Imagine a criminal investigation. A witness happened to take a video as the perpetrator did the crime. In the video, you can clearly see a recognizable detail on the perpetrator's body in high quality; a birthmark perhaps. This rules out the main suspect -- but can we trust that the birthmark actually exists and isn't hallucinated? Would a non-AI codec have just showed a clearly compression-artifact-looking blob of pixels which can't be determined one way or the other? Or would a non-AI codec have contained actual image data of the birth mark in sufficient detail?

          Using AI to introduce realistic-looking details where there was none before (which is what your proposed AI codec inherently does) should never happen automatically.

          • amiga386 4 hours ago

            > And in the case of many others, it makes a very significant difference.

            This is very true, but we're talking about an entertainment provider's choice of codec for streaming to millions of subscribers.

            A security recording device's choice of codec ought to be very different, perhaps even regulated to exclude codecs which could "hallucinate" high-definition detail not present in the raw camera data, and the limitations of the recording media need to be understood by law enforcement. We've had similar problems since the introduction of tape recorders, VHS and so on, they always need to be worked out. Even the phantom of Helibronn (https://en.wikipedia.org/wiki/Phantom_of_Heilbronn) turned out to be DNA contamination of swabs by someone who worked for the swab manufacturer.

            • mort96 2 hours ago

              I don't understand why it needs to be a part of the codec. Can't Netflix use relatively low bitrate/resolution AV1 and then use AI to upscale or add back detail in the player? Why is this something we want to do in the codec and therefore set in stone with standard bodies and hardware implementations?

              • amiga386 2 hours ago

                We're currently indulging a hypothetical, the idea of AI being used to either improve the quality of streamed video, or provide the same quality with a lower bitrate, so the focus is what would both ends of the codec could agree on.

                The coding side of "codec" needs to know what the decoding side would add back in (the hypothetical AI upscaling), so it knows where it can skimp and get a good "AI" result anyway, versus where it has to be generous in allocating bits because the "AI" hallucinates too badly to meet the quality requirements. You'd also want it specified, so that any encoding displays the same on any decoder, and you'd want it in hardware as most devices that display video rely on dedicated decoders to play it at full frame rate and/or not drain their battery. It it's not in hardware, it's not going to be adopted. It is possible to have different encodings, so a "baseline" encoding could leave out the AI upscaler, at the cost of needing a higher bitrate to maintain quality, or switching to a lower quality if bitrate isn't there.

                Separating out codec from upscaler, and having a deliberately low-resolution / low-bitrate stream be naively "AI upscaled" would, IMHO, look like shit. It's already a trend in computer games to render at lower resolution and have dedicated graphics card hardware "AI upscale" (DLSS, FSR, XeSS, PSSR), because 4k resolutions are just too much work to render modern graphics consistently at 60fps. But the result, IMHO, noticibly and distractingly glitches and errors all the time.

          • beala 5 hours ago

            There’s an infamous case of xerox photocopiers substituting in incorrect characters due to a poorly tuned compression algorithm. No AI necessary.

            https://en.wikipedia.org/wiki/JBIG2#:~:text=Character%20subs...

            • mort96 5 hours ago

              Yeah, I had that case in mind actually. It's a perfect illustration of why compression artifacts should be obvious and not just realistic-looking hallucinations.

          • mapt 4 hours ago

            > a codec doesn't have enough information to know.

            The material belief is that modern trained neural network methods that improve on ten generations of variations of the discrete cosine transform and wavelets, can bring a codec from "1% of knowing" to "5% of knowing". This is broadly useful. The level of abstraction does not need to be "The AI told the decoder to put a finger here", it may be "The AI told the decoder how to terminate the wrinkle on a finger here". An AI detail overlay. As we go from 1080p to 4K to 8K and beyond we care less and less about individual small-scale details being 100% correct, and there are representative elements that existing techniques are just really bad at squeezing into higher compression ratios.

            I don't claim that it's ideal, and the initial results left a lot to be desired in gaming (where latency and prediction is a Hard Problem), but AI upscaling is already routinely used for scene rips of older videos (from the VHS Age or the DVD Age), and it's clearly going to happen inside of a codec sooner or later.

            • mort96 4 hours ago

              I'm not saying it's not going to happen. I'm saying it's a terrible idea.

              AI upscaling built in to video players isn't a problem, as long as you can view the source data by disabling AI upscaling. The human is in control.

              AI upscaling and detail hallucination built in to video codecs is a problem.

              • mapt 4 hours ago

                The entire job of a codec is subjectively authentic, but lossy compression. AI is our best and in some ways easiest method of lossy compression. All lossy compression produces artifacts; JPEG macroblocks are effectively a hallucination, albeit one that is immediately identifiable because it fails to simulate anything else we're familiar with.

                AI compression doesn't have to be the level of compression that exists in image generation prompts, though. A SORA prompt might be 500 bits (~1 bit per character natural English), while a decompressed 4K frame that you're trying to bring to 16K level of simulated detail starts out at 199 million bits. It can be a much finer level of compression.

          • cubefox 5 hours ago

            Maybe there could be a "hallucination rate" parameter in the encoder: More hallucination would enable higher subjective image quality without increased accuracy. It could be used for Netflix streaming, where birthmarks and other forensic details don't matter because it's all just entertainment. Of course the hallucination parameter needs to be hard coded somehow in the output in order to determine its reliability.

    • afiori 11 hours ago

      AI embeddings can be seen as a very advanced form of lossy compression

    • cubefox 5 hours ago

      Neural codecs are indeed the future of audio and video compression. A lot of people / organizations are working on them and they are close to being practical. E.g. https://arxiv.org/abs/2502.20762

    • randall 18 hours ago

      for sure. macroblock hinting seems like a good place for research.

  • dylan604 18 hours ago

    how does that mean "~30% of devices now support AV1 hardware encoding"? I'm guessing you meant hardware decoding???

  • snvzz 18 hours ago

    >So now that h.264, h.265, and AV1 seem to be the three major codecs with hardware support

    That'd be h264 (associated patents expired in most of the world), vp9 and av1.

    h265 aka HEVC is less common due to dodgy, abusive licensing. Some vendors even disable it with drivers despite hardware support because it is nothing but legal trouble.

    • ladyanita22 11 hours ago

      I have the feeling that H265 is more prevalent than VP9

  • vitorgrs 12 hours ago

    I mean... I bought a Samsung TV in 2020, and it already supported AV1 HW decoding.

    2020 feels close, but that's 5 years.

    • usrusr 7 hours ago

      Is that supposed to be long-lived for a TV?

      I'm running an LG initially released in 2013 and the only thing I'm not happy with is that about a year ago Netflix ended their app for that hardware generation (likely for phasing out whatever codec it used). Now I'm running that unit behind an Amazon fire stick and the user experience is so much worse.

      (that LG was a "smart" TV from before they started enshittifying, such a delight - had to use and set up a recent LG once on a family visit and it was even worse than the fire stick, omg, so much worse!)

      • Dylan16807 4 hours ago

        > Is that supposed to be long-lived for a TV?

        I don't see anything in that comment implying such a thing. It's just about the uptake of decoders.

      • StilesCrisis 5 hours ago

        Fire Stick is the most enshittified device (which is why it was so cheap). AppleTV is fantastic if you're willing to spend $100. You don't need the latest gen; previous gen are just as good.

    • cubefox 7 hours ago

      Two years ago I bought a Snapdragon 8+ Gen 1 phone (TSMC 4nm, with 12 GB LPDDR RAM, 256 GB NAND flash, and a 200 megapixel camera). It still feels pretty modern but it has no AV1 support.

IgorPartola 18 hours ago

Amazing. Proprietary video codecs need to not be the default and this is huge validation for AV1 as a production-ready codec.

  • raw_anon_1111 15 hours ago

    Why does it matter if Netflix is using an open standard if every video they stream is wrapped in proprietary closed DRM?

    • chii 15 hours ago

      because device makers will not care for the DRM, but will care for the hardware decoder they need to decide to put into their devices to decode netflix videos. By ensuring this video codec is open, it benefits everybody else now, as this same device will now be able to hardware decode _more_ videos from different video providers, as well as make more video providers choose AV1.

      Basically, a network effect for an open codec.

      • csmpltn 12 hours ago

        But they still need to decode the proprietary DRM before it can be fed to the hardware decoder... lol

        • cm2187 11 hours ago

          I think the point is that if you are not Netflix, you can use AV1 as most of your clients devices support hardware acceleration thanks to the big guys using AV1 themselves.

        • nevi-me 10 hours ago

          I struggle to follow your point. They still need to do that for any codec, and I would think that the DRM decryption would be using algorithms that might also be hardware accelerated.

    • cheema33 14 hours ago

      > Why does it matter if Netflix is using an open standard if every video they stream is wrapped in proprietary closed DRM?

      I am not sure if this is a serious question, but I'll bite in case it is.

      Without DRM Netflix's business would not exist. Nobody would license them any content if it was going to be streamed without a DRM.

      • reddalo 11 hours ago

        >Without DRM Netflix's business would not exist. Nobody would license them any content if it was going to be streamed without a DRM.

        I don't agree. If people refused to watch DRM-protected content, they would get rid of it.

        For example, Pluto TV is a free streaming service that has much content without DRM. GOG lets you buy DRM-free games. Even Netflix itself lets you stream DRM-free content, albeit in low resolution.

        • bobdvb 4 hours ago

          From previous experience some platforms are considered a "leakage source" for content and major rights owners won't put their content there because it's too easy to steal from. The security measures that are put on streaming platforms aren't totally ineffective, they're restrictive but it's considered worth the trouble because platforms can actually measure the effect of restrictions.

          The low resolution option is something many rightsholders accept, but from a product proposition perspective it's difficult to explain to many customers. They're just grumpy that they paid for content and can only watch it in SD, that reduces your customer satisfaction. Better to do nothing than a poor job sometimes.

      • dontlaugh 11 hours ago

        Why? Everything gets pirated anyway, even with all the DRM. There’s no difference.

        • bobdvb 4 hours ago

          I've spent >20 years doing content security in various forms at various companies. Until recently I was directing the technology at a major streaming platform.

          I can confirm that while there are serious issues with Widevine (and to a lesser extent PlayReady), the protection measures aren't totally ineffective. My work in improving security had measurable results saving significant amounts of money and reducing content leakage. One memorable time my colleague and I had a call with a big rights owner who tracks the piracy of their assets and they said "Can you tell us what you've been doing recently? Because the amount of piracy from your platform has dropped significantly."

          Anti-piracy and content security is also a differentiator between platforms when bidding for content deals. Rights owners will absolutely give the best deals to the provider who provides more assurance and avoid platforms which are leaky buckets.

          I know that doesn't fit the narrative, but until recently this was literally my job.

        • jamesnorden 8 hours ago

          Security theater mostly, makes the executives feel good.

      • realusername 13 hours ago

        I don't think anybody could suggest going back to Blueray at this point, if selling online without DRM would be the only choice, they would have to comply.

      • IshKebab 11 hours ago

        I'm not convinced by that. If DRM didn't exist then do you really think studios would be like "nah we'll just miss out on all that money".

        They just want DRM because it makes them even more money. Or at least they think it does. I have yet to find a single TV show or film that isn't available on Bittorrent so I don't think the DRM is actually preventing piracy in the slightest. I guess they want it in order to prevent legal tools from easily working with videos, e.g. for backup, retransmission etc.

shanemhansen 14 hours ago

> AV1 streaming sessions achieve VMAF scores¹ that are 4.3 points higher than AVC and 0.9 points higher than HEVC sessions. At the same time, AV1 sessions use one-third less bandwidth than both AVC and HEVC, resulting in 45% fewer buffering interruptions.

Just thought I'd extract the part I found interesting as a performance engineer.

  • slhck 7 hours ago

    This VMAF comparison is to be taken with a grain of salt. Netflix' primary goal was to reduce the bitrate consumption, as can be seen, while roughly keeping the same nominal quality of the stream. This means that, ignoring all other factors and limitations of H.264 with higher resolutions, VMAF scores for all their streaming sessions should roughly be the same, or in a comparable range, because that's what they're optimizing for. (See the Dynamic Optimizer Framework they have publicly posted a few years ago.)

    Still impressive numbers, of course.

VerifiedReports 16 hours ago

I had forgotten about the film-grain extraction, which is a clever approach to a huge problem for compression.

But... did I miss it, or was there no mention of any tool to specify grain parameters up front? If you're shooting "clean" digital footage and you decide in post that you want to add grain, how do you convey the grain parameters to the encoder?

It would degrade your work and defeat some of the purpose of this clever scheme if you had to add fake grain to your original footage, feed the grainy footage to the encoder to have it analyzed for its characteristics and stripped out (inevitably degrading real image details at least a bit), and then have the grain re-added on delivery.

So you need a way to specify grain characteristics to the encoder directly, so clean footage can be delivered without degradation and grain applied to it upon rendering at the client.

  • crazygringo 15 hours ago

    You just add it to your original footage, and accept whatever quality degradation that grain inherently provides.

    Any movie or TV show is ultimately going to be streamed in lots of different formats. And when grain is added, it's often on a per-shot basis, not uniformly. E.g. flashback scenes will have more grain. Or darker scenes will have more grain added to emulate film.

    Trying to tie it to the particular codec would be a crazy headache. For a solo project it could be doable but I can't ever imagine a streamer building a source material pipeline that would handle that.

    • VerifiedReports 12 hours ago

      Mmmm, no, because if the delivery conduit uses AV1, you can optimize for it and realize better quality by avoiding the whole degrading round of grain analysis and stripping.

      "I can't ever imagine a streamer building a source material pipeline that would handle that."

      That's exactly what the article describes, though. It's already built, and Netflix is championing this delivery mechanism. Netflix is also famous for dictating technical requirements for source material. Why would they not want the director to be able to provide a delivery-ready master that skips the whole grain-analysis/grain-removal step and provides the best possible image quality?

      Presumably the grain extraction/re-adding mechanism described here handles variable grain throughout the program. I don't know why you'd assume that it doesn't. If it didn't, you'd wind up with a single grain level for the entire movie; an entirely unacceptable result for the very reason you mention.

      This scheme loses a major opportunity for new productions unless the director can provide a clean master and an accompanying "grain track." Call it a GDL: grain decision list.

      This would also be future-proof; if a new codec is devised that also supports this grain layer, the parameters could be translated from the previous master into the new codec. I wish Netflix could go back and remove the hideous soft-focus filtration from The West Wing, but nope; that's baked into the footage forever.

      • irae 4 hours ago

        I believe you are speculating on digital mastering and not codec conversion.

        From the creator's PoV their intention and quality is defined in post-production and mastering, color grading and other stuff I am not expert on. But I know a bit more from music mastering and you might be thinking of a workflow similar to Apple, which allows creators to master for their codec with "Mastered for iTuenes" flow, where the creators opt-in to an extra step to increase quality of the encoding and can hear in their studio the final quality after Apple encodes and DRMs the content on their servers.

        In video I would assume that is much more complicated, since there are many quality the video is encoded to allow for slower connections and buffering without interruptions. So I assume the best strategy is the one you mentioned yourself, where AV1 obviously detects on a per scene or keyframe interval the grain level/type/characteristics and encode as to be accurate to the source material at this scene.

        In other words: The artist/director preference for grain is already per scene and expressed in the high bitrate/low-compression format they provide to Netflix and competitors. I find it unlikely that any encoder flags would specifically benefit the encoding workflow in the way you suggested it might.

      • crazygringo 3 hours ago

        You're misunderstanding.

        > if the delivery conduit uses AV1, you can optimize for it

        You could, in theory, as I confirmed.

        > It's already built, and Netflix is championing this delivery mechanism.

        No it's not. AV1 encoding is already built. Not a pipeline where source files come without noise but with noise metadata.

        > and provides the best possible image quality?

        The difference in quality is not particularly meaningful. Advanced noise-reduction algorithms already average out pixel values across many frames to recover a noise-free version that is quite accurate (including accounting for motion), and when the motion/change is so overwhelming that this doesn't work, it's too fast for the eye to be perceiving that level of detail anyways.

        > This scheme loses a major opportunity for new productions unless the director can provide a clean master and an accompanying "grain track."

        Right, that's what you're proposing. But it doesn't exist. And it's probably never going to exist, for good reason.

        Production houses generally provide digital masters in IMF format (which is basically JPEG2000), or sometimes ProRes. At a technical level, a grain track could be invented. But it basically flies in the face of the idea that the pixel data itself is the final "master". In the same way, color grading and vector graphics aren't provided as metadata either, even though they could be in theory.

        Once you get away from the idea that the source pixels are the ultimate source of truth and put additional postprocessing into metadata, it opens up a whole can of worms where different streamers interpret the metadata differently, like some streamers might choose to never add noise and so the shows look different and no longer reflect the creator's intent.

        So it's almost less of a technical question and more of a philosophical question about what represents the finished product. And the industry has long decided that the finished product is the pixels themselves, not layers and effects that still need to be composited.

        > I wish Netflix could go back and remove the hideous soft-focus filtration from The West Wing, but nope; that's baked into the footage forever.

        In case you're not aware, it's not a postproduction filter -- the soft focus was done with diffusion filters on the cameras themselves, as well as choice of film stock. And that was the creative intent at the time. Trying to "remove" it would be like trying to pretend it wasn't the late-90's network drama that it was.

  • bob1029 2 hours ago

    Actual film grain (i.e., photochemical) is arguably a valid source of information. You can frame it as noise, but does provide additional information content that our visual system can work with.

    Removing real film grain from content and then recreating it parametrically on the other side is not the same thing as directly encoding it. You are killing a lot of information. It is really hard to quantify exactly how we perceive this sort of information so it's easy to evade the consequences of screwing with it. Selling the Netflix board on an extra X megabits/s per streamer to keep genuine film grain that only 1% of the customers will notice is a non-starter.

pbw 18 hours ago

There's an HDR war brewing on TikTok and other social apps. A fraction of posts that use HDR are just massively brighter than the rest; the whole video shines like a flashlight. The apps are eventually going to have to detect HDR abuse.

  • thrdbndndn 17 hours ago

    The whole HDR scene still feels like a mess.

    I know how bad the support for HDR is on computers (particularly Windows and cheap monitors), so I avoid consuming HDR content on them.

    But I just purchased a new iPhone 17 Pro, and I was very surprised at how these HDR videos on social media still look like shit on apps like Instagram.

    And even worse, the HDR video I shoot with my iPhone looks like shit even when playing it back on the same phone! After a few trials I had to just turn it off in the Camera app.

    • johncolanduoni 17 hours ago

      I wonder if it fundamentally only really makes sense for film, video games, etc. where a person will actually tune the range per scene. Plus, only when played on half decent monitors that don’t just squash BT.2020 so they can say HDR on the brochure.

      • Dylan16807 4 hours ago

        Even without tuning it shouldn't look worse than squishing to SDR at capture time. There are are significant ecosystem failures that could be fixed.

    • theshackleford 16 hours ago

      The HDR implementation in Windows 11 is fine. And it's not even that bad in 11 in terms of titles and content officially supporting HDR. Most of the ideas that it's "bad" comes from the "cheap monitor" part, not windows.

      I have zero issues and only an exceptional image on W11 with a PG32UQX.

      • lwkl 4 hours ago

        Also if you get flashbanged by SDR content on Windows 11 there is a slider in HDR settings that lets you turn down the brightness of SDR content. I didn't know about this at first and had HDR disable because of this for a long time.

      • RealStickman_ 9 hours ago

        IIRC Windows still uses the sRGB curve for tone mapping of SDR content in HDR, so you have to toggle it on and off all the time.

        KDE Wayland went the better route and uses Gamma 2.2

    • Forgeties79 17 hours ago

      The only time I shoot HDR on anything is because I plan on crushing the shadows/raising highlights after the fact. S curves all the way. Get all the dynamic range you can and then dial in the look. Otherwise it just looks like a flat washed out mess most of the time

  • munificent 18 hours ago

    Just what we need, a new loudness war, but for our eyeballs.

    https://en.wikipedia.org/wiki/Loudness_war

    • morshu9001 14 hours ago

      What if they did HDR for audio? So an audio file can tell your speakers to output at 300% of the normal max volume, even more than what compression can do.

      • Cthulhu_ 11 hours ago

        Isn't that just by having generally low volume levels? I'm being pedantic, but audio already supports a kind of HDR like that. That said, I wonder if the "volume normalisation" tech that definitely Spotify, presumably other media apps / players / etc have, can be abused to think a song is really quiet.

    • eru 18 hours ago

      Interestingly, the loudness war was essentially fixed by the streaming services. They were in a similar situation as Tik Tok is now.

      • Demiurge 14 hours ago

        You would think, but not in a way that matters. Everyone still compresses their mixes. People try to get around normalization algorithms by clever hacks. The dynamics still suffer, and bad mixes still clip. So no, I don’t think streaming services fixed the loudness wars.

      • aoeusnth1 16 hours ago

        What's the history on the end to the loudness war? Do streaming services renormalize super compressed music to be quieter than the peaks of higher dynamic range music?

        • eru 16 hours ago

          Yes. Basically the streaming services started using a decent model of perceived loudness, and normalise tracks to roughly the same perceived level. I seem to remember that Apple (the computer company, not the music company) was involved as well, but I need to re-read the history here. Their music service and mp3 players were popular back in the day.

          So all music producers got out of compressing their music was clipping, and not extra loudness when played back.

          • cdash 15 hours ago

            It hasn't really changed much in the mastering process, they still are doing the same old compression. Maybe not the to the same extremes, but dynamic range is still usually terrible. They do it a a higher LUFS target than the streaming platforms normalize to because each streaming platform has a different limit and could change it at any time, so better to be on the safe side. Also the fact that majority of music listening doesn't happen on good speakers/environment.

            • account42 10 hours ago

              > Also the fact that majority of music listening doesn't happen on good speakers/environment.

              Exacly this. I usually do not want high dynamic audio because that means it's either to quiet sometimes or loud enough to annoy neighbors at other times, or both.

      • irae 3 hours ago

        I hope they end up removing HDR from videos with HDR text. Recording video in sunlight etc is OK, it can be sort of "normalized brightness" or something. But HDR text on top is terrible always.

  • crazygringo 18 hours ago

    This is one of the reasons I don't like HDR support "by default".

    HDR is meant to be so much more intense, it should really be limited to things like immersive full-screen long-form-ish content. It's for movies, TV shows, etc.

    It's not what I want for non-immersive videos you scroll through, ads, etc. I'd be happy if it were disabled by the OS whenever not in full screen mode. Unless you're building a video editor or something.

    • JoshTriplett 18 hours ago

      Or a photo viewer, which isn't necessarily running in fullscreen.

  • baby_souffle 4 hours ago

    > The apps are eventually going to have to detect HDR abuse

    The latest android release has a setting that is the HDR version of “volume leveling”.

  • jsheard 18 hours ago

    Sounds like they need something akin to audio volume normalization but for video. You can go bright, but only in moderation, otherwise your whole video gets dimmed down until the average is reasonable.

    • solarkraft 6 hours ago

      Actually I don’t even agree with that. I don’t want to be flash banged.

    • illiac786 11 hours ago

      I was about to write that. The algorithm need to be chosen, what is mostly used for audio gain normalization? Rolling average?

  • recursive 18 hours ago

    My phone has this cool feature where it doesn't support HDR.

    • illiac786 11 hours ago

      Every phone has it, it’s called “power save mode” on most devices and provides additional advantages like preventing apps from doing too much stuff in the background. =)

  • JoshTriplett 18 hours ago

    That's true on the web, as well; HDR images on web pages have this problem.

    It's not obvious whether there's any automated way to reliably detect the difference between "use of HDR" and "abuse of HDR". But you could probably catch the most egregious cases, like "every single pixel in the video has brightness above 80%".

    • kmeisthax 17 hours ago

      Funnily enough HDR already has to detect this problem, because most HDR monitors literally do not have the power circuitry or cooling to deliver a complete white screen at maximum brightness.

      My idea is: for each frame, grayscale the image, then count what percentage of the screen is above the standard white level. If more than 20% of the image is >SDR white level, then tone-map the whole video to the SDR white point.

      • Koffiepoeder 12 hours ago

        I now present you: HDRbooster. The tool to boost your image to 19.99% BOOSTED highlights and 80.01% MAX brightness (99.99% of SDR white)!

      • JoshTriplett 17 hours ago

        That needs a temporal component as well: games and videos often use HDR for sudden short-lived brightness.

    • eru 18 hours ago

      > It's not obvious whether there's any automated way to reliably detect the difference between "use of HDR" and "abuse of HDR".

      That sounds like a job our new AI overlords could probably handle. (But that might be overkill.)

  • morshu9001 14 hours ago

    HDR has a slight purpose, but the way it was rolled out was so disrespectful that I just want it permanently gone everywhere. Even the rare times it's used in a non-abusive way, it can hurt your eyes or make things display weirdly.

  • ElasticBottle 18 hours ago

    Can someone explain what the war is about?

    Like HDR abuse makes it sound bad, because the video is bright? Wouldn't that just hurt the person posting it since I'd skip over a bright video?

    Sorry if I'm phrasing this all wrong, don't really use TikTok

    • JoshTriplett 18 hours ago

      > Wouldn't that just hurt the person posting it since I'd skip over a bright video?

      Sure, in the same way that advertising should never work since people would just skip over a banner ad. In an ideal world, everyone would uniformly go "nope"; in our world, it's very much analogous to the https://en.wikipedia.org/wiki/Loudness_war .

    • johncolanduoni 17 hours ago

      Not everything that glitters (or blinds) is gold.

  • dylan604 18 hours ago

    sounds like every fad that came before it where it was over used by all of the people copying with no understanding of what it is or why. remember all of the HDR still images that pushed everything to look post-apocalyptic? remember all of the people pushing washed out videos because they didn't know how to grade the images recorded in log and it became a "thing"?

    eventually, it'll wear itself out just like every other over use of the new

  • kmeisthax 17 hours ago

    I would love to know who the hell thought adding "brighter than white" range to HDR was a good idea. Or, even worse, who the hell at Apple thought implementing that should happen by way of locking UI to the standard range. Even if you have a properly mastered HDR video (or image), and you've got your brightness set to where it doesn't hurt to look at, it still makes all the UI surrounding that image look grey. If I'm only supposed to watch HDR in fullscreen, where there's no surrounding UI, then maybe you should tone-map to SDR until I fullscreen the damn video?

    • crazygringo 17 hours ago

      Yup, totally agreed. I said the same thing in another comment -- HDR should be reserved only for full-screen stuff where you want to be immersed in it, like movies and TV shows.

      Unless you're using a video editor or something, everything should just be SDR when it's within a user interface.

  • hbn 18 hours ago

    HDR videos on social media look terrible because the UI isn’t in HDR while the video isn’t. So you have this insanely bright video that more or less ignores your brightness settings, and then dim icons on top of it that almost look incomplete or fuzzy cause of their surroundings. It looks bizarre and terrible.

    • crazygringo 17 hours ago

      The alternative is even worse, where the whole UI is blinding you. Plus, that level of brightness isn't meant to be sustained.

      The solution is for social media to be SDR, not for the UI to be HDR.

      • miladyincontrol 15 hours ago

        Imo the real solution is for luminance to scale appropriately even in HDR range, kinda like how gain map HDR images can. Scaled both with regards to the display's capabilities and the user/apps intents.

        • solarkraft 6 hours ago

          My personal solution would be to cap video brightness to the brightness I selected.

    • NathanielK 18 hours ago

      It's good if you have black text on white background, since your app can have good contrast without searing your eyes. People started switching to dark themes to avoid having their eyeballs seared monitors with the brightness high.

      For things filmed with HDR in mind it's a benefit. Bummer things always get taken to the extreme.

      • hbn 6 hours ago

        I only use light themes for the most part, and HDR videos look insane and out of place. If you scroll past an HDR video on Instagram you have a, eyeball-searing section of your screen because your eyes aren't adjusted to looking at that brightness, and then once you scroll it off the screen and you have no HDR content, everything looks dim and muted because you just got flashbanged.

    • hombre_fatal 17 hours ago

      Not sure how it works on Android, but it's such amateur UX on Apple's part.

      99.9% of people expect HDR content to get capped / tone-mapped to their display's brightness setting.

      That way, HDR content is just magically better. I think this is already how HDR works on non-HDR displays?

      For the 0.01% of people who want something different, it should be a toggle.

      Unfortunately I think this is either (A) amateur enshittification like with their keyboards 10 years ago, or (B) Apple specifically likes how it works since it forces you to see their "XDR tech" even though it's a horrible experience day to day.

    • nine_k 18 hours ago

      But isn't it the point? Try looking at a light bulb; everything around it is so much less bright.

      OTOH pointing a flaslight at your face is at least impolite. I would put a dark filter on top of HDR vdeos until a video is clicked for watching.

      • hbn 6 hours ago

        I'm not trying to watch videos or read text on my light bulb

        • nine_k 3 hours ago

          A video of a sunrise, or a firework, or metal being cast, etc feels much more real in HDR. There are legitimate uses.

bofaGuy 18 hours ago

Netflix has been the worst performing and lowest quality video stream of any of the streaming services. Fuzzy video, lots of visual noise and artifacts. Just plan bad and this is on the 4k plan on 1GB fiber on a 4k Apple TV. I can literally tell when someone is watching Netflix without knowing because it looks like shit.

  • mapontosevenths 17 hours ago

    It's not AV1's fault though, I'm pretty sure it's that they cheap out on the bitrate. Apple is among the highest bitrates (other than Sony's weird hardware locked streaming service).

    I actually blamed AV1 for the macro-blocking and generally awful experience of watching horror films on Netflix for a long time. Then I realized other sources using AV1 were better.

    If you press ctl-alt-shift-d while the video is playing you'll note that most of the time that the bitrate is appallingly low, and also that Netflix plays their own original content using higher bitrate HEVC rather than AV1.

    That's because they actually want it to look good. For partner content they often default back to lower bitrate AV1, because they just don't care.

  • odo1242 17 hours ago

    This is actually their DRM speaking. If you watch it on a Linux device or basically anything that isn’t a smart TV on the latest OS, they limit you to a 720p low bitrate stream, even if you pay for 4k. (See Louis Rossman’s video on the topic)

    • jsheard 17 hours ago

      OP said they're using an Apple TV, which most definitely supports the 4K DRM.

      • array_key_first 17 hours ago

        The bit rate is unfortunately crushed to hell and back, leading to blockiness on 4K.

    • misiek08 11 hours ago

      Have same experience as OP on newest ATV 4k. Good it's not only me who wonders how is it possible that they describe such great approaches to encoding, but final result is just so bad.

      Good that the OCAs really work and are very inspiring in content delivery domain.

  • bombela 15 hours ago

    Yep, and they also silently downgrade resolution and audio channels on an ever changing and hidden list of browsers/OS/device overtime.

    Meanwhile pirated movies are in Blu-ray quality, with all audio and language options you can dream of.

  • mulderc 17 hours ago

    I also find Netflix video quality shockingly bad and oddly inconsistent. I think they just don’t prioritize video quality in the same way as say apple or Disney does.

  • deanylev 6 hours ago

    I was able to improve things somewhat by going to https://www.netflix.com/settings/playback/<myprofileid> and changing "Data usage per screen" from Auto to High

  • pcchristie 16 hours ago

    I cancelled Netflix for this exact reason. 4K Netflix looks worse than 720 YouTube, yet I pay(paid) for Netflix 4K, and at roughly 2x what I paid for Netflix when it launched. It's genuinely a disgrace how they can even claim with a straight face that you're actually watching 4K. The last price rise was the tipping point and I tapped out after 11 years.

  • prhn 16 hours ago

    Netflix on Apple TV has an issue if "Match Content" is "off" where it will constantly downgrade the video stream to a lower bitrate unnecessarily.

    Even fixing that issue the video quality is never great compared to other services.

  • mtoner23 18 hours ago

    Probably some function of your location to data centers. I find hbo max to be aysmal these days. But I've learned to just stop caring about this stuff since no one else in my life does

  • not_a_bot_4sho 16 hours ago

    Oddly enough, I observe something to the opposite effect.

    I wonder if it has more to do with proximity to edge delivery nodes than anything else.

Eduard 18 hours ago

I'm surprised AV1 usage is only at 30%. Is AV1 so demanding that Netflix clients without AV1 hardware acceleration capabilities would be overwhelmed by it?

  • FrostKiwi 18 hours ago

    Thanks to libdav1d's [1] lovingly hand crafted SIMD ASM instructions it's actually possible to reasonably playback AV1 without hardware acceleration, but basically yes: From Snapdragon 8 onwards, Google Tensor G3 onwards, NVIDIA RTX 3000 series onwards. All relatively new .

    [1] https://code.videolan.org/videolan/dav1d

  • adgjlsfhk1 18 hours ago

    There are a lot of 10 year old TVs/fire sticks still in use that have a CPU that maxes out running the UI and rely exclusively on hardware decoding for all codecs (e.g. they couldn't hardware decode h264 either). Image a super budget phone from ~2012 and you'll have some idea the hardware capability we're dealing with.

  • johncolanduoni 17 hours ago

    Compression gains will mostly be for the benefit of the streaming platform’s bills/infra unless you’re trying to stream 4K 60fps on hotel wifi (or if you can’t decode last-gen codecs on hardware either ). Apparently streaming platforms still favor user experience enough to not heat their rooms for no observable improvement. Also a TV CPU can barely decode a PNG still in software - video decoding of any kind is simply impossible.

    • solarkraft 6 hours ago

      > Apparently streaming platforms still favor user experience enough to not heat their rooms for no observable improvement

      It’s more like “why does Netflix kill my battery within an hour when I used to be able to play for 20”

  • MaxL93 11 hours ago

    I'd love to watch Netflix AV1 streams but they just straight up don't serve it to my smart TV or my Windows computers despite hardware acceleration support.

    The only way I can get them to serve me an AV1 stream is if I block "protected content IDs" through browser site settings. Otherwise they're giving me an H.264 stream... It's really silly, to say the least

  • eru 18 hours ago

    If you are on a mobile device, decoding without hardware assistance might not overwhelm the processors directly, but it might drain your battery unnecessarily fast?

  • solarkraft 6 hours ago

    Absolutely. Playing back any video codec is a terrible experience without acceleration.

  • boterock 18 hours ago

    tv manufacturers don't want high end chips for their tv sets... hardware decoding is just a way to make cheaper chips for tvs.

  • dd_xplore 13 hours ago

    They would be served h.265

liampulles 9 hours ago

I'm a hobbiest video encoder (mostly I like to experiment in backing up my DVD collection), and I recently switched to using AV1 over HEVC.

I've found the ratio of a fixed quality vs CPU load to be better, and I've found it is reasonably good at retaining detail over smoothing things out when compared to HEVC. And the ability to add generated "pseudo grain" works pretty well to give the perception of detail. The performance of GPU encoders (while still not good enough fory maybe stringent standards) is better.

aperture147 15 hours ago

AV1 is not new anymore and I think most of the modern devices are supporting them natively. Some devices like Apple even have a dedicated AV1 HW-accelerator. Netflix has pushing AV1 for a while now so I thought that the adoption rate should be like 50%, but it seems like AV1 requires better hardware and newer software which a lot of people don't have.

  • smallstepforman 14 hours ago

    Dont forget that people also view Netflix on TV’s, and a large number of physical TV’s were made before AV1 was specced. So 30% overall may also mean 70% on modern devices.

  • brnt 7 hours ago

    > AV1 is not new anymore

    Uh what. (Embedded) hardware lasts a long time (and it should!). TV's around the globe are not all built after 2018. H264 is still the gold standard if you want to be sure a random device has hardware acceleration.

    I make use of this by taking a USB hard drive with me on trips. Random TV's rarely have issue with my H264 catalogue. It'll be a while before I look at AV1 for this. Sure, I wish I could benefit faster, but I don't want people to throw out perfectly good hardware either!

resolutefunctor 18 hours ago

This is really cool. Props to the team that created AV1. Very impressive

tr45872267 18 hours ago

>AV1 sessions use one-third less bandwidth than both AVC and HEVC

Sounds like they set HEVC to higher quality then? Otherwise how could it be the same as AVC?

  • pornel 18 hours ago

    There are other possible explanations, e.g. AVC and HEVC are set to the same bitrate, so AVC streams lose quality, while AV1 targets HEVC's quality. Or they compare AV1 traffic to the sum of all mixed H.26x traffic. Or the rates vary in more complex ways and that's an (over)simplified summary for the purpose of the post.

    Netflix developed VMAF, so they're definitely aware of the complexity of matching quality across codecs and bitrates.

    • tr45872267 18 hours ago

      I have no doubt they know what they are doing. But it's a srange metric no matter how you slice it. Why compare AV1's bandwith to the average of h.264 and h.265, and without any more details about resolution or compression ratio? Reading between the lines, it sounds like they use AV1 for low bandwidth and h.265 for high bandwidth and h.264 as a fallback. If that is the case, why bring up this strange average bandwidth comparison?

      • slhck 7 hours ago

        Yeah it's a weird comparison to be making. It all depends on how they selected the quality (VMAF) target during encoding. You couple easily end up with other results had they, say, decided to keep the bandwidth but improve quality using AV1.

  • dylan604 18 hours ago

    definitely reads like "you're holding it wrong" to me as well

_puk 13 hours ago

I imagine that's a big part of the drive behind discontinuing Chromecast support..

https://www.androidcentral.com/streaming-tv/chromecast/netfl...

  • StrLght 12 hours ago

    I doubt that. Netflix has an app on TVs as old as 8-10 years now. SoCs in such TVs aren't enough to decode AV1. They're stuck with H.264 for a long time.

  • kotaKat 2 hours ago

    Nah, that's more "we can't get ad injection working on the old Chromecast client" because it still works on early Chromecasts for ad-free plans.

vitorgrs 12 hours ago

Weirdly, Netflix on my Samsung TV it's been a few months it's using only H264. Not AV1. When they first launched AV1, it worked there...

Honestly not complaining, because they were using AV1 with 800-900~kbps for 1080p content, which is clearly not enough compared to their 6Mbps h.264 bitrate.

  • SG- 8 hours ago

    they may have determined the decoding of av1 was too poor or that software decoding av1 wasn't a good idea.

ls612 18 hours ago

On a related note, why are release groups not putting out AV1 WEB-DLs? Most 4K stuff is h265 now but if AV1 is supplied without re-encoding surely that would be better?

  • avidiax 18 hours ago

    I looked into this before, and the short answer is that release groups would be allowed to release in AV1, but the market seems to prefer H264 and H265 because of compatibility and release speed. Encoding AV1 to an archival quality takes too long, reduces playback compatibility, and doesn't save that much space.

    There also are no scene rules for AV1, only for H265 [1]

    [1] https://scenerules.org/html/2020_X265.html

    • MaxL93 11 hours ago

      AV1 is the king of ultra-low bitrates, but as you go higher — and not even that much higher — HEVC becomes just as good, if not more. Publicly-available AV1 encoders (still) have a tendency to over-flatten anything that is low-contrast enough, while x265 is much better at preserving visual energy.

      This problem is only just now starting to get solved in SVT-AV1 with the addition of community-created psychovisual optimizations... features that x264 had over 15 years ago!

    • aidenn0 16 hours ago

      I'm surprised it took so long for CRF to dethrone 2-pass. We used to use 2-pass primarily so that files could be made to fit on CDs.

    • breve 17 hours ago

      > Encoding AV1 to an archival quality takes too long

      With the SVT-AV1 encoder you can achieve better quality in less time versus the x265 encoder. You just have to use the right presets. See the encoding results section:

      https://www.spiedigitallibrary.org/conference-proceedings-of...

    • ls612 18 hours ago

      Yeah I’m talking about web-dl though not a rip so there is no encoding necessary.

  • chrisfosterelli 18 hours ago

    Player compatibility. Netflix can use AV1 and send it to the devices that support it while sending H265 to those that don't. A release group puts out AV1 and a good chunk of users start avoiding their releases because they can't figure out why it doesn't play (or plays poorly).

  • mrbluecoat 16 hours ago

    h.264 has near-universal device support and almost no playback issues at the expensive of slightly larger file sizes. h.265 and av1 give you 10-bit 4K but playback on even modest laptops can become choppy or produce render artifacts. I tried all three, desperately wanting av1 to win but Jellyfin on a small streaming server just couldn't keep up.

  • aidenn0 16 hours ago

    I'm not in the scene anymore, but for my own personal encoding, at higher quality settings, AV1 (rav1e or SVT; AOM was crazy slow) doesn't significantly beat out x265 for most sources.

    FGS makes a huge difference at moderately high bitrates for movies that are very grainy, but many people seem to really not want it for HQ sources (see sibling comments). With FGS off, it's hard to find any sources that benefit at bitrates that you will torrent rather than stream.

  • Dwedit 18 hours ago

    Because pirates are unaffected by the patent situation with H.265.

    • homebrewer 13 hours ago

      Everyone is affected by that mess, did you miss the recent news about Dell and HP dropping HEVC support in hardware they have already shipped? Encoders might not care about legal purity of the encoding process, but they do have to care about how it's going to be decoded. I like using proper software to view my videos, but it's a rarity afaik.

    • ls612 18 hours ago

      But isn’t AV1 just better than h.265 now regardless of the patents? The only downside is limited compatibility.

      • bubblethink 16 hours ago

        HW support for av1 is still behind h265. There's a lot of 5-10 year old hw that can play h265 but not av1. Second, there is also a split bw Dovi and HDR(+). Is av1 + Dovi a thing? Blu rays are obviously h265. Overall, h265 is the common denominator for all UHD content.

        • HelloUsername 11 hours ago

          > Blu rays are obviously h265

          Most new UHD, yes, but otherwise BRD primarily use h264/avc

      • BlaDeKke 17 hours ago

        Encoding my 40TB library to AV1 with software encoding without losing quality would take more then a year of not multiple years, consume lots of power while doing this, to save a little bit of storage. Granted, after a year of non stop encoding I would save a few TB of space. But it think it is cheaper to buy a new 20TB hard drive than the electricity used for the encoding.

      • phantasmish 17 hours ago

        I avoid av1 downloads when possible because I don’t want to have to figure out how to disable film grain synthesis and then deal with whatever damage that causes to apparent quality on a video that was encoded with it in mind. Like I just don’t want any encoding that supports that, if I can stay away from it.

        • coppsilgold 16 hours ago

          In MPV it's just "F1 vf toggle format:film-grain=no" in the input config. And I prefer AV1 because of this, almost everything looks better without that noise.

          You can also include "vf=format:film-grain=no" in the config itself to start with no film grain by default.

          • phantasmish 16 hours ago

            I watch almost everything in Infuse on Apple TV or in my browser, though.

        • adgjlsfhk1 16 hours ago

          What's wrong with film grain synthesis? Most film grain in modern films is "fake" anyway (The modern VFX pipeline first removes grain, then adds effects, and lastly re-adds fake grain), so instead of forcing the codec to try to compress lots of noise (and end up blurring lots of it away), we can just have the codec encode the noisless version and put the noise on after.

          • phantasmish 16 hours ago

            I watch a lot of stuff from the first 110ish years of cinema. For the most recent 25, and especially 15… yeah I dunno, maybe, but easier to just avoid it.

            I do sometimes end up with av1 for streaming-only stuff, but most of that looks like shit anyway, so some (more) digital smudging isn’t going to make it much worse.

            • adgjlsfhk1 15 hours ago

              Even for pre-digital era movies, you want film grain. You just want it done right (which not many places do to be fair).

              The problem you see with AV1 streaming isn't the film grain synthesis; it's the bitrate. Netflix is using film grain synthesis to save bandwidth (e.g. 2-5mbps for 1080p, ~20mbps for 4k), 4k bluray is closer to 100mbps.

              If the AV1+FGS is given anywhere close to comparable bitrate to other codecs (especially if it's encoding from a non-compressed source like a high res film scan), it will absolutely demolish a codec that doesn't have FGS on both bitrate and detail. The tech is just getting a bad rap because Netflix is aiming for minimal cost to deliver good enough rather than maximal quality.

        • Wowfunhappy 16 hours ago

          With HEVC you just don't have the option to disable film grain because it's burned into the video stream.

          • phantasmish 16 hours ago

            I’m not looking to disable film grain, if it’s part of the source.

            • Mashimo 12 hours ago

              Does AV1 add it if it's not part of the source?

              • phantasmish 6 hours ago

                I dunno, but if there is grain in the source it may erase it (discarding information) then invent new grain (noise) later.

                • Wowfunhappy 5 hours ago

                  I'm skeptical of this (I think they avoid adding grain to the AV1 stream which they add to the other streams--of course all grain is artificial in modern times), but even if true--like, all grain is noise! It's random noise from the sensor. There's nothing magical about it.

                  • phantasmish 4 hours ago

                    The grain’s got randomness because distribution and size of grains is random, but it’s not noise, it’s the “resolution limit” (if you will) of the picture itself. The whole picture is grain. The film is grain. Displaying that is accurately displaying the picture. Erasing it for compression’s sake is tossing out information, and adding it back later is just an effect to add noise.

                    I’m ok with that for things where I don’t care that much about how it looks (do I give a shit if I lose just a little detail on Happy Gilmore? Probably not) and agree that faking the grain probably gets you a closer look to the original if you’re gonna erase the grain for better compression, but if I want actual high quality for a film source then faked grain is no good, since if you’re having to fake it you definitely already sacrificed a lot of picture quality (because, again, the grain is the picture, you only get rid of it by discarding information from the picture)

  • Drybones 10 hours ago

    Smaller PT sites usually allow it

    Bigger PT sites with strict rules do not allow it yet and are actively discussing/debating it.Netflix Web-DLs being AV1 is definitely pushing that. The codec has to be a select-able option during upload.

  • hapticmonkey 14 hours ago

    I've seen some on private sites. My guess is they are not popular enough yet. Or pirates are using specific hardware to bypass Widevine encryption (like an Nvidia Shield and burning keys periodically) that doesn't easily get the AV1 streams.

  • qingcharles 12 hours ago

    I'm seeing releases pop up on Pirate Bay with AV1 this year.

ksec 7 hours ago

Worth a note, H.264 High Profile is patent free in most countries and soon be patent free too in US.

  • sylware 7 hours ago

    Isn't AV1 on the level of H.265? And are H.265 and the future H.266(will face the upcoming av2) free of charge forever and wherever like av[12]?

    They could do the Big Tech way: make it all 'free' for a good while, estinguish/calm down any serious competition, then make them not 'free' anymore.

    In the end, you cannot trust them.

    • anordal 3 hours ago

      Absolutely not.

      I wish everyone knew the difference between patents and copyright.

      You can download an open source HEVC codec, and use it for all they care according to their copyright. But! You also owe MPEG-LA 0.2 USD if you want to use it, not to mention an undisclosed sum to actors like HEVC Advance and all the other patent owners I don't remember, because they have their own terms, and it's not their problem that you compiled an open source implementation.

    • adzm 3 hours ago

      VP9 is more on the level of H265 really. VVC/H266 is closer to AV1. It's not an exact comparison but it is close. The licensing is just awful for VVC similar to HEVC and now that AV1 has proved itself everyone is pivoting away from VVC/h266 especially on the consumer side. Pretty much all VVC adoption is entirely internal (studios, set top boxes, etc) and it is not used by any major consumer streaming service afaik.

      • ksec 22 minutes ago

        I guess most people forgot about x264 dark shikari's post already.

        VP9 isn't H.265 level. That is the marketing spin of AOM. And even AOM members admit VVC is better than AV1.

        Liking one codec or whether it is royalty free is one thing, whether it is performing better is another thing.

conartist6 17 hours ago

For a second there I wasn't looking very close and I thought it said that 30% of Netflix was running on .AVI files

philipallstar 7 hours ago

This is a great result from Google, Netflix, Cisco, etc.

nrhrjrjrjtntbt 13 hours ago

> At Netflix, our top priority is delivering the best possible entertainment experience to our members.

I dont think that is true of any streamers. Otherwise they wouldnt provide the UI equivalent of a shopping centre that tries to get you lost and unable to find your way out.

  • sen 13 hours ago

    Or compression that makes a streamed 4K video look worse than a 1080p video played locally.

techpression 10 hours ago

Compression is great and all, but Netflix is overdoing it and their content looks like an over-sharpened mess with lego blocks in high intensity scenes. And no, it's not my connection, Apple TV does it far better and so does Prime.

It's really sad that most people never get to experience a good 4K Blu-ray, where the grain is actually part of the image as mastered and there's enough bitrate to not rely on sharpening.

TMWNN 4 hours ago

The one big hardware deficiency of my Nvidia Shield TV is its lack of YouTube AV1 support.

forgotpwd16 13 hours ago

Am I the only one that thought this is an old article by the title? AV1 is now 10 years old and AV2 has been announced for year-end release few months ago. If anything the news is that AV1 powers only 30% by now. At least HEVC, released about the same time, has gotten quite popular in warez scene (movies/TV/anime) for small encodes, whereas AV1 releases are still considered a rarity. (Though to be fair 30% Netflix & YT means AV1 usage in total is much higher.) Will've expected a royalty-free codec to've been embraced more but seems its difficulty for long time to be played on low power devices hindered its adoption.

testdelacc1 9 hours ago

Something doesn’t quite add up to me. The post says “AV1 powers approximately 30% of all Netflix viewing”. Impressive, but I’m wondering why it isn’t higher? I’m guessing most devices should support AV1 software decoders. 88% of devices in certified in the last 4 years support AV1, all browsers support AV1 software decoding, Netflix apps on Android (since 2021) and iOS (since 2023) obviously do.

So why isn’t it AV1 higher? The post doesn’t say, so we can only speculate. It feels like they’re preferring hardware decoding to software decoding, even if it’s an older codec. If this is true, it would make sense - it’s better for the client’s power and battery consumption.

But then why start work on AV2 before AV1 has even reached a majority of devices? I’m sure they have the answer but they’re not sharing here.

  • jhugo 9 hours ago

    Smart TVs, TV sticks, and a lot of mobile devices will not be capable of decoding AV1 in software in realtime, given their low-spec CPUs. I imagine that Netflix is only serving AV1 to devices with hardware decoding support.

shmerl 17 hours ago

Qualcomm seems to be lagging behind and doesn't have AV1 decoder except in high end SoCs.

notatoad 16 hours ago

I understand that sometimes the HN titles get edited to be less descriptive and more generic in order to match the actual article title.

What’s the logic with changing the title here from the actual article title it was originally submitted with “AV1 — Now Powering 30% of Netflix Streaming” to the generic and not at all representative title it currently has “AV1: a modern open codec”? That is neither the article title nor representative of the article content.

  • tomhow 13 hours ago

    OK guys, my screwup.

    We generally try to remove numbers from titles, because numbers tend to make a title more baity than it would otherwise be, and quite often (e.g., when reporting benchmark test results) a number is cherry-picked or dialed up for maximum baitiness. In this case, the number isn't exaggerated, but any number tends to grab the eye more than words, so it's just our convention to remove number-based titles where we can.

    The thing with this title is that the number isn't primarily what the article is about, and in fact it under-sells what the article really is, which is a quite-interesting narrative of Netflix's journey from H.264/AVC, to the initial adoption of AV1 on Android in 2020, to where it is now: 30% adoption across the board.

    When we assess that an article's original title is baity or misleading, we try to find a subtitle or a verbatim sentence in the article that is sufficiently representative of the content.

    The title I chose is a subtitle, but I didn't take enough care to ensure it was adequately representative. I've now chosen a different subtitle which I do think is the most accurate representation of what the whole article is about.

  • pants2 16 hours ago

    Though in the original title AV1 could be anything if you don't know it's a codec. How about:

    "AV1 open video codec now powers 30% of Netflix viewing, adds HDR10+ and film grain synthesis"

    • nerdsniper 15 hours ago

      AV1 is fine as-is. Plenty of technical titles on HN would need to be googled if you didn't know it. Even in yours, HDR10+ "could be anything if you don't know it". Play this game if you want, but it's unwindable. The only people who care about AV1 already know what it is.

      • pants2 15 hours ago

        Well, I'm interested in AV1 as a videographer but hadn't heard of it before. Without 'codec' in the title I would have thought it was networking related.

        Re: HDR - not the same thing. HDR has been around for decades and every TV in every electronics store blasts you with HDR10 demos. It's well known. AV1 is extremely niche and deserves 2 words to describe it.

        • cyphar 14 hours ago

          AV1 has been around for a decade (well, it was released 7 years ago but the Alliance for Open Media was formed a decade ago).

          It's fine that you haven't heard of it before (you're one of today's lucky 10,000!) but it really isn't that niche. YouTube and Netflix (from TFA) also started switching to AV1 several years ago, so I would expect it to have similar name recognition to VP9 or WebM at this point. My only interaction with video codecs is having to futz around with ffmpeg to get stuff to play on my TV, and I heard about AV1 a year or two before it was published.

        • edoceo 14 hours ago

          I'm old (50) and have heard AV1 before. My modern TV didn't say HDR or HDR10 (it did say 4k). Agree that AV1 should include "codec".

          One word, or acronym, just isn't enough to describe anything on this modern world.

    • lII1lIlI11ll 13 hours ago

      > Though in the original title AV1 could be anything if you don't know it's a codec.

      I'm not trying to be elitist, but this is "Hacker News", not CNN or BBC. It should be safe to assume some level of computer literacy.

      • averageRoyalty 13 hours ago

        Knowledge of all available codecs is certainly not the same tier as basic computer literacy. I agree it doesn't need to be dumbed down to the general user, but we also shouldn't assume everyone here know every technical abbreviation.

    • efitz 14 hours ago

      The article barely mentioned “open”, and certainly gave no insight as to what “open” actually means wrt AV1.

  • VerifiedReports 16 hours ago

    Amen. The mania for obscurity in titles here is infuriating. This one is actually replete with information compared to many you see on the front page.

    • tomhow 10 hours ago

      If there really was a “mania for obscurity in titles” we’d see a lot more complaints than we do.

      Our title policy is pretty simple and attuned for maximum respect to the post’s author/publisher and the HN audience.

      We primarily just want to retain the title that was chosen by the author/publisher, because it’s their work and they are entitled to have such an important part of their work preserved.

      The only caveat is that if the title is baity or misleading, we’ll edit it, but only enough that it’s no longer baity or misleading. That’s because clickbait and misleading titles are disrespectful to the audience.

      Any time you see a title that doesn’t conform to these principles, you’re welcome to email us and ask us to review it. Several helpful HN users do this routinely.

    • CyberDildonics 14 hours ago

      hacker news loves low information click bait titles. The shorter and more vague the better.

  • cortesoft 15 hours ago

    It is usually Dang using his judgment.

    • big-and-small 13 hours ago

      I really like moderation on HN in general, but honestly this inconsistent policy of editorializing titles is bad. There were plenty of times where submitter editorialized titles (e.g GitHub code dumps of some project) were changed back to useless and vague (without context) original titles.

      And now HN administration tend to editorialize in their own way.

  • wltr 14 hours ago

    For me that’s a FU moment that reminds me ‘TF am I doing here?’ I genuinely see this resource as a censoring plus advertising (both for YC, obviously) platform, where there are generic things, but also things someone doesn’t want you to read or know. The titles are constantly being changed to gibberish like right here, the adequate comments or posts are being dead, yet the absolutely irrelevant or offensive things, can stay not touched. Etc.

  • 7e 16 hours ago

    Also, it’s not the whole picture. AV1 is open because it didn’t have the good stuff (newly patented things) and as such I also wouldn’t say it’s the most modern.

    • adgjlsfhk1 15 hours ago

      AV1 has plenty of good stuff. AOM (the agency that developed AV1) has a patent pool https://www.stout.com/en/insights/article/sj17-the-alliance-... comprising of video hardware/software patents from Netflix, Google, Nvidia, Arm, Intel, Microsoft, Amazon and a bunch of other companies. AV1 has a bunch of patents covering it, but also has a guarantee that you're allowed to use those patents as you see fit (as long as you don't sue AOM members for violating media patents).

      AV1 definitely is missing some techniques patented by h264 and h265, but AV2 is coming around now that all the h264 innovations are patent free (and now that there's been another decade of research into new cutting edge techniques for it).

    • bawolff 16 hours ago

      Just because something is patented doesn't necessarily mean its good. I think head to head comparisons matter more. (Admittedly i dont know how av1 holds up)

      • parl_match 16 hours ago

        Yes, but in this case, it does.

        AV1 is good enough that the cost of not licensing might outweigh the cost of higher bandwidth. And it sounds like Netflix agrees with that.

endorphine 16 hours ago

Is it me or this post has LLM vibes?

kvirani 18 hours ago

Top post without a single comment and only 29 points. Clearly my mental model of how posts bubble to the top is broken.

  • yjftsjthsd-h 18 hours ago

    IIRC, there's a time/recency factor. If we assume that most people don't browse /newest (without commenting on should, I suspect this is true), then that seems like a reasonable way to help surface things; enough upvotes to indicate interest means a story gets a chance at the front page.