What You Need to Know Before Touching a Video File

(gist.github.com)

106 points | by qbow883 5 days ago

15 comments

  • embedding-shape 2 hours ago
    It seems really weirdly written. It's written with a lot of authority, like saying "Don't use VLC" and "Don't use Y" yet provides no reasoning for those things. Just putting "Trust me, just don't" doesn't suddenly mean I trust the author more, it probably has the opposite effect. Some sections seem to differ based on if the reader knows/doesn't know something, but I thought the article was supposed to be for the latter.

    Would have been nice if these "MUST KNOW BEFORE" advises were structured in a way so one could easily come back and use it as a reference, like just a list, but instead it's like a over-dinner conversation with your "expert and correct but socially-annoying" work colleague, who refuses to elaborate on the how's and why's, but still have very strong opinions.

    • Etheryte 1 hour ago
      Exactly, very hard to take the rest of it seriously after the VLC bit. VLC has literally never left me hanging, across I don't know how many decades. It's gonna take more than a trust me bro to challenge that.
      • ffsm8 23 minutes ago
        You're talking about VLC for video playback, TFA is taking about video editing.

        VLC ignores a lot for it's outstanding video playback support, which is great if you want the playback too just work... But that's the player perspective, not the editing/encoding

      • ErroneousBosh 28 minutes ago
        VLC is great for playing stuff back, but can produce some horribly incorrect video files especially if you're dealing with stuff for editing.

        There's a reason why VLC isn't used in broadcast stuff and ffmpeg is.

      • MallocVoidstar 1 hour ago
        IIRC VLC used the wrong primaries for converting to RGB for a long time (years) even after it being reported to them as wrong
    • ramesh31 2 hours ago
      technically correct is the best kind. who cares if it's obnoxious? take the opinions and agree or disagree with them.
      • dylan604 1 hour ago
        How do you know it is technically correct without explanation. It's not much different from someone getting blown off for being annoying because they constantly question simple answers when seeking better understanding. I was fortunate to work with a group of engineers when I was very young that accepted my constant use of "why?" not as disrespectful questioning but realized I was actually learning so they naturally just provided more details leading to less "why?" being asked. This eventually got to the point where I would ask a question, and the answer would be to read a specific book on the shelf. This was way before the internet. I received a better education on the job than I ever was going to get in school.

        So no, I'm not just going to take an opinion without more information. I don't change my mind just on say so.

      • yaur 7 minutes ago
        When we switched from x264 to hardware based encoders it saved something like 90% on our customers' power and cooling bills.

        So while this essay might be "technically correct" in some very narrow sense the author is speaking with far more authority than they have the experience to justify, which is what makes it obnoxious in the first place.

      • snakeboy 1 hour ago
        It works if you know the person and have a baseline for how much confidence you give their opinions. If it's just a random person on the internet, they need to support their argument.
  • EdNutting 2 hours ago
    Interesting read, it’s a shame the ranty format makes it 3x longer than necessary.

    Not sure why it takes a dump on VLC - it’s been the most stable and friendly video player for Windows for a long time (it matters that ordinary users, like school teachers, can use it without special training. I don’t care how ideological you are about Linux or video players or whatever lol).

    • jamesnorden 1 hour ago
      I don't believe it's the case anymore, but it was very common for VLC to cause video corruption (see [1] for example of what it looked like) in the past, the hate just stuck around and I don't think it's ever going away.

      [1] https://www.reddit.com/r/glitch_art/comments/144vjl/vlc_star...

      • EdNutting 1 hour ago
        13 years since that post and this is the first time I’m hearing of this long-past issue.

        Haters gonna hate I guess.

      • nickthegreek 57 minutes ago
        It has never been very common for VLC to cause video corruption.
    • throwaway2046 2 hours ago
      VLC works great on Linux too! It's one of the few programs where I expect the exact same look and feel regardless of the underlying OS.

      mpv is okay but its complete reliance on command line flags and manually written config files makes it a bore.

      • embedding-shape 2 hours ago
        > where I expect the exact same look and feel regardless of the underlying OS

        Slightly ironic, as I think a new UI is underway (and coming soon?). Not sure what version it's planned for, but I think some beta has it enabled by default already, was surprised when I saw it. So the consistent UI is here today, and will be in the future, but there will be a slice of time where different users will run different versions where some switched to the new UI, and some haven't. But it'll hopefully be a brief period, and of course it's still cross-platform :)

      • binarygit 21 minutes ago
        VLC is pretty much one of the default things I download on any of my computers. Right now I use mac and it's my default video player here too!
    • howenterprisey 2 hours ago
      In the anime fan subbing community (which this document is likely from), it's very common to hate on VLC for a variety of imagined (and occasionally real but marginal) issues.
      • bcye 1 hour ago
        Why is that?
        • amlib 37 minutes ago
          At least for the real part there was the great 10-bit encoding "switch off" at around 2012 where it seemed like the whole anime encoding scene decided to move into encoding just about everything with "10-bit h264" in order to preserve more detail at the same bitrate. VLC didn't have support for it and for a long time (+5 years?) it remained without proper support for that. Every time you tried playing such files they would exhibit corruption at some interval. It was like watching a scrambled cable channel with brief moments of respite.

          The kicker is that many, many other players broke. Very few hardware decoders could deal with this format, so it was fairly common to get dropped frames due to software decoding fallback even if your device or player could play it. And, about devices, if you were previously playing h264 anime stuff on your nice pre-smart tv, forget about doing so with the 10-bit stuff.

          Years passed and most players could deal with 10-bit encoding, people bought newer devices that could hardware decode it and so on, but afaik VLC remained incompatible a while longer.

          Eventually it all became mutt because the anime scene switched to h265...

        • zdw 35 minutes ago
          Mostly that VLC has had noticeable issues with displaying some kinds of subtitles made with Advanced SubStation (especially ones taking up much of the frame, or that pan/zoom), which MPV-based players handle better.

          If you want a MPV-based player GUI on macOS, https://github.com/iina/iina is quite good.

    • EdNutting 1 hour ago
      Follow-up comment: I love how the author’s one brief take-down shot at VLC is currently the dominant criticism in the HN comments (inc. mine). 10,000+ words and the entire lot is being questioned because of one dubious throwaway comment about VLC.

      A lesson to learn in that.

      Lol

  • weinzierl 53 minutes ago
    "Don't use Topaz AI, Anime4k, RealESRGAN, RIFE, etc. Trust me, just don't."

    Why? I only know Topaz and I always thought it had its narrow but legitimate uses cases for upscaling and equalizing quality?

    • perching_aix 39 minutes ago
      Can't mind read the guy obviously, but the usual motivation that I'm aware of is that you pretty much fuck over everyone else that comes later. Upscalers improve over time, but in terms of distribution, recency bias is strong and visual treats are inviting. So when those much better upscalers eventually come around, what's more likely to still be available is the secondary source you distributed, which is already upscaled once with a then-inferior upscaler. This leads to a form of generational rot.

      Other likely explanations are:

      - them not liking how these upscalers look: you can imagine if they can nitpick minor differences between different encodes that most people don't notice, they'll hate the glaring artifacts these filters usually produce

      - boycotting AI

    • ErroneousBosh 25 minutes ago
      Topaz looks bloody awful. Instead of big blocky upscaled pixels you've got weird artifacty "oil painting effect" smeary blobs.
    • echelon 49 minutes ago
      If someone has anti-AI opinions and spends zero effort explaining their position, I assume they have "AI Derangement" [1], a hate/fear perpetrated by the Big Scare media and podcasters seeking like/subscribes.

      You can spend a tremendous amount of time using these tools to accomplish pretty stunning results already:

      - https://www.youtube.com/watch?v=Tii9uF0nAx4 - Made by a film school grad as a demo of real filmmaking combined with AI VFX.

      - https://www.youtube.com/watch?v=FAQWRBCt_5E - Created by a Hollywood TV writer for an FX show you've probably seen. Not the best animation or voicing, but you can see how it gives a writer more than just a blank page to convey their thoughts.

      - https://www.youtube.com/watch?v=wWZYP5jn5w4 - Music video. Slightly MAGA-coded, but made by a Hollywood VFX person.

      - https://www.youtube.com/watch?v=tAAiiKteM-U - Made by a film school grad as a Robot Chicken homage. If you're going to tell them "don't use AI", then are you going to get them a job at Disney? Also, all the pieces are hand-rotoscoped, the mouth animations are hand-animated, and every voice is from a hired (and paid) voice actor.

      - https://www.youtube.com/watch?v=H4NFXGMuwpY - Made by a film school grad as a Robot Chicken homage. See previous comment.

      - https://www.youtube.com/watch?v=d_KXYpaTe_8 - Another slightly MAGA-coded music video. Made by the same Hollywood VFX person.

      - https://www.youtube.com/watch?v=9hlx5Rslrzk - Amazing Spider-Man vs. Carnage anime created with ComfyUI and other models.

      - https://www.youtube.com/watch?v=oqoCWdOwr2U - Christmas Grinch anime.

      - https://www.youtube.com/watch?v=uKYeDIiqiHs - Totally 100% cursed. Made by a teenager following the comic book's plot. Instead of this teenager spending 100 hours on Fortnite, they made this.

      - https://www.youtube.com/watch?v=Ps5Dhc3Lh8U - A Pixar-like short film

      [1] Creators and artists using AI have been harassed, doxxed, sent death threats, name-called, sent shock images, etc. by anti-AI folks at such volume, that we've become bitter about these people. This is literally just a new tool in the tool box. We shouldn't be treated like this.

      • cnntth 33 minutes ago
        OP makes zero comments about content generation, and the complaint is about upscaling introducing artifacts not in the original source. No different than hating a bad 4k remaster / sharpening.
  • happytoexplain 1 hour ago
    I'm always amazed when I see how many people are unfamiliar with VLC hate. It was notorious (to the point of it being a popular meme topic) for video artifacts, slow/buggy seeking, bloated/clumsy UI/menus, having very little format support out of the box, and buggy subtitles. I assume nowadays it's much better, since it seems popular, but its reputation will stick with me forever.
    • gruez 1 hour ago
      >It was notorious (to the point of it being a popular meme topic) for [...] having very little format support out of the box

      ???

      I thought the meme was that it played basically everything? At least compared to windows media player or whatever.

      The other items I can't say I've noticed, but then again I only play the most common of files (eg. h.264/h.265 with english subtitles in a mkv) so maybe it's something that only happens with unusual formats/encodes.

      edit: based on other comments (eg. https://news.ycombinator.com/item?id=46465349), it looks like it might indeed be caused by uncommon files that I haven't encountered.

    • jlarocco 1 hour ago
      I've never had problems with VLC, and I've used it off and on for 20 years.

      I don't doubt that there's some obscure, elite videophile hate towards it, but I'm hardly going to stop using it because a few random internet strangers hate on it.

      • perching_aix 49 minutes ago
        Kinda the problem with anecdotes isn't it? :)

        My own anecdotal experience with VLC was that while every update fixed something, they also broke something in return - and these updates were common. This got annoying enough at some point for me to hop ships, and so I switched to mpc-hc and never looked back.

        I've since also tried (and keep trying) the to-me still newfangled mpv, but I'm not a fan of the GUI or the keybinds. I know it can be customized, but that's not something I'm interested in doing. I know there are alternative frontends as well, but I checked and they're not to my liking either. So I mostly just use it for when mpc-hc gives out, such as really badly broken media files, or anything HDR.

        • jlarocco 46 minutes ago
          And you think everybody else should stop using it because you had problems?

          I'll make up my own mind on it.

          • perching_aix 35 minutes ago
            Do you think everyone else should start or continue using it because you never had problems?

            Let's be kind. Clearly not what either of us were thinking or intending to convey.

          • gpvos 37 minutes ago
            Where did perching_aix say that?
    • dooglius 1 hour ago
      What year was this? I don't know there has ever been a normal format it doesn't support, and I think this has been the case for at least 15 years.
      • dotancohen 33 minutes ago
        Up until just last month I had never had a problem with VLC. But I don't pirate content so maybe I just hadn't encountered the problematic files. However, recording voice notes in Opus format on my phone, it turns out that VLC has a bug playing Opus files at certain bit rates. However for me this is easily worked around by just using MPV.
      • u_sama 1 hour ago
        I dropped VLC circa 2019 for all the reasons mentioned and ever since I use exclusively MPV, both on Windows and Linux.

        So at least from those times

    • EdNutting 1 hour ago
      For a long time it was the only graphical user-friendly option for non-technical Windows users that had decent support for a wide range of formats. I don’t know about its early years, but friends, family and I have been using it for a good 15+ years without encountering the issues folks are describing in these comments.

      It seems there’s a lot of open-source lovers that haven’t also accepted that bugs can get fixed, projects can improve, etc. They’d rather treat a project as though it was stuck at version 0 from 20 something years ago. Deeply ironic.

    • miladyincontrol 1 hour ago
      Agree. Never mind how far they were behind on the more power user options like scaling, dealing with mismatches in video framerate and monitor refresh rate, etc.

      Havent used it in ages, but a decade ago it felt a joke for all the video artifacts and subtitle glitches.

      The one part that does get me some about people who blindly still praise it as THE video player at least outside of more technically inclined spaces like this, is so many people assume it exists as some monolith. Clearly library free, entirely the original work of VideoLAN, gracious they be that they give it all away for free.

    • gpvos 39 minutes ago
      Do you have sources for that? As far as I know VLC has actually always been famous for supporting basically every format.
  • craftkiller 40 minutes ago
    Something I've never been able to find satisfactory information on (and unfortunately this article also declares it out of scope), is what is the actual hard on-the-wire and on-disk differences between SDR and HDR? Like yes, I know HDR = high dynamic range = bigger difference between light and dark, but what technical changes were needed to accomplish this?

    The way I understand it, we've got the YCbCr that is being converted to an RGB value which directly corresponds to how bright we drive the R, G, and B subpixels. So wouldn't the entire range already be available? As in, post-conversion to RGB you've got 256 levels for each channel which can be anywhere from 0 to 255 or 0% to 100%? We could go to 10-bit color which would then give you finer control with 1024 levels per channel instead of 256, but you still have the same range of 0% to 100%. Does the YCbCr -> RGB conversion not use the full 0-255 range in RGB?

    Naturally, we can stick brighter backlights in our monitors to make the difference between light and dark more significant, but that wouldn't change the on-disk or on-the-wire formats. Those formats have changed (video files are specifically HDR or SDR and operating systems need to support HDR to drive HDR monitors), so clearly I am missing something but all of my searches only find people comparing the final image without digging into the technical details behind the shift. Anyone care to explain or have links to a good source of information on the topic?

    • mafuyu 9 minutes ago
      The keywords you're missing are color spaces and gamma curves. For a given bandwidth, we want to efficiently allocate color encoding as well as brightness (logarithmically to capture the huge dynamic range of perceptible light). sRGB is one such standard that we've all agreed upon, and output devices all ostensibly shoot for the sRGB target, but may also interpret the signal however they'd like. This is inevitable, to account for the fact that not all output devices are equally capable. HDR is another set of standards that aims to expand the dynamic range, while also pinning those values to actual real-life brightness values. But again, TVs and such may interpret those signals in wildly different ways, as evidenced by the wide range of TVs that claim to have "HDR" support.

      This was probably not the most accurate explanation, but hopefully it's enough to point you in the right direction.

    • memoriuaysj 4 minutes ago
      YCrCb can also be better for HDR or not - 4:2:2 vs 4:4:4

      if you expand limited YCrCb to a large HDR range you'll get a "blurred" output.

      Imaging converting 1 bit image (0 or 1, black or white pixel) to full range HDR RGB - it's still black and white

    • dylan604 31 minutes ago
      > Naturally, we can stick brighter backlights in our monitors to make the difference between light and dark more significant,

      It's actually the opposite that makes the biggest difference with the physical monitor. CRTs always had a residual glow that caused blacks to be grays. It was very hard to get true black on a CRT unless it was off and had been for some time. It wasn't until you could actually have no light from a pixel where black was actually black.

      Sony did a demo when they released their OLED monitors where they had the top of each monitor type side by side: CRT, LCD, OLED. The CRT was just gray while the OLED was actually black. To the point that I was thinking in my head that surely this is a joke and the OLED wasn't actually on. That's precisely when the narrator said "and just to show that the monitors are all on" as the video switched to a test pattern.

      As for the true question you're getting at, TFA mentions things like color matrix, primaries, and transfer settings in the file. Depending on the values, the decoder makes decision on the math used to calculate the values. You can use any of the values on the same video and arrive at different results. Using the wrong ones will make your video look bad, so ensuring your file has the correct values is important.

      From TFA: https://gist.github.com/arch1t3cht/b5b9552633567fa7658deee5a...

    • fenwick67 15 minutes ago
      here you go

      > 10 bits per sample Rec. 2020 uses video levels where the black level is defined as code 64 and the nominal peak is defined as code 940. Codes 0–3 and 1,020–1,023 are used for the timing reference. Codes 4 through 63 provide video data below the black level while codes 941 through 1,019 provide video data above the nominal peak.

      https://en.wikipedia.org/wiki/Rec._2020

      Compare to

      https://en.wikipedia.org/wiki/Rec._709

  • jokoon 1 hour ago
    I wish he talked about avidemux.

    It's a simple tool which is great for many things, it has filters and there are most of the formats. I think it uses ffmpeg under the hood.

    It's an old tool but it's fine for most things, when ffmpeg is to fastidious to use. ffmpeg is still what I use, but some more complex tasks are just more comfortable with avidemux.

    • dylan604 23 minutes ago
      Some of the simple tools might not be using ffmpeg per se, but using the libav or similar libraries. ffmpeg is just a tool built to utilize the functionality of multiple libraries like this.
    • globnomulous 24 minutes ago
      > to fastidious

      Do you mean "too fussy?"

      • dylan604 22 minutes ago
        using a sledge to drive a finishing nail? yes, it's still a hammer and it's still a nail, but still the wrong tool for the job
  • weinzierl 32 minutes ago
    The article talks about image comparisons but does not say what the best way to extract an image is.

    If I want the best possible quality image at a precisely specified time, what would I do?

    Can I increase quality if I have some leeway regarding the time (to use the closest keyframe)?

    Is there a way to "undo" motion blur and get a sharp picture?

    • latexr 9 minutes ago
      I usually use a shortcut in mpv to extract the screenshot. If I want to do it via the command-line:

        ffmpeg -ss 00:00:12.435 -i '/Users/weinzieri/videofile.mp4' -vframes 1 '/Users/weinzieri/image.png'
      
      The means “go to 00:00:12.435 on the file /Users/weinzieri/videofile.mp4 and extract one frame to the file /Users/weinzieri/image.png”.
    • ErroneousBosh 26 minutes ago
      > Is there a way to "undo" motion blur and get a sharp picture?

      Not really, no, any more than there is a way to unblur something that was shot out of focus.

      You can play clever tricks with motion estimation and neural networks but really all you're getting is a prediction of what it might have been like if the data had really been present.

      Once the information is gone, it's gone.

      • weinzierl 20 minutes ago
        If the estimation is good it might be enough for some use cases. Is there any software out there that specializes in this? Similarly to maybe AI colorizing or upscaling, which both guess information that is not there anymore.
  • kwar13 2 hours ago
    Pretty good writeup but not sure why VLC is not recommended...?
  • Jabrov 2 hours ago
    What's wrong with VLC?
    • Waterluvian 2 hours ago
      Making such a bold, unsubstantiated claim is a curious item in an otherwise detailed document. I went looking for other explanations and found this gem: https://www.reddit.com/r/mpv/comments/m1sxjo/it_is_better_mp...

      I think it might be one of those classic “everyone should just get good like me” style opinions you find polluting some subject matter communities.

    • dylan604 1 hour ago
      my biggest pet peeve was that VLC was always considered a streamer and treated local files as streams as well. for the longest time, stepping within the video was not possible. reverse play was also a bane as well, even with i-frame only content. i have long found players that are better for me, but still find myself using VLC frequently because it still has features these other players do not.
  • swiftcoder 2 hours ago
    Really good quickstart guide
    • gruez 1 hour ago
      >Really good quickstart guide

      It really isn't. You have to scroll 75% of the way through the document before you it tells you what to actually type in. Everything before (9000+ words) is just ranty exposition that might be relevant, but is hardly "quick".

      • swiftcoder 49 minutes ago
        Nah, see, I maintain a commercial video platform, and half the battle is people typing things in before they understand what a codec is. Theory first, practice after.
        • dylan604 20 minutes ago
          that's not a quick start guide. not once has the quick start guide with a printer explained the workings of the ink jet nozzle and the ability to precisely control the position of the head. it just says plug it in, hit this button to join wifi, open this app on your device, hit print.
        • wizhi 26 minutes ago
          Do you have any recommendations for literature on the subject of video encoding etc? I really want to learn more theory.
        • fbias 27 minutes ago
          The discussions in this thread are amusing. It’s a pretty great beginner guide. Almost a parallel to “how to ask questions the smart way” applied to videos.
  • g4zj 1 hour ago
    I'm curious what the issue is with using Handbrake? I use it all the time on macOS and it's generally a simple and effective tool for my purposes.
    • xp84 1 hour ago
      Handbrake is fine if you truly need to reencode (aka “transcode”) your video, but if you find yourself with a video that your player can’t read, you might be able to just change the container format (remux it) using ffmpeg, copying the video and audio streams directly across.

      With video there are 3 formats: the video stream itself, the audio stream itself, and the container (only the container is knowable from the extension). Formats could technically be combined in any combination.

      The video stream especially is costly in CPU to encode, and can degrade quality significantly to transcode so it’s just a shame to re-encode if the original codec is usable.

      Container format mkv is notorious for not being supported out of the box on lots of consumer devices, even if they might have codecs for the audio and video streams they typically contain. (It has cool features geeks like, though, but for some reason it gets less support.)

    • SG- 1 hour ago
      the author can't stand how it simply re-encodes videos instead of extracting the video tracks and puts them in new containers.
    • dspillett 1 hour ago
      If you search the page you'll find a reference to having “numerous foot guns”.

      I can't say I've experienced either of the ones mentioned, but I have had trouble in the past with output resolution selection (ending up with a larger file than expected with the encoding resolution much larger than the intended display resolution). User error, of course, but that tab is a bit non-obvious so it might be fair to call it a footgun.

  • perching_aix 1 hour ago
    I've had a lot of misconceptions that I had to contend with over the years myself as well. Maybe this thread is a good opportunity to air the biggest one of those. Additionally, I'll touch on subbing at the end, since the post specifically calls it out.

    My biggest misconception, bar none, was around what a codec is exactly, and how well specified they are. I'd keep hearing downright mythical sounding claims, such as how different hardware and software encoders, and even decoders, produce different quality outputs.

    This sounded absolutely mental to me. I thought that when someone said AVC / H.264, then there was some specification somewhere, that was then implemented, and that's it. I could not for the life of me even begin to fathom where differences in quality might seep in. Chief of this was when somebody claimed using single threaded encoding instead of multi threaded encoding was superior. I legitimately considered I was being messed with, or that the person I was talking to simply didn't know what they were talking about.

    My initial thoughts on this were that okay, maybe there's a specification, and the various codec implementations just "creatively interpret" these. This made intuitive sense to me because "de jure" and "de facto" distinctions are immensely common in the real world, be it for laws, standards, what have you. So I'd start differentiating and going "okay so this is H.264 but <implementation name>". I was pretty happy with this, but eventually, something felt off enough to make me start digging again.

    And then, not even a very long time ago, the mystery unraveled. What the various codec specifications actually describe, and what these codecs actually "are", is the on-disk bitstream format, and how to decode it. Just the decode. Never the encode. This applies to video, image, and sound formats; all lossy media formats. Except for telephony, all these codecs only ever specify the end result and how to decode that, but not the way to get there.

    And so suddenly, the differences between implementations made sense. It isn't that they're flaunting the standard: for the encoding step, there simply isn't one. The various codec implementations are to compete on finding the "best" way to compress information to the same cross-compatibly decode-able bitstream. It is the individual encoders' responsibility to craft a so-called psychovisual or psychoacoustic model, and then build a compute-efficient encoder that can get you the most bang for the buck. This is how you get differences between different hardware and software encoders, and how you can get differences even between single and multi-threaded codepaths of the same encoder. Some of the approaches they chose might simply not work or work well with multi threading.

    One question that escaped me then was how can e.g. "HEVC / H.265" be "more optimal" than "AVC / H.264" if all these standards define is the end result and how to decode that end result. The answer is actually kinda trivial: more features. Literally just more knobs to tweak. These of course introduce some overhead, so the question becomes, can you reliably beat this overhead to achieve parity, or gain efficiency. The OP claims this is not a foregone conclusion, but doesn't substantiate. In my anecdotal experience, it is: parity or even efficiency gain is pretty much guaranteed.

    Finally, I mentioned differences between decoder output quality. That is a bit more boring. It is usually a matter of fault tolerance, and indeed, standards violations, such as supporting a 10 bit format in H.264 when the standard (supposedly, never checked) only specifies 8-bit. And of course, just basic incorrectness / bugs.

    Regarding subbing then, unless you're burning in subs (called hard-subs), all this malarkey about encoding doesn't actually matter. The only thing you really need to know about is subtitle formats and media containers. OP's writing is not really for you.

    • dylan604 10 minutes ago
      I was a DVD programmer for 10 years. There was a defined DVD spec. The problem is that not every DVD device adhered to the spec. Specs contain words like shall/must and other words that can be misinterpreted, and then you have people that build MVP as a product that do not worry about the more advanced portion of the spec.

      As a specific example, the DVD software had a random feature that could be used. There was one brand of player that had a preset list of random numbers so that every time you played a disc that used random, the random would be the exact same every time. This made designing DVD-Video games "interesting" as not all players behaved the same.

      This was when I first became aware that just because there's a spec doesn't mean you can count on the spec being followed in the same way everywhere. As you mentioned, video decoders also play fast and loose with specs. That's why some players cannot decode the 10-bit encodes as that's an "advanced" feature. Some players could not decode all of the profiles/levels a codec could use according to the spec. Apple's QTPlayer could not decode the more advanced profiles/levels just to show that it's not "small" devs making limited decoders.

  • tmaly 2 hours ago
    This is a great write up. Thank you for sharing.
  • webdevver 2 hours ago
    video format world is one where you nope out pretty quick once you realize how many moving pieces there are.

    ffmpeg seems ridiculously complicated, but infact its amazing the amount of work that happens under the hood when you do

        ffmpeg -i input.mp4 output.webm
    
    and tbh theyve made the interface about as smooth as can be given the scope of the problem.
    • dylan604 1 hour ago
      this complication causing people to nope out has made my career. for everyone that decides it is too complicated and is only the realm of experts, my career has been made that much more secure. sadly, i've worked with plenty of video that has clearly been made by someone that should have "noped out"
  • pandemic_region 1 hour ago
    Could have used this in the nineties, where hunting a specific codec to play that video you downloaded off a BBS was an actual thing.
    • cruffle_duffle 34 minutes ago
      Oh man it extended well past the 90’s. Finding some weird windows video codec in a dodgy .ru domain was a time honored tradition for quite some time.

      I remember all the weird repackaged video codec installers that put mystery goo all over them machine.

      The article bashes VLC but I tell you what… VLC plays just about everything you feed it without complaint. Even horribly corrupt files it will attempt to handle. It might not be perfect by any means but it does almost always work.