Categories

Discussion: 10-bit h264

(Pic not related)

Remember the transition from XviD to h264? Well, soon, 10-bit h264 will replace 8-bit h264.

Advantages
Without going into encoder speak, 10-bit h264 can achieve a same level of quality as 8-bit h264, but at a lower filesize.

Disadvantages
10-bit h264 is not widely supported yet.

More info on 10-bit h264 | More | More

This is simply a discussion your opinions of 10-bit h264.

810 comments to Discussion: 10-bit h264

  • Xcalibur

    It seems like the majority of people prefer to watch via their computer. For those of us who use a media player such as the WD TV to watch on their TV Hi10P is a much more scary proposition in that it renders our players utterly worthless from this point on. Even if it were possible to update the players via firmware we all know the companies involved won’t bother. Going from my 50″ tv back to my PC to watch anime again would be….. less than enjoyable.

    I’m all for progress, but to be fair to people who would be completely left out in the cold by this chance I think both an 8-bit and 10-bit version of the MKV encodes should be offered.

    I also have to wonder why, in this age where internet speeds and bandwidth keep improving, filesize continues to be such a major issue with so many people.

    • DmonHiro

      But why don’t you just hook un your 50″ to your PC?

      You are asking for double the encoding work for something you are already getting for free. Do you think that’s fair?

      Also, for the ladt (yeah right) time, the REAL benefit of 10bit is better quality by higher color accuracy. Less banding is what we want. The lower file size is the effect of the higher color accuracy. It is a just a bonus. Sure, it’s a really nice bonus, but the 10bit switch would happen if the size was the same, for the simple fact that it would decrese color banding. How many more times will I have to explain this?

      • Guest1

        Forget it, people won understand that no matter how many times someone says it (and it is sad)

      • Because some of us don’t WANT our PCs connected to our TVs. They are in different rooms and used for different things. Connecting a PC would solve the problem, but my point remains why should I accept that a device I bought which is specifically designed to play this content is now worthless because of one encoding method change?

        A second and equally important point: have you compared picture quality between a PC and a dedicated media device? Media players are basically a blu-ray SOC in a box with some custom firmware for codec support. The scaling and de-interlacing is much better than the software based solutions on the PC. 720p anime encodes look barely passable when viewed on my PC, but they look really good through the WDTV.

        In short I really like watching via the WDTV and would like to remain doing so if possible.

        • Desuwa

          Maybe you should stop using VLC. Most people prefer software scaling because it allows them to choose the algorithm and the settings. In fact, I actively avoid hardware decoding, or processing because it often produces subtly (or not-so-subtly) different output compared to what it should be.

          For instance, despite the ringing inherent in it, I prefer Lanczos for my upscaling because of the sharpness. I have friends who swear by bicubic because it has absolutely no ringing or aliasing.

          While I can’t say I’ve done a side-by-side comparison of a properly configured PC and something pre-assembled like a WDTV, I have compared my PC running DXVA (with tweaked upscaling) against it playing the same video with coreAVC + ffdshow upscaling, and the difference was noticeable. Not massive, especially if you never looked at them side-by-side, but I preferred the software decoded result. Some colours, especially reds, looked… odd through DXVA, at least compared with what they should be.

          It also varies between hardware decoders, some may handle some things better or worse than others. Not to mention some encodes simply being incompatible with hardware decoding (not 10bit, just using settings the hardware doesn’t support).

          In short, software is flexible, offers choice, and does not vary in quality based on what hardware it is run on.

      • Xcalibur

        Also, I do understand the color argument for 10-bit, but I really don’t buy it. This is in the same category as Deep Color support in the HD spec. The color performace in terms of gradients and dithering is better with 10-bit, which reduces banding. This is only true, however, when the original source material was encoded at 10-bit or higher and this is *not* the case in our scenario. Nothing is currently authored or mastered at more than 8-bit, including blu-ray discs. So re-encoding 8-bit to 10-bit only as the effect of reducing filesize because of the superior color sampling efficiency, it can’t add color gradients that were not previously there. If there is reduced banding it could only be because the re-encode technique is helping to mask it through some form of dithering or sub-sampling technique. If it helps, then that is good I suppose, but it can’t add color information that was never there in the first place.

        • Desuwa

          That would be completely true except for the fact that it isn’t, at all.

          Actually I lied, “it can’t add color gradients that were not previously there” is true. The thing is, that’s not what it is doing. It is preserving the ones that are there. If we were trading raw video back and forth, 10bit would be pointless, but with compression you get the unintuitive result that generating more colour data before compressing both results in lower file sizes, as encoders can afford to throw more information away, and better quality, as the resulting video when decoded is much closer to the original video.

          Dithering is used to compensate for not having enough colours. I’m unsure of the exact details, but sometimes encoders apply dithering to their encodes (or something similar, maybe just noise) to mask banding that occurs because of the compression. This helps to preserve the appearance of gradients, at the cost of a lot of data that adds to the file size, though not as much as simply turning up the quality a lot to get the same apparent quality.

          tl;dr: Be sure you know what you’re talking about, especially if you’re claiming the file size and quality benefits, that we can already check for ourselves, are lies.

          • Xcalibur

            “Lies” is not what I said. I said no benefit, not the same meaning at all.

            Also, I agree with you…. I am indeed wrong. That the 10-bit encoding process throws away less color data hadn’t occurred to me for some crazy reason. Your post has been helpful.

    • Anonymous

      Hopefully those boxes will get software updates to allow them to play hi10p, but i wouldn’t hold my breath if I were you.

      You can’t ask for people to hold back because you bought a piece of unco-operative, inflexible hardware. The wonder of computers and the internet is the ability for things to improve on the fly, and every upgrade that can happen should as soon as possible.

      • Xcalibur

        I’m not asking anybody to hold back. Progress is good. So is backwards compatibility, another PC hallmark.

        I’m hoping people will be generous and understanding enough to consider scenarios such as mine and continue to offer something for us for at least a while longer.

        • UE

          Dear god man stop your incessant whining and re-encode your files… As for video quality the image that comes out of my computer well I am just not sure how the hell your system is setup.

    • nagato

      I’m gonna go out on a limb here and guess that even when doki adopts 10-bit they’ll still offer 8-bit as well, the most logical option would be 10-bit 1080p for perfectionists and 8-bit 720p (with 10-bit if 1080p isn’t a good option) for everyone who either doesn’t care or can’t run 10-bit.

      I say this because doki donations go up pretty fast and they’re not an obscure group. It wouldn’t make sense to cater only to the vocal minority who want ONLY 10-bit. They’d lose a lot of people on that and they’re not stupid.

  • c = Em/Bm

    if nobody would pioneered the use of this Hi10P profile, then who would add a support to 10 bit? When doki posted this, 10bit get a direct support from ffv decoder…. FYI, this kind of technology was invented years ago but no one dared to used it yet until Doki popped out the suggestion… My praise for Doki! Be the first to fully adopt this 10bit technology and others will surely follow you guys!

  • martinez

    this thread has running wildly 😛

  • Clannad Man

    Since this discussion is so popular, I’m for having this thread take the place of the forums until the forums return.

    Otherwise, we can continue to have a few posters who will rage & rage for the sake of raging. That’s entertaining too.

    Understand that nothing new is being stated at this point, and that the only likely reason this discussion is being bumped again and again is so that Doki can get an idea of how many users they will gain/lose by making the switch. They understand very well the stability of the software as well as the benefits of switching; they’re simply trying to ascertain whether or not switching now as opposed to in December or next spring will provide a a maximum user bump. I doubt they see the willingness of people upgrading on their computers to be an issue.

    Issues of primary concern are probably:

    1) Will most users be willing to download the new stable version of CCCP [7-30-11] to get 10-bit compatibility? (Highly Probable)

    2) What percentage of users prefer to watch releases on a home media center [such as WD Live Plus] as opposed to directly off of their computer/off of their computer connected to their TV? (Probably a sizable minority)

    3) Assuming issue two is a sizable minority or greater, what is the likelihood of major media players updating to 10-bit support? (Low to Moderate, posing the risk of a major user exodus)

    4) How much effort/resources will need to be put in for replacing iconic releases? [Ex. Will users demand Clannad/Kanon/Air etc. be re-released to coincide with the superior playback format?] (Moderate to High | Possible for iconic releases, highly unlikely for seasonal shows)

    5) How much installation and troubleshoot support is Doki willing to provide and will this support prove to be a distraction to more important duties? (Moderate | Yes)

    The weight of the listed factors will doubtless constitute the basis for Doki’s decision. As much as I am sure they would like to accommodate everyone, ultimately they will choose the path that creates the most venues for future growth; whether that growth stems from casual viewers who demand a highly compatible release or quality/size aficionados remains to be seen.

    My guess is that Doki begins to incorporate 10-bit h264 by Winter. This would give them adequate time to analyze the likelihood of major media center updates as well as giving the impression of ‘starting of a new year with something new & wonderful’. (Fall is a possibility, but is less likely from what I can tell.) I also predict that for their next BD release, they will decide to release 2 versions: an 8-bit and a 10-bit version in 3 different qualities. Why would they do this, and why with a BD release? People tend to save BD quality shows to a HD archive as opposed to seasonal shows which many people watch once and subsequently delete. (Yes, of course there are also many who archive everything they download, but we’re talking about the average user here.) Therefore, tracking the DL numbers for a popular BD release will provide a more accurate representation of ‘user willingness to convert’.

    Why is this? Many people DL lower quality, highly comparable releases, watch the show once or twice, and then delete it. However, for a well done BD show, people are much more likely to permanently save it and opt for the highest quality release. Since 10-bit h264 would be the highest quality release available to them, it is logical to assume most people with compatible software will choose the 10-bit 1080p version over the 8-bit 1080p version. Likewise, those with incompatible 10-bit software will choose the highest quality encode, that being the 1080p 8-bit release. In this way, Doki can determine with minimal error the number of users that can/are willing to use the 10-bit platform over the more common 8-bit platform.

    ‘But Clannad Man, won’t releasing 2 versions of a show be a lot of work? Why can’t they test 10-bit software with 1080p and continue using 8-bit software for 720p?’ The reason two versions of the same quality release should be released is so that numbers are not skewed and then misinterpreted. Releasing only a 10-b 1080p release and a 8-bit 720p release will push all people with incompatible software to 720p, coupling them with those with compatible software who simply want a smaller release. The only accurate way of determining the demand for one format over another is to release both 10-b & 8-b 1080p releases as well as 10-b & 8-b 720p releases and track the DL numbers.

    It would also be best to provide both formats in each periodic release. The first few shows released will have people DL the 10-b version to try it out, though not all of them will stick with it. By the time a BD show’s final episodes are released (or the final batch), the DL numbers should stabilize to the point where they can be considered accurate.

    For the most accurate scientific method to be used, this process would be done over a poll. While a poll would require many users to answer many questions about preferences to resemble accurate results, releasing a BD show as mentioned above would provide exact & accurate numbers to support or discredit the use of future 10-bit releases in both 1080p and 720p format.

  • E = mc^2

    Coalgirls already released in 10 bit format… when Hi10p become popular, I’m very sure that the hardware player provider would support the 10bit because they are forced to do it if everyone uses 10bit…

    this 10bit technology has been around since 2005 and when Doki sugessted it and Coalgirls doing it, I believe other fansubs will consider to use this 10bit as they can deliver same quality in SMALLER file size… for us who live in the country which internet speed is not very fast, that will give us a relieve to watch HD with small file size 😀

    • nagato

      Downloading a coalgirls file is already like downloading from a third-world country anyway ヽ(´ー`)┌

    • 8bit

      >>I’m very sure that the hardware player provider would support the 10bit because they are forced to do it if everyone uses 10bit…

      It’s going to take at least another generation before that happens, at least with set-top boxes like the WDTV or Boxee Box. These things are mostly made to play a wide-variety of video, but with HD content they are usually the happiest when videos are encoded to Blu-Ray spec since it’s a standard. Hi10p currently is not a standard and right now a fringe encoding method. Anime fansubbers love to be on the bleeding-edge.

      AMD and Nvidia could implement hardware 10-bit decoding in their next-gen video cards (or if possible do it via software for the current gen).

  • Glitch

    I personally am very exited about 10-bit becoming more widespread? Why? Because file size is important to me, but so is high quality. Here in Australia we have slow internet connections, and generally also a limit of how much we can download per month. 600MB or so per episode is fine with me – but I want 1080p stuff and with 8-bit the file sizes are just too darn large.
    I have a WDTV, but I only use it for about 2 weeks every year when I visit my family. The subtitle rendering is just too crappy. I’m buying a new laptop soon, and then I can hook that up to the TV, and my WDTV will not be used again.
    So, overall, I think 10-bit is the way to go, especially for BD rips 🙂

  • HaloGuy

    Hi10p plays perfectly with the latest K-lite mega pack. I’m currently using that..no problem at all.
    Also, doing 1080p in Hi10p is surely a splendid idea, as it’ll save a lot of space, giving better quality at the same time. But Doki should reconsider about doing 720p in Hi10p too.
    I mean, admit it, what’s the point in watching the same quality (or sometimes less), in bigger file size? If people whine about codec support, then they should know, there a thing called “update” for every software.
    Since Hi10p has limited support in many codec packs and player, so it’s only a matter of time before it gets full supported. Hopefully, it’ll be within a few months.

  • vrs

    So what in the hell am I supposed to do with my Popcorn Hour A210 that I got to watch anime with? Throw it away in the trash? It won’t handle 10 bit encodes and I don’t belive it can be enabled to do so with a firmware upgrade. Thanks anime fansubbers.

    • HaloGuy

      It’s just that older technology replaces newer ones…that’s universal truth 😛

      • Glitch

        That’s exactly right. Anyone who watches anime must know that formats and technologies will change over time – there’s no use staying with the old formats when better ones are available. It’s better to stay on the forefront of change, methinks. Anyway, Doki has announced now that “TV will remain as is, HD and SD h264. Blu-Ray 1080p is Hi10P, 720p and 480p is h264”.
        I’m happy about this decision because 1080p file sizes will be smaller and I can finally start collecting them.
        I think, though, that a lot of the major fansubbers will be changing for next season, or if not it will be the season after that. Nutbladder has already changed, though they are also making 8-bit available.
        I’m getting a new laptop soon myself 😉 So it doesn’t bother me too much.
        And, no, I’m not super rich so I can afford these things. In fact, I’m a student who lives on government handouts. I’m living off canned soup, bread, vegemite toast, and whatever fruit is on special, so I can afford my new laptop.

    • Index

      Connect ur WDTV to ur laptop/desktop computer?

      😛

  • Jim

    It’s not a matter of Western Digital “adding” support for 10-bit to existing WDTV players (or other manufacturers adding it to their players).

    The chips come from Sigma with 8-bit colour inputs to the processor, and extra codec support can’t be added to existing chips.

    Sigma has to build new 10-bit chips that are designed to handle 10-bit codecs, and then WD and the other manufacturers have to start building new players around the new chips.

    And, since most graphics cards are still only 8-bit, and many LCD panels are only 6-bit and add dithering to approximate 8-bit colour visually, perhaps I’m missing something, but I fail to see the point in reducing the dithering through a 10-bit encode, only to have the graphics card add some back in and then the display panel add even more. It seems that the actual way to prevent dithering is to have an actual 10-bit card hooked to an actual 10-bit display, before you worry about what’s in the file iteself.

  • In order for Hi10P to be run properly I had to remove MPC-HC + CoreAVC, however, installing the latest CCCP, which came with MPC-HC (the entire reason I used CoreAVC in the first place as it seemed to work well together) fixed everything.
    As I have a relative cheap 24″ Full HD monitor (via DVI) I don’t see any difference between 8-bit and 10-bit.
    What do I vote? 10-bit. It’s a step forward. When I get new hardware, this will keep up as well. Above all, same quality (or more or so I have been told) for less hard drive space.

  • nagato

    Thanks, Doki, for leaving 8-bit an option for everything under 1080p. Now the people who want absolute top quality have it and no one else gets alienated. Seems like a no-brainer really, though I’m sure people will still find reasons to complain.

  • Guest1

    i just want to say that some programming groups are working on Custom Firmwares to hardware media players to make them able to play 10bit encodes, when the project start moving i’ll post the links but they are only gonna do for a few hardware players.

  • Fenel

    Ehhh.

    Why so early hi10p, why?

    Yes, many codecs can run now hi10p, but they don’t support DXVA or other hardware decoding methods.
    So – for people with slow CPU, who handle 1080p with DXVA (many HTCP computers) watching 10bit mkv is impossible.

    I know many of us said “then watch 720p”, but it isn’t best solution.

    I think you, Holo, should wait some more time to use 10bit.

    • Guest1

      2GB ram + 1.8 dual core (from the old ones, not the new ones) and 128 grapphic memory is enough to play 10bit 1080p, please dont tell me that the HTPCs you guys talk about are weaker than that 😉

    • Himmelia

      My computer is powered by Intel Core 2 Duo and 9400GT, so it’s not a problem to me. My only problem is my mobo… can’t support 2TB drive since all these drives are 1 years old. So, I’ve been trying to raise funds enough to buy a better mobo.

  • LostLogia4

    If you’ll excuse me, how much the file size reduction by changing from H264 to Hi10P?

  • Angel

    Requirements:
    Monitor 10 bits
    video card support 10 bits(nvidia quadro or amd firepro)

  • max

    I use a WD TV Live.

    My reason for using it: I have a PC that could easily play 2 10bit encoded full hd videos at the same time, but a spent a multiple of what the PC is worth on audio equipment and the one thing I will not do while watching anything on my TV is to bear with a cooler sound in the background. (Actually when it comes to movies I copy the file from my NAS to a flash drive to avoid the hard disc sounds in the edge of the room.)

    I know it’s possible to build cooler-free PCs, but they are always inferior regarding performance compared to PCs using normal air cooling AND far more expensive and in ‘many’ cases far less stable.

    So as long as no affordable (~200 EUR) standalone player (which also does not fail in other functions ^^) supports it I will not download such releases. (Or delete them right away should I download them without noticing the “flaw” beforehand.)

  • 8bit

    For some people who are worried about playing 10-bit/hi10p videos on shiny, plastic toys – it is possible, if you have one of the more powerful Android phones/tablets on the market.

    MX Video Player (free on the marketplace) can decode hi10p videos. I tried it out on my phone. Too slow to decode it, but no funny visuals going on (blockiness or pink pixels all over the place).

    While it is able to decode it in software, the usual high number of ref frames fansubbers like to use is going to kill software playback on “shiny, plastic” toys that run Android. Might be better with the next-gen quad-core chips or more powerful dual-core chips coming out next year (I would not be surprised, though, if the Samsung Galaxy S/S2 phones can playback 720 hi10p videos fine).

  • Ridley

    While I could care less about file size, and 10 bit versus 8 bit for TV releases. Also assuming someone is still reading this thread. I would much appreciate leaving a 720 or 1080 Blu Ray version as 8 bit when you release them.

    For 2 reasons: 1) Blu Rays are 8 bit, there is no benefit other than files size for making them 10 bit. 2) and because Blu Rays are 8 bit, I can not author 10 bit files onto a BD, which is my main method of archiving until a licensed NA release is available.

  • 10bit_sucks

    Thank you… FOR BREAKING COMPATIBILITY WITH ALL HARDWARE PLAYERS (WD TV Live, Boxee Box, Popcorn Hour etc. + all range of TVs with integrated mkv support) ON THE MARKET. There is no system-on-chip (SoC) solution with hi10p profile support and there won’t be any any time soon (if ever…). Good quote from mpclub.com: “Some dude stated ~ wow, look, a New widely unsuported x264 Profile, lets use it ~ and more and more groups following the monkeys call.”.
    I’m pissed of how irresponsible some people can be.

  • W. Anton

    Some groups releasing only 10 bit versions was a stupid call to make.

    Anime scene seems to always do this.
    First it was warp frames over 1 with xvid breaking hardware compatibility with divx players.
    Then it was high amount of ref frames (over the bluray spec).
    Now it’s this stupidity releasing using only high 10 profile.

    If you want to release realmedia files what do I care but release hardware compatible files as well.

    I used to be an encoder for a fansub group and I know that when you compare still frames against a still frames you can get littlebit crazy.
    It’s like, wow this new encoding parameter makes these pixels look so much better lets use it even if it takes twice the cpu power for decoding.

    Sure there is less banding with 10bit encodes but most 720p h264 encodes are already so good that in 99.9% cases you can’t see or care about the difference if there is movement in the screen.

    Question is not if 10bit looks better and compresses better because it does. It helps some scenes more than other.

    Blurays are already encoded with 8bit so encoding those with 10bit is kind of stupid because what you get out of 10bit is just probably more compression.
    Someone might argue that well after filtering if filtering is done in 10bit blaablaa.

    Sure if source material is encoded in 10bits or if source is mpeg2 which has better gradients. Sure you can get much nicer looking encode. But for f.sake release 8bit aswell.

    I like know how many complaints anime groups got about banding issues when there was no 10bit encodes at all.
    I bet te amount of complaints was really low.
    That would show how many people actually care about whole banding issue with 8bit h264.
    And how many of those actually want to lose hardware compatibility.

    Maybe groups should release 1080p 10bit for those guys who REALLY care about quality and leave 720p as 8bit like I read Doki is planning to do.

  • Kinoko

    While there are 8-bit encodes that don’t have serious banding issues, others do. For example the encode of Steins;Gate on Crunchyroll has the most horrid banding I’ve seen in a long time. If selecting “Hi10p” is the idiot-proof solution to banding, then I’m all for it.

    If nothing else it’s great that it reduces the size significantly. I’ve been getting so fed up with these 500+ MB episodes. Ironically the quality craze caused this problem, and it will be likely be the same thing that ends up helping to solve it.

  • W. Anton

    So you think crunchyroll is going to switch to hi10p profile?
    It’s a streaming service, by that I mean nothing to do 10bit encodes done by anime fansub scene. Maybe if adobe flash player starts supporting 10bit h264. But there will probably be new compression standard out like HEVC/H265 before that happens 🙂
    So dream on dream on 🙂

  • Kinoko

    I didn’t say anything about expecting anything from CR. It was just an example of something encoded in h264 that looks like garbage. Maybe you should read over posts twice before responding.

  • Daniel

    10 bit…. 10 bit!!!!!! 10 bit??????? 10 bit bahhahahahahahah 10 bit ALLL THE WAYYYYYYYYYYYYY

  • W. Anton

    Kiniko, ok you lost me here. What was the point to bring up that Crunchyroll crap then? I can make a bad H264 encode by just encoding from bad source. 10bit is not going to help with bad source material.
    Did anime scene Steins;Gate encodes have bad banding? Well did they?

    Sorry i didn’t mean to be mean, but your messages don’t make sense.

    Whole questing here is not about if something encoded looks bad, it’s about if something looks bad because of 8bit H264.

    Sure you can make smaller files with this highly unsupported with hardware vendors atleast, 10bit high profile. But is it worth to lose hardware compatibility and in some places software compatibility?
    I think not.

    I’m fed up these still pictures side by side comparisons. When video is moving it’s very, i mean extremely hard to see any differences between properly encoded 8bit and 10bit atleast with lcd panels i’m using.

    • Kinoko

      I highly doubt the source that CR used for Steins;Gate looked like total shit in the first place. The BD rips look great after all. Any original source that had terrible banding would be thrown in the garbage and people would get fired. I don’t know how you could think that original source would be the cause.

      There are plenty of people who only know how to avoid massive banding by cranking up the bitrate. If 10-bit can make it possible for these people to make encodes that look at least decent without just cranking the bit rate, than that would be great. The smaller file size is simply an extra bonus.

      As far as breaking hardware compatibility goes, suck it up. When h264 came out I had to mostly stop using my modded xbox to play video because it isn’t powerful enough to handle h264. It sucked, but I dealt with it.

      You are right that PROPERLY encoded 8bit and 10bit videos look the same, but not everyone can do things properly. The world is full of failure. I haven’t tested this myself since my computer sucks too much to make encoding h264 very practical, but from what I understand, a video that has a lot of banding when encoded in 8bit, the same video encoded in 10bit with the rest of the settings being identical will show significantly less banding. This is what I meant by “idiot-proof solution.”

      Honestly I don’t know what has been so confusing. I think maybe you’re replaying things that others have said in your head and having a whole argument play out. Don’t get so wrapped up in frustration, because you’re adding things that weren’t said and as a result not even understanding what was actually said. Sorry to have to be a dick about it.

Leave a Reply

You can use these HTML tags

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

  

  

  

Archives