Categories

Discussion: 10-bit h264

(Pic not related)

Remember the transition from XviD to h264? Well, soon, 10-bit h264 will replace 8-bit h264.

Advantages
Without going into encoder speak, 10-bit h264 can achieve a same level of quality as 8-bit h264, but at a lower filesize.

Disadvantages
10-bit h264 is not widely supported yet.

More info on 10-bit h264 | More | More

This is simply a discussion your opinions of 10-bit h264.

810 comments to Discussion: 10-bit h264

  • Julio

    Can somebody post a 480p 10-bit sample? I want to see how much more cpu is needed compare to 8-bit.

  • Fignuts

    10bit is okay, but with little support as of now, it’s just a curiosity. I’d hold off until there is more universal support for it. Then let ‘er rip!

  • rando

    I’d vote against for a very simple reason: accessibility.

    You’ll lose a lot more people by going 10-bit then gain at the moment. Reasons range from miniscule difference in filesize to the fact that there is no way around CPU problems (and even modern laptops and netbooks have problems with 8-bit 720p due to battery saving options gimping CPU, or just because they’re using low power CPU like atom).

    Filesize difference simply isn’t big enough to justify the problems caused to audience. Wait until at least two more generations of CPUs are past us and people with a year-old laptop can reliably watch your stuff while sitting on the beach with no access to AC outlet. Then go ahead and change.

    • Although I agree that this doesn’t have to be done immediately, waiting for two more generations of CPUs would be a bit much. For those who are capable of running 10bit, which I believe are quite a few, it would be a straight improvement.

      Maybe there will be some groups that re-release shows in 8bit like those people who do so for the PSP and gaming consoles.

    • adsl

      those who have trouble playing 720p should just play 480p. 2 generations of CPU will not be necessary, because a simple dual core (my 2 years laptop) can play 720p Hi10P, 8bit 1080p and even a Blu-ray without stressing the system too much. Fir those that watch on the go should simply use 480p (using h264 or XviD)

    • glassd

      Unless you’re running a netbook or really old hardware, I don’t see any laptops not being able to run 720p video (specialy ones running sandybridge due to quicksync) My laptop (core 2 duo) can handle 1080p content on battery, even in extreme power saving mode. Waiting too long could also be bad, as if no one uses 10bit no one will move over to it. People need to be using it and asking for support in order for the majority of devices to support it.

      I believe using it to distribute files on the web is the way to get products to support it.

      • when i talked about my laptop i was saying in power saving mode too (except the BD playing it lags a bit in power saving mode in battery)

        • The extremely obvious question here is WHY should people who can reliably watch 8bit 720p suddenly massively downgrade to 480p or even 480p xvid? What is so revolutionarily awesome about 10bit that makes it worth forcing so many people lose so much quality?

          Reality is, there is absolutely nothing about 10bit except the bragging rights and small file size savings. Neither of which are comparable to forcing people to downgrade their experience to 480p.

          • adsl

            hva you read the description of the PC i used to test Hi10P if not here it is :

            dual core 1.6? GHz (i’m too lazy to go see if its 1.6 or not)
            2GB ram DDR2 800MHz
            256 MB of graphic card
            other specs arent needed πŸ˜›
            any computer nowadays (except netbooks) have these specs

          • Zdm321

            Actually, my netbook can play 720p Hi10P videos fine.

          • adsl

            that only confirms what i wanted to say, you dont need a “super computer” to play Hi10P

          • adsl

            btw what are your netbook specs?

          • Zdm321

            Atom N550 Dual Core @ 1.5GHz, 1GB Ram, NVidia Ion graphics card, running Linux, using MPlayer2 w/ SMplayer for 10bit playback.

  • I wonder how long till it is fully supported.

  • Classic

    I’m fine with 8-bit h264. Filesize isn’t a problem for me.

    (I have tried 10-bit, but didnt out well. Guess I need to tweak some settings or get a better comp).

  • Des

    Wait with it until it`s more supported.

  • tumdedumdedum

    I’ve been doing tests all week , it’s not only Hi10p that came out this last year.
    http://en.wikipedia.org/wiki/H.264/MPEG-4_AVC#Profiles

    There’s Hi422P and Hi444PP as well. Hi444PP can go up to 14 bit processing mode.

    While doing tests with Hi444PP in 10 bit mode I’ve gotten even smaller results than Hi10p at same crf. Even with the input only being YV12 (I only use crf for testing tbh , I encode in 4 pass)

    In contrast to Hi10p Hi444PP even when only using 10 bit processing does use significantly more CPU time compared to 8 bit hi profile.

    But thinking in the long run , since it’s out there , a standard , and implemented (cccp from 22 july worked well) , fansub groups might as well consider going all the way instead of using Hi10p as a stepping stone.It’s going to be whinefest either way , so instead of potentially facing 2 such incidents , today and again a few months from now, why not rip off the band aid in 1 go.

    Usagi drop from Doki redone in 360p Hi444P Predictive.

    DDL:
    http://www.megaupload.com/?d=PMDBZ3NU

    Use cccp beta from 22 july or newer:
    http://www.cccp-project.net/beta/
    Or Mplayer2 from here:
    http://mplayer2.srsfckn.biz/

    Video is 40656 KB. Used up to 16% cpu time on a 4 core P4. Same encode in HiP 8 used about half to a third of that.

    The “Uses more cpu time” argument is valid with Hi444PP but imo this fact makes it worth having duplicate releases . A HiP 8 bit standard one , and a Hi444PP one that does indeed need more processing power.

    • Unlike going from 8-bit to 10-bit, going from 4:2:0 to 4:4:4 is not actually beneficial.

      Also, you’re a terrible encoder, so stop posting about encoding already.

      • tumdedumdedum

        Kid , go play with your lego’s.

      • tumdedumdedum

        Augmenting the colourspace , especially when scaling down , is beneficial , inherent artefacts being created due to dithering don’t need to be reintroduced. hence reducing the bytes needed.

        I’ve been working with video since the newtek video toaster …. But really , starting a discussion with a sad excuse for a human being like you isn’t going to amount to anything.

        • Mah homie D be correct nigga. 10-bit be helpin’ cuz uppin teh bit depth on teh inside be good fo dither n’ error roundin’ n’ shit.

          Colorspace conversion ain’t always gon’ help, specially when the original ain’t 444 neither.

          Plus you niggas aint seen nothin till you try 64bit[ch] encodin’ — here be a 40mb 720p episode… compat with teh shiny plassic toys too homie. http://www.mediafire.com/file/kt3uym020bmljll/Clannad%20%7EAfter%20Story%7E%2007.mp4

          • tumdedumdedum

            When you downscale yes . Broadcast fansubs are scaled down from 1440×1080 tc’d at 60i to 1280×720 at 24p. color information of neighbouring pixels creates averages . ie more colour. simple terms , if you have a yellow and a green pixel next to each other you get a light green result scaled down which may not be reproducible in an yv12 colourspace. If the output then again becomes 4:2:0 you use less chroma info per pixel. In fact 4:2:0 only uses 1 chr per 4 lum . Using full yv24 when encoding something scaled down thus 1 reduces artificially introduced pixel colour bleeding , and augments gradients. Gradients that in the source were dithered , may be smoother in the destination , smooth gradient compress a lot easier than introduced dither .

            As you can see in this example
            The image indicated as “ori” is a scaled down of the original , then reverted to a 4:2:0 colourspace.

            The other image is a frame from that 50 meg encode mentioned above , using 4:2:0 input , moved to a 4:4:4 colourspace , scaled down , then that was used as the source for x264.

            http://screenshotcomparison.com/comparison/68691

            The original uncompressed scale that was put back in yv12 shows more colour bleeding around the buttons on the tree than the encoded version.

          • Man you must be playin nigga the only thing in dat screenshot dat peeps notice is all yo extra artifacts up in dere homie, specially in places dat should be flat in color.

            Man you be downscalin to ridicalus res and puttin artifacts in erry’tang, all dat extra space for da colors ain’t doin much good is they.

            Ah stand by my words earlier homie, yo screenie just proved me and mah homie D was rite all ‘long.

  • Natsu3

    Just because someone has been fiddling with toys for 20 years doesn’t mean he’s any more knowledgeable about lossy encoding than someone who’s been clicking Gknot buttons for 8 years.
    4-pass encoding is ridiculous – assuming you’re not using some terrible encoder for some antiquated format;
    Crf is not a measurement for perceptual quality, especially not over different profiles – it’s pointless to use it as a comparison basis;
    Hi10p is not a stepping stone, there’re several good reasons why the x264 devs didn’t take it further (at least read up the documentation before you test something);
    Your encode looks horrible and is in no way to be considered a pro argument for whatever you’re trying to sell;
    Lol @ 640×352 (get over mod16 encoding already);
    Lol2 @ 640×352 (nobody cares about such low resolutions);
    Lol3 @ transcoding lossy audio (unrelated to 10bit video, but confirms you’re a terrible encoder);
    Lmao @ editing the header to hide encoding settings.

    But don’t feel obliged to respond to yet another “sad excuse for a human being”, “old fart”.

    • tumdedumdedum

      640×352 is a bug in mkvmerge atm , demux and see , the h264 inside is 640×360 and set to AR:16/9. Investigate before talking out of your ass.

      CRF is a quick fix encode period. Multi pass takes a lot more time but the available data is being used in a much more efficient way. Pass 3 and 4 are only there to improve this very marginally , but I’ll take any bit of improvement I can get when limiting myself to tiny sizes.

      The x264 devs did take it further. Read up on your facts windbag.

      My encode is between 50 and 60 meg , which is why I use lossy audio , that is the result I am aiming for . Vorbis is one of the best lossy codecs for low bandwidth.

      Do better with your quick fix CRF and 50 meg , then we’ll talk.

      The only thing you have been able to prove is your arrogance.

      • The one I gave you ‘bove is already betta den yo shit nigga. And its 40mib in HD. Dats aech to teh dee in se’en twenny pee. Quit cho bitchin and either say yes or no do doki doin 10-bit.

        Heh dat applies to this nigga too tho, my bad.

  • First Β«Blood-CΒ» released above the High@L4.1 AVC profile (theΒ first episode is in High@L5.0, theΒ second is inΒ High@L4.2), now 10bit toΒ replace 8bit?

    Damnit, the HDD space is much cheaper than theΒ new CPU and motherboard and memory and Windows and probably videocard asΒ well.

  • johnc

    lolwut…? There’s no hardware accelerated decoding for 10-bit h264? Yeah that’s pretty much a non-starter for HTPCs.

    Will we be forced to use PS/2 peripherals too?

    • adsl

      depending on what you use PS/2 can be better than USB

    • 8it

      Same problem here. I was going to upgrade an old 6-year desktop I have lying around (upgrade ram and throw in a new video card for video acceleration) to act as my HTPC. No point in that really if people are moving towards hi10p encodes, especially for 1080p stuff most likely released for next year.

      I’ll also have wait for XBMC to support the new shit.

  • Rokudaime

    Jesus Christ….this discussion topic has become HUUUUGE! Over 400 comments!? ΰ² _ΰ²  The heck!? This must be like a record for Doki or something…I personally don’t understand why this is going on so strongly, and breng dragged out so much though.

    • adsl

      because 10bit does not have GPU decode or whatever and only a “few” media players support it

      • Bob

        If by “few” you mean all bar coreavc then yes.

        • adsl

          thats exactly what i wanted to say, only coreAVC isn’t supporting Hi10P right now but a patch is coming soon (info on the forums).
          i was against because only a few players supported it but now that so many support it i only have a question: “why not use it?”

          • johnc

            Why not us it? Simple… because it’s absolutely retarded to move video decoding off of the GPU and put it onto the CPU, especially on HTPC applications where heat and fan noise are legitimate concerns. And on that note I really don’t know what the obsession is with having a thousand hacked-together libraries and programs just to play media files. The Matroska splitter was responsible for much of this, but it appears to be more of a mindset in the fansubbing world, and this is just another step down that wrong direction. If you need to bring several programs and packages together just to get playback, you’re doing it wrong.

            For Windows users it should be this simple: 1) Download MPC-HC. 2) Open file and hit “Play”. If you’re doing anything more than that, you’re getting hosed.

          • adsl

            Linux users (i am linus user too) can use mplayer 2 or VLC (i dont like VLC but it plays Hi10P perfectly) and a simple dual core can play 720p Hi10P with 0 problems

          • adsl is rite dawg, I use teh linux and it works no problem, but try movin dat siht to teh shiny-ass plassic toys and … well dat shit actually crashed my phone, so yea.

  • mosh

    I believe there is a great PotPlayer build with madvr + lavfilters that supports hi10p.
    http://imouto.my/configuring-potplayer-for-gpu-accelerated-video-playback-with-dxva-or-cuda-and-also-high-performance-software-decoding
    Try it out and enjoy.

  • john

    my 8 year old htpc pc 2.2 single core 1g ram 64m gfx( lol i know) can play 720p fine even 99% of bd 720p plays fine, i dont see why people would complain as most would have at least dual core pcs which shouldnt have problems with it. if your pc isnt upto date you should settle for anything you can get. regardless i still say wait a few months till its the norm

  • bob barker

    I think folks are missing the more obvious point. None of our current machines will do well with the update and not many have enough cash to drop on a whole new machine to play the files that work fine as is.

    The ones saying the update works fine are either flat out lying or have machines that are new enough to get over the bump.

    After reading all of these comments.

    I say its more likely to go with cheap memory over a whole new machine PLUS more memory.

    I don’t think its reasonably to tell the majority of users to upgrade a whole machine for something like this. Supported or otherwise its a wasted effort when the majority of users won’t be able to play them comfortably.

    • adsl

      hva you read the description of the PC i used to test Hi10P if not here it is :

      dual core 1.6? GHz (i’m too lazy to go see if its 1.6 or not)
      2GB ram DDR2 800MHz
      256 MB of graphic card
      other specs arent needed
      any computer nowadays (except netbooks) have these specs

      Zdm321
      July 24, 2011 at 7:27 PM

      Actually, my netbook can play 720p Hi10P videos fine.

      adsl
      July 24, 2011 at 7:28 PM

      that only confirms what i wanted to say, you dont need a β€œsuper computer” to play Hi10P

    • Zdm321

      No, actually. My 2-year-old netbook plays Hi10P fine (dual-core atom with ion graphics), as does my 2+-year-old desktop computer (quad-core but with crappy intel integrated graphics). MPlayer2 works wonders.

      • johnc

        Sure, it’ll play fine. And personally, I won’t be affected. But how about 1080p on that dual-core atom? There are probably some people on the cusp there that will be affected. My understanding is that the dual-core intel atom 330 is only clocked at 1.6 GHz. Will that be enough to decode 1080p now that you have no access at all to the onboard ION graphics?

        I think there will be a lot of people in the HTPC world that are going to get shafted on 1080p simply because they rely on GPU video decoding. I’m particularly thinking of the lower-end (not OLD) boxes, like the Zotac, etc. It’s not that 10-bit requires much more processing power to decode, it’s the fact that you lose the option to decode on hardware.

        I’m amazed by the people here who say, “It works for me! So who cares???”

        Just the fact that it’s a completely retarded idea should be enough reason to be against it.

        • Bob

          I’m amazed at people who say “it doesn’t work for me” so let’s deny everyone better quality output.

        • adsl

          people who say β€œIt works for me! So who cares???” are douches, but to decode 10 bit i only used more 2% or 3% of CPU, so people don’t need to worry much about it

        • Zdm321

          Why would I play 1080p on my netbook? The screen isn’t even 720p!
          But anyway, I have the Atom N550, not the 330.

          I haven’t tested 1080p 10bit playback because I haven’t seen any 1080p 10bit files out till now (Coalgirl’s BMG v2). I’ll test it later tonight for teh lulz.

          The point I was trying to make is that “None of our current machines will do well with the update and …” is totally incorrect.

          • 8bit

            Resolution of a video and resolution of your screen does not matter when playing back a video. The video will just be scaled to fit properly.

            Just like JohnC – its the 1080p videos that I am worried about when trying to play it on a low power HTPC or even an old desktop that can’t software decode 1080p, but can with the right GPU.

            Then again, for a desktop, I wouldn’t be surprised if NVDIA or ATI offer a software update for hi10p decoding. ATI for the longest time couldn’t GPU decode past L4.1, but after one of the revisions in the drivers, you could decode L5.1 videos fine.

          • Zdm321

            Yes, I know it gets scaled back, but what’s the point of playing 1080p on it if it don’t even have a 720p screen? 1080p would just be a waste of space.

            Edit: Unsurprisingly, 1080p 10bit doesn’t playback well on a 2-year-old dual-core atom netbook (too many dropped frames). But like I’ve been saying, 1080p 10bit is overkill for such a small screen, and 720p 10bit is more than enough.

        • Index

          >But how about 1080p on that dual-core atom?

          that make u stupid, Sir.
          may u give a smarter reason, please?

          ┐(οΏ£γƒΌοΏ£)β”Œ

          • 8bit

            Dual-core ATOM plus ION(2) is a great platform for HTPC use. Smart enough reason?

          • Index

            >Dual-core ATOM plus ION(2) is a great platform for HTPC use. Smart enough reason?

            1024 x 600 px monitor to play 1920 x 1080 px video?
            Sorry, Sir. Not enough smart
            =========
            don’t bother to download 1080p. u just have to download 720p

            ┐(οΏ£γƒΌοΏ£)β”Œ

          • 8bit

            >1024 x 600 px monitor to play 1920 x 1080 px video?
            >Sorry, Sir. Not enough smart
            >=========
            >don’t bother to download 1080p. u just have to >download 720p

            Re-read what I posted. HTPC use is what I was talking about.

            Try again, good sir.

          • Index

            >Dual-core ATOM plus ION(2) is a great platform for HTPC use.

            so u said that’s a great platform for HTPC use, but u worried that ur platform can’t play 1080p?

            that’s not GREAT, then.

            fyuhhh. i think i’ll rest for a bit then ┐( Β―3Β―)β”Œ

          • 8bit

            >so u said that’s a great platform for HTPC use, but u worried that ur platform can’t play 1080p?

            Um, not concerned with 1080p playback, but with hi10p encoded 1080p videos. I’m not worried since I don’t run such a platform, but it is a concern for some people.

          • johnc

            Index,

            I don’t think you understand the HTPC / home entertainment market. You know — people who want to watch video on their TVs, not on their laptops. These systems are GREAT for 1080p and that’s exactly what they’re marketed for. They accomplish 1080p playback through hardware acceleration, aka “modern technology”. To say they’re somehow deficient is inaccurate.

            It’s retarded for anyone these days to decode MKV in software, unless they have ancient systems and are required to. This new encoding scheme is a step backwards, and 10-bit color was not designed for this purpose, which is why no non-million dollar video card even supports it.

          • adsl

            nive thing i have CUDA to help my CPU then

          • johnc

            ok… how is CUDA going to help your CPU?

  • odinigh

    honestly, it’s Holo’s decision, if any of you have got a problem with it, ah well.

    If he wants to change his release format, he will, and there’s nothing can be done about it, unless someone goes through the trouble of reencoding every release.

    If he doesn’t do it, he doesn’t.

  • Desuwa

    The problem with the “my 5 year old computer can’t play it” argument is that it will always be true. No matter when the jump to 10bit is made, someone’s 5, 10, or 15 year old computer won’t be able to play it.

    Now that every all the important decoders, save only coreAVC, support 10bit there’s no real compelling argument not to switch. While it’s true it takes marginally more processing power, if a netbook can handle 720p, well, I don’t see the argument. It seems like any decent CPU from the past couple of years can handle it, so if you bought something really underpowered that can’t, well, you knew it was a weak CPU when you bought it.

    Current hardware decoders simply won’t ever support 10bit. This will always be true, whether we wait for more “support/faster CPUs/black magic”, or make the jump now, current HTPCs or other computers relying on hardware decode will need to be replaced.

    Most people won’t upgrade their HTPCs until it doesn’t work anymore, so those will always be around. Yes, they will lose out if the CPU can’t handle the load, but, again, waiting years to make the jump won’t put them in a better position.

    • johnc

      wtf… People will have to upgrade their HTPCs? We’re not talking about 5, 10, 15-yo hardware here. We’re talking about HTPCs being sold today and the most powerful APUs available TODAY. People have to throw that out and get a 3 GHz processor generating 60-degree heat to watch 1080p in their living rooms?

      Are people even thinking this through — or do they just assume that everyone is watching 720p on their netbooks, and if they can do it, everyone else can too?

      • Desuwa

        Yes, just like having to replace your DVD player with a BD player no matter how new it was, or replacing your VCR with a DVD player before that.

        While I did engage in a little bit of hyperbole myself, as no one would seriously expect a 10 or 15 year old computer to keep up, we don’t need to exaggerate everything. If a netbook can play 10bit 720p at a reasonable bitrate, some of the more budget oriented desktop CPUs ought to be able to handle 1080p without breaking a sweat.

        The incompatibilities with 10bit don’t come from a difference in processing power required, but instead that the hardware is hardwired for certain h.264 profiles, and can’t handle anything outside of those.

        We already have cases of encodes that can’t be decoded by DXVA or other hardware decoders, though those have become much less common as the hardware has gotten more flexible, and encoders have changed their settings to more standard ones.

        If we were to wait 5 years to make the jump to 10bit, what would that accomplish? These current HTPCs wouldn’t have gotten any newer, or been replaced for other reasons, since there is no reason to upgrade, so we’d still have the same problem.

        Now that proper software support is largely here, except for CoreAVC, I don’t see the reason to stick with 8bit when the benefits are obvious, and since we’re talking about fansubbed anime here, most people affected will have the ability to follow an installation guide for the new version of CCCP/mplayer.

        The only reason I can see to wait was if by waiting we could make one larger jump instead of several smaller jumps, say to 14 or 16 bit colour, but those are so far in the future it’s not worth waiting for, since nothing is mastered in more than 10bits right now.

        • adsl

          i completely agree with you, many freeware already supports 10bit and you dont need a “super Computer” to play it well (looks like netbooks can do it) and ofc 5 year old PCs wold have troubles playing it (i am surprised for it handling the current 1080p with no problems)

        • johnc

          I guess it’s a matter of what you gain vs. what you lose. For me, hardware acceleration is pretty much a non-negotiable, more on the principle of it because my CPU could certainly run circles around Hi10p 1080p. Whereas 8- vs 10-bit brings very little to the table, comparatively speaking. If we actually had 10-bit color display devices in consumer land, and the variance in color was reasonably noticeable, maybe it’d be worth it. But as it is now, those benefits are nowhere near happening. The only thing this change brings is better compression methods and slightly better video quality because of those compression changes. (I’m presuming one could get just as good video quality with less aggressive compression.)

          At first, reading the thread, I thought it seemed like a good idea… but when I read that it pretty much cancels out acceleration, it just seems like a major step backwards. Not forwards — backwards. If we were talking about a driver update or a patch to the VDPAU libraries or something, I’d be on board. And maybe that is the case. But if it’s physically impossible on the hardware, I think it’s a dumb decision and I even think it’s clear-cut obvious. Advocates are talking about this like it’s the next greatest thing, whereas the benefits seem minimal, if not practically non-existent. Meanwhile the potential cost is to eliminate everyone who relies on hardware acceleration, which is essentially the entire HTPC market.

          • Desuwa

            It’s not going to break hardware acceleration forever. Just until the next generation of video processors that can handle 10bit come out. It’s not a matter of “hardware acceleration will never work again” or anything like that.

            The benefits, however, are very obvious, even discounting the decrease in file size. Any dark scene, or any scene otherwise prone to banding, sees massive benefits. The only way to get around that with 8bit is to jack up the quality (and therefore file size) to obscene levels, whereas now we can get that same benefit with even smaller files.

            The only major difference with this jump, compared to the previous XviD to h.264 jump, is that it’s still the same codec just used in a different way. People used to have DivX players, should we have avoided h.264 forever for their sakes?

            At some point you have to decide whether it’s worth it to make the logical iterative jump or not. Since the next thing after 10bit h.264 seems far enough out that it’s not waiting for, and it looks like the fansubbing community is going to be making the jump eventually, it’s now a matter of picking sooner or later. Either way, there will always be those people who were just barely able to handle the old stuff, they will have to upgrade. In fact, since the actual processing power required is not increased by any significant amount, it will only be affecting computers with very old or very weak CPUs that rely on hardware acceleration.

          • Desuwa

            Also, I should add, for the majority of HTPCs, which are custom built (lacking any specific numbers, but from my personal experience most HTPCs are custom built instead of pre-assembled), it will simply be a matter of tossing in a new $50 video card once they support 10bit. The CPU, motherboard, RAM, etc, from the HTPC will still work, it’ll just be a matter of replacing one budget part with a newer model. In fact, it’ll be cheaper than buying a new DVD player to replace my VCR was, way back when.

            Not to mention that that new $50 video card will still be able to handle the 8bit h.264, when my DVD player could not play any of my old VHS tapes.

          • johnc

            I think you’re being a bit optimistic. One has to put the two images side-by-side, put his face 2 inches from the screen, and then analyze every pixel to find any noticeable difference. Anyone who watches anime for the sake of watching anime — and not to masturbate to picture quality — isn’t going to notice a single thing sitting 10 feet away.

            The disk space savings are nice, but there are already plenty of easy answers to get around disk space. As far as I know there is no consumer-level Hi10p video hardware expected to even be released in the near-term. So telling people to upgrade seems a bit hollow.

            Of course fansubbers will do whatever they want, and maybe a transition period of releasing both 8- and 10-bit encodes would work. But the whole thing just comes off as e-stroking as each subber tries to out-d!ck the other. Hi10p has been around for years, but out of nowhere now all the fansubbers are falling into line like dominoes, promising something as great as the transition to DVD.

            And I’m telling you… if the fansubbers came out and said that everyone had to install Windows 95 to view their encodes going forward, 90% of the people here would be lined up at Best Buy tomorrow morning looking to buy their copies. It’s probably one of the more conformist communities I’ve ever come across.

          • Desuwa

            Ah, ad hominem, I should have been expecting that.

            Well, the reason “all the fansubbers are falling into line like dominoes” is because a beta CCCP came out with support for 10bit, which makes it easy for the vast majority of people here to play. The reason that it seems, to you, that this community is so conformist is simply that fansubbers as a whole have made largely logical changes, and there’s no reason to oppose them. It’s one thing to be different, and it’s another to be different for the sake of being different.

            I think you put far too much weight on the small minority of HTPC users. Most people who watch fansubbed anime do so on a computer, and I’d venture (without any concrete numbers), that more use software solutions than hardware, since hardware acceleration is harder to set up and has other drawbacks.

            For the 10bit Steins;Gate ED encode from commie, I hardly had to do frame by frame comparisons, or lean in close to my monitor, to see the difference. Especially in banding, which will, I admit, be masked by the distance for the minority of people who watch it on a TV (though even then, I believe it can still be noticeable), is very noticeable and jarring for the majority who are using their computer monitors. Which is again, ignoring the significant gains in file size, which is a lesser benefit.

            Hell, as far as personal experience goes, I know more people who use their full sized desktop PCs with a TV than use HTPCs.

          • Index

            finaly there’s someone that understand me πŸ˜€

  • innerchihiro

    I’m all for it as long as it works on most people’s computers (including my two year old MacBook Pro.)
    I’d love to see it tested in an ordinary Doki release in addition to an 8bit version.
    I trust you to begin using it when it makes the most sense to you.

  • redcero

    I agree with many others in that while going to 10 bit is good, we should give it time. Personally I dislike vlc and mplayer2. I like using my zoom player (which I think is nice although many will think it sucks horribly), and since coreavc doesn’t support the 10 bit yet, I pretty much have no way to play it.

  • innerchihiro

    Can anyone explain to me how to download and run Mplayer2? Maybe a point in the right direction? I can’t make heads or tails of their website. I have a mac.

  • adsl

    Download http://www.fileserve.com/file/XYhKWK7
    extract to the folder you want
    add a shortcut for the .exe file
    add ” -vo direct3d -ass” without to the end of target path (teh space between the .exe and -vo is important

  • sid

    Why don’t you drop Xvid and use 8-bit and 10-bit h264? This way you can move forward and have the ‘compatibility’ tag on yourselves.

    • Chunghwa

      but then all the XviD friends with computers from 1998 will say “WHY U DO DIS? YOU MAKE LIFE DIFFICULT, NOT EVERYBODY UPGRADES THEIR COMPUTAH EVERY 2 YEARS ;_;”. And apparently there’s a lot of 1998-friends still around.

    • Zdm321

      We did drop XviD. The only shows that we still do XviD for are ones that we originally started in XviD (Qwaser). We aren’t going to switch the SD over to h264 for consistency reasons.

      • ‘Xactly.

        If bandwidth is a concern, I seriously beseech you my homies to check out the 40MB HD file I linked to above. It’s 8-bit and plays on a 900mhz celeron. In HD. Just sayin’.

        • Relgoshan

          My Eee 701 confirms this; I know that ~after story~ shit is all soft tones and low motion but GODDAMN nyukka WHAT DID YOU DO??

          It reminds me how early encodes like Elfen Lied ep13 sucked piles of CPU, but now we get files like the one you linked up.

  • I just have one question, will Doki always provide 480ps for Ongoing/Current TV Projects?

  • Clannad Man

    I just wanted to post in the Epic thread.

    … That is all.

  • Patoriku

    If the software decoders can handle 10-bit on netbooks with 720p screens, but htpc owners with 1080p big screens rely on hardware decoding, the answer seems obvious.

    720p = 10-bit
    1080p = 8-bit

    you release both formats anyway.

    Just my 2 Yen.

  • Julio

    Ugh, I tried playing a 360p 10-bit video and it’s too much for my computer to handle it. There is no way I”m going to support this at all.

    • Zdm321

      How old is your computer, and what kind of processor does it have? 720p 10bit plays fine on my netbook, so I find it hard to believe that any modern desktop/processor would have trouble with 360p 10bit playback. Did you make sure to upgrade to the CCCP Beta or MPlayer2?

      • Julio

        My desktop is now 10 years old. The processor is a AMD 1.3 Ghz with 512 MB RAM and 128 MB video. It can handle 720p h.264 videos depending on the encoding compression. I’m using the latest beta ffdshow with zoom player. I will try cccp beta and see how it handles things.

        • martinez

          maybe it’s time to upgrade ur comps. my sister have celeron M notebook and u know what?

          it can’t decode flac and very hard to play 720p mkv

        • Relgoshan

          VLC works pretty well for me. A big problem with older AMD chips is the lack of next-gen SSE support. Even an ’02 Clawhammer Athlon at the lowest speed would perform much better due to improved instruction decoding.

          I just saw a new netbook on the shelf, dual-core with Win7 Starter, new Intel hardware video decoder, 1GB total RAM, only $200USD. Or you could find a 5yo PC for about $100 ($0-$50 if it needs a new hdd).

Leave a Reply to Index Cancel reply

You can use these HTML tags

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

  

  

  

Archives