Friday, August 3, 2007

The Truth about Upconverting DVD Players

HD and related technologies aren't that well understood by the general public. We've all seen the hype about "true HD" displays and about "upconverting/HD compatible" DVD players, and not many people care about or understand what these things mean to their viewing experience. I had a friend recommend that I get a 1080i compatible DVD player instead of getting BD or HD-DVD right now while I wait out the format wars. He claimed his DVD picture was as good as HDTV. I've read advice that says that for fixed panel displays, upconverting DVD players match the native pixel count of the display, and thus provide a better picture. I'm here to say this:
all of the above is bull. Whether we're talking about interlaced vs progressive, upscaling DVD or not, its all fancy marketing terms to get you to think you can get a better picture. The truth is, both of these rely on real-time interpolation, and its a question of which device has the best algorithm.

What is interpolation?
It is any method of guessing at information you don't have. For example, we all know that an interlaced signal shows you all the even lines of one frame, and then all the odd lines of the next. For a CRT, the phosphorescence lifetime (the time it stays bright) of the pixel plus the phenomenon of human visual perception called the persistence of vision effectively blur the picture to where we don't notice we are seeing half the data for any given frame. But for any fixed resolution panel display (LCD, PDP) this wouldn't be the case. These each have a native resolution (usually 1920x1080 or 1280x720 pixels), and what you see on the screen is in this resolution, ALWAYS. For example, I have a 1920x1080 display. When I view broadcast 1080i HD from my cable box, somewhere along the path from where the cable enters the cable box to where image gets rendered on the screen, the picture has been converted to 1080p. Period. The cable box could do the conversion, or the TV could. In my case, the TV does it. It performs a deinterlace. It separates the even and odd lines of the pictures, then guess at the missing pixels, then shows you the two frames back to back. What about native 720p content from my cable box? In this case, my cable box is set to upconvert this to a 1080i signal (so my TV isn't always switching modes from channel to channel). How does it do this? Interpolation. It takes the pixels it has, spaces them further apart (with unknown pixels in the middle), then guesses at the unknowns. The 1080i signal gets to the TV, where it is deinterlaced (again by interpolation) to give me my 1080p display. DVDs? Same thing. A regular DVD player outputs 480p. This gets upconverted by my TV to display at 1920x1080.

Wrong Focus in the Consumer Market
So while people are worried about upconverting DVD players giving a better picture, they don't realize that their TVs do the same scaling themselves. What they should be worried about is HOW each device does its interpolation. There are various algorithms for interpolation (zero-order, bilinear, bicubic, all sorts of proprietary adaptive ones based on identifying motion, etc). Consumers should want to know the details of how each one works, and a side by side comparison. This comes up for my regular old DVDs: do I let the PS3 playing the DVD do the scaling/upconverting to 1080p, or do I let my Sony TV? The answer is "whichever one is better at it" but at this point it is impossible to find comparisons such as this available to consumers. You might think that with them both being Sony, they should be the same, but the truth is there is no way of knowing other than side by side comparison. I suspect the PS3 has more raw DSP capability, and my tests confirm that getting the PS3 to do the upconversion gives a visibly better PQ.

The bottom line is this: there is a fundamental idea in information theory that says that no algorithm can recover 100% of missing information for all inputs. So a particular algorithm might do really well for some inputs, but MUST fair horribly for some others. It is all about what devices have the best algorithms for the type of signals we feed them, namely moving pictures. And this is something you have to decide with your own two eyes.


Poseidon said...

Excellent advice Raj! You are spot on with your data too, good to know I'm not the only one out there wondering why this is such a convoluted topic.

A more personally interesting topic, once yours is understood, is what is the best way to STORE that data? I have ripped most of our DVD's (the lifespan of an unscratched DVD is short in our household w/ three young boys) and have played around w/ both X264/AAC (more CPU intensive for playback) and Mpeg4@Xvid (not quite as good, but DivX is probably more pervasive these days than x264) encoding.

But I don't want to wish I'd chosen a different encoding algorithm down the line (still happy about ripping my CD's to MP3s, but of course it would have been better to wait until AAC).

So to limit the amount of interpolation, you want to use a sane compression algorithm - less lossy means less "interpolation" in the end. For DVD's - maybe just leave them MPEG2, which is STILL compressed, and not too bad overall if you think about it). And Blueray, ugh, gonna need more harddrive space...


Raj said...

I know this is months after your comment, but later is better than never :-)

I'm usually interested in fidelity when it comes to video. I don't mind compression and loss in my audio; I'm no expert, and I don't have any kind of sound systems capable of high fidelity playback.

All of the content we consume at home that can be stored for later is already compressed; cable TV, OTA digital, and DVD are all based on MPEG2 standards. Blu-ray supports MPEG2 compression in addition to H.264 and Microsoft's VC-1. Nothing comes down the pipes fully uncompressed.

So, in the interest of fidelity, I never re-encode any of the above formats. If I rip a DVD, I leave it in the original MPEG-PS format. If I record OTA HD or things from my cable box, I leave them in MPEG-TS. If I were to rip a Blu-Ray (haven't done this one yet), I'd leave it in H.264 or whatever.

You gain fidelity and save time not having to re-encode. The tradeoff is of course more intensive space requirements.

For my money though, having the original quality material is worth the cost of a few terabyte sized drives. The things go for around $100 these days, so I deem it worthwhile to pay up for storage to get full quality video.

That's my two cents.