08.20.05
2001 and Monster Cables
I’ve been reading Arthur C. Clarke’s “2001: A Space Odysesy” series (2001, 2010, 2061, 3001) lately so I thought “what the hell” and decided to watch the movie of 2001 again. Well, my opinion hasn’t changed. Compared to the book, it’s totally and utterly boring. It does absolutely nothing aside from self-indulgent effects. When jogging around a spaceship is the most interesting part of a movie that runs over two hours then it really does say something about the quality. While reading the book makes it slightly better (you can understand the huge gobs of stuff that comes from left field that the book explains but the movie doesn’t), it’s still a rather painful experience. Definitely on my “don’t bother” list.
I will say that one scene gave me a chuckle for the wrong reasons: HAL’s monitor screen displays “OMG” at one stage.
Changing subject totally – I went to Radio Shack to buy a DVI cable the other day for my flat panel monitor and saw the $100 offering from Monster for a single channel cable with the following recommendation:
Most ordinary DVI cables are prone to impedance variances and loss of signal strength, while poor shielding causes the cable to radiate radio frequency and electromagnetic interference out to the rest of your system. As a result, you’ll notice blurred images, and a loss of color fidelity and image detail.
I can’t help it. I have to laugh at exactly what sort of morons they think their customers are. It’s a DIGITAL video interface (you know, what DVI stands for?). You don’t get signal loss short of static. You don’t get blurred images. You don’t lose high color fidelity. A bit is either ONE or ZERO. It’s not somewhere in between. I guess if you’re not technical you could be fooled, but wow, someone sue them for false advertising already!
Best Buy got my business – they had a perfectly good dual channel DVI cable for $30. I have perfect picture, zero blurring and zero color fading.
Fred Flintstone Said:
May 25, 2006 at 7:17 am
Hi,
I was just reading through your opinion on how binary signals work. You seem misinformed:
All electricity is analog. A perfect digital is only theoretical, it’s basically just voltage that jump’s between two values (like 0 – 5V). In reality the signal might go to 4.98V on high, and 0.1V on low. To have a cable with the lowest possible resistance would take away the cable as the weakest link. Another thing that’s a problem is the rise time of the pulse. A real digital signal doesn’t go from low to high in 0 seconds. That would most likely be impossible because of the inductance and capacitance in cables. Again, the cable plays a role here. And interference? Why do you think there are optical cables: immune to electrical interference. Not so with electrical cables. The more interference a cable picks up, the more unlikely the receiver at the end will get the correct value (low or high), which will result in either an error, or the wasting of time during error correction.
Does this help?
jw Said:
May 28, 2006 at 1:21 am
Actually, I’m very well informed on how electrical signals work – especially binary ones. Having an honors degree in Electrical Engineering majoring in Computer Systems teaches you a decent amount, especially when you have to design digital circuits to deal with this sort of thing. Typically digital signals are actually *very* analog in nature, but all that matters is that they are above or below a cutoff (1.5-2V is typical for non-differential logic but in reality if your signal is even approaching this you’ve got big problems).
In terms of a DVI link, the concept of “blurred images” or “loss of color strength” is blatantly ridiculous. A digital signal doesn’t get “blurred” or “lose strength” – it’s either good or bad, 1 or 0, correct or scrambled, picture or static. Digital signals degrade to static, not to blurs or faded colors.
If you really want to get technical though, DVI uses an electrically balanced differential coding scheme, which nominally transmits at +/- 4.0 volts and registers a signal at +/- 0.5 volts on receiving. Rise/Fall times can be up to 40% of the bit rate each and sampling is done at the nominal peak of the digital waveform. No error correction is done on a DVI link – if it fails, you get static. No wasted time.
So, my point stands. Monster cables’ advertising is blatantly false. While an analog signal may exhibit ghosting, blurring or fading from poor quality cables, a digital signal will remain perfect until your cable become *so* bad that you start getting static. As long as your cable is in spec, you’re good to go.
DVI spec is here if you’re interested: http://www.ddwg.org/lib/dvi_10.pdf