本帖最后由 DXDXDX 于 2019-11-13 10:31 编辑
说的跟他懂hdmi线制作流程和测试规格一样,讨论有没有影响要先把hdmi白皮书和仪器厂公布的测量指南读一读,区别小不代表没区别
HDMI is an analog signal. In HDMI, a digital bit stream is modulated into an analog signal. Think of WiFi: It might seem digital, but I assure you, the radio signal is an analog waveform. HDMI is not Morse Code; to pack billions of bits per second over each wire, a modulation scheme is used. Bits can most definitely be lost over the cable. This data loss is referred to as Bit Error Rate. HDMI has a design in which video data is transmitted in three separate twisted pairs: R, G and B, or Y, Cb and Cr. A fourth twisted pair carries a clock signal. What’s interesting about this is that each pixel is comprised of information from all three differential pairs. Because of this, when a bit error occurs, it is nearly impossible to see. For example, in Y Cb Cr, if one bit of Cr is lost, the luminance of the pixel is unchanged, and a small portion of one color component will slightly change. It is almost impossible to see such an error unless the error rate is very large. HDMI cables are required to pass certification testing which requires Bit Error Rates many, many times lower than what the best human eye can perceive under the best possible conditions. But uncertified cables which are manufactured with low quality are “just good enough,” and higher bit error rates can occur. So as you see, the answer to your question is: DEFINITELY there is a difference between one HDMI cable an another, but you may have difficulty seeing it. If you see sparkles or horizontal lines on your TV, try using a better HDMI cable.
|