AQ Diamond Ethernet

Good to eat humble pie when it deserves to be eaten!

I know this had been done to death but can anyone point me to a convincing article/blog/post which explains the science/theory/thinking/explanatiom behind how it is possible that digital ethernet cables can produce different sounds? I have spent a few hours googling this and the weight of ‘opinion’ is that digital is 1s and 0s and the packets arrive or they don’t and a semi decent ethernet cable will self correct for jitter and if there is jitter it will manifestbitself as skipping or something similar and not an improvement on the sound that is received and produced. Can ignore ‘quality’ for this purpose. My IT mate can’t get his head around it. He has not blind tested but I and another mate have, and we both picked it 4/4 times taking turns to switch the cables. So SOMETHING is going on, and I wonder whether theory needs to catch up to reality. I promise we didn’t eat dodgy cookies :rolleyes:
 
Ethernet cables carry an analog signal that represents a digital 1 or 0. The receiver needs to turn the analog back into digital. If the cable degrades the analog signal then the point where it should change from a 1 to 0, or 0 to 1, can shift. This would mean there is now a slight timing error in the reconstructed digital signal. At least that is my hypothesis. :)
 
Thanks! That is what I thought. But I read that if the packet of 1s and 0s does not arrive precisely in the right sequence then either the packet gets re-sent or there is an audible skip/glitch.
 
It does arrive in the right sequence. Just because the timing is off between bits does not mean the bit sequence is incorrect, or that the CRC will be bad.
 
Since its an analog signal as well, the cable itself allows for noise to be introduced to the system that can be carried throughout the rest of the chain.
 
The fact is: all digital communications in the modern audio world are absolutely 100% bit perfect regardless of the cable brand or quality. Ethernet (or USB) deliver the bits fully intact from the source to the DAC. There is enough integrity in the waveform to withstand severe degradation in the signal before any corruption of the bits occur. Other industries use these interfaces willy-nilly and move terabytes of data around without anyone complaining of errors.

In the audio space the distinguishing factor and crucial difference in our systems is that the DAC has to ultimately convert a reference voltage level to an analog waveform ...which we hear through our speakers. Its at this point of conversion from bits-to-volts in the DAC that all that is audible is manifested. At any instant of a song, the DAC hardware dutifully tries to set the output analog level as per the input bit pattern but its only possible to do this with total precision if the reference voltage to the DAC and the power and ground rails are at a perfectly unchanging level. If they are changing (even at micro-volt level) well then damn it, we hear it and our brain dissolves the belief that the recording is reality.

And this is the 'hard to grasp' but quite obvious fact: Nasty analog noise from the source player or the AC mains is conducted along anything metal and rides digital cable (without affecting its data integrity) to the DAC to do the damage. Sure there are preventative measures in place such as RF shields, component and trace layout, improvements to the power supplies and thick metal barriers ...but audiophiles with resolving systems and good ears still hear the errors in the waveforms. Bit errors are pops, clicks and crackles. RF induced errors are what you would expect: loss of staging, depth, clarity, etc.

So, yes, better digital cables (ethenet, USB) do make a sonic difference ...but only because they allow less RF noise thru using metallurgy, construction or design. Its my opinion that cable manufacturers perpetuate the ignorance of what is really going on and in my own case, i've transformed my mid-priced, well constructed USB cable into a rock-star by adding a few dozen clamp-on ferrites. And saved many hundreds of dollars in the process.
 
I know some have had great results with ferrite, but when I attempted them in my system the soundstage collapsed on itself. I've never had that happen with better cables - much the opposite, in fact.

I'm not discounting your results, but sharing my journey down that road.

Cheers
 
The fact is: all digital communications in the modern audio world are absolutely 100% bit perfect regardless of the cable brand or quality. Ethernet (or USB) deliver the bits fully intact from the source to the DAC. There is enough integrity in the waveform to withstand severe degradation in the signal before any corruption of the bits occur.

Interesting. I have never seen anyone post that the data was corrupted by an Ethernet cable.
 
The fact is: all digital communications in the modern audio world are absolutely 100% bit perfect regardless of the cable brand or quality. Ethernet (or USB) deliver the bits fully intact from the source to the DAC. There is enough integrity in the waveform to withstand severe degradation in the signal before any corruption of the bits occur. Other industries use these interfaces willy-nilly and move terabytes of data around without anyone complaining of errors.

In the audio space the distinguishing factor and crucial difference in our systems is that the DAC has to ultimately convert a reference voltage level to an analog waveform ...which we hear through our speakers. Its at this point of conversion from bits-to-volts in the DAC that all that is audible is manifested. At any instant of a song, the DAC hardware dutifully tries to set the output analog level as per the input bit pattern but its only possible to do this with total precision if the reference voltage to the DAC and the power and ground rails are at a perfectly unchanging level. If they are changing (even at micro-volt level) well then damn it, we hear it and our brain dissolves the belief that the recording is reality.

And this is the 'hard to grasp' but quite obvious fact: Nasty analog noise from the source player or the AC mains is conducted along anything metal and rides digital cable (without affecting its data integrity) to the DAC to do the damage. Sure there are preventative measures in place such as RF shields, component and trace layout, improvements to the power supplies and thick metal barriers ...but audiophiles with resolving systems and good ears still hear the errors in the waveforms. Bit errors are pops, clicks and crackles. RF induced errors are what you would expect: loss of staging, depth, clarity, etc.

So, yes, better digital cables (ethenet, USB) do make a sonic difference ...but only because they allow less RF noise thru using metallurgy, construction or design. Its my opinion that cable manufacturers perpetuate the ignorance of what is really going on and in my own case, i've transformed my mid-priced, well constructed USB cable into a rock-star by adding a few dozen clamp-on ferrites. And saved many hundreds of dollars in the process.

Thanks for this. Makes a lot of sense. How about the timing point made by Bud? That makes a lot of sense to me too. Would also explain why my IT friend can’t wrap his head around it - he is used to dealing with other (non-audio) data which perhaps is less timing-critical.
 
I know this had been done to death but can anyone point me to a convincing article/blog/post which explains the science/theory/thinking/explanatiom behind how it is possible that digital ethernet cables can produce different sounds? :

Have often wondered that myself... Here's a post from elsewhere on the AQ Diamond LAN cable that includes measurements of certain parameters that might explain the differences between different LAN cables and how such parameters might affect the sound when different cables are used in different systems.

Ironically, while measuring better, the poster found that the AQ Diamond made no audible difference in his system. For context, he has the Vivaldi stack.

[quote author=AndrewC link=topic=264926.msg1242455#msg1242455 date=1516497130]
Lots of fun testing out the Audioquest RJ/E Ethernet cable out yesterday. I actually thought it’d take me longer to come to some conclusion, but I’ll cut to the chase first; the Audioquest RJ/E Ethernet cable makes no audible difference in my system that I can tell…. BUT! Call me a convert, it’s staying in my system :)… Read on for why;

I started off with a few basic checks on the RJ/E as a standard Ethernet data cable using my Ethernet test-set and Switch at home, including Bit Error Rate test (BER), throughput, jitter, latency etc, the thing works perfectly fine and measures like any other 1.5m long Ethernet cable. In any case, even a $2 Ethernet cable is unlikely to demonstrate any problems - as an FYI, my main Juniper Ethernet switch at home currently has an uptime of 221 days, there are ZERO errors on any of it’s 8 connected ports, some literally with junk freebie Ethernet cables connected. So, as far as 1s & 0s are concerned, the Audioquest RJ/E is no different than any standard shielded Ethernet cables - Non-Audiophile, Computer “data is data” types and Amazon reviewers can rejoice :P

(Totally error-free Ethernet transmission stats from my switch)

eb6hqd.jpg



But, as quite a few of us Audiophiles have come to increasingly realise, with digital cables, in an audiophile context, it’s in the area of “noise suppression” quality where things start to make a big difference, especially when connected to noisy compute platforms with poor PSUs and noisy ground etc etc. My Ethernet test-set can also perform a Time-Domain-Reflectometer (TDR) test - typically designed to measure cable lengths and/or to locate cable breaks using a digital pulse launched into an open cable and measuring the reflection response signal in the analog domain - just for kicks I ran a TDR test on the AQ RJ/E vs. Nexans LANMark CAT6A which produced an interesting result. As these cables are pretty short (1-1.5 meter), my test-set could only show the summed reflected portion of the test pulse… have a look;

4e7c324ad95d42c7919877053096f6bb.jpg


a8839be042a813a82f8550be1723eb82.jpg


(X-axis is the estimated cable length in Meters, Y-axis is the summation of the test pulse & reflected response amplitude, normalised to +/- 100).

Any guesses which is which? ;D Two big differences; even with such a short cable, clearly one has a better “slew-rate” than the other, suggesting better (lower) capacitive characteristics. And it also has a tight grouping of the impulse reflection responses suggesting equal performance of the 4 twisted pairs of wires inside the cable. Of course this doesn’t automatically mean that the RJ/E is going to “sound” better, but it does suggest it has significantly better material/construction.

That gave me the idea to use my Ethernet Toner test-kit to compare how well each cable’s shield works. Big difference! With any cheapo Ethernet UTP cable, the Toner Probe - basically an inductive amplifier - can pick up the square-pulse’s harmonics from almost a foot away from the cable. With the Nexans, it’s about 10-15cm.

(Ethernet Toner Probe sounding off near the Nexans CAT6A).

dff5982a7680ec270c198705ca5f9b44.jpg


With the AQ RJ/E though, I virtually had to touch the cable with the Probe before it sounded off. Not only that, the AQ RJ/E performed differently depending on which end was plugged into the tone generator… meaning the cable is objectively directional (exactly as advertised by Audioquest, at least with respect to it’s shielding anyway). Of course properly connected quality shielded Ethernet cables don’t generally radiate harmonics, but cheapo ones do!! And will impact imaging/separation if they’re laying near small-signal analog cables.

Once again, this doesn’t automatically mean that the RJ/E will “sound” better in a system, but it does prove that the cable is well shielded and likely somewhat immune to EMI/RFI, and will not radiate Ethernet bitstream harmonics out to the rest of my cables/system. Whether this better shielding is due to cable construction or Audioquest’s DBS, I can’t really tell - while testing, I totally missed that I can actually unplug the DBS battery and see how it performs… but now that it’s in my system, I’m not touching it for a few days. The point though, the Nexans are properly shielded/foiled cables (S/FTP CAT6A), so something’s definitely better with AQ RJ/E.

(My regular go-to test tracks for DSD128 streaming over Ethernet, @ merely 11.28Mbps, over a 10Gb/600MHz capable Ethernet cable... talk about overkill :P)

s3h7hg.jpg


All that said, the RJ/E doesn’t appear to deliver any sonic benefits while in my audio system, I suspect because my Upsampler/DAC’s Ethernet port is already well isolated via Fibre and an Ethernet isolator dongle, plus its quite far apart from the other system cables which are also mostly well shielded and well dressed anyway, so perhaps not surprising. But I’m impressed enough with what I’ve seen/heard that I’m keeping it in my system (might as well, for it’s crazy price! ;D)

The RJ/E reviews online are mostly at either extremes; those like arstechnica or amazon calling the RJ/E snake oil or ridiculing it, vs. audiophiles raving about it... none bothered to include any empirical test in either the digital or analog domains to support their conclusions...::). As usual, the truth lies somewhere in-between. Like most things, it’s impact is totally system dependent on your personal setup.

So, YMMV, but highly recommended for anyone serious about this stuff [emoji4]
[/quote]
 
All this Ethernet cable stuff is the main reason why I stay with a direct attached USB on my Lumin versus getting a NAS. :)
 
The network protocols responsible for the transmission of the data - TCP/IP - take care of ensuring data packets arrive and are reconstructed in the correct order, no different than in data networking. In fact, it is data networking responsible for getting our music from one place to another. Memory buffers on the receiving network hardware allow the software to perform this function very reliably; errors of this type imo would probably be extremely obvious, as-in the music stopping. So I don’t buy the theory that somehow jitter is involved in diminishing the quality of the data transmission.

However I do subscribe to the hypothesis - proposed articulately in this thread and by almarg on audiogon, another very smart guy - that interference external to the data transmission can surf the Ethernet cable and make its way into the circuitry of the audio component, with the effects we try so hard to guard against. It’s this that a physically well constructed (Ethernet) cable prevents. Imo
 
Just did a small test adding another high quality AQ Ethernet cable.

Replacing a regular Cat5 IT Ethernet cable with an AQ Vodka between NAS and Router had a really significant impact on SQ. Clarity, transparency, detail, sound-stage and even oomph pressurizing the room jumped to another level, surprisingly much so. I already had an AQ Vodka between the router and the Nyquist DAC, yet the other high quality Ethernet cable had maybe an even larger impact.

When thinking of it, really a no-brainer that it appears really important to get a high bandwidth, high quality signal out of the NAS into the router. Once corrupted, the SQ cannot be corrected at a later stage in the chain. Old wisdom: good sound starts with the source. Meaning even before the DAC.

Whatever the reason (not sure how fruitful the discussion is): the only really important thing to me is whether there is an effect on the sound. And there was.


Sent from my iPad using Tapatalk
 
I just did a test at Mike’s store with my off the shelve Cat7 Ethernet and AQ Diamond. I couldn’t believe how quickly I heard the difference. Needles to say, Diamond has been ordered.
 
I would look further than just an upgraded network cable, in my experience an "audiophile" network switch brings more improvement.
My set up consists of 12 meter (certified) Bleu Jeans Cat6A from my Router/NAS to my stereo rig, then an AQvox network switch, with two AQ Vodka's connected to Aurender and dCs Rossini.
 
I tend to agree with the previous statement to the extent, that the Aqvox Network Switch SE makes a significant difference.

That said, I would argue that the quality of Ethernet cables remains important. Otherwise there is a risk that you piss away the improvement achieved through the switch.


Sent from my iPad using Tapatalk
 
Back
Top