Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Mac AIR v MiND 180 v Auralic ARIES v totaldac d1 server
Also where does the "bit perfect" claim refer to, the output of AIR or at the Devialet end? It'd be a brave claim to make that ticking "bit perfect" ensures perfect bits at the Devialet 100% of the time in all situations. In a normally functioning network I suspect, much like the disk drive analogy, there will be errors but at a level so small as to be insignificant, errors that could occur at several layers of the protocol stack. The network performs to a "good enough" level of performance error-wise and I'd be surprised if the network when functioning normally could give rise to significant differences in sound quality, unless there's other factors such as electrical effects in the physical layer.

Superior clocking sounds like a plausible explanation for all or most of the difference but it'd be great to have a definitive answer. Thumb5's work on reverse-engineering the AIR protocols is really worthwhile and will really help understand. I've been looking at comparing packet streams from two separate source players (I.e jRiver or say, VLC) to determine whether the packet checksums really are identical or not.
Reply


Messages In This Thread
CuBox - by Kunter - 31-Aug-2014, 13:49
RE: Mac AIR v MiND 180 v Auralic ARIES v totaldac d1 server - by Rufus McDufus - 09-Sep-2014, 13:52

Forum Jump:


Users browsing this thread: 5 Guest(s)