A question about MQA

Status
Not open for further replies.
The source file is the original from which copies are made.

As you are also an analog aficionado, for vinyl that would be the master tape used to produce the vinyl stampers.

With every media, independent of whether digital or analog, copying always deteriorates the quality.


Sent from my iPad using Tapatalk

I understand that with digital files integrity checks (like checksums) are run to ensure that the original file and the copy are identical. So assuming that these checks are done, how can a digital copy of a digital file be different or worse than the original?
 
I understand that with digital files integrity checks (like checksums) are run to ensure that the original file and the copy are identical. So assuming that these checks are done, how can a digital copy of a digital file be different or worse than the original?

The quality of source file deteriorates similarly to an analog master when you make a copy of a copy of a copy etc. Deterioration is just slower.

That does of course not happen when a lots of users download the same source file from a site. Just when copying multiple times. Think of it like making a photocopy of a photocopy.


Sent from my iPad using Tapatalk
 
The quality of source file deteriorates similarly to an analog master when you make a copy of a copy of a copy etc. Deterioration is just slower.

That does of course not happen when a lots of users download the same source file from a site. Just when copying multiple times. Think of it like making a photocopy of a photocopy.


Sent from my iPad using Tapatalk

Seriously?
 
I am also surprised by this.
You say that an audio file when copied say in windows (copy/paste) deteriorates the quality of the sound in it?

Downloading a file isn’t a kind of copying it locally?
 
I am also surprised by this.
You say that an audio file when copied say in windows (copy/paste) deteriorates the quality of the sound in it?

Downloading a file isn’t a kind of copying it locally?

It’s nonsense.
 
The quality of source file deteriorates similarly to an analog master when you make a copy of a copy of a copy etc. Deterioration is just slower.

That does of course not happen when a lots of users download the same source file from a site. Just when copying multiple times. Think of it like making a photocopy of a photocopy.


Sent from my iPad using Tapatalk

That makes no sense to me either.
 
That makes no sense to me either.

Nobody is trying to make sense to you. Just stating things how they are. Unfortunately, SW deteriorates. Usually you recognize the effect, if you are trying to use hand-me-down SW. At some point passing it on, it just stops working.

If you have a minimal understanding of how software works, these are absolute no-brainers.


Sent from my iPad using Tapatalk
 
I am also surprised by this.
You say that an audio file when copied say in windows (copy/paste) deteriorates the quality of the sound in it?

Downloading a file isn’t a kind of copying it locally?

As explained, if everyone is downloading the same file, it’s original +1. The same quality for everyone who downloads it.

If however one person downloads the file, and shares it with a buddy, the quality starts to deteriorate. The more often you make a copy of a copy, the higher the likelihood it stops working.


Sent from my iPad using Tapatalk
 
i would say that while it's possible for a digital file to suffer corruption from being manipulated in the mastering process, i've not seen or even heard of a file to be diminished from generations of properly executed copies on the creation side.

the exception is ripping software for redbook/CD's, where there are lossy and lossless processes and various disc media that introduce variables. if you add a generation of variables to each copy event the errors will build up. but the problem is not inherent in making data file copies. or transmitting files with less that robust methods. buddy to buddy as you mention. lots of variables in each sharing event.

so we are talking about flawed methods of copying.

https://en.wikipedia.org/wiki/Generation_loss
 
As explained, if everyone is downloading the same file, it’s original +1. The same quality for everyone who downloads it.

If however one person downloads the file, and shares it with a buddy, the quality starts to deteriorate. The more often you make a copy of a copy, the higher the likelihood it stops working.


Sent from my iPad using Tapatalk

For digital files?


Sent from my iPhone using Tapatalk
 
Yes Mike, indeed.

You are a SW guy, so just humor me and try this at home: take some simple SW program and copy it. Test whether it still works. If it still works, take another copy of the copy and do the test again.

At some point your SW will cease to work, and you will need to go back to the previous version to use it.


Sent from my iPad using Tapatalk
 
Al, thank you for taking the time to respond.

The question arising in this thread is, how many people on the forum do actually have the technical proficiency to discuss coding differences between MPEG1 layer 3, pulse-code division, direct stream digital or master quality authentication? My guess would be very few.

Pulse-code modulation is a compression mechanism, filed for patent in the US in 1946 and 1952. The patent was granted a few years later. NHK developed the first PCM recording device in 1967. Please note, that was before the moon landing. From today’s perspective it would be a bit strange to believe the world has not evolved ever since.

From a technical perspective, pulse-code modulation simply could not handle today’s communications requirements. It’s just too inefficient. That’s why it was superseded in professional application by time-division modulation, code-division modulation, and wideband code-division modulation after that. And the next future is already on the horizon with millimeter wave-division.

Technical evolution typically occurs when a new method is developed to overcome insufficiencies of an old one. Sometimes the new method is successful, and completely overcomes limitations of the old one. In other cases it resolves one challenge, while introducing a new one in the process. This is what happened with DSD.

DSD was successful in improving fidelity of the signal by multiplying modulation frequency. But unfortunately that increased signal bandwidth to an extent, that it rendered DSD unusable for many use cases. Enter MQA. It addresses the bandwidth issue applying a partly similar logic as DSD by working in the inaudible signal band, while also adjusting amplitude and phase distortions (as e.g. introduced by R2R recording machines).

So, is MQA the final solution and the bees-knees for all things audio? Probably not. It‘s just another evolutionary step in an ongoing process.

Thank you for your reply. A couple of points:

1. As for pulse code modulation (PCM) predating the moonlanding: General relativity and quantum mechanics, cornerstones of modern science and technology, were discovered in 1915 and between 1900 and 1910, respectively. Your point?

2. Code division modulation was invented in 1935, time division multiplexing in the late 19th century.

3. Apples and oranges. Code division modulation and time division multiplexing are transmission schemes in telecommunication, PCM is a coding scheme. The latter is used by the two former. PCM is still standard in digital telephony.

4. MQA uses PCM. Your point again?

5. PCM is not a compression mechanism as you claim but a coding mechanism. There are PCM schemes that use compression -- such as MQA.

6. Communication bandwidth is for many not an issue these days. Quobuz streams uncompressed "high res" PCM, without problems for most users. MQA was a solution in search of a temporary problem.
 
i would say that while it's possible for a digital file to suffer corruption from being manipulated in the mastering process, i've not seen or even heard of a file to be diminished from generations of properly executed copies on the creation side.

the exception is ripping software for redbook/CD's, where there are lossy and lossless processes and various disc media that introduce variables. if you add a generation of variables to each copy event the errors will build up. but the problem is not inherent in making data file copies. or transmitting files with less that robust methods. buddy to buddy as you mention. lots of variables in each sharing event.

so we are talking about flawed methods of copying.

https://en.wikipedia.org/wiki/Generation_loss

Yes, that’s correct. The copying process frequently yields errors. Even with bit-proof copying. A lots of variables in every sharing event, as you say.

I’m a bit surprised this is news to anyone.


Sent from my iPad using Tapatalk
 
Thank you for your reply. A couple of points:

1. As for pulse code modulation (PCM) predating the moonlanding: General relativity and quantum mechanics, cornerstones of modern science and technology, were discovered in 1915 and between 1900 and 1910, respectively. Your point?

2. Code division modulation was invented in 1935, time division multiplexing in the late 19th century.

3. Apples and oranges. Code division modulation and time division multiplexing are transmission schemes in telecommunication, PCM is a coding scheme. The latter is used by the two former. PCM is still standard in digital telephony.

4. MQA uses PCM. Your point again?

5. PCM is not a compression mechanism as you claim but a coding mechanism. There are PCM schemes that use compression -- such as MQA.

6. Communication bandwidth is for many not an issue these days. Quobuz streams uncompressed "high res" PCM, without problems for most users. MQA was a solution in search of a temporary problem.

PCM is essentially the PSTN codec, there are way more advanced alternatives available today.


Sent from my iPad using Tapatalk
 
As for MQA using PCM:

"The MQA stream is PCM".

Quote from answer 38 in:

"A Comprehensive Q&A With MQA's Bob Stuart"

https://audiophilestyle.com/ca/ca-academy/A-Comprehensive-Q-A-With-MQA-s-Bob-Stuart/

MQA still uses PCM, that’s correct. But they are trying to take advantage of the learnings along the way in their coding scheme. That’s all I’m saying. Personally I think that is a good thing.

I do agree the MQA approach has been a bit problematic politically. But I just enjoy any efforts trying to improve things where necessary.

I’m format agnostic, and do not have any stakes in any of the formats. I’m just trying to keep an open mind.

For anyone who has followed my Brinkmann Nyquist thread a couple years back, I did a test of MQA against various PCM and DSD formats when I got the DAC. The result was inconclusive. Sometimes MQA sounded better, sometimes it did not.


Sent from my iPad using Tapatalk
 
If copying of digital files was not perfect, there would be the danger of altering of data during sharing of scientific files. But that is not the case, as I can confirm from my daily work as a biochemist with large data files related to mass spectrometry and analysis of the data. If digital transfer of scientific data, which involves copying, could not be trusted, this would create a big problem. But there isn't one.


From the Wikipedia link that Mike Lavigne posted:

https://en.m.wikipedia.org/wiki/Generation_loss

"Used correctly, digital technology can eliminate generation loss. Copying a digital file gives an exact copy if the equipment is operating properly. This trait of digital technology has given rise to awareness of the risk of unauthorized copying. Before digital technology was widespread, a record label, for example, could be confident knowing that unauthorized copies of their music tracks were never as good as the originals."

[....]

"However, copying a digital file itself incurs no generation loss—the copied file is identical to the original, provided a perfect copying channel is used."
 
Status
Not open for further replies.
Back
Top