I am not sure if you guys have seen this thread on the RuneAudio forums but discussed substantial clipping and THD using the SPDIF out of the Digi+ as well as the Pi USB.
Is there an issue with Pi and/or Digi+ that would cause clipping?
Here is a portion of the findings:
I have found that, by enabling software volume control and setting the Pi's volume to 89%, the issue with the altered test signal goes away and the results in RMAA look normal for my DAC. However, by doing this you are essentially removing 2 bits of resolution from the signal. Using the RMAA 16/44 test with the Pi at 87% (the sweet-spot, see results up-thread) gives a dynamic range output of 85 dB which is the equivalent of 14 bits. Disable the volume mixer (so output is 100%) and you get ~16 bits of resolution, but 15% THD.
So, my question is: Is the Pi doing something at the i2s (or earlier) rendering stage that results in a clipped signal? I would be inclined at this point to say 'yes' as, to repeat what I wrote earlier, the analogue output of a single file from a single DAC should be exactly the same - regardless of the device used to render the digital information to it - so long as that rendering device is operating in a bit-perfect fashion.