Hi!
Here's the story:
Different AD converters, as well as different DA converters have different latencies. This is because internal filtering in the converters differs from type to type, and from manufacturer to manufacturer.
Straight SPDIF does not need such filters.
As a consequence it is impossible to match the latency of analog converters to SPDIF by default. Even if we compensated the latency for our outputs, we would not be able to anticipate the latency of the analog input of an arbitrary audio interface.
For the variable sample rates in the Profiler we use a sample rate converter which is build in the DSP hardware, since 2018. This is again comparable to the filters of AD/DA converters. We have no access or control about its latency as well.
In a future firmware we are planning to bypass the sample rate converter, when the sample rate is set to 44.1 and Profiler is Master. This would bring back the low latency of earlier firmwares, for that specific setting.
Background story:
The filters mentioned in the AD/DA converters are digital anti-aliasing filters. The latency is generated by the linear-phase design of these filters.
To my knowledge, latency-free filters would have been very appropriate and would have allowed to time-align signals by only an error of a few samples, while bringing down the latency by a significant amount.
But unfortunately audiophile cork sniffers have once critizised AD/DA manufactures for using non linear-phase filters and were afraid of phase distortions. As a consequence the manufacturers, serving the users, mostly use linear-phase filters since more than two decades, baked into their hardware.
To my knowledge, such phase distortions could be seen on a scope or by other means, but cannot be perceived by ear.
Same with 96 kHz ...