Level into Logic is sooo low


  • I never read an article or heard from an engineer that codes digital signal processes stating that in general higher sample rates on a DAW will produce a better sound quality. The good engineer nows that if a process will produce better results using a higher sample rate, then it is mandatory to run this specific process on a higher sample rate. Most processes don't differ when changing the sample rate.
    All articles that I have read were written by people with great knowledge in audio technic, but with no experience writing audio code.

  • That article is interesting, but has very little to do with the original issue. A good read, though...
    My focus is still in figuring out why I'm getting such a difference in levels within my sequencer when using the KPA vs my other guitar processors.
    These forums are dangerous- one could spend all day on here!

  • I've got no idea... the weird thing is that the level going into my system (the RME Fireface 800) seems to be alright. It's just that when it gets into Logic that it's so low.


    What strikes me odd with your pic of the RME mixer is that you have lowered the level of the software-out (middle row) AND the hardware-out (bottom row) by -7.7 dB. So you are loosing 15.4 dB alltogether from the Logic playback device to your monitors. Now that could point to a pretty strange routing or environment setting that made you do that. That doesn't mean that it really is the culprit, it just strikes me odd. Usually you would leave the hardware-out at 0 dB and control the volume going to your monitor with the software outs that are routed to the hardware outs OR you do it just the other way 'round. But you wouldn't dial back both faders if it is not for a special reason. The last time I used Logic is too many years ago to know much details of that darned environment anynore. All I remember that it was mostly the reason when something went horrible wrong. So it would be very interesting to see a picture of your environment window and see if there is going something wrong inside.


    One last thing:
    Is there a reason why you don't use S/P-DIF? It's the easiest way of getting the Kemper in a DAW and if you ever want to reamp anything or process your samped bass it is so convinient. Why not using that?

  • I never read an article or heard from an engineer that codes digital signal processes stating that in general higher sample rates on a DAW will produce a better sound quality. The good engineer knows that if a process will produce better results using a higher sample rate, then it is mandatory to run this specific process on a higher sample rate. Most processes don't differ when changing the sample rate.
    All articles that I have read were written by people with great knowledge in audio technic, but with no experience writing audio code.

    I'm not sure we're referring to the same issues here. The article I mentioned (Director's Editorial by Paolo Nuti, AudioReview n° 333, May, 2012 pag. 6) was referring to the difference between an MP3 vs. a 44.1/16 CD vs. a 96/24 master.


    The CD standard limits are reached of course at the lower levels: I think for example of the ambience in a concert hall: attenders' noises, late reflections, musicians' noises... Eng. Nuti shows that the early reflections in a mid-sized hall for a person sitting at one-third are attenuated by about 15-18 dB. With the CD standard all you have to describe this is no more than 6-7 bits, which for the later (weaker) reflections drop to 4 or less: this means just 16, 8 or 4 levels.
    Considering that a 24-bit coding gives no more than further 4 "real" bits if compared to the 16-bit coding, it anyway gives us 6-8 bits (64-128 levels) for describing the ambience, the air, the stage perception, the localization of each instrument... It's meaningful.


    It's not that the sounds are not there in a (well recorded) CD, it's just that there's something different, which for the audiophile can make the difference between listening pleasure and the feeling that something is missing.
    Nuances, I know. But, if a DAW is not able to manage 24 (or 36) bits I don't see how it might capture the magic in that concert hall.

  • Hey Garrincha,
    The reason that I had those faders down was to control the level of my speaker controller- a Mackie Big Knob. I wanted to be able to move the overall level knob without my speakers blowing my ears off at the slightest nudge.
    However, when I raised the levels in the RME mixer, it didn't affect the level going into Logic that much. RME tech support wanted me to find out the output of the KPA- either +4 or -10. I just set my RME settings to +4.

  • Hey Garrincha,
    The reason that I had those faders down was to control the level of my speaker controller- a Mackie Big Knob. I wanted to be able to move the overall level knob without my speakers blowing my ears off at the slightest nudge.


    Ah, I see, that makes sense of course.



    However, when I raised the levels in the RME mixer, it didn't affect the level going into Logic that much. RME tech support wanted me to find out the output of the KPA- either +4 or -10. I just set my RME settings to +4.


    Hm, that strikes me odd again as the level going into Logic shouldn't be affected at all. If there is a change in level - even the slightest bit - that would indicate you are not getting the signal straight from the hardware input of the RME. As I said I didn't work with Logic after they dumped the PC version which was like 10 years ago, but I remember vaguely you could route everything back and forth and around in circles in the environment. That always buggered the hell out of me since I used to get lost in the process ;) In Cubase you'd have the hardware input always routed straight to the input channel of the software and that seemed more straightforward to me.


    I use a RME Multiface and the level you see at the top row of faders are exactely what I get in Cubase at the input channel and "on tape". The screenshot of your RME mixer looks good in that regard. So it should be possible to get that level in Logic as well.


    As for the line-level-setting, +4dBu is the usual setting for the professional line-level. It would really take me by surprise if the Kemper would be running at -10dBV. But you never know ;)

  • Ya- you're obviously knowledgable with this stuff; thanks for your input!
    I just got the specs of the KPA's output from KPA:
    Analog Outputs
    Master L and R Outputs: XLR balanced, ¼ inch TRS unbalanced with ground lift
    max output level: XLR +22dBu, TRS +16dBu

    Monitor Output: ¼ inch TRS unbalanced with ground lift
    max output level: +16dBu

    Direct Output/Send: ¼ inch TRS unbalanced with ground lift
    max output level: +16dBu


    I'm gonna call RME once more to see if they've got any ideas. I'll never be totally comfortable seeing the level going into Logic so much lower than I'm used to/expected. If I hear anything, I'll keep you in the loop. Thanks again!

  • Film and TV work is done at 48kHz (except when it's HD and has been requested at 96kHz which is very rare)...


    ...and while we're on the subject the KPA really needs to get a "slave" mode on the clocking, many studios use Big Ben's or other word clocks and aren't going to want to disable that purely for the sake of the KPA, which may reduce it's usage and desirability especially for re-amping. I love my KPA< but the approach feels very isolated and like it's imagined to be the only piece of outboard in the studio which just isn't the case in the real world.

  • +1 for the slave mode.


    Indeed. Having to reset my interface to slave when I turn on the Kemper is an annoyance. I'd like to use the S/PIDF but for the time being I am avoiding the hassle and going in analog.

  • I use the SPDif mode exclusively, but it is really annoying switching back and forth constantly in my audio interface to clock it to the Kemper, and then switch everything back once the KPA is off...


    Granted, it's a small annoyance... but after all we're lazy guitarists, always looking for a perfect world :P

  • The CD standard limits are reached of course at the lower levels: I think for example of the ambience in a concert hall: attenders' noises, late reflections, musicians' noises... Eng. Nuti shows that the early reflections in a mid-sized hall for a person sitting at one-third are attenuated by about 15-18 dB. With the CD standard all you have to describe this is no more than 6-7 bits, which for the later (weaker) reflections drop to 4 or less: this means just 16, 8 or 4 levels.
    Considering that a 24-bit coding gives no more than further 4 "real" bits if compared to the 16-bit coding, it anyway gives us 6-8 bits (64-128 levels) for describing the ambience, the air, the stage perception, the localization of each instrument... It's meaningful.


    It's not that the sounds are not there in a (well recorded) CD, it's just that there's something different, which for the audiophile can make the difference between listening pleasure and the feeling that something is missing.
    Nuances, I know. But, if a DAW is not able to manage 24 (or 36) bits I don't see how it might capture the magic in that concert hall.


    Did the expert say what the sonical difference is between 16 and 24 bit?

  • :huh:


    ... I've described it in the post you've just replied to...


    That is: "it's just that there's something different"
    No sonical differences are described in detail?


    The theory you are citing is a bit weird. How can sonical parts of a CD be left on 6 to 7 bits, when being at -18 dB?

  • That is: "it's just that there's something different"
    No sonical differences are described in detail?

    This is what I wrote:

    ... the ambience, the air, the stage
    perception, the localization of each instrument... It's meaningful. It's not that the sounds are not there in a (well recorded) CD, it's
    just that there's something different, which for the audiophile can make
    the difference between listening pleasure and the feeling that
    something is missing.


    Nuances, I know. But, if a DAW is not able to manage 24 (or 36) bits I
    don't see how it might capture the magic in that concert hall.

    Quote

    The theory you are citing is a bit weird. How can sonical parts of a CD be left on 6 to 7 bits, when being at -18 dB?

    The level variations of a weak sound in a CD have less than 16 bits to be described, I guess we agree on this.
    Which figures would you consider accurate instead?

  • The KPA has a lower output level than other modelers because due to the clean sund compensation it keeps some headroom.


    I might be missing something since this thread went a bit offtopic but I thought the OP had a different problem. As I understand it he is getting a good level at the hardware-input of his RME (and that would be the Kempers output) but he CAN'T get the same (good) level in Logic when he records the Kemper.


    This seems really strange and I have no idea what is going on there. I've got some experience with RME hardware and their Total Mix software but I'm using Cubase for recording and it works perfectly for me. So there must be something different going on in his system.


    Personally I did notice the additional headroom of the Kemper, but I do like to keep a bit of a headroom when recording anyway. But then I don't have the problem the OP had.


  • The first effect that should be noticable with not enough bits is quantization noise. This is similar to what our bit shaper generates in purposes. If the CD is well mastered, then dithering is applied, that converts quantization noise into an analog noise floor. After dithering the digital recording has the same quality as tape recording with a S/N of about 90 dB.
    It is strange to me that people believe to hear differences in the sound, but never mention noise or quantization noise.


    Where can I find an A/B comparison on the web?

    Edited once, last by ckemper ().