How to prevent noise that exceeds Nyquist frequency of input signal when upsampling
Posted: Sun Jun 08, 2025 10:41 am
Do you know how to prevent noise that exceeds Nyquist frequency of input signal when upsampling?
If it is possible to set the same sample rate for input and output of VB-CABLE, this problem does not occur.
However, if sending application(input side of VB-CABLE) dynamically changes sample rate,
receiving application(output side of VB-CABLE) cannot detect the change in sample rate,
so the largest possible sample rate must be set for output side of VB-CABLE.
In that case, VB-CABLE will automatically perform upsampling,
but harmonic noise will occur in frequency band above Nyquist frequency of input signal.
This phenomenon is bound to occur due to the principle of upsampling,
and I understand that the noise is usually suppressed by applying a low-pass filter(interpolation filter).
However, it seems that VB-CABLE does not apply a low-pass filter when upsampling.
(Is this to prevent signal degradation, reduce processing load, or reduce delays?)
It would be ideal if receiving application could apply a low-pass filter,
but receiving application cannot know the sample rate of the original input signal,
it does not know what frequency to set for the low-pass filter.
Do you know any solutions to the above problem?
(For example, specific steps such as applying a low-pass filter inside VB Cable,
using software other than VB Cable (such as Voicemeeter),
detecting the sample rate of the input signal with an external application, etc.
I'm not sure if these are actually possible...)
Note: My English skills are not good enough,
so I am writing this topic with the assistance of machine translation.
Please forgive me if there are any rude expressions.
If it is possible to set the same sample rate for input and output of VB-CABLE, this problem does not occur.
However, if sending application(input side of VB-CABLE) dynamically changes sample rate,
receiving application(output side of VB-CABLE) cannot detect the change in sample rate,
so the largest possible sample rate must be set for output side of VB-CABLE.
In that case, VB-CABLE will automatically perform upsampling,
but harmonic noise will occur in frequency band above Nyquist frequency of input signal.
This phenomenon is bound to occur due to the principle of upsampling,
and I understand that the noise is usually suppressed by applying a low-pass filter(interpolation filter).
However, it seems that VB-CABLE does not apply a low-pass filter when upsampling.
(Is this to prevent signal degradation, reduce processing load, or reduce delays?)
It would be ideal if receiving application could apply a low-pass filter,
but receiving application cannot know the sample rate of the original input signal,
it does not know what frequency to set for the low-pass filter.
Do you know any solutions to the above problem?
(For example, specific steps such as applying a low-pass filter inside VB Cable,
using software other than VB Cable (such as Voicemeeter),
detecting the sample rate of the input signal with an external application, etc.
I'm not sure if these are actually possible...)
Note: My English skills are not good enough,
so I am writing this topic with the assistance of machine translation.
Please forgive me if there are any rude expressions.