-
Notifications
You must be signed in to change notification settings - Fork 157
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
有关博客里提到的一句话 #4
Comments
这个很难去枚举,你可以尝试一下,有些论文中也有介绍加了BN会导致效果变差。 |
还可以看一下这个解释:去除BN层已经被证明有助于增强性能和减少计算复杂度在不同的PSNR-oriented任务,包括SR和去模糊。BN层在训练期间使用批次的均值和方差对特征进行归一化,在测试期间使用整个训练数据集的估计均值和方差。当训练和测试数据集的统计数据差异很大时,BN层往往引入不适的伪影,限制了泛化能力。我们以经验观察到,BN层有可能当网络深和在GAN网络下训练时带来伪影。这些伪影偶尔出现在迭代和不同设置之间,违反了稳定性能超过训练的需求。因此,我们为了训练稳定和一致性去除了BN层。此外,去除BN层有助于提高泛化能力,减少计算复杂度和内存使用。 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
您好,最近我在使用bn层的时候遇到点问题,就是收敛的确加快了但是精度下降了大约3%,这个问题不是第一次出现了,这让我很困惑,正好在您的博客中看到这么一句:注意:不要随便加BN,有些问题加了后会导致loss变大,我想请问一下是什么样的问题可能会导致这个结果呢?
The text was updated successfully, but these errors were encountered: