从PyTorch中的循环学习率获取LR

2024-10-01 02:30:44 发布

您现在位置:Python中文网/ 问答频道 /正文

我试图在PyTorchreimplementation of StyleGAN by rosinality之上实现cyclical learning rate方法。为了做到这一点,我在this博客文章中建议的基础上继续努力

为了检查损失是如何随着学习率的变化而变化的,我需要绘制(LR,损失)是如何变化的Here您可以找到我的train.py的修改版本。以下是我所做的主要改变:

  1. 定义cyclical_lr,一个调节周期学习率的函数

def cyclical_lr(stepsize, min_lr, max_lr):

    # Scaler: we can adapt this if we do not want the triangular CLR
    scaler = lambda x: 1.

    # Lambda function to calculate the LR
    lr_lambda = lambda it: min_lr + (max_lr - min_lr) * relative(it, stepsize)

    # Additional function to see where on the cycle we are
    def relative(it, stepsize):
        cycle = math.floor(1 + it / (2 * stepsize))
        x = abs(it / stepsize - 2 * cycle + 1)
        return max(0, (1 - x)) * scaler(cycle)

    return lr_lambda
  1. 实现鉴别器和生成器的循环学习率

    step_size = 5*256
    end_lr = 10**-1
    factor = 10**5
    clr = cyclical_lr(step_size, min_lr=end_lr / factor, max_lr=end_lr)
    scheduler_g = torch.optim.lr_scheduler.LambdaLR(g_optimizer, [clr, clr])
    
    d_optimizer = optim.Adam(discriminator.parameters(), lr=args.lr, betas=(0.0, 0.99))
    scheduler_d = torch.optim.lr_scheduler.LambdaLR(d_optimizer, [clr])
    

对于如何绘制损失随学习率的变化曲线,您有何建议?理想情况下,我希望使用TensorBoard进行此操作,目前我正在绘制生成器损耗、鉴别器损耗和生成图像的大小,作为迭代次数的函数:

    if (i + 1) % 100 == 0:
        writer.add_scalar('Loss/G', gen_loss_val, i * args.batch.get(resolution))
        writer.add_scalar('Loss/D', disc_loss_val, i * args.batch.get(resolution))
        writer.add_scalar('Step/pixel_size', (4 * 2 ** step), i * args.batch.get(resolution))
        print(args.batch.get(resolution))

Tags: lambdagetbatchargsitminmaxscheduler