bender python客户端

bender-client的Python项目详细描述


用于python的bender客户端

:warning: The full DOCUMENTATION on bender-python-client can be found HERE.

设置

  1. bender.dreem.com
  2. 使用pip install bender-client
  3. 在python环境中安装bender

用法示例

Let's use the famous MNIST example where we try to recognize handwritten digits in images.

使用PyTorch的算法代码如下:

To use this example, do not forget to pip install numpy torch torchvision .

from__future__importprint_functionimporttorchimporttorch.nnasnnimporttorch.nn.functionalasFimporttorch.optimasoptimfromtorchvisionimportdatasets,transformsclassNet(nn.Module):def__init__(self,dropout=True,activation="relu",kernel_size=5,conv_depth=10,linear_depth=50):super(Net,self).__init__()self.conv1=nn.Conv2d(1,conv_depth,kernel_size=kernel_size)self.conv2=nn.Conv2d(conv_depth,20,kernel_size=kernel_size)self.conv2_drop=nn.Dropout2d()ifdropoutisTrueelselambdax:xself.fc1=nn.Linear(320,linear_depth)self.fc2=nn.Linear(linear_depth,10)self.activation=getattr(F,activation)defforward(self,x):x=self.activation(F.max_pool2d(self.conv1(x),2))x=self.activation(F.max_pool2d(self.conv2_drop(self.conv2(x)),2))x=x.view(-1,320)x=self.activation(self.fc1(x))x=F.dropout(x,training=self.training)x=self.fc2(x)returnF.log_softmax(x,dim=1)deftrain(model,device,train_loader,optimizer,epoch):model.train()forbatch_idx,(data,target)inenumerate(train_loader):data,target=data.to(device),target.to(device)optimizer.zero_grad()output=model(data)loss=F.nll_loss(output,target)loss.backward()optimizer.step()ifbatch_idx%10==0:print('Train Epoch: {} [{}/{} ({:.0f}%)]\tLoss: {:.6f}'.format(epoch,batch_idx*len(data),len(train_loader.dataset),100.*batch_idx/len(train_loader),loss.item()))deftest(model,device,test_loader):model.eval()test_loss=0correct=0withtorch.no_grad():fordata,targetintest_loader:data,target=data.to(device),target.to(device)output=model(data)test_loss+=F.nll_loss(output,target,reduction='sum').item()pred=output.max(1,keepdim=True)[1]correct+=pred.eq(target.view_as(pred)).sum().item()test_loss/=len(test_loader.dataset)print('\nTest set: Average loss: {:.4f}, Accuracy: {}/{} ({:.0f}%)\n'.format(test_loss,correct,len(test_loader.dataset),100.*correct/len(test_loader.dataset)))return(correct/len(test_loader.dataset))defrun(epochs=3,lr=0.01,momentum=0.5,dropout=True,activation="relu",kernel_size=5,conv_depth=10,linear_depth=50):torch.manual_seed(1)device=torch.device("cpu")train_loader=torch.utils.data.DataLoader(datasets.MNIST('../data',train=True,download=True,transform=transforms.Compose([transforms.ToTensor(),transforms.Normalize((0.1307,),(0.3081,))])),batch_size=32,shuffle=True,)test_loader=torch.utils.data.DataLoader(datasets.MNIST('../data',train=False,transform=transforms.Compose([transforms.ToTensor(),transforms.Normalize((0.1307,),(0.3081,))])),batch_size=1000,shuffle=True,)model=Net(dropout,activation).to(device)optimizer=optim.SGD(model.parameters(),lr=lr,momentum=momentum)accuracy=0forepochinrange(1,int(epochs)+1):train(model,device,train_loader,optimizer,epoch)accuracy=test(model,device,test_loader)returnaccuracyif__name__=='__main__':# HYPERPARAMETERS (That's what bender is interested in)# Here we select them on our own in an arbitrary wayhyperparameters={"kernel_size":5,"epochs":3,"lr":0.05,"momentum":0.2,"dropout":True,"activation":"relu","conv_depth":10,"linear_depth":50,}run(epochs=hyperparameters["epochs"],lr=hyperparameters["lr"],momentum=hyperparameters["momentum"],dropout=hyperparameters["dropout"],activation=hyperparameters["activation"],kernel_size=hyperparameters["kernel_size"],conv_depth=hyperparameters["conv_depth"],linear_depth=hyperparameters["linear_depth"],)

现在让我们将bender插入其中!

  1. 导入折弯机
frombenderclientimportBenderbender=Bender()

This will ask for your email and password. The client will use these to login and retrieve a TOKEN. This TOKEN is personal, it should not be shared, it will be stored in your home folder as ".bender_token", and you will not be asked for your login/password again until it expires. :warning: Again, your TOKEN is personal. You should not give it or add it to any public repository :warning:

  1. 创建一个实验

An experiment is related to the problem you are trying to solve, here : MNIST classification

bender.new_experiment(name='MNIST Classification',description='Simple image classification on handwritten digits',metrics=[{"metric_name":"algorithm_accuracy",# It's just a name and there can be multiple watched metrics."type":"reward",# The type can either be "reward" or "loss" depending on if you want to maximize or minimize it.}],dataset='MNIST')
  1. 创建算法

An algo is simply corresponding to ONE solution to an Experiment problem : here it's as we saw a Neural Net with PyTorch

bender.new_algo(name='PyTorch_NN',# The parameters below are actually the Hyper-Parameters of your algo described in a listparameters=[{"name":'kernel_size',"category":"categorical","search_space":{"values":[3,5,7],},},{"name":'conv_depth',"category":"uniform","search_space":{"low":1,"high":100,"step":1,},},{"name":'linear_depth',"category":"uniform","search_space":{"low":1,"high":100,"step":1,},},{"name":'epochs',"category":"uniform","search_space":{"low":1,"high":4,"step":1,},},{"name":'lr',"category":"loguniform","search_space":{"low":1e-5,"high":1e-1,"step":1e-6,},},{"name":'momentum',"category":"uniform","search_space":{"low":0,"high":1,"step":0.05,},},{"name":'dropout',"category":"categorical","search_space":{"values":[True,False],},},{"name":'activation',"category":"categorical","search_space":{"values":["relu","softmax","sigmoid","tanh"],},},])
  1. 从bender获取超参数集建议

The whole goal of what we did up there is to use Bender to get a new set of Hyperparameters to try according to the settings of your Experiment and Algo.

suggestion=bender.suggest()# suggestion would for example contain something like :{"kernel_size":5,"epochs":3,"lr":0.05,"momentum":0.2,"dropout":True,"activation":"tanh","conv_depth":10,"linear_depth":50,}
  1. feed a trial to bender

A Trial is simply an attempt of you trying a Hyperparameters Set with your algorithm associated with the result metrics obtained. If you want bender to improve over time, feed him every trial you make.

bender.new_trial(parameters={"kernel_size":5,"epochs":3,"lr":0.05,"momentum":0.2,"dropout":True,"activation":"tanh","conv_depth":10,"linear_depth":50,},results={"algorithm_accuracy":0.7,# We put an arbitrary value here just for the example.})
  1. 全部代码放在一起

Psssssst... The magic starts at line 443... ;)

To use this example, do not forget to pip install numpy torch torchvision bender-client .

from__future__importprint_functionimportargparseimporttorchimporttorch.nnasnnimporttorch.nn.functionalasFimporttorch.optimasoptimfromtorchvisionimportdatasets,transformsfrombenderclientimportBenderclassNet(nn.Module):def__init__(self,dropout=True,activation="relu",kernel_size=5,conv_depth=10,linear_depth=50):super(Net,self).__init__()self.conv1=nn.Conv2d(1,conv_depth,kernel_size=kernel_size)self.conv2=nn.Conv2d(conv_depth,20,kernel_size=kernel_size)self.conv2_drop=nn.Dropout2d()ifdropoutisTrueelselambdax:xself.fc1=nn.Linear(320,linear_depth)self.fc2=nn.Linear(linear_depth,10)self.activation=getattr(F,activation)defforward(self,x):x=self.activation(F.max_pool2d(self.conv1(x),2))x=self.activation(F.max_pool2d(self.conv2_drop(self.conv2(x)),2))x=x.view(-1,320)x=self.activation(self.fc1(x))x=F.dropout(x,training=self.training)x=self.fc2(x)returnF.log_softmax(x,dim=1)deftrain(model,device,train_loader,optimizer,epoch):model.train()forbatch_idx,(data,target)inenumerate(train_loader):data,target=data.to(device),target.to(device)optimizer.zero_grad()output=model(data)loss=F.nll_loss(output,target)loss.backward()optimizer.step()ifbatch_idx%10==0:print('Train Epoch: {} [{}/{} ({:.0f}%)]\tLoss: {:.6f}'.format(epoch,batch_idx*len(data),len(train_loader.dataset),100.*batch_idx/len(train_loader),loss.item()))deftest(model,device,test_loader):model.eval()test_loss=0correct=0withtorch.no_grad():fordata,targetintest_loader:data,target=data.to(device),target.to(device)output=model(data)test_loss+=F.nll_loss(output,target,reduction='sum').item()pred=output.max(1,keepdim=True)[1]correct+=pred.eq(target.view_as(pred)).sum().item()test_loss/=len(test_loader.dataset)print('\nTest set: Average loss: {:.4f}, Accuracy: {}/{} ({:.0f}%)\n'.format(test_loss,correct,len(test_loader.dataset),100.*correct/len(test_loader.dataset)))return(correct/len(test_loader.dataset))defrun(epochs=3,lr=0.01,momentum=0.5,dropout=True,activation="relu",kernel_size=5,conv_depth=10,linear_depth=50):torch.manual_seed(1)device=torch.device("cpu")train_loader=torch.utils.data.DataLoader(datasets.MNIST('../data',train=True,download=True,transform=transforms.Compose([transforms.ToTensor(),transforms.Normalize((0.1307,),(0.3081,))])),batch_size=32,shuffle=True,)test_loader=torch.utils.data.DataLoader(datasets.MNIST('../data',train=False,transform=transforms.Compose([transforms.ToTensor(),transforms.Normalize((0.1307,),(0.3081,))])),batch_size=1000,shuffle=True,)model=Net(dropout,activation).to(device)optimizer=optim.SGD(model.parameters(),lr=lr,momentum=momentum)accuracy=0forepochinrange(1,int(epochs)+1):train(model,device,train_loader,optimizer,epoch)accuracy=test(model,device,test_loader)returnaccuracydefinit_bender():bender=Bender()bender.create_experiment(name='MNIST Classification',description='Simple image classification on handwritten digits',metrics=[{"metric_name":"algorithm_accuracy","type":"reward"}],dataset='MNIST')bender.create_algo(name='PyTorch_NN',hyperparameters=[{"name":'kernel_size',"category":"categorical","search_space":{"values":[3,5,7],},},{"name":'conv_depth',"category":"uniform","search_space":{"low":1,"high":100,"step":1,},},{"name":'linear_depth',"category":"uniform","search_space":{"low":1,"high":100,"step":1,},},{"name":'epochs',"category":"uniform","search_space":{"low":1,"high":4,"step":1,},},{"name":'lr',"category":"loguniform","search_space":{"low":1e-5,"high":1e-1,"step":1e-6,},},{"name":'momentum',"category":"uniform","search_space":{"low":0,"high":1,"step":0.05,},},{"name":'dropout',"category":"categorical","search_space":{"values":[True,False],},},{"name":'activation',"category":"categorical","search_space":{"values":["relu","softmax","sigmoid","tanh"],},},])returnbenderif__name__=='__main__':# Create experiment and algo if they don't exist yet. Else, load them from the config file ./.benderconfbender=init_bender()whileTrue:# Get a set of Hyperparameters to testsuggestion=bender.suggest(metric="algorithm_accuracy")# Get algo result with themresult=run(epochs=suggestion["epochs"],lr=suggestion["lr"],momentum=suggestion["momentum"],dropout=suggestion["dropout"],activation=suggestion["activation"],kernel_size=suggestion["kernel_size"],conv_depth=suggestion["conv_depth"],linear_depth=suggestion["linear_depth"],)# Feed Bender a Trial, AKA => suggestion + resultbender.create_trial(hyperparameters=suggestion,results={"algorithm_accuracy":result})print('New trial sent -----------------------------------------------------\n\n')

欢迎加入QQ群-->: 979659372 Python中文网_新手群

推荐PyPI第三方库


热门话题
java Intellij通过方法中的包查找用法   java中VS代码和打包命名的问题   将java CMS功能集成到具有高度动态内容的网站(Lucene/Mysql/Nosql)的策略   oracle的java类强制转换异常。jdbc。驾驶员OracleConnection   字节码向JVM添加上指令   如何在抽象类中执行java方法?   java是否可以在apache访问日志中排除指定的GET参数?(作者:W7开发环境)   java如何获取已安装音频播放器的列表?   尝试向HS学生展示如何使用Java访问MS数据库   使用正则表达式java对给定行中的特定字符串进行计数   java JOOQ Select查询中的Select计数   方法Java,如何从二维双精度数组中找到特定值?   获取图像URL的java正则表达式   java在切换到新的窗口驱动程序后找不到元素