fastai标签

fast-tabnet的Python项目详细描述


fastai的TabNet

This is an adaptation of TabNet (Attention-based network for tabular data) for fastai (>=2.0) library. The original paper https://arxiv.org/pdf/1908.07442.pdf.

安装

pip install fast_tabnet

如何使用

model = TabNetModel(emb_szs, n_cont, out_sz, embed_p=0., y_range=None, n_d=8, n_a=8, n_steps=3, gamma=1.5, n_independent=2, n_shared=2, epsilon=1e-15, virtual_batch_size=128, momentum=0.02)

参数emb_szs, n_cont, out_sz, embed_p, y_range与fastai TabularModel相同。在

  • 注意:内景 预测层的维数(通常在4到64之间)
  • 内景:内景 注意层的尺寸(通常在4到64之间)
  • n_步骤:int 新作业中成功的步骤数(通常在3到10之间)
  • gamma:浮动 浮动在1以上,注意力更新的比例因子(通常在1.0到2.0之间)
  • 动量:浮动 介于0和1之间的浮点值,将用于所有批次定额中的动量
  • n_独立:int 每个GLU块中独立GLU层的数量(默认2)
  • n_共享:int 每个GLU块中独立GLU层的数量(默认2)
  • epsilon:浮动 避免使用log(0),这个值应该保持很低

示例

下面是fastai库中的一个示例,但使用的模型是TabNet

fromfastai.basicsimport*fromfastai.tabular.allimport*fromfast_tabnet.coreimport*
^{pr2}$
<;样式范围>; .dataframe tbody tr th:仅类型{ 垂直对齐:中间; }
.dataframe tbody tr th {
    vertical-align: top;
}

.dataframe thead th {
    text-align: right;
}
<;/style>;
ageworkclassfnlwgteducationeducation-nummarital-statusoccupationrelationshipracesexcapital-gaincapital-losshours-per-weeknative-countrysalary
049Private101320Assoc-acdm12.0Married-civ-spouseNaNWifeWhiteFemale0190240United-States>=50k
144Private236746Masters14.0DivorcedExec-managerialNot-in-familyWhiteMale10520045United-States>=50k
238Private96185HS-gradNaNDivorcedNaNUnmarriedBlackFemale0032United-States<50k
338Self-emp-inc112847Prof-school15.0Married-civ-spouseProf-specialtyHusbandAsian-Pac-IslanderMale0040United-States>=50k
442Self-emp-not-inc822977th-8thNaNMarried-civ-spouseOther-serviceWifeBlackFemale0050United-States<50k
cat_names=['workclass','education','marital-status','occupation','relationship','race','native-country','sex']cont_names=['age','fnlwgt','education-num']procs=[Categorify,FillMissing,Normalize]splits=RandomSplitter()(range_of(df_main))
to=TabularPandas(df_main,procs,cat_names,cont_names,y_names="salary",y_block=CategoryBlock(),splits=splits)
dls=to.dataloaders(bs=32)
dls.valid.show_batch()
^{tb2}$
to_tst=to.new(df_test)to_tst.process()to_tst.all_cols.head()
<;样式范围>; .dataframe tbody tr th:仅类型{ 垂直对齐:中间; }
.dataframe tbody tr th {
    vertical-align: top;
}

.dataframe thead th {
    text-align: right;
}
<;/style>;
workclasseducationmarital-statusoccupationrelationshipracenative-countrysexeducation-num_naagefnlwgteducation-numsalary
315615259334021-1.505833-0.559418-1.2021700
3156221252534011-1.4326530.421241-0.4180320
315635734154021-0.1154060.132868-1.9863070
31564812391540211.4945610.749805-0.4180320
3156511211534021-0.4813087.529798-0.4180320
emb_szs=get_emb_sz(to)

这就是模型的用途

model=TabNetModel(emb_szs,len(to.cont_names),dls.c,n_d=8,n_a=8,n_steps=5,mask_type='entmax');
learn=Learner(dls,model,CrossEntropyLossFlat(),opt_func=Adam,lr=3e-2,metrics=[accuracy])
learn.lr_find()
SuggestedLRs(lr_min=0.2754228591918945, lr_steep=1.9054607491852948e-06)

png

learn.fit_one_cycle(5)
epochtrain_lossvalid_lossaccuracytime
00.4462740.4144510.81701500:30
10.3800020.3930300.81891600:30
20.3711490.3598020.83206600:30
30.3490270.3522550.83586800:30
40.3553390.3493600.83681900:30

Tabnet可解释性

# feature importance for 2k rowsdl=learn.dls.test_dl(df.iloc[:2000],bs=256)feature_importances=tabnet_feature_importances(learn.model,dl)
# per sample interpretationdl=learn.dls.test_dl(df.iloc[:20],bs=20)res_explain,res_masks=tabnet_explain(learn.model,dl)
plt.xticks(rotation='vertical')plt.bar(dl.x_names,feature_importances,color='g')plt.show()

png

defplot_explain(masks,lbls,figsize=(12,12)):"Plots masks with `lbls` (`dls.x_names`)"fig=plt.figure(figsize=figsize)ax=fig.add_axes([0.1,0.1,0.8,0.8])plt.yticks(np.arange(0,len(masks),1.0))plt.xticks(np.arange(0,len(masks[0]),1.0))ax.set_xticklabels(lbls,rotation=90)plt.ylabel('Sample Number')plt.xlabel('Variable')plt.imshow(masks)
plot_explain(res_explain,dl.x_names)

png

基于贝叶斯优化的超参数搜索

如果你的数据集不是很大,你可以用贝叶斯优化为表格模型调整超参数。如果度量足够敏感,可以使用这种方法直接优化度量(在我们的示例中不是这样,而是使用验证损失)。另外,您应该创建第二个验证集,因为您将使用第一个验证集作为贝叶斯优化的训练集。在

您可能需要安装优化器pip install bayesian-optimization

^{pr21}$
# The function we'll optimize@lru_cache(1000)defget_accuracy(n_d:Int,n_a:Int,n_steps:Int):model=TabNetModel(emb_szs,len(to.cont_names),dls.c,n_d=n_d,n_a=n_a,n_steps=n_steps,gamma=1.5)learn=Learner(dls,model,CrossEntropyLossFlat(),opt_func=opt_func,lr=3e-2,metrics=[accuracy])learn.fit_one_cycle(5)returnfloat(learn.validate(dl=learn.dls.valid)[1])

这种贝叶斯优化的实现不能自然地使用descreet值。这就是为什么我们将包装与lru_cache一起使用。在

deffit_accuracy(pow_n_d,pow_n_a,pow_n_steps):n_d,n_a,n_steps=map(lambdax:2**int(x),(pow_n_d,pow_n_a,pow_n_steps))returnget_accuracy(n_d,n_a,n_steps)
frombayes_optimportBayesianOptimization# Bounded region of parameter spacepbounds={'pow_n_d':(0,8),'pow_n_a':(0,8),'pow_n_steps':(0,4)}optimizer=BayesianOptimization(f=fit_accuracy,pbounds=pbounds,)
optimizer.maximize(init_points=15,n_iter=100,)
|   iter    |  target   |  pow_n_a  |  pow_n_d  | pow_n_... |
-------------------------------------------------------------
epochtrain_lossvalid_lossaccuracytime
00.4048880.4328340.79388500:10
10.3679790.3848400.81860000:09
20.3664440.3720050.81970800:09
30.3627710.3669490.82351100:10
40.3536820.3671320.82351100:10
| ?[0m 1       ?[0m | ?[0m 0.8235  ?[0m | ?[0m 0.9408  ?[0m | ?[0m 1.898   ?[0m | ?[0m 1.652   ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.3933010.4497420.81083600:08
10.3791400.4137730.81558900:07
20.3557900.3889070.82256000:07
30.3499840.3626710.82873900:07
40.3480000.3601500.82731300:07
| ?[95m 2       ?[0m | ?[95m 0.8273  ?[0m | ?[95m 4.262   ?[0m | ?[95m 5.604   ?[0m | ?[95m 0.2437  ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.4515720.4341890.78121000:12
10.4237630.4134200.80545000:12
20.3989220.4086880.81416400:12
30.3909810.3923980.80893500:12
40.3764180.3822500.81717400:12
| ?[0m 3       ?[0m | ?[0m 0.8172  ?[0m | ?[0m 7.233   ?[0m | ?[0m 6.471   ?[0m | ?[0m 2.508   ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.4031870.4139860.79816200:07
10.3985440.3901020.82018400:07
20.3905690.3897030.82525300:07
30.3754260.3857060.82699600:07
40.3704460.3833660.83111500:06
| ?[95m 4       ?[0m | ?[95m 0.8311  ?[0m | ?[95m 5.935   ?[0m | ?[95m 1.241   ?[0m | ?[95m 0.3809  ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.4641450.4586410.75126700:18
10.4246910.4369680.78802300:18
20.4315760.4365810.77582400:18
30.4321430.4370620.75950600:18
40.4299150.4383320.75855500:18
^{pr31}$
epochtrain_lossvalid_lossaccuracytime
00.4703590.4758260.74889100:12
10.4115640.4094330.79705300:12
20.3927180.3973630.80972700:12
30.3875640.3800330.81432200:12
40.3741530.3782580.81891600:12
| ?[0m 6       ?[0m | ?[0m 0.8189  ?[0m | ?[0m 4.592   ?[0m | ?[0m 2.138   ?[0m | ?[0m 2.824   ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.5470420.5887520.75411900:18
10.4917310.4697950.77186300:18
20.4543400.4339610.77519000:18
30.4243860.4323850.78295300:18
40.3976450.4064200.80576700:19
| ?[0m 7       ?[0m | ?[0m 0.8058  ?[0m | ?[0m 6.186   ?[0m | ?[0m 7.016   ?[0m | ?[0m 3.316   ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.4852450.4876350.75110900:18
10.4508320.4464230.75031700:18
20.4482030.4494190.75522800:18
30.4302580.4435620.74429700:18
40.4298210.4371730.76156500:18
| ?[0m 8       ?[0m | ?[0m 0.7616  ?[0m | ?[0m 2.018   ?[0m | ?[0m 1.316   ?[0m | ?[0m 3.675   ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.4584250.4557330.75158400:12
10.4397810.4678070.75110900:12
20.4203310.4322160.77519000:12
30.4210120.4214120.78231900:12
40.4018280.4134340.80101400:12
^{pr35}$
epochtrain_lossvalid_lossaccuracytime
00.5469970.5067280.76140700:18
10.4897120.4393240.79958800:18
20.4485580.4484190.78612200:18
30.4368690.4353750.80164800:18
40.4171280.4210930.79832100:18
| ?[0m 10      ?[0m | ?[0m 0.7983  ?[0m | ?[0m 5.203   ?[0m | ?[0m 7.719   ?[0m | ?[0m 3.407   ?[0m |
^{tb15}$
| ?[0m 11      ?[0m | ?[0m 0.8308  ?[0m | ?[0m 6.048   ?[0m | ?[0m 4.376   ?[0m | ?[0m 0.08141 ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.4307720.4308970.76774400:12
10.4026110.4321370.76425900:12
20.4075790.4096510.81210400:12
30.3749880.3918220.81669800:12
40.3780110.3892780.81606500:12
| ?[0m 12      ?[0m | ?[0m 0.8161  ?[0m | ?[0m 7.083   ?[0m | ?[0m 1.385   ?[0m | ?[0m 2.806   ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.4020180.4120510.81226200:09
10.3728040.4649370.81162900:09
20.3682740.3846750.82018400:09
30.3645020.3719200.82065900:09
40.3489980.3694450.82382800:09
| ?[0m 13      ?[0m | ?[0m 0.8238  ?[0m | ?[0m 4.812   ?[0m | ?[0m 3.785   ?[0m | ?[0m 1.396   ?[0m |
| ?[0m 14      ?[0m | ?[0m 0.8172  ?[0m | ?[0m 7.672   ?[0m | ?[0m 6.719   ?[0m | ?[0m 2.72    ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.4760330.4425980.80354900:12
10.4052360.4140150.78897300:11
20.4062910.4512690.78944900:11
30.3910130.3932430.81606500:12
40.3741600.3776350.82145100:12
| ?[0m 15      ?[0m | ?[0m 0.8215  ?[0m | ?[0m 6.464   ?[0m | ?[0m 7.954   ?[0m | ?[0m 2.647   ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.3901420.3906780.81099500:06
10.3817170.3822020.81305500:06
20.3685640.3787050.82382800:06
30.3588580.3683290.82351100:07
40.3533920.3639130.82588700:06
| ?[0m 16      ?[0m | ?[0m 0.8259  ?[0m | ?[0m 0.1229  ?[0m | ?[0m 7.83    ?[0m | ?[0m 0.3708  ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.3812150.4226510.80069700:06
10.3773450.3808630.81590600:06
20.3666310.3705790.82287700:06
30.3627450.3666190.82335200:07
40.3568610.3648350.82588700:07
| ?[0m 17      ?[0m | ?[0m 0.8259  ?[0m | ?[0m 0.03098 ?[0m | ?[0m 3.326   ?[0m | ?[0m 0.007025?[0m |
^{tb21}$
| ?[0m 18      ?[0m | ?[0m 0.8294  ?[0m | ?[0m 7.81    ?[0m | ?[0m 7.976   ?[0m | ?[0m 0.0194  ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.6792920.6772990.24889100:05
10.6754030.6784060.24889100:05
20.6732590.6733740.24889100:06
30.6749960.6735140.24889100:07
40.6688130.6736710.24889100:07
| ?[0m 19      ?[0m | ?[0m 0.2489  ?[0m | ?[0m 0.4499  ?[0m | ?[0m 0.138   ?[0m | ?[0m 0.001101?[0m |
epochtrain_lossvalid_lossaccuracytime
00.5242010.5281320.72988000:30
10.4193770.4031980.81210400:31
20.3993980.4188900.81242100:31
30.3816510.3917440.81907500:31
40.3687420.3779040.82208500:31
| ?[0m 20      ?[0m | ?[0m 0.8221  ?[0m | ?[0m 0.0     ?[0m | ?[0m 6.575   ?[0m | ?[0m 4.0     ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.6810830.6823970.24889100:05
10.6729350.6793710.24889100:06
20.6752000.6734660.24889100:06
30.6742510.6733560.24889100:06
40.6688610.6731860.24889100:06
| ?[0m 21      ?[0m | ?[0m 0.2489  ?[0m | ?[0m 8.0     ?[0m | ?[0m 0.0     ?[0m | ?[0m 0.0     ?[0m |
^{tb25}$
| ?[0m 22      ?[0m | ?[0m 0.8251  ?[0m | ?[0m 0.0     ?[0m | ?[0m 4.502   ?[0m | ?[0m 2.193   ?[0m |
^{tb26}$
| ?[0m 23      ?[0m | ?[0m 0.789   ?[0m | ?[0m 8.0     ?[0m | ?[0m 3.702   ?[0m | ?[0m 4.0     ?[0m |
^{tb27}$
| ?[0m 24      ?[0m | ?[0m 0.7549  ?[0m | ?[0m 6.009   ?[0m | ?[0m 0.0     ?[0m | ?[0m 4.0     ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.4228370.4039530.81939200:06
10.3807530.3673450.82683800:06
20.3530450.3651740.83000600:07
30.3486280.3642820.82636200:07
40.3435610.3615090.82921400:07
| ?[0m 25      ?[0m | ?[0m 0.8292  ?[0m | ?[0m 3.522   ?[0m | ?[0m 8.0     ?[0m | ?[0m 0.0     ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.8077661.3072790.48162200:31
10.5133080.4994700.78358700:32
20.4459060.4926200.79800400:31
30.3850940.3999860.80750900:32
40.3872280.3847390.81701500:31
^{pr51}$
epochtrain_lossvalid_lossaccuracytime
00.4420760.4913380.75538700:31
10.4410780.4436740.76077300:31
20.4175750.4187580.79214200:31
30.4108250.4175810.78849800:34
40.4034070.4109410.79832100:46
| ?[0m 27      ?[0m | ?[0m 0.7983  ?[0m | ?[0m 0.0     ?[0m | ?[0m 0.0     ?[0m | ?[0m 4.0     ?[0m |
^{tb31}$ ^{pr53}$
epochtrain_lossvalid_lossaccuracytime
00.4306040.4695920.78121000:45
10.4230740.4297040.79752900:45
20.4001200.3933980.81099500:45
30.3823610.3906510.81606500:46
40.3895200.4018780.80719300:46
^{pr54}$
epochtrain_lossvalid_lossaccuracytime
00.3963480.3974540.80671700:08
10.3833420.3860230.81955000:07
20.3694930.3744010.82002500:07
30.3560150.3665350.82620400:08
40.3410730.3652410.82620400:08
^{pr55}$ ^{tb34}$
| ?[0m 31      ?[0m | ?[0m 0.8016  ?[0m | ?[0m 8.0     ?[0m | ?[0m 8.0     ?[0m | ?[0m 4.0     ?[0m |
^{tb35}$
| ?[0m 32      ?[0m | ?[0m 0.8294  ?[0m | ?[0m 5.864   ?[0m | ?[0m 8.0     ?[0m | ?[0m 0.0     ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.4983760.4366960.79404300:16
10.4116990.4355370.80133100:16
20.3853270.3969160.82018400:16
30.3820200.3898560.81337100:16
40.3738690.3778040.82081700:15
| ?[0m 33      ?[0m | ?[0m 0.8208  ?[0m | ?[0m 1.776   ?[0m | ?[0m 8.0     ?[0m | ?[0m 2.212   ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.4046530.4401060.77218000:11
10.3779310.3937150.81733200:11
20.3732210.3792730.82683800:11
30.3596820.3628440.82842200:11
40.3403840.3630720.82889700:11
| ?[0m 34      ?[0m | ?[0m 0.8289  ?[0m | ?[0m 5.777   ?[0m | ?[0m 2.2     ?[0m | ?[0m 1.31    ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.5203080.5032070.74920800:45
10.4725010.4514690.78041800:45
20.4546860.4291750.78485400:45
30.4008000.4137270.79546900:44
40.4056040.4097700.80164800:45
| ?[0m 35      ?[0m | ?[0m 0.8016  ?[0m | ?[0m 2.748   ?[0m | ?[0m 5.915   ?[0m | ?[0m 4.0     ?[0m |
^{39}$ ^{pr61}$
epochtrain_lossvalid_lossaccuracytime
00.4207820.4207210.79135000:10
10.4035760.4083760.80022200:10
20.3902360.3936240.82034200:11
30.3777770.3896570.82161000:11
40.3828090.3860110.82097600:11
| ?[0m 37      ?[0m | ?[0m 0.821   ?[0m | ?[0m 5.093   ?[0m | ?[0m 0.172   ?[0m | ?[0m 1.64    ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.3935750.3978110.81226200:08
10.3782720.3819150.81574800:08
20.3647990.3692140.82462000:08
30.3557570.3645540.82699600:08
40.3420900.3627230.82430300:08
| ?[0m 38      ?[0m | ?[0m 0.8243  ?[0m | ?[0m 8.0     ?[0m | ?[0m 5.799   ?[0m | ?[0m 0.0     ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.3936930.3969800.82208500:11
10.3612310.3931460.81384700:11
20.3456450.3795100.82398600:11
30.3497780.3670770.82667900:11
40.3423900.3620270.82778800:11
| ?[0m 39      ?[0m | ?[0m 0.8278  ?[0m | ?[0m 1.62    ?[0m | ?[0m 3.832   ?[0m | ?[0m 1.151   ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.8327370.4910020.77154600:43
10.6279480.5535520.76473400:43
20.4989010.4671620.79198400:46
30.4311960.4445760.78564600:43
40.3997450.4270600.79657800:42
| ?[0m 40      ?[0m | ?[0m 0.7966  ?[0m | ?[0m 2.198   ?[0m | ?[0m 8.0     ?[0m | ?[0m 4.0     ?[0m |
^{tb44}$
| ?[0m 41      ?[0m | ?[0m 0.7641  ?[0m | ?[0m 8.0     ?[0m | ?[0m 1.03    ?[0m | ?[0m 4.0     ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.4085040.4132750.79721200:15
10.3927070.3990850.80576700:15
20.3799380.3955500.81780700:15
30.3752880.3831860.82081700:15
40.3604170.3750980.82319400:16
| ?[0m 42      ?[0m | ?[0m 0.8232  ?[0m | ?[0m 0.0     ?[0m | ?[0m 2.504   ?[0m | ?[0m 2.135   ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.3993710.4151960.80101400:07
10.3678040.3920200.81099500:06
20.3622880.3851240.82065900:07
30.3447280.3713390.82366900:07
40.3457690.3620590.82937300:07
| ?[0m 43      ?[0m | ?[0m 0.8294  ?[0m | ?[0m 0.0     ?[0m | ?[0m 5.441   ?[0m | ?[0m 0.0     ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.3971570.4310030.80386600:06
10.3949640.3964480.81036100:06
20.3785840.3879430.82065900:07
30.3716010.3861860.81828300:07
40.3697590.3843390.82763000:07
| ?[0m 44      ?[0m | ?[0m 0.8276  ?[0m | ?[0m 4.636   ?[0m | ?[0m 1.476   ?[0m | ?[0m 0.0     ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.4086540.4268060.79119100:12
10.3941840.4065860.78643900:12
20.3696250.3726800.82256000:12
30.3494440.3681420.82382800:12
40.3516840.3634060.82620400:12
| ?[0m 45      ?[0m | ?[0m 0.8262  ?[0m | ?[0m 0.0     ?[0m | ?[0m 7.071   ?[0m | ?[0m 2.071   ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.4002930.4160980.81162900:08
10.3773870.4333950.80703400:08
20.3681310.3954480.79642000:08
30.3677500.3768790.81717400:08
40.3621240.3714320.82113400:08
^{pr71}$
epochtrain_lossvalid_lossaccuracytime
00.4045790.4374430.81479700:07
10.3753420.3804160.82493700:07
20.3658350.3776170.81273800:07
30.3546190.3645030.82747100:07
40.3406030.3634880.82794700:07
| ?[0m 47      ?[0m | ?[0m 0.8279  ?[0m | ?[0m 6.579   ?[0m | ?[0m 6.485   ?[0m | ?[0m 0.0     ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.3848900.4403420.81257900:08
10.3714830.3872000.81384700:09
20.3659510.3780710.81828300:09
30.3629650.3699940.82161000:09
40.3564830.3651510.82652100:09
| ?[0m 48      ?[0m | ?[0m 0.8265  ?[0m | ?[0m 8.0     ?[0m | ?[0m 4.293   ?[0m | ?[0m 1.74    ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.3863080.3892500.81543100:08
10.3684020.3893380.81495600:09
20.3622110.3771960.82477800:09
30.3561350.3629510.82953100:09
40.3415770.3624760.83079900:09
| ?[0m 49      ?[0m | ?[0m 0.8308  ?[0m | ?[0m 7.909   ?[0m | ?[0m 7.827   ?[0m | ?[0m 1.323   ?[0m |
^{tb53}$
| ?[0m 50      ?[0m | ?[0m 0.8303  ?[0m | ?[0m 4.946   ?[0m | ?[0m 1.246   ?[0m | ?[0m 1.589   ?[0m |
^{tb54}$
| ?[95m 51      ?[0m | ?[95m 0.8314  ?[0m | ?[95m 5.664   ?[0m | ?[95m 2.626   ?[0m | ?[95m 0.003048?[0m |
epochtrain_lossvalid_lossaccuracytime
00.3891950.3900320.81733200:06
10.3699930.3821990.81970800:07
20.3628010.3732820.82652100:06
30.3597600.3635970.82430300:06
40.3445250.3620970.82889700:07
^{pr77}$ ^{tb56}$ ^{pr78}$
epochtrain_lossvalid_lossaccuracytime
00.5003220.5192610.75712900:10
10.4132700.4236300.80196500:11
20.3802340.3955880.81337100:12
30.3616770.3781230.81717400:12
40.3746290.3737720.82002500:12
| ?[0m 54      ?[0m | ?[0m 0.82    ?[0m | ?[0m 4.579   ?[0m | ?[0m 5.017   ?[0m | ?[0m 2.928   ?[0m |
| ?[0m 55      ?[0m | ?[0m 0.8259  ?[0m | ?[0m 0.02565 ?[0m | ?[0m 3.699   ?[0m | ?[0m 0.9808  ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.4527870.4436970.76869500:11
10.4283320.4154540.80069700:11
20.3965220.4028500.80766800:12
30.4248020.4146480.78358700:12
40.3850550.3923590.80148900:12
| ?[0m 56      ?[0m | ?[0m 0.8015  ?[0m | ?[0m 1.927   ?[0m | ?[0m 5.92    ?[0m | ?[0m 2.53    ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.4355970.4382220.81083600:19
10.3999200.5311890.77075400:19
20.4034080.4093820.80481600:18
30.3635190.3838230.81590600:19
40.3600300.3776210.81970800:19
| ?[0m 57      ?[0m | ?[0m 0.8197  ?[0m | ?[0m 0.7796  ?[0m | ?[0m 4.576   ?[0m | ?[0m 3.952   ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.3884450.4202430.80053900:07
10.3729120.3696590.82763000:07
20.3544430.3667570.82810500:07
30.3524680.3660380.82256000:07
40.3478220.3620010.82969000:07
| ?[0m 58      ?[0m | ?[0m 0.8297  ?[0m | ?[0m 3.525   ?[0m | ?[0m 4.198   ?[0m | ?[0m 0.02314 ?[0m |
^{tb61}$
| ?[0m 59      ?[0m | ?[0m 0.8194  ?[0m | ?[0m 6.711   ?[0m | ?[0m 3.848   ?[0m | ?[0m 2.395   ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.3968310.3890450.80956900:07
10.3711710.3750650.81860000:07
20.3503090.3717950.82462000:07
30.3597000.3630410.82873900:07
40.3457350.3615560.83079900:07
^{pr84}$
epochtrain_lossvalid_lossaccuracytime
00.4228530.4126910.80434100:09
10.3752090.3946920.81717400:09
20.3655740.3803760.82018400:08
30.3591430.3636070.83111500:08
40.3479910.3616500.82794700:08
| ?[0m 61      ?[0m | ?[0m 0.8279  ?[0m | ?[0m 7.962   ?[0m | ?[0m 6.151   ?[0m | ?[0m 1.119   ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.3881470.4058850.81067800:07
10.3677430.3918670.80782600:07
20.3669640.3629800.82873900:07
30.3634020.3633960.82953100:07
40.3510940.3622450.82921400:07
^{pr86}$ ^{tb65}$ ^{pr87}$
epochtrain_lossvalid_lossaccuracytime
00.4189900.4204630.80861800:07
10.3898300.3981100.81622300:08
20.3829750.3876200.81495600:06
30.3840930.3796070.81939200:06
40.3580190.3711400.82382800:08
^{pr88}$
epochtrain_lossvalid_lossaccuracytime
00.3851470.3996800.80624200:09
10.3760320.3811310.82256000:09
20.3638700.3782270.82240200:09
30.3510890.3687900.82683800:09
40.3404040.3618070.82921400:09
^{pr89}$
epochtrain_lossvalid_lossaccuracytime
00.5266280.5072360.74667300:17
10.4602290.4556750.76568400:19
20.4174270.4213680.78596300:19
30.4628000.4588440.77392300:19
40.4494790.4566270.78358700:19
| ?[0m 66      ?[0m | ?[0m 0.7836  ?[0m | ?[0m 3.68    ?[0m | ?[0m 3.977   ?[0m | ?[0m 3.919   ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.6216780.5590800.74904900:10
10.4571040.4736100.75839700:11
20.4162870.4166220.76457500:13
30.3881070.4038440.81194500:13
40.3842310.3963970.81305500:13
| ?[0m 67      ?[0m | ?[0m 0.8131  ?[0m | ?[0m 5.907   ?[0m | ?[0m 0.9452  ?[0m | ?[0m 2.168   ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.4101050.4161710.80861800:11
10.3816690.4001090.80909400:13
20.3775390.4038790.80307400:13
30.3746530.3891220.80861800:13
40.3663560.3805260.81400500:13
^{pr92}$ ^{tb71}$ ^{pr93}$
epochtrain_lossvalid_lossaccuracytime
00.4459240.4885810.75269300:17
10.4107090.4009620.81368800:18
20.3735180.3932350.82018400:18
30.3641600.3789200.82081700:17
40.3575510.3716290.82541200:17
| ?[0m 70      ?[0m | ?[0m 0.8254  ?[0m | ?[0m 0.009375?[0m | ?[0m 5.081   ?[0m | ?[0m 3.79    ?[0m |
^{tb73}$ ^{pr95}$
epochtrain_lossvalid_lossaccuracytime
00.3915090.3999660.80640000:06
10.3666940.4057190.82382800:07
20.3597510.3754960.82287700:07
30.3476780.3617110.83079900:07
40.3368960.3619220.82858000:07
^{pr96}$
epochtrain_lossvalid_lossaccuracytime
00.4354850.4148830.80814300:08
10.3735910.4171380.81400500:09
20.3695900.3757240.82018400:09
30.3708290.3686550.82953100:09
40.3464630.3663070.82541200:09
^{pr97}$
epochtrain_lossvalid_lossaccuracytime
00.4557410.5300080.79499400:11
10.4219610.4233170.80529200:11
20.4057990.4057290.80735100:12
30.3838950.3950920.81685700:12
40.3788820.3860440.81875800:12
^{pr98}$ ^{tb77}$ ^{pr99}$
epochtrain_lossvalid_lossaccuracytime
00.4128560.4100160.79911300:08
10.4108520.4164050.78849800:08
20.3738970.3843850.82430300:09
30.3531640.3661290.82271900:09
40.3532530.3622690.82636200:09
| ?[0m 79      ?[0m | ?[0m 0.8264  ?[0m | ?[0m 3.438   ?[0m | ?[0m 7.982   ?[0m | ?[0m 1.829   ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.4193160.4089360.79816200:08
10.3938260.3905260.82018400:08
20.3728790.3748230.82271900:08
30.3580190.3709130.82034200:08
40.3460200.3622520.82969000:08
^{pr101}$
epochtrain_lossvalid_lossaccuracytime
00.4334810.4373200.79008200:18
10.4152800.4029460.81416400:18
20.3655750.3762850.82287700:18
30.3632060.3718650.82050100:18
40.3564010.3702520.82382800:18
| ?[0m 81      ?[0m | ?[0m 0.8238  ?[0m | ?[0m 0.03221 ?[0m | ?[0m 1.306   ?[0m | ?[0m 3.909   ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.3931500.4209640.78374500:07
10.3750650.3713800.82398600:07
20.3629520.3870370.81368800:07
30.3472450.3702250.82493700:07
40.3484060.3614200.83064000:07
| ?[0m 82      ?[0m | ?[0m 0.8306  ?[0m | ?[0m 1.575   ?[0m | ?[0m 2.689   ?[0m | ?[0m 0.8684  ?[0m |
epochtrain_lossvalid_lossaccuracytime
00.3955300.3974300.81860000:06
10.3586790.3967730.81828300:07
20.3493050.3728770.82382800:07
30.3473460.3630060.82842200:07
40.3356520.3625670.83095700:07
| ?[0m 83      ?[0m | ?[0m 0.831   ?[0m | ?[0m 2.765   ?[0m | ?[0m 5.439   ?[0m | ?[0m 0.04047 ?[0m |
^{83磅}$
| ?[0m 84      ?[0m | ?[0m 0.8253  ?[0m | ?[0m 0.1961  ?[0m | ?[0m 4.123   ?[0m | ?[0m 0.02039 ?[0m |
^{tb84}$
| ?[0m 85      ?[0m | ?[0m 0.8213  ?[0m | ?[0m 7.937   ?[0m | ?[0m 7.939   ?[0m | ?[0m 2.895   ?[0m |
^{tb85}$
| ?[0m 86      ?[0m | ?[0m 0.8235  ?[0m | ?[0m 0.06921 ?[0m | ?[0m 5.7     ?[0m | ?[0m 2.778   ?[0m |
^{tb86}$
| ?[0m 87      ?[0m | ?[0m 0.8184  ?[0m | ?[0m 7.965   ?[0m | ?[0m 5.261   ?[0m | ?[0m 2.661   ?[0m |
^{tb87}$ ^{pr109}$ ^{tb 8八$ {pr 110}$ {tb89} {pr 111}$ {tb 90}$ {pr 112}$ {t91}$ {pr 113}$ {t92}$ {pr 114}$ {tb 93}$ {pr 115}$ {t94} {pr 116}$ {tb95}$ {pr 117}$ {tb 96}$ {pr 118}$ {tb 97}$ {pr 119}$ {tb 98}$ {pr 120}$ {t99}$ {pr121}$ {tb 100}$ {pr 122}$ {tb 101}$ {123}$ {tb 102}$ {pr 124}$ {tb 103}$ {pr 125}$ {tb 104}$ {pr 126}$ {tb 105}$ {pr127}$ {tb 106}$ {pr 128}$ {tb 107}$ {pr 129}$ {tb 108}$ {pr 130}$ {tb 109}$ {pr 131}$ {pr 132}$ {pr 133}$ {pr 134}$ {pr 135}$

内存不足的数据集

如果你的数据集太大以至于无法放入内存中,你可以在每个历元中加载一个数据块。在

{pr 136}$ {pr137}$ {pr 138}$ {pr 139}$ {pr 140}$ {pr 141}$ {pr 142}$ {pr 143}$ {pr 144}$ {pr 145}$ {pr 146}$ {pr 147}$ {tb 110}$

欢迎加入QQ群-->: 979659372 Python中文网_新手群

推荐PyPI第三方库


热门话题
Scala相当于java。util。ArrayList   使用jsoup从webview解析loggedin网站   java如何在字符串资源中设置值   java需要用奇怪的模式解析XML   java Freemarker StringTemplateLoader   java通过命令行参数更改默认ant目标   java CRC32更改初始值   socket中的java J2ME IOError::open=11004   运行Mahout 0.9文本处理示例时遇到的java问题   java Apache James Spring发行版未启动   JavaQuartz 2.2。X和Spring4集成   java Jpanel使用鼠标移动事件重新绘制   java使用Ant预编译JSP的最佳方法是什么   java测试spring安全帖子。访问此资源需要完全身份验证