deepctr.models.autoint module

Author:
Weichen Shen,wcshen1994@163.com
Reference:
[1] Song W, Shi C, Xiao Z, et al. AutoInt: Automatic Feature Interaction Learning via Self-Attentive Neural Networks[J]. arXiv preprint arXiv:1810.11921, 2018.(https://arxiv.org/abs/1810.11921)
deepctr.models.autoint.AutoInt(feature_dim_dict, embedding_size=8, att_layer_num=3, att_embedding_size=8, att_head_num=2, att_res=True, hidden_size=(256, 256), activation='relu', l2_reg_deep=0, l2_reg_embedding=1e-05, use_bn=False, keep_prob=1.0, init_std=0.0001, seed=1024, final_activation='sigmoid')[source]

Instantiates the AutoInt Network architecture.

Parameters:
  • feature_dim_dict – dict,to indicate sparse field and dense field like {‘sparse’:{‘field_1’:4,’field_2’:3,’field_3’:2},’dense’:[‘field_4’,’field_5’]}
  • embedding_size – positive integer,sparse feature embedding_size
  • att_layer_num – int.The InteractingLayer number to be used.
  • att_embedding_size – int.The embedding size in multi-head self-attention network.
  • att_head_num – int.The head number in multi-head self-attention network.
  • att_res – bool.Whether or not use standard residual connections before output.
  • hidden_size – list,list of positive integer or empty list, the layer number and units in each layer of deep net
  • activation – Activation function to use in deep net
  • l2_reg_deep – float. L2 regularizer strength applied to deep net
  • l2_reg_embedding – float. L2 regularizer strength applied to embedding vector
  • use_bn – bool. Whether use BatchNormalization before activation or not.in deep net
  • keep_prob – float in (0,1]. keep_prob used in deep net
  • init_std – float,to use as the initialize std of embedding vector
  • seed – integer ,to use as random seed.
  • final_activation – output activation,usually 'sigmoid' or 'linear'
Returns:

A Keras model instance.