deepctr.models.afm module

Author:
Weichen Shen,wcshen1994@163.com
Reference:
[1] Xiao J, Ye H, He X, et al. Attentional factorization machines: Learning the weight of feature interactions via attention networks[J]. arXiv preprint arXiv:1708.04617, 2017. (https://arxiv.org/abs/1708.04617)
deepctr.models.afm.AFM(feature_dim_dict, embedding_size=8, use_attention=True, attention_factor=8, l2_reg_linear=1e-05, l2_reg_embedding=1e-05, l2_reg_att=1e-05, keep_prob=1.0, init_std=0.0001, seed=1024, final_activation='sigmoid')[source]

Instantiates the Attentonal Factorization Machine architecture.

Parameters:
  • feature_dim_dict – dict,to indicate sparse field and dense field like {‘sparse’:{‘field_1’:4,’field_2’:3,’field_3’:2},’dense’:[‘field_4’,’field_5’]}
  • embedding_size – positive integer,sparse feature embedding_size
  • use_attention – bool,whether use attention or not,if set to False.it is the same as standard Factorization Machine
  • attention_factor – positive integer,units in attention net
  • l2_reg_linear – float. L2 regularizer strength applied to linear part
  • l2_reg_embedding – float. L2 regularizer strength applied to embedding vector
  • l2_reg_att – float. L2 regularizer strength applied to attention net
  • keep_prob – float in (0,1]. keep_prob after attention net
  • init_std – float,to use as the initialize std of embedding vector
  • seed – integer ,to use as random seed.
  • final_activation – str,output activation,usually 'sigmoid' or 'linear'
Returns:

A Keras model instance.