deepctr.models.xdeepfm module

Author:
Weichen Shen, weichenswc@163.com
Reference:
[1] Lian J, Zhou X, Zhang F, et al. xDeepFM: Combining Explicit and Implicit Feature Interactions for Recommender Systems[J]. arXiv preprint arXiv:1803.05170, 2018.(https://arxiv.org/pdf/1803.05170.pdf)
deepctr.models.xdeepfm.xDeepFM(linear_feature_columns, dnn_feature_columns, dnn_hidden_units=(256, 256), cin_layer_size=(128, 128), cin_split_half=True, cin_activation='relu', l2_reg_linear=1e-05, l2_reg_embedding=1e-05, l2_reg_dnn=0, l2_reg_cin=0, seed=1024, dnn_dropout=0, dnn_activation='relu', dnn_use_bn=False, task='binary')[source]

Instantiates the xDeepFM architecture.

Parameters:
  • linear_feature_columns – An iterable containing all the features used by linear part of the model.
  • dnn_feature_columns – An iterable containing all the features used by deep part of the model.
  • dnn_hidden_units – list,list of positive integer or empty list, the layer number and units in each layer of deep net
  • cin_layer_size – list,list of positive integer or empty list, the feature maps in each hidden layer of Compressed Interaction Network
  • cin_split_half – bool.if set to True, half of the feature maps in each hidden will connect to output unit
  • cin_activation – activation function used on feature maps
  • l2_reg_linear – float. L2 regularizer strength applied to linear part
  • l2_reg_embedding – L2 regularizer strength applied to embedding vector
  • l2_reg_dnn – L2 regularizer strength applied to deep net
  • l2_reg_cin – L2 regularizer strength applied to CIN.
  • seed – integer ,to use as random seed.
  • dnn_dropout – float in [0,1), the probability we will drop out a given DNN coordinate.
  • dnn_activation – Activation function to use in DNN
  • dnn_use_bn – bool. Whether use BatchNormalization before activation or not in DNN
  • task – str, "binary" for binary logloss or "regression" for regression loss
Returns:

A Keras model instance.