deepctr.models.sequence.bst module

Author:
Zichao Li, 2843656167@qq.com
Reference:
Qiwei Chen, Huan Zhao, Wei Li, Pipei Huang, and Wenwu Ou. 2019. Behavior sequence transformer for e-commerce recommendation in Alibaba. In Proceedings of the 1st International Workshop on Deep Learning Practice for High-Dimensional Sparse Data (DLP-KDD ‘19). Association for Computing Machinery, New York, NY, USA, Article 12, 1–4. DOI:https://doi.org/10.1145/3326937.3341261
deepctr.models.sequence.bst.BST(dnn_feature_columns, history_feature_list, transformer_num=1, att_head_num=8, use_bn=False, dnn_hidden_units=(256, 128, 64), dnn_activation='relu', l2_reg_dnn=0, l2_reg_embedding=1e-06, dnn_dropout=0.0, seed=1024, task='binary')[source]

Instantiates the BST architecture.

Parameters:
  • dnn_feature_columns – An iterable containing all the features used by deep part of the model.
  • history_feature_list – list, to indicate sequence sparse field.
  • transformer_num – int, the number of transformer layer.
  • att_head_num – int, the number of heads in multi-head self attention.
  • use_bn – bool. Whether use BatchNormalization before activation or not in deep net
  • dnn_hidden_units – list,list of positive integer or empty list, the layer number and units in each layer of DNN
  • dnn_activation – Activation function to use in DNN
  • l2_reg_dnn – float. L2 regularizer strength applied to DNN
  • l2_reg_embedding – float. L2 regularizer strength applied to embedding vector
  • dnn_dropout – float in [0,1), the probability we will drop out a given DNN coordinate.
  • seed – integer ,to use as random seed.
  • task – str, "binary" for binary logloss or "regression" for regression loss
Returns:

A Keras model instance.