deepctr.models.multitask.mmoe module¶
- Author:
Mincai Lai, laimc@shanghaitech.edu.cn
Weichen Shen, weichenswc@163.com
- Reference:
- [1] Ma J, Zhao Z, Yi X, et al. Modeling task relationships in multi-task learning with multi-gate mixture-of-experts[C]//Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 2018.(https://dl.acm.org/doi/abs/10.1145/3219819.3220007)
-
deepctr.models.multitask.mmoe.
MMOE
(dnn_feature_columns, num_experts=3, expert_dnn_hidden_units=(256, 128), tower_dnn_hidden_units=(64, ), gate_dnn_hidden_units=(), l2_reg_embedding=1e-05, l2_reg_dnn=0, seed=1024, dnn_dropout=0, dnn_activation='relu', dnn_use_bn=False, task_types=('binary', 'binary'), task_names=('ctr', 'ctcvr'))[source]¶ Instantiates the Multi-gate Mixture-of-Experts multi-task learning architecture.
Parameters: - dnn_feature_columns – An iterable containing all the features used by deep part of the model.
- num_experts – integer, number of experts.
- expert_dnn_hidden_units – list,list of positive integer or empty list, the layer number and units in each layer of expert DNN.
- tower_dnn_hidden_units – list,list of positive integer or empty list, the layer number and units in each layer of task-specific DNN.
- gate_dnn_hidden_units – list,list of positive integer or empty list, the layer number and units in each layer of gate DNN.
- l2_reg_embedding – float. L2 regularizer strength applied to embedding vector
- l2_reg_dnn – float. L2 regularizer strength applied to DNN
- seed – integer ,to use as random seed.
- dnn_dropout – float in [0,1), the probability we will drop out a given DNN coordinate.
- dnn_activation – Activation function to use in DNN
- dnn_use_bn – bool. Whether use BatchNormalization before activation or not in DNN
- task_types – list of str, indicating the loss of each tasks,
"binary"
for binary logloss,"regression"
for regression loss. e.g. [‘binary’, ‘regression’] - task_names – list of str, indicating the predict target of each tasks
Returns: a Keras model instance