deepctr.activations module

class deepctr.activations.Dice(axis=-1, epsilon=1e-09, **kwargs)[source]

The Data Adaptive Activation Function in DIN,which can be viewed as a generalization of PReLu and can adaptively adjust the rectified point according to distribution of input data.

Input shape
  • Arbitrary. Use the keyword argument input_shape (tuple of integers, does not include the samples axis) when using this layer as the first layer in a model.
Output shape
  • Same shape as the input.
Arguments
  • axis : Integer, the axis that should be used to compute data distribution (typically the features axis).
  • epsilon : Small float added to variance to avoid dividing by zero.
References
  • [Zhou G, Zhu X, Song C, et al. Deep interest network for click-through rate prediction[C]//Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. ACM, 2018: 1059-1068.](https://arxiv.org/pdf/1706.06978.pdf)
build(input_shape)[source]

Creates the variables of the layer.

call(inputs, **kwargs)[source]

This is where the layer’s logic lives.

Arguments:
inputs: Input tensor, or list/tuple of input tensors. **kwargs: Additional keyword arguments.
Returns:
A tensor or list/tuple of tensors.
get_config()[source]

Returns the config of the layer.

A layer config is a Python dictionary (serializable) containing the configuration of a layer. The same layer can be reinstantiated later (without its trained weights) from this configuration.

The config of a layer does not include connectivity information, nor the layer class name. These are handled by Container (one layer of abstraction above).

Returns:
Python dictionary.