# Thresholded Rectified Linear Unit.

It follows: f(x) = x for x > theta, f(x) = 0 otherwise.

layer_activation_thresholded_relu(
object,
theta = 1,
input_shape = NULL,
batch_input_shape = NULL,
batch_size = NULL,
dtype = NULL,
name = NULL,
trainable = NULL,
weights = NULL
)

## Arguments

 object Model or layer object theta float >= 0. Threshold location of activation. input_shape Input shape (list of integers, does not include the samples axis) which is required when using this layer as the first layer in a model. batch_input_shape Shapes, including the batch size. For instance, batch_input_shape=c(10, 32) indicates that the expected input will be batches of 10 32-dimensional vectors. batch_input_shape=list(NULL, 32) indicates batches of an arbitrary number of 32-dimensional vectors. batch_size Fixed batch size for layer dtype The data type expected by the input, as a string (float32, float64, int32...) name An optional name string for the layer. Should be unique in a model (do not reuse the same name twice). It will be autogenerated if it isn't provided. trainable Whether the layer weights will be updated during training. weights Initial weights for layer.

Other activation layers: layer_activation_elu(), layer_activation_leaky_relu(), layer_activation_parametric_relu(), layer_activation_relu(), layer_activation_selu(), layer_activation_softmax(), layer_activation()