How to use advanced activation layers in Keras?

The correct way to use the advanced activations like PReLU is to use it with add() method and not wrapping it using Activation class. Example:

model = Sequential()
act = keras.layers.advanced_activations.PReLU(init="zero", weights=None)
model.add(Dense(64, input_dim=14, init="uniform"))
model.add(act)

Leave a Comment

Hata!: SQLSTATE[HY000] [1045] Access denied for user 'divattrend_liink'@'localhost' (using password: YES)