The correct way to use the advanced activations like PReLU is to use it with add()
method and not wrapping it using Activation
class. Example:
model = Sequential()
act = keras.layers.advanced_activations.PReLU(init="zero", weights=None)
model.add(Dense(64, input_dim=14, init="uniform"))
model.add(act)