Add dropout layers between pretrained dense layers in keras

I found an answer myself by using Keras functional API from keras.applications import VGG16 from keras.layers import Dropout from keras.models import Model model = VGG16(weights=”imagenet”) # Store the fully connected layers fc1 = model.layers[-3] fc2 = model.layers[-2] predictions = model.layers[-1] # Create the dropout layers dropout1 = Dropout(0.85) dropout2 = Dropout(0.85) # Reconnect the layers … Read more

Keras: class weights (class_weight) for one-hot encoding

Here’s a solution that’s a bit shorter and faster. If your one-hot encoded y is a np.array: import numpy as np from sklearn.utils.class_weight import compute_class_weight y_integers = np.argmax(y, axis=1) class_weights = compute_class_weight(‘balanced’, np.unique(y_integers), y_integers) d_class_weights = dict(enumerate(class_weights)) d_class_weights can then be passed to class_weight in .fit.

Why does Keras LSTM batch size used for prediction have to be the same as fitting batch size?

Unfortunately what you want to do is impossible with Keras … I’ve also struggle a lot of time on this problems and the only way is to dive into the rabbit hole and work with Tensorflow directly to do LSTM rolling prediction. First, to be clear on terminology, batch_size usually means number of sequences that … Read more

Is there an easy way to get something like Keras model.summary in Tensorflow?

Looks like you can use Slim Example: import numpy as np from tensorflow.python.layers import base import tensorflow as tf import tensorflow.contrib.slim as slim x = np.zeros((1,4,4,3)) x_tf = tf.convert_to_tensor(x, np.float32) z_tf = tf.layers.conv2d(x_tf, filters=32, kernel_size=(3,3)) def model_summary(): model_vars = tf.trainable_variables() slim.model_analyzer.analyze_vars(model_vars, print_info=True) model_summary() Output: ——— Variables: name (type shape) [size] ——— conv2d/kernel:0 (float32_ref 3x3x3x32) [864, … Read more

Hata!: SQLSTATE[HY000] [1045] Access denied for user 'divattrend_liink'@'localhost' (using password: YES)