[FIXED] Custom kernel initialization in Keras

Issue

I have the following code for implementing a standard classification problem in Keras:

import numpy as np
import keras
from keras.models import Sequential
from keras.layers import Dense
from sklearn.datasets import load_breast_cancer

X,y = load_breast_cancer(return_X_y =True)

from sklearn.model_selection import train_test_split
X_train,X_test,y_train,y_test = train_test_split(X,y,test_size=0.3,random_state=1)

model = Sequential()
model.add(Dense(units=30,input_dim=X_train.shape[1],activation='relu',kernel_initializer='uniform'))
model.add(Dense(units=20,activation='relu',kernel_initializer='uniform'))
model.add(Dense(units=1,activation='sigmoid',kernel_initializer='uniform'))

model.compile(optimizer='sgd',loss='binary_crossentropy',metrics=['accuracy'])
model.fit(X_train,y_train,validation_data=(X_test,y_test),epochs=200,batch_size=40,verbose=1)

Everything works fine, but I want to use a custom initialization function in the kernel_initializer.

For example in this line

model.add(Dense(units=20,activation='relu',kernel_initializer='uniform'))

I would prefer having this kind of code:

def my_custom_initialization():
    return here

model.add(Dense(units=20,activation='relu',kernel_initializer=my_custom_initialization()))

How can I generate numbers with custom distribution or method that fits in the Keras framework?

Solution

For creating custom kernel initialization in Keras, try like below:

import numpy as np
import tensorflow as tf
import keras
from keras.models import Sequential
from keras.layers import Dense
from sklearn.datasets import load_breast_cancer

X,y = load_breast_cancer(return_X_y =True)

from sklearn.model_selection import train_test_split
X_train,X_test,y_train,y_test = train_test_split(X,y,test_size=0.3,random_state=1)


def my_custom_initialization(shape, dtype=None):
    return tf.random.normal(shape, dtype=dtype)

model = Sequential()
model.add(Dense(units=30,input_dim=X_train.shape[1],activation='relu',
                kernel_initializer=my_custom_initialization))
model.add(Dense(units=20,activation='relu',
                kernel_initializer=my_custom_initialization))
model.add(Dense(units=1,activation='sigmoid',
                kernel_initializer=my_custom_initialization))

model.compile(optimizer='sgd',loss='binary_crossentropy',metrics=['accuracy'])
model.fit(X_train,y_train,validation_data=(X_test,y_test),epochs=3,batch_size=40,verbose=1)

Output:

Epoch 1/3
10/10 [==============================] - 2s 65ms/step - loss: 598388.8750 - accuracy: 0.5553 - val_loss: 0.6913 - val_accuracy: 0.6316
Epoch 2/3
10/10 [==============================] - 0s 9ms/step - loss: 0.6907 - accuracy: 0.6256 - val_loss: 0.6898 - val_accuracy: 0.6316
Epoch 3/3
10/10 [==============================] - 0s 8ms/step - loss: 0.6893 - accuracy: 0.6256 - val_loss: 0.6883 - val_accuracy: 0.6316

Answered By – I'mahdi

Answer Checked By – Timothy Miller (Easybugfix Admin)

Leave a Reply

(*) Required, Your email will not be published