Pickle dump huge file without memory error

I was having the same issue. I use joblib and work was done. In case if someone wants to know other possibilities.

save the model to disk

from sklearn.externals import joblib
filename="finalized_model.sav"
joblib.dump(model, filename)  

some time later… load the model from disk

loaded_model = joblib.load(filename)
result = loaded_model.score(X_test, Y_test) 

print(result)

Leave a Comment

Hata!: SQLSTATE[HY000] [1045] Access denied for user 'divattrend_liink'@'localhost' (using password: YES)