Will scikit-learn utilize GPU?

Tensorflow only uses GPU if it is built against Cuda and CuDNN. By default it does not use GPU, especially if it is running inside Docker, unless you use nvidia-docker and an image with a built-in support.

Scikit-learn is not intended to be used as a deep-learning framework and it does not provide any GPU support.

Why is there no support for deep or reinforcement learning / Will there be support for deep or reinforcement learning in scikit-learn?

Deep learning and reinforcement learning both require a rich
vocabulary to define an architecture, with deep learning additionally
requiring GPUs for efficient computing. However, neither of these fit
within the design constraints of scikit-learn; as a result, deep
learning and reinforcement learning are currently out of scope for
what scikit-learn seeks to achieve.

Extracted from http://scikit-learn.org/stable/faq.html#why-is-there-no-support-for-deep-or-reinforcement-learning-will-there-be-support-for-deep-or-reinforcement-learning-in-scikit-learn

Will you add GPU support in scikit-learn?

No, or at least not in the near future. The main reason is that GPU
support will introduce many software dependencies and introduce
platform specific issues. scikit-learn is designed to be easy to
install on a wide variety of platforms. Outside of neural networks,
GPUs don’t play a large role in machine learning today, and much
larger gains in speed can often be achieved by a careful choice of
algorithms.

Extracted from http://scikit-learn.org/stable/faq.html#will-you-add-gpu-support

Leave a Comment

Hata!: SQLSTATE[HY000] [1045] Access denied for user 'divattrend_liink'@'localhost' (using password: YES)