Why is PyTorch called PyTorch? [closed]

Here a short answer, formed as another question:

Torch, SMORCH ???

PyTorch developed from Torch7. A precursor to the original Torch was a library called SVM-Torch, which was developed around 2001. The SVM stands for Support Vector Machines.

SVM-Torch is a decomposition algorithm similar to SVM-Light, but adapted to regression problems, according to this paper.

Also around this time, G.W.Flake described the sequential minimal optimization algorithm (SMO), which could be used to train SVMs on sparse data sets, and this was incorporated into NODElib.

Interestingly, this was called the SMORCH algorithm.

You can find out more about SMORCH in the NODElib docs

Optimization of the SVMs is:

  • performed by a variation of John Platt’s sequential minimal
  • optimization (SMO) algorithm. This version of SMO is generalized
  • for regression, uses kernel caching, and incorporates several
  • heuristics; for these reasons, we refer to the optimization
  • algorithm as SMORCH.

So SMORCH =

Sequential
Minimal
Optimization
Regression
Caching
Heuristics

I can’t answer definitively, but my thinking is “Torch” is a riff or evolution of “Light” from SVM-Light combined with a large helping of SMORCHiness. You’d need to check in with the authors of SVMTorch and SVM-Light to confirm that this is indeed what “sparked” the name. It is reasonable to assume that the “TO” of Torch stands for some other optimization, rather than SMO, such as Tensor Optimization, but I haven’t found any direct reference… yet.

Leave a Comment

tech