Running Neural Networks on Different Frameworks - [Comparison]

in #machine-learning7 years ago

Resources #38.png


Ilia Karmanov has created a github repository for comparing the performance of neural networks across different (popular) platforms/frameworks.

He ran the Jupyter notebooks with the NN code on Nvidia K80 GPU on a Microsoft Azure VM. As per Ilia:

"The notebooks are not specifically written for speed, instead they aim to create an easy comparison between the frameworks." [source]

Ilia tested a convolutional neural network on CIFAR-10 dataset for image recognition on a few libraries. Contrary to what some practitioners might expect, the top 3 libraries in terms of training time in seconds were:

  • Caffe2 - took 149 seconds to train the CNN for an accuracy of 79%
  • MXNet - took 149 seconds to train the CNN for an accuracy of 77%
  • Gluon - took 157 seconds to train the CNN for an accuracy 77%

Libraries like TensorFlow, Keras, and PyTorch had slightly lower (very close) performance.

Ilia also tested a Recurrent Neural Network on IMDB for sentiment analysis. The top 3 performing libraries were:

  • CNTK - took 29 seconds to train the RNN for an accuracy of 86%
  • MXNet - took 29 seconds to train the RNN for an accuracy of 86% (same as CNTK)
  • PyTorch - took 32 seconds to train the RNN for an accuracy of 85%

I'd say that the performance of these libraries is really close; the difference being of a few hundred seconds between the top and the bottom. This might be really important on very large datasets and complicated NN designs, but it's not as important on smaller datasets and simpler design. For the very technical and insightful details, you can check/read the repo below:

Running Neural Networks on Different Frameworks - [Comparison]


To stay in touch with me, follow @cristi


Cristi Vlad Self-Experimenter and Author

Sort:  

Hello :)

Deci cum am in plan sa invat programare, pe chanelul tau trebuie sa stau mai des :))

Mersi!, Zi faina!

Pe ce vrei sa te axezi?

Pe programare aplicatii pe android in prima faza, apoi vreau pe java si html :D

Multa bafta!

Mersi Mult!

M-am uitat pe linkul tau, tot respectul pentru ceea ce faci , cel mai probabil in viitorul apropiat o sa iti cumpar cartea de pe amazon cu mastering udemy :D

Ma uit la niste cursuri pe udemy si are potential foarte mare platforma , u ai cursuri si acolo ?

Da, am un curs, ketosis and intermittent fasting. Iti dau cupon gratis daca vrei.

The reported accuracy metrics seem very low, and the training times seem too fast, in comparison with what I recall from recent research papers. Do you really think this comes from a good test approach?

I am not sure, though if some really wants to double check, they can test and see if they are reproducible.

True. I just wanted to give you and the community a heads-up since the numbers seem inconsistent with other research.

Humanity have advanced and invested so much into parallel processing which led to a better design of artificial intelligence in the last couple of years. I wonder where this could lead in the near future!

I enjoyed reading your post. There is a lot of good stuff.

Good job @cristi

I have upvote

Good performance Cristi ;)