It depends on the task, but basically you are right. Training stage requires a lot of computational power. If someone implements Deep Learning at Amazon servers, he probably spends few million dollars for just server infrastructure
You are viewing a single comment's thread from:
It does require "a lot" of computing power. But considering what leaps and bounds we've come in the last few years, you would be amazed at what you can do with a normal desktop PC or even a smartphone these days.
yeah. Computers and processors are developed rapidly. Moreover there are some amazing parallelization technologies that can essentially alleviate computational requirements.
I think in a few years training of neural network will barely be a big problem.