Neural Nets on the CPU Are (as Expected) Pointless

I finally got my OpenDeep Dataset implementation for the TEDLIUM dataset feeding into a basic generative RNN... and as expected, it's pointlessly slow. Expect to go, say, 100 epochs... and it's already taken ~ 2 hours on the first epoch. Seems like I really do need to get an Amazon account and use that to run any tests I want to run.

This isn't news, of course, I knew it wasn't going to be workable to run on the CPU. I'm just running the test to verify that the dataset really does work as a mechanism to feed audio into the RNN. But some compulsion makes me want to have a single Epoch complete so that I can quantify just how much faster it is on a GPU-accelerated machine. But that's for some other day.


Comments are closed.


Pingbacks are closed.