top of page

Deep learning trend - Compressing Networks

I was discussing deep learning's trend with some friends several weeks ago, and most of them thinks NVIDIA might be today’s king of Deep Learning Hardware, and its stock seems reveal the same thing. But as Tombone's blog says: "that there is a new player lurking in the shadows. You see, GPU-based mining of bitcoin didn’t last very long once people realized the economic value of owning bitcoins. Bitcoin very quickly transitioned into specialized FPGA hardware for running the underlying bitcoin computations, and the FPGAs of Deep Learning are right around the corner. Will NVIDIA remain the King? I see a fork in NVIDIA's future. You can continue producing hardware which pleases both gamers and machine learning researchers, or you can specialize. There is a plethora of interesting companies like Nervana Systems, Movidius, and most importantly Google, that don’t want to rely on power-hungry heatboxes known as GPUs, especially when it comes to scaling already trained deep learning models. Just take a look at Fathom by Movidius or the Google TPU."

Deep learning would become a very interesting battle field for both big tech companies and new startups, I believe as the Compressing networks evolves, there will be some brand new data entries come out of the water, and it is only those willing to transfer their academic skills into a product world-facing focus, the upcoming wave of deep entrepreneurs, that will make serious dollars.


Featured Posts
Recent Posts
Archive
Search By Tags
No tags yet.
Follow Us
  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page