The machine learning battle is heating up.
Deep Learning, the term, is somewhat recent and its popularity is most definitely increasing. At the root, the concept has been around for probably over 40 years under the umbrella or Artificial Neural Networks (ANNs). Although the idea was fantastic, the applications were limited by the availability of computing power. With the evolution of GPU cards (Graphics Processing Units), where each card packs thousands of core, deep learning is becoming more practical and applicable.
With their large scale data centers, Google and Amazon have both been able to polish their deep learning platforms. With everyone from the financial industry to healthcare now trying to research deep learning implementations for their own gains, both players are competing for attention.
After releasing to open source TensorFlow in November of last year, Google has take another step forward and announced TPUs or Tensor Processing Units. These custom ASICs were built specifically for machine learning and tailored for TensorFlow.
Meanwhile, Amazon is coming out with its own ML platform (DSSTNE, available on GitHub). You can read more on this at Geekwire.
Like with every new shiny toy, efforts tend to be overdone at first and we may see a few “Deep Learning” vending machines before things stabilize. And although it is hard for me to predict who will will the ML arms race, I think deep learning is definitely here to stay. It should soon graduate from “buzz” word to common computing practice.