The frenzy of deep learning (multi-level neural networks) has reached a new level during this trip to Silicon Valley. It has gone beyond automatically want to fund to it is in the water we drink and the air we breathe out here which makes me wonder when the investment backlash for all of these "2% better on X test" is going to come.
I am convinced of two things though:
A very smart friend today provided me with a new lens to look at these quirky multi-layered neural networks through: he claimed that they are the Great Unification of Statistical Computing in that they should, over time, subsume Classic Machine Learning by being more generally applicable to a wide class of classification and prediction problems.
If this is true then I wonder how long it will be before we get a programming language that supports deep learning methods natively. I'm not exactly sure that this won't be a really well thought out library or framework for your favorite data crunching language but to continue the thought experiment for a moment: what if along with lists, hashes, strings and number types, we got a DNN data type that could somehow easily encapsulate the setup of a deep neural network and then a series of methods to perform the first 80% of necessary steps to train it. Tweaks could be made with extensions to the basic methods for training and classification could then be done by calling still more methods.
To take it one step further, such a programming language might completely abstract where any particular step runs (server, client, GPU, etc.) and make sure that the models, when big enough, can get moved between computers completely transparently.
I need to dig into the problem domain to see if any of what I just wrote remotely makes sense. In a sense I wrote it here at the beginning of that journey to serve as a record of how I might want this to work before pesky implementation details get in my way. Because I sure would love to have deep learning be this simple for me.