Will We Ever Truly Understand Machine Learning?
Good article on an ongoing debate about the inherent “interpretability” of machine learning systems. It provides some background on Deep Visualization, and how it’s not just for generating eerie images with dogs and birds embedded in everything, but is instead a tool for interpreting neurons in machine learning network.
It’s not a given, however, that these kinds of tools will get us to full-blown interpretability for a number of reasons laid out by the author.
In short, these systems exhibit similar kinds of inscrutability as Mother Nature. In fact, I think it is precisely because of these kinds of systems that we may need to expand our notion of exactly what nature is:
Artificial Intelligence as Force of Nature
A special thanks to Oleg Moskalensky for flagging this one for me. Great find, Oleg. Thank you.
#machinelearning #intelligence #knowledge