Posts

Showing posts from 2018

I have removed blog comments to support GDPR compliance

One of the great things about writing a blog is interacting with users through the comments they leave on my blog. At least for now, I have disabled comments on my blog. Sorry about that!

Is too much emphasis in Artificial Intelligence given to Deep Learning?

Deep learning has revolutionized the way we do Natural Language Processing, image recognition, translation, and other very specific tasks. I have used deep learning most days at work for about four years. Currently, I do no work in image recognition but I still use convolutional networks for NLP, and in the last year mostly use RNNs and GANs. While I agree that deep learning is important in many practical use cases, a lot of data science still revolves around simpler models. I feel that other important techniques like probabilistic graph models (PGM) and discrete optimization models (the  MiniZinc language  is a good place to start) don't get the attention in universities and industry that they deserve. On a scale of 1 to 10, I estimate the hype level of deep learning to be approximately 15. I started working in the AI field in 1982 (back then, mostly "symbolic AI" and neural networks) and to me artificial intelligence has always meant a very long term goal of build

Java JDK 10

Java 10 was released for general availability a week ago.  I just installed it on macOS from  http://jdk.java.net/10/  I un-tar'ed the distribution, set JAVA_HOME to the Home directory in the distribution and put Home/bin first on my PATH. I used Java 10 with the latest version of IntelliJ with no problems: opened an existing Java project and switched the JDK to the newly installed JDK 10. There are several nice features but the only ones I am likely to use are: Local variable type inference Better Docker container awareness in Linux Improved repl support I have mixed feelings about the rapid new 6 month release cycles, but I understand the desire to compete with other JVM languages like Kotlin, Scala, Clojure, etc. I have updating both of my Java books ( Power Java  and Practical Artificial Intelligence Programming With Java ) on my schedule for the next 18 months. Java 11 is slated for release September 2018 and I will probably use Java 11 (whatever it will be!) for t

Trying out the Integrated Variants library to explain predictions made by a deep learning classification models

"Explainable AI" for deep learning is required for some applications, at least explainable enough to get some intuition for how a model works. The recent paper  "Axiomatic Attribution for Deep Networks"  describes how to determine which input features have the most effect on a specific prediction by a deep learning classification model. I used the library  IntegratedGradients  that works with Keras and another version is available for TensorFlow. I modified my two year old example model using the University of Wisconsin cancer data set today. If you want to experiment with the ideas in this paper and the IntegratedGradients library, then using my modified example might save you some time and effort.