I found this great channel by professor Nando de Freitas at the University of Oxford. Most of the videos are good, but the series on Neural Networks and Deep Learning is great:
I needed to create a series of diagnostic plots for a recent Data Mining project. I created the plots by hand using R — I say "by hand" to mean that I wrote a script to generate them, rather than using a tool such as Tableau. The reason is that the data for the plots came from the UCI Machine Learning Repository, and it just so happened that the particular datasets come bundled with the R standard library. :)
A recent assignment in a machine learning class called for drawing the k-nearest-neighbor decision boundary for some given values of k, starting with k=1. The task involved using standard Euclidean distance between the starting points to determine the class of the nearest neighbors, and at the same time to draw (by hand) the resulting figure.
I've been keeping my eye on Spark for a while now, and decided to take the plunge recently after having to do some brief R analyses that were not that complicated and were perfect for Spark. I use TextMate as my R IDE, and I wanted to run my scripts from TextMate right into Spark, and the following are a couple of tips & tricks I found on how to setup everything so that you can start Spark from a command-R (⌘-R) shortcut.
Weka is a great resource for data mining and machine learning. You can get a lot done with the standalone GUI workbench, but sometimes you need to use it as part of a script in a custom R analysis pipeline. Yes, you could create a shell script that makes use of the Weka command-line tools, and invoke said script from R using a 'system' call, but that could get out of hand really quickly.