Good fences may make good neighbors, but in this session, machine learning will make even better neighbors. We will take a look at theory, the mechanics, and the nuances of building a data science experiment using k-nearest neighbor (k-NN). The session will be a combination of lecture and lab using R, so bring your laptop if you are interested in doing some hands-on work. One this we will not be addressing in this session is the age old controversy of weather the algorithm’s acronym is k-NN or kNN.
Jamie has a BSCS in Computer Science and a Masters in Public Health. He is the former Chair of his town's Information Services Advisory Board and is an outspoken advocate for Open Data. He also is involved with his local .NET User Group (TRINUG) with an emphasis on data analytics, machine learning, and the internet of things (IoT). He is the author of Mastering .NET Machine Learning,
Jamie lives in Cary, North Carolina with his wonderful wife Jill and their three awesome children: Sonoma, Sawyer, and Sloan. He blogs weekly at http://jamessdixon.wordpress.com and can be found on Twitter @jamie_dixon