Data Science: An Introduction to k-nearest neighbor (k-NN)

2017-08-29
Organizer: Triangle SQL Server User Group
Click here for registration info

Good fences may make good neighbors, but in this session, machine learning will make even better neighbors. We will take a look at theory, the mechanics, and the nuances of building a data science experiment using k-nearest neighbor (k-NN). The session will be a combination of lecture and lab using R, so bring your laptop if you are interested in doing some hands-on work. One this we will not be addressing in this session is the age old controversy of weather the algorithm’s acronym is k-NN or kNN.

Jamie Dixon has been writing code for as long as he can remember and has been getting paid to do it since 1995. He was using C# and javascript almost exclusively until discovering F# and now combines all three languages for the problem at hand. He has a passion for discovering overlooked gems in data sets and merging software engineering techniques to scientific computing. An aspiring polyglot, he has been recently doing project in R, Python, and MatLab.
Jamie has a BSCS in Computer Science and a Masters in Public Health. He is the former Chair of his town's Information Services Advisory Board and is an outspoken advocate for Open Data. He also is involved with his local .NET User Group (TRINUG) with an emphasis on data analytics, machine learning, and the internet of things (IoT). He is the author of Mastering .NET Machine Learning,
Jamie lives in Cary, North Carolina with his wonderful wife Jill and their three awesome children: Sonoma, Sawyer, and Sloan. He blogs weekly at http://jamessdixon.wordpress.com and can be found on Twitter @jamie_dixon

----

Poster: triangletech