East Carolina University
Department of Computer Science
CSCI 4120
Machine Learning
Standard Syllabus
3 credits |
Prepared by Venkat Gudivada, May 2018 |
Catalog entry
P: CSCI 2540; MATH 2228 or MATH 2283. Machine learning and
statistical pattern recognition algorithms and their application to
data analytics, bioinformatics, speech recognition, natural language
processing, robotic control, autonomous navigation, and text and web
data processing.
Course summary
Do you wonder about how IBM Watson, a question-answering system
that responds to questions posed in natural language, won Jeopardy
game championship in 2011?
Can you automatically generate textual descriptions that reflect the content of digital images?
How do you automatically colorize black and white movies?
How do you achieve real-time transnational of speech given in one language to another?
How do you discover patterns hidden in large data and use them to improve sales?
How do you make predictions based on historical data?
All of the above and more is possible using machine
learning. In this course, you will learn theory, algorithms, and
tools that enable you to solve problems like the above. Why are
you waiting then?
Course topics
-
Basic concepts of machine learning and example applications
-
Learning theory - bias/variance trade-offs, Union and Chernoff/Hoeffding bounds
-
Supervised learning: logistic regression, Perceptron, naive
Bayes, generative learning algorithms, Gaussian discriminant
analysis, support vector machines, model and feature selection,
and ensemble methods
-
Unsupervised learning: clustering, K-means, EM, mixture of
Gaussians, factor analysis, principal components analysis,
independent components analysis
-
Reinforcement learning and control : MDPs, value and policy
iterations, linear quadratic regulation, Q-learning, value
function approximation
-
Deep learning
Student learning outcomes
-
Demonstrate a broad understanding of various machine learning
and statistical pattern recognition algorithms and their
application to diverse practical problems.
-
Apply supervised learning algorithms (parametric/non-parametric
algorithms, support vector machines, kernels, and neural
networks) to solve practical problems.
-
Apply unsupervised learning algorithms (clustering,
dimensionality reduction, recommender systems, and deep
learning) to solve practical problems.
-
Apply best practices in machine learning (bias/variance theory)
to solve diverse problems in domains ranging from computer
vision to speech recognition, information retrieval, and natural
language processing.
Textbook
Gareth James et al. An Introduction to Statistical Learning: with Applications in R.
New York, NY: Springer, 2013. ISBN: 978-1461471370
Other required material
- David Barber. Bayesian Reasoning and Machine Learning.
New York, NY: Cambridge University Press, 2012. isbn: 978-0521518147.
- Trevor Hastie, Robert Tibshirani, and Jerome Friedman.
The Elements of Statistical Learning: Data Mining, Inference, and Prediction.
Second. New York, NY: Springer, 2013. ISBN: 978-0387848570.
- Yaser S. Abu-Mostafa, M. Magdon-Ismail, and H-S. Lin.
Learning From Data. Berlin, Germany: AMLBook, 2012. ISBN: 978-1600490064.
- Kevin Murphy. Machine Learning: A Probabilistic Perspective.
Adaptive Computation and Machine Learning. Cambridge University Press: The MIT Press,
2012. ISBN: 978-0262018029.
- Christopher Bishop. Pattern Recognition and Machine Learning.
New York, NY: Springer, 2007. ISBN: 978-0387310732.
Grading
Course grade is based on four components:
Activity |
Weight (%) |
Assignments (paper-and-pencil) |
20 |
Assignments (programming) |
30 |
Midterm exam |
20 |
Final exam |
30 |
Grade meanings
Grade |
Meaning |
A |
Achievement substantially exceeds basic course expectations |
A− |
|
B+ |
|
B |
Achievement exceeds basic course expectations |
B− |
|
B+ |
|
C |
Achievement adequately meets basic course expectations |
C− |
|
D+ |
|
D |
Achievement falls below basic course expectations |
D− |
|
F |
Failure – achievement does not justify credit for course |