ECE 8528: Advanced Topics in Statistical Modeling
for Engineering Applications

Syllabus

Contact Information:

  Lecture     TBD  
  TBD  
  Lecturer     Joseph Picone, Professor  
  Office: EA 703A  
  Office Hours: (MWF) 11:00 - 12:00 PM  
  Phone: 215-204-4841  
  Email: picone@temple.edu  
  Skype: joseph.picone  
  Social Media     temple.engineering.ece8528@groups.facebook.com  
  Website     http://www.isip.piconepress.edu/courses/temple/ece_8528  
  Required Textbook     None  
  Reference Textbooks     C.M. Bishop  
  Pattern Recognition and Machine Learning  
  Springer, ISBN: 978-0387310732, 2003.  

  D.J.C. MacKay  
  Information Theory, Inference and Learning Algorithms  
  Cambridge University Press, ISBN: 978-0521642989, 2004.

  R.J. Thibaux
  Nonparametric Bayesian Models For Machine Learning  
  Proquest, ISBN: 978-1243992130, 2011.

Also, see the course web site for additional reading materials.
  Prerequisites     ENGR 5022 (minimum grade: B-)
  ENGR 5033 (minimum grade: B-)  


Grading Policies:

  Item  
  Weight  
  Exam No. 1     20%  
  Exam No. 2     20%  
  Exam No. 3     20%  
  Final Exam     20%  
  Project     20%  
  TOTAL:     100%  


This course builds on a basic knowledge of machine learning and reviews recent advances in the field. It is a research-oriented course intended to complement a student's thesis or dissertation research. The course will focus on a selection of emerging machine learning algorithms and analyze contemporary publications on these techniques. The emphasis will be on algorithms suited to large, complex data sets. Both supervised and unsupervised learning methodologies will be discussed. Applications will be drawn from several signal processing disciplines including speech, image and bioengineering applications.

It is understood that the specific topic list in this course will evolve as new methods emerge. Currently, two topics that will be emphasized are nonparametric Bayesian methods and deep learning.

The course requirements include three in-class exams and a final exam. In addition, students will be expected to complete a course project that involves application of an existing contemporary statistical modeling approach to a real-world problem. Students are encouraged to draw on examples from their own research. The specifics of this assignment will be negotiated in writing with the course instructor, and fully defined by the fifth week of the course.

Lecture Schedule:

The lecture component will cover the following topics:

  Class  
  Topic(s)  
  1     (a) Course Overview and Introduction  
  (b) Parametric Statistical Models  
  (c) The Expectation Maximization Algorithm  
  2     (a) Maximum Likelhood Approaches  
  (b) Discriminative Training  
  (c) Crossvalidation, Bagging and Jackknifing  
  3     (a) Nonparametric Bayesian Approaches  
  (b) Inference Algorithms  
  (c) Hybrid Models and Temporal Structure  
  4     (a) Deep Learning  
  (b) Random Fields and Bayesian Networks  
  (c) Deep Belief Networks  
  5     (a) Dimensionaltiy Reduction  
  (b) Kernel Theory  
  (c) Exam No. 1  
  6     (a) Variational Methods  
  (b) Variational EM  
  (c) Monte Carlo Methods  
  7     (a) Graphical Models  
  (b) Latent Semantic Analysis  
  (c) Social Network Analysis  
  8     (a) Statistical Learning Theory  
  (b) Fisher Information  
  (c) Bounds and Frequentist Theory  
  9     (a) Gaussian Processes  
  (b) Dirichlet Process Mixture Models  
  (c) Exam No. 2  
  10     (a) Supervised Learning  
  (b) Unsupervised Learning  
  (c) Adaptation  
  11     (a) Sparsity  
  (b) Greedy Algorithms  
  (c) Compression Sensing  
  12     (a) Online Learning  
  (b) Active Learning  
  (c) Nonparametric Learning  
  13     (a) Learning with Humans in the Loop  
  (b) Prediction of Complex Data  
  (c) Exam No. 3  
  14     (a) Applications: Search Engines  
  (b) Applications: Speech Recognition  
  (c) Applications: Predicting User Preferences (Netflix)  
  15     (a) Final Exam  


Please note that the dates above are fixed since they have been arranged to optimize a number of constraints. You need to adjust your schedules, including job interviews and site visits, accordingly.