Lecturer(s)
|
-
Černý Antonín, Ing.
-
Zmeškal Ladislav, Ing. Ph.D.
|
Course content
|
The below itemized topics represent radii of the subject matter went through, they do not exactly correspond to scheduled lectures: 1. Introductory information, organization of the subject, recommended literature and sources of study materials; basic notion and definitions and their mutual relations; relationship among data, information and knowledge; definition of machine intelligence and machine learning (hypothesis - knowledge, state space, parametric space). 2. Introduction into machine learning, supervised and unsupervised learning; applications and examples, case studies; description of the purpose and procedure of each part of a learning system; machine learning operators; canonic machine learning task, its preconditions and goals; regression and classification, primitive linear classifier. 3. Bayes learning, Bayes theorem, optimal and naive bayesian classifier, hypothesis selection strategies, applications of NBC. 4. Linear regression, cost function derivation and techniques to minimize it, gradient descent derivation, gradient descent algorithm. 5. Multivariate linear regression, gradient descent in multidimensional space, problems and limitations of gradient descent; polynomial regression; normal equation. 6. Logistic regression, logistic regression hypothesis model, interpretation of the results, decision boundary, multi-class classification - One-vs-All algorithm. 7.Regularization, overtraining and its symptoms, techniques to avoid/suppress overtraining, naive derivation of regularization, regularization algorithm, regularized linear and logistic regression. 8. Support Vector Machines, optimization goal as an alternative perspective of logistic regression, mathematical model of SVM, hypothesis with safety factor, kernels. 9. Neural networks, history, biological pre-model of artificial neural networks, mathematical model of a neuron, MLP-type layered networks, classification via ANN, cost function of an ANN and its optimization, learning, Backpropagation algorithm. 10. Clustering, general remarks on unsupervised learning, K-means method, optimization criterion of the K-means, centroid selection, cluster number selection, K-means algorithm. 11. Dimensionality reduction, Principal Component Analysis, PCA functionality description and algorithm, PCA features, mathematical background of PCA, applications and case studies. 12. Blind source separation, motivation and definition of the blind source separation problem, Independent Component Analysis, ICA functionality description and algorithm, ICA features, mathematical background of ICA, applications and case studies. 13. Evolutional and genetic algorithms, metaheuristic strategies of state space search; genotype encoding techniques; operators a parameters of GA, fitness function; general canonic form of a genetic algorithm, SOEA/MOEA; new generation selection strategies.
|
Learning activities and teaching methods
|
Lecture supplemented with a discussion, Lecture with practical applications, Discussion, Laboratory work, Task-based study method, Individual study, Self-study of literature, Lecture with visual aids
- Graduate study programme term essay (40-50)
- 40 hours per semester
- Preparation for an examination (30-60)
- 30 hours per semester
- Presentation preparation (report) (1-10)
- 6 hours per semester
- Practical training (number of hours)
- 26 hours per semester
- Contact hours
- 39 hours per semester
- Preparation for laboratory testing; outcome analysis (1-8)
- 15 hours per semester
|
prerequisite |
---|
Knowledge |
---|
Good command of mathematic analysis, calculus, probability & statistics, and numerical methods. Active programming skills in a high-level language like e.g. C/C++, Object Pascal, Java, C#; MATLAB or Octave command is welcome. The ability of self-reliant study of scientific literature and a satisfactory English language level (presumed study from mostly English resources). |
Skills |
---|
Write non-trivial programs in a high-level language like e.g. C/C++, Object Pascal, Java, C#; MATLAB or Octave command is welcome. Study the scientific literature written in English. |
Competences |
---|
N/A |
N/A |
N/A |
N/A |
N/A |
learning outcomes |
---|
Knowledge |
---|
Through passing the subject, a student gains a general overview of the paradigms of artificial cognitive systems focused mainly on their practical application in the field of artificial intelligence and intelligent software. He/she accomplishes a deep understanding of the basic techniques of machine learning, representation, derivation, and recording of the knowledge and rational behaviour, i.e. decision making and problem solving. This allows him/her to become involved in research and development tasks both in his/her subsequent study and in industrial practice. |
Skills |
---|
A student can implement basic machine learning techniques or modify them with deep understanding. He/she can also design own well-grounded approaches for solving problems in the field of artificial intelligence and machine learning. |
Competences |
---|
N/A |
N/A |
N/A |
N/A |
N/A |
teaching methods |
---|
Knowledge |
---|
Lecture with visual aids |
Lecture supplemented with a discussion |
Practicum |
Task-based study method |
Self-study of literature |
Individual study |
Interactive lecture |
Discussion |
Skills |
---|
Practicum |
Individual study |
Competences |
---|
Lecture supplemented with a discussion |
Discussion |
Task-based study method |
Self-study of literature |
assessment methods |
---|
Knowledge |
---|
Test |
Skills demonstration during practicum |
Oral exam |
Skills |
---|
Oral exam |
Test |
Skills demonstration during practicum |
Competences |
---|
Oral exam |
Test |
Skills demonstration during practicum |
Recommended literature
|
-
Barber, David. Bayesian reasoning and machine learning. Cambridge : Cambridge University Press, 2012. ISBN 978-0-521-51814-7.
-
Bishop, C.M. Pattern Recognition and Machine Learning. Springer, 2006. ISBN 978-0387-31073-2.
-
Murphy, Kevin P. Machine learning : a probabilistic perspective. Cambridge : MIT Press, 2012. ISBN 978-0-262-01802-9.
-
Nilsson, J. Nils. Introduction to Machine Learning. Stanford University Press. Stanford University, 2005.
-
Smola, A. J., Vishwanathan, S. V. N. Introduction to Machine Learning. Cambridge: Cambridge University Press, 2008. ISBN 0-521-82583-0.
-
Trevor Hastie, Robert Tibshirani, Jerome Friedman. The Elements of Statistical Learning. Springer. 2009. ISBN 978-0-387-84857-0.
|