|
|
Contents
|
|
Week 1: |
Basic properties of neurons, connection patterns between neurons. Main mathematical models proposed for neurons and their distinguishing features. |
Week 2: |
Preliminaries of Graph Theory applicable to the ANN topological characterization (particular classes of graphs, direct graphs, topological invariants, Voronoi diagrams and Delaunay tessellations etc.). |
Week 3: |
Basics for algorithms characterization I : P- and NP-complete problems, shortest path problem interconnection and routing algorithms |
Week 4: |
Basics for algorithms characterization II: Placement and partitioning. Associative Memory (AM), linear associators, AM implementation. |
Week 5: |
Perceptrons as the simplest learning machines. Main definitions (decision function, pattern space, decision surface). Liner separability of training patterns. Correction increment and perceptrone larning algorithms as implementations of the gradient descent method. |
Week 6: |
The perceptron convergence theorem. Convergence speed. The Widrow-Hoff LMS algorithm. Order of predicate and perceptrons. |
Week 7: |
Exact representation using Feedforward Networks. Kolmogorov’s theorem and its consequences. Approximate representations. Fixed Multilayer Feedforward Network training by backpropagation. |
Week 8: |
Midterm exam |
Week 9: |
Structural training of Multilayer Feedforward Networks (algorithm, robustness and size issues). Unsupervised and reinforcement learning (Principal Component Analysis networks, self-organization in networks). The probabilistic neural networks. |
Week 10: |
Complexity of learning using Feedforward Networks : Learnability in ANN, generalizability of learning, space complexity of Feedforward Networks. |
Week 11: |
Growth algorithms (the upstart algorithm, learning by divide and conquer et al.). Networks with nonlinear synapses and nonlinear synaptic contacts. |
Week 12: |
Symmetric Hopfield networks (convergence proof, capacity and spurious memory, correlated patterns). Symmetric networks with analog units (convergence proof, cellular NNs).. |
Week 13: |
Seeking the global minimum. A learning algorithm for the Boltzmann machine. Asimmetric recurrent networks. |
Week 14: |
Unsupervised competitive learning. Adaptive resonance networks. Self-organizing feature maps. |
Week 15*: |
NN approaches to solving hard problems (multitarget tracking, time series prediction, speech generation and recognition et al.). |
Week 16*: |
Final exam |
Textbooks and materials: |
|
Recommended readings: |
N.K.Bose, P.Liang, “Neural network fundamentals with graphs, algorithms and applications” Laurene Fausett “Foundations of Neural Networks”. |
|
* Between 15th and 16th weeks is there a free week for students to prepare for final exam.
|
|