ECTS @ IUE ECTS @ IUE ECTS @ IUE ECTS @ IUE ECTS @ IUE ECTS @ IUE ECTS @ IUE

Syllabus ( MATH 587 )


   Basic information
Course title: Mathematical Background and Applications of Neural Networks
Course code: MATH 587
Lecturer: Assoc. Prof. Dr. Selçuk TOPAL
ECTS credits: 7.5
GTU credits: 3 (3+0+0)
Year, Semester: 1/2, Fall and Spring
Level of course: Second Cycle (Master's)
Type of course: Area Elective
Language of instruction: English
Mode of delivery: Face to face
Pre- and co-requisites: Calculus I, II
Professional practice: No
Purpose of the course: To give understanding of data processing (e.g. classifying and recognizing) through the use of appropriate architectures and mathematical models of artificial neural networks(ANN).To be able of giving assessment to such networks properties as their convergence, volume and stability.
   Learning outcomes Up

Upon successful completion of this course, students will be able to:

  1. To explain the essence of main mathematical models used for neurons and their nets

    Contribution to Program Outcomes

    1. Relate mathematics to other disciplines and develop mathematical models for multidisciplinary problems

    Method of assessment

    1. Written exam
  2. To effectively use the abilities of different ANN models relative to data processing and formulate reasons for an appropriate model selection.

    Contribution to Program Outcomes

    1. Define and manipulate advanced concepts of Mathematics
    2. Design and conduct research projects independently

    Method of assessment

    1. Written exam
    2. Homework assignment
  3. To apply the apparatus of the ANNs studied for the practical tasks correct formulation and solution.

    Contribution to Program Outcomes

    1. Design and conduct research projects independently
    2. Develop mathematical, communicative, problem-solving, brainstorming skills.

    Method of assessment

    1. Seminar/presentation
    2. Term paper
   Contents Up
Week 1: Basic properties of neurons, connection patterns between neurons. Main mathematical models proposed for neurons and their distinguishing features.
Week 2: Preliminaries of Graph Theory applicable to the ANN topological characterization (particular classes of graphs, direct graphs, topological invariants, Voronoi diagrams and Delaunay tessellations etc.).
Week 3: Basics for algorithms characterization I : P- and NP-complete problems, shortest path problem interconnection and routing algorithms
Week 4: Basics for algorithms characterization II: Placement and partitioning. Associative Memory (AM), linear associators, AM implementation.
Week 5: Perceptrons as the simplest learning machines. Main definitions (decision function, pattern space, decision surface). Liner separability of training patterns. Correction increment and perceptrone larning algorithms as implementations of the gradient descent method.
Week 6: The perceptron convergence theorem. Convergence speed. The Widrow-Hoff LMS algorithm. Order of predicate and perceptrons.
Week 7: Exact representation using Feedforward Networks. Kolmogorov’s theorem and its consequences. Approximate representations. Fixed Multilayer Feedforward Network training by backpropagation.
Week 8: Midterm exam
Week 9: Structural training of Multilayer Feedforward Networks (algorithm, robustness and size issues). Unsupervised and reinforcement learning (Principal Component Analysis networks, self-organization in networks). The probabilistic neural networks.
Week 10: Complexity of learning using Feedforward Networks : Learnability in ANN, generalizability of learning, space complexity of Feedforward Networks.
Week 11: Growth algorithms (the upstart algorithm, learning by divide and conquer et al.). Networks with nonlinear synapses and nonlinear synaptic contacts.
Week 12: Symmetric Hopfield networks (convergence proof, capacity and spurious memory, correlated patterns). Symmetric networks with analog units (convergence proof, cellular NNs)..
Week 13: Seeking the global minimum. A learning algorithm for the Boltzmann machine. Asimmetric recurrent networks.
Week 14: Unsupervised competitive learning. Adaptive resonance networks. Self-organizing feature maps.
Week 15*: NN approaches to solving hard problems (multitarget tracking, time series prediction, speech generation and recognition et al.).
Week 16*: Final exam
Textbooks and materials:
Recommended readings: N.K.Bose, P.Liang, “Neural network fundamentals with graphs, algorithms and applications”
Laurene Fausett “Foundations of Neural Networks”.
  * Between 15th and 16th weeks is there a free week for students to prepare for final exam.
Assessment Up
Method of assessment Week number Weight (%)
Mid-terms: 8 30
Other in-term studies: 0
Project: 14 20
Homework: 3,5,7,9,11,13 10
Quiz: 0
Final exam: 16 40
  Total weight:
(%)
   Workload Up
Activity Duration (Hours per week) Total number of weeks Total hours in term
Courses (Face-to-face teaching): 3 14
Own studies outside class: 0 0
Practice, Recitation: 1 14
Homework: 2 6
Term project: 5 14
Term project presentation: 5 1
Quiz: 0 0
Own study for mid-term exam: 10 2
Mid-term: 2 1
Personal studies for final exam: 10 2
Final exam: 2 1
    Total workload:
    Total ECTS credits:
*
  * ECTS credit is calculated by dividing total workload by 25.
(1 ECTS = 25 work hours)
-->