Week 5: Approximations of functions, cross-validation, network pruning and complexity regularization, convolution networks, non-linear filtering Week 4: Multilayer perceptron, batch and online learning, derivation of the back propagation algorithm, XOR problem, Role of Hessian in online learning, annealing and optimal control of learning rate Week 3: Modeling through regression, linear and logistic regression for multiple classes. Week 2: Learning processes, learning tasks, Perceptron, perceptron convergence theorem, relationship between perceptron and Bayes classifiers, batch perceptron algorithm Week 1: Introduction, human brain, models of a neuron, neural communication, neural networks as directed graphs, network architectures (feed-forward, feedback etc.), knowledge representation. INDUSTRY SUPPORT: AI based, machine learning based. PREREQUISITES: Basic mathematical background in probability, linear algebra, signals and systems or equivalent. INTENDED AUDIENCE: Graduate level, Senior UG can also participate, engineers and scientists within related industry. The course will have assignments that are theoretical and computer based working with actual data. that are based on the MLP will be touched upon. Towards the end, topics such as convolutional neural networks etc. The course covers Rosenblatt’s perceptron, regression modeling, multilayer perceptron (MLP), kernel methods and radial basis functions (RBF), support vector machines (SVM), regularization theory and principal component analysis (Hebbian and kernel based). The neural networks are viewed as directed graphs with various network topologies towards learning tasks driven by optimization techniques. The course starts with a motivation of how the human brain is inspirational to building artificial neural networks. This will be an introductory graduate level course in neural networks for signal processing.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |