EVENT; FACULTY DEVELOPMENT PROGRAM (FDP)

FDP DAY ONE

25 June 2021 10:00am -11:30am
Topic: Neural Networks with Pattern Classification
Speaker: Prof (Dr). A.R.Garg (HOD, Dept of Electrical Engineering)

SUMMARY

Prof (Dr). A.R.Garg explained about the following:

Supervised learning: input and corresponding output is given and it produces correct output for new inputs. Reinforcement learning: Animal learning. . Unsupervised learning: We give inputs as training, find the structure in which the observed inputs belong. Applications: Handwriting recognition, OCR, speech recognition, speaker recognition/verification, text classification, security: face detection and recognition, computer vision, diagnosis, adaptive control, fraud detection, database marketing, spam filtering, games, financial prediction. Unsupervised learning: clustering, embedding, compression. Problems of interest for computer scientists and engineers: Pattern Classification, Clustering, Function Approximation, Predicting/forecasting, Optimization, Content Addressable, Memory. Huge simplification: A neuron computes a weighted sum of its inputs and fires when that sum exceeds threshold. Hebbian learning: Synaptic weights change as a function of the pre and post synaptic activities. Speciality of biological neural system are: massive parallelism, Distributed representation, and computation, Learning ability, Generalization ability, Adaptivity, Inherent contextual information processing, Fault tolerance, low energy consumption. Neural Network examples like: Banking by prediction, big advertising to optimize their ad choice. Artificial Neural Network: Information processing paradigm inspired by the biological nervous system. Activation function (Which transforms neurons input to output) and its features like Squashing effect and simple calculations. Decision boundaries: which divides feature space by drawing a hyperplane across it. Discriminant function: returns different values on opposite side. A perceptron consists of input values, weighted bias, a weighted sum, and activation function.

SESSION 2

25 June 2021 11:45am -1:30pm

Topic: Convolutional Neural Networks
Speaker: Prof (Dr). A.R.Garg (HOD, Dept of Electrical Engineering, MBM Engineering college)

SUMMARY

Prof (Dr). A.R.Garg explained about the following:
Neural representation of reality. Physical reality 🡪 Energy holoflux, Hypervolume 🡪 Conceptual reality. Revised Speciality of Biological Neural System: Massive parallelism, Distributed representation, and computation, Learning ability, Generalization ability, Adaptivity, Inherent contextual information processing, Fault tolerance, low energy consumption. Raw input: Input data 🡪 Neural network and it is used for Function approximation, Forecasting, optimization, Pattern classification, Feature extraction, Clustering, Content addressable memory. Processed input: Input data 🡪 Extractor 🡪 Feature vector 🡪 Neural network. Processed input treats curse of dimensionality, reduces the computational requirement of NN and improves the generalization ability of NN. Receptive Field Retinal – The area of the visual space in which a visual stimulation can change the spiking response of the cell. Components of typical convolution neural network like Input to layers, First layer, Next layer and fully connected layer. CNN (Convolution Neural Network) with Transfer learning: Feature Extraction Block (all the convolutional layer) of one of the following: VGG16, ResNetV2, InceptionResNetV2, MobileNetV2 transfers knowledge to pre-trained FE. Parameter sharing: A feature detector that is useful in one part of image is useful in another part of image. Sparsity of connection: In each layer, each output value depends only on small member of inputs. Applications of CNN like supervised convents used in objection detection, convents to detect faces and licence plates, convent-based system to track customers in super market etc

SESSION 3

25 June 2021 2:30pm -04:30pm
Topic: practical implementation on Neural Network
Speaker: Mr. Saurabh M Sharma Research Scholar (MTech, CSE) at sir padampat Singhania university.

SUMMARY

Mr. Saurabh Sharma explained about the following:
Linear Regression-implementation using Neural Network. Architecture of Single Neuron and Neural Network. Data pre-processing: check and fill the missing value, normalization (multivariant problem), categorical values and dummy values (multivariant problem). Problems from dataset on how to implement the exact values to plot the graphs and to assign the values accurately. Hyper Parameters: are those parameters which we set for training. Architecture of CNN classification: Cat & Dog. Implementation: CNN implementation with practical example.