Biography
Prof. Manu Pratap Singh
Prof. Manu Pratap Singh
Ambedkar University, India
Title: Soft Computing Techniques of Multi-objective optimization for pattern Recognition
Abstract: 

Pattern recognition is a dominate research area in the field of Machine intelligence.  Pattern recognition is considered with various techniques of soft computing. In different approaches of soft computing the pattern recognition is considered as the non constraint multi objective optimization problem. Pattern storage & recalling i.e. pattern association is one of prominent method for the pattern recognition task that one would like to realize using an artificial neural network (ANN) as associative memory feature. Pattern storage is generally accomplished by a feedback network consisting of processing units with non-linear bipolar output functions. The stable states of the network represent the memorized or stored patterns. Since the Hopfield neural network with associative memory was introduced, various modifications are developed for the purpose of storing and retrieving memory patterns as fixed-point attractors. The dynamics of these networks have been studied extensively because of their potential applications. The dynamics determines the retrieval quality of the associative memories corresponding to already stored patterns. The pattern information in an unsupervised manner is encoded as sum of correlation weight matrices in the connection strengths between the proceeding units of feedback neural network using the locally available information of the pre and post synaptic units which is considered as final or parent weight matrix.

Hopfield proposed a fully connected neural network model of associative memory in which we can store information by distributing it among neurons, and recall it from the dynamically relaxed neuron states. If we map these states corresponding to certain desired memory vectors, then the time evolution of dynamics leads to a stable state. These stable states of the networks represent the stored patterns. Hopfield used the Hebbian learning rule to prescribe the weight matrix for establishing these stable states. A major drawback of this type of neural networks is that the memory attractors are constantly accompanied with a huge number of spurious memory attractors so that the network dynamics is very likely to be trapped in these attractors, and thereby prevents the retrieval of the memory attractors. Hopfield type networks also likely are trapped in non-optimal local minima close to the starting point, which is not desired. The presence of false minima will increase the probability of error in recall of the stored pattern. The problem of false minima can be reduced by adopting the evolutionary algorithm to accomplish the search for global minima. There have been a lot of researchers who apply evolutionary techniques (simulated annealing and Genetic algorithm) to minimize the problem of false minima. Imades & Akira have applied evolutionary computation to Hopfield neural networks in various ways. A rigorous treatment of the capacity of the Hopfield associative memory can be found in.  The Genetic algorithm has been identified as one of prominent search technique for exploring the global minima in Hopfield neural network.

Developed by Holland, a Genetic algorithm is a biologically inspired search technique. In simple terms, the technique involves generating a random initial population of individuals, each of which represents a potential solution to a problem. Each member of this population evaluates from a fitness function which is selected against some known criteria. The selected members of the population from the fitness function are used to generate the new population as the members of the population are then selected for reproduction based potential solutions from the operations of the genetic algorithm. The process of evaluation, selection, and recombination is iterated until the population converges to an acceptable optimal solution. Genetic algorithms (GAs) require only fitness information, not gradient information or other internal knowledge of a problem as in case of neural networks. Genetic algorithms have traditionally been used in optimization but, with a few enhancements, can perform classification, prediction and pattern association as well. The GA has been used very effectively for function optimization and it can perform efficient searching for approximate global minima. It has been observed that the pattern recalling in the Hopfield type neural networks can be performed efficiently with GA. The GA in this case is expected to yield alternative global optimal values of the weight matrix corresponding to all stored patterns. The conventional Hopfield neural network suffers from the problem of non-convergence and local minima on increasing the complexity of the network. However, GA is particularly good to perform efficient searching in large and complex space to find out the global optima and for convergence. Considerable research into the Hopfield network has shown that the model may trap into four types of spurious attractors. Four well identified classes of these attractors are mixture states, spin glass states, compliment states and alien attractors. As the complexity of the of the search space increases, GA presents an increasingly attractive alternative for pattern storage & recalling in Hopfield type neural networks of associative memory.

The neural network applications address problems in pattern classification, prediction, financial analysis, and control and optimization. In most current applications, neural networks are best used as aids to human decision makers instead of substitutes for them. Genetic algorithms have helped market researchers performing market segmentation analysis. Genetic algorithms and neural networks can be integrated into a single application to take advantage of the best features of these technologies.

Much work has been done on the evolution of neural networks with GA. There have been a lot of researches which apply evolutionary techniques to layered neural networks. However, their applications to fully connected neural networks remain few so far. The first attempt to conjugate evolutionary algorithms with Hopfield neural networks dealt with training of connection weights and design of the neural network architecture, or both. Evolution has been introduced in neural networks at three levels: architectures, connection weights and learning rules. The evolution of connection weights proceeds at the lowest level on the fastest time scale in an environment determined by architecture, a learning rule, and learning tasks. The evolution of connection weights introduces an adaptive and global approach to training, especially in the reinforcement learning and recurrent network learning paradigm. Training of neural networks using evolutionary algorithms started in the beginning of 90’s . Reviews can be found in. Cardenas et al. presented the architecture optimization of neural networks using parallel genetic algorithms for pattern recognition based on person faces. They compared the results of the training stage for sequential and parallel implementations. The genetic evolution has been used as data structures processing for image classification.

 The work on which we are focusing due to its scientific importance and socially relevancy is the of GA for efficient recalling of memorized patterns as auto associative memory from the Hopfield neural network corresponding to the presented input pattern vector of handwritten Hindi ‘SWARS’ characters. The recalling in this associative memory network is performed under the consideration of reducing the effect of false minima by using evolutionary searching method like genetic algorithm. In this approach the GA starts from the suboptimal weight matrix as the initial population of solution. The suboptimal weight matrix reflects the encoded patterns information of the training set by using unsupervised Hebbian learning rule i.e. sum of correlation weight matrices. Each correlation term is corresponding to individual pattern information. Hence, the GA starts from the sum of correlation matrices for training set which we call as parent weight matrix, and it determines the optimal weight matrix for the presented noisy prototype input patterns of the handwritten ‘SWARS’ of Hindi language. The performance of pattern storage network is evaluated as rate of success in recalling of correct memorized pattern correspond to the presented prototype input pattern of handwritten ‘SWARS’ with GA which starts from sub-optimal solution i.e. sub-optimal GA. The simulated results indicate the better performance of the suboptimal genetic algorithm (SGA) as compared with Hebbian rule in success rate for recalling of correct memorized ‘SWARS’ characters. 
Biography: 
    Prof. Manu Pratap Singh received his Ph.D. from Kumaun University Nainital, Uthrakhand, India, in 2001. He completed his Master of Science in Computer Science from Allahabad University, Allahabad in 1995. He is currently working as Professor in Department of Computer Science, Institute of Engineering and Technology, Dr. B.R. Ambedkar University, Agra, UP, India since 2014. He is engaged in teaching and research since last 20 years. He has more than 90 research papers in journals of international and national repute. His work has been recognized widely around the world in the form of citations of his research papers. He also has received the Young Scientist Award in computer science by international Academy of Physical sciences, Allahabad in year 2005. He has guided 18 students for their doctorate in computer science. He is also referee of various international and national journals like International Journal of Uncertainty, Fuzziness and Knowledge Based Systems published by World scientific publishing cooperation Ltd, International Journal of Engineering, Iran, IEEE Transaction of fuzzy systems and European journal of operation research published by Elsevier. He has developed a feed forward neural networks simulator for hand written character recognition of English alphabets. He has also developed a hybrid evolutionary algorithm for hand written character recognition of English as well as for Hindi language classification. In his hybrid approach the Genetic algorithm is incorporated with back propagation learning rule to train the feed forward neural networks. In this approach the genetic algorithm starts from the suboptimal solution and converges for the optimal solutions. This approach leads for the multi objective optimization phenomena. Another hybrid approach of evolutionary algorithm has been developed for the feedback neural network of Hopfield type for efficient recalling for the memorized patterns. Here also the randomness from the genetic algorithm is minimized by starting it from the suboptimal solution in the term of parent weight matrix for the global optimal solutions i.e. correct weight matrices for the network to consider it for efficient pattern recalling.  His research interests are focused on Neural networks, pattern recognition and machine intelligence, soft-computing, quantum computing etc. He is a member of technical committee of IASTED, Canada since 2004. He is also the regular member of machine intelligence Research Labs (MIR Labs), scientific network for innovation and research excellence (SNIRE), Auburn, Washington, USE, http://www.mirlabs.org, since 2012. His Google citation indices are h-14, i10-index is 16 and he has 434 citations. He has been invited as keynote speaker and invited guest speaker in various institutions in India and Abroad.