Normal view MARC view ISBD view

Fundamentals of Artificial Neural Networks

By: Hassoun, Mohamad.
Material type: materialTypeLabelBookPublisher: USA: A Bradford Book, 2003Description: 511 p.ISBN: 0262514672 (paperback); 9780262514675 (paperback).Subject(s): Neural Networks, Artificial Intelligence, Computer ScienceDDC classification: 006.3 Hassoun 1st 2003 29289 Statistics Summary: As book review editor of the IEEE Transactions on Neural Networks, Mohamad Hassoun has had the opportunity to assess the multitude of books on artificial neural networks that have appeared in recent years. Now, in Fundamentals of Artificial Neural Networks, he provides the first systematic account of artificial neural network paradigms by identifying clearly the fundamental concepts and major methodologies underlying most of the current theory and practice employed by neural network researchers.Such a systematic and unified treatment, although sadly lacking in most recent texts on neural networks, makes the subject more accessible to students and practitioners. Here, important results are integrated in order to more fully explain a wide range of existing empirical observations and commonly used heuristics. There are numerous illustrative examples, over 200 end-of-chapter analytical and computer-based problems that will aid in the development of neural network analysis and design skills, and a bibliography of nearly 700 references.Proceeding in a clear and logical fashion, the first two chapters present the basic building blocks and concepts of artificial neural networks and analyze the computational capabilities of the basic network architectures involved. Supervised, reinforcement, and unsupervised learning rules in simple nets are brought together in a common framework in chapter three. The convergence and solution properties of these learning rules are then treated mathematically in chapter four, using the "average learning equation" analysis approach. This organization of material makes it natural to switch into learning multilayer nets using backprop and its variants, described in chapter five. Chapter six covers most of the major neural network paradigms, while associative memories and energy minimizing nets are given detailed coverage in the next chapter. The final chapter takes up Boltzmann machines and Boltzmann learning along with other global search/optimization algorithms such as stochastic gradient search, simulated annealing, and genetic algorithms.
Tags from this library: No tags from this library for this title. Add tag(s)
Log in to add tags.
    average rating: 0.0 (0 votes)
Item type Current location Collection Call number Status Date due Barcode Item holds
Books Books UVAS Library
Computer Science
Social Science 006.3 Hassoun 1st 2003 29289 Computer.Science (Browse shelf) Available 29289
Total holds: 0

As book review editor of the IEEE Transactions on Neural Networks, Mohamad Hassoun has had the opportunity to assess the multitude of books on artificial neural networks that have appeared in recent years. Now, in Fundamentals of Artificial Neural Networks, he provides the first systematic account of artificial neural network paradigms by identifying clearly the fundamental concepts and major methodologies underlying most of the current theory and practice employed by neural network researchers.Such a systematic and unified treatment, although sadly lacking in most recent texts on neural networks, makes the subject more accessible to students and practitioners. Here, important results are integrated in order to more fully explain a wide range of existing empirical observations and commonly used heuristics. There are numerous illustrative examples, over 200 end-of-chapter analytical and computer-based problems that will aid in the development of neural network analysis and design skills, and a bibliography of nearly 700 references.Proceeding in a clear and logical fashion, the first two chapters present the basic building blocks and concepts of artificial neural networks and analyze the computational capabilities of the basic network architectures involved. Supervised, reinforcement, and unsupervised learning rules in simple nets are brought together in a common framework in chapter three. The convergence and solution properties of these learning rules are then treated mathematically in chapter four, using the "average learning equation" analysis approach. This organization of material makes it natural to switch into learning multilayer nets using backprop and its variants, described in chapter five. Chapter six covers most of the major neural network paradigms, while associative memories and energy minimizing nets are given detailed coverage in the next chapter. The final chapter takes up Boltzmann machines and Boltzmann learning along with other global search/optimization algorithms such as stochastic gradient search, simulated annealing, and genetic algorithms.

There are no comments for this item.

Log in to your account to post a comment.


Implemented and Maintained by UVAS Library.
For any Suggestions/Query Contact to library or Email:rehana.kousar@uvas.edu.pk Phone:+91 99239068
Website/OPAC best viewed in Mozilla Browser in 1366X768 Resolution.