Neural Networks and Deep Learning : A Textbook 🔍
Charu C. Aggarwal Springer International Publishing : Imprint: Springer, 2nd ed. 2023, 2023
英语 [en] · RAR · 59.2MB · 2023 · 📘 非小说类图书 · 🚀/lgli/lgrs/nexusstc · Save
描述
Neural networks were developed to simulate the human nervous system for Machine Learning tasks by treating the computational units in a learning model in a manner similar to human neurons. The grand vision of neural networks is to create artificial intelligence by building machines whose architecture simulates the computations in the human nervous system. Although the biological model of neural networks is an exciting one and evokes comparisons with science fiction, neural networks have a much simpler and mundane mathematical basis than a complex biological system. The neural network abstraction can be viewed as a modular approach of enabling learning algorithms that are based on continuous optimization on a computational graph of mathematical dependencies between the input and output. These ideas are strikingly similar to classical optimization methods in control theory, which historically preceded the development of neural network algorithms.
Neural networks were developed soon after the advent of computers in the fifties and sixties. Rosenblatt’s perceptron algorithm was seen as a fundamental cornerstone of neural networks, which caused an initial period of euphoria — it was soon followed by disappointment as the initial successes were somewhat limited. Eventually, at the turn of the century, greater data availability and increasing computational power lead to increased successes of neural networks, and this area was reborn under the new label of “Deep Learning.” Although we are still far from the day that Artificial Intelligence (AI) is close to human performance, there are specific domains like image recognition, self-driving cars, and game playing, where AI has matched or exceeded human performance. It is also hard to predict what AI might be able to do in the future. For example, few computer vision experts would have thought two decades ago that any automated system could ever perform an intuitive task like categorizing an image more accurately than a human. The large amounts of data available in recent years together with increased computational power have enabled experimentation with more sophisticated and deep neural architectures than was previously possible. The resulting success has changed the broader perception of the potential of Deep Learning. This book discusses neural networks from this modern perspective.
The chapters of the book are organized as follows:
1. The basics of neural networks: Chapters 1, 2, and 3 discuss the basics of neural network design and the backpropagation algorithm. Many traditional machine learning models can be understood as special cases of neural learning. Understanding the relationship between traditional machine learning and neural networks is the first step to understanding the latter. The simulation of various machine learning models with neural networks is provided in Chapter 3. This will give the analyst a feel of how neural networks push the envelope of traditional machine learning algorithms.
2. Fundamentals of neural networks: Although Chapters 1, 2, and 3 provide an overview of the training methods for neural networks, a more detailed understanding of the training challenges is provided in Chapters 4 and 5. Chapters 6 and 7 present radial-basis function (RBF) networks and restricted Boltzmann machines.
3. Advanced topics in neural networks: A lot of the recent success of deep learning is a result of the specialized architectures for various domains, such as recurrent neural networks and convolutional neural networks. Chapters 8 and 9 discuss recurrent and convolutional neural networks. Graph neural networks are discussed in Chapter 10. Several advanced topics like deep reinforcement learning, attention mechanisms, neural Turing mechanisms, and generative adversarial networks are discussed in Chapters 11 and 12.
备用文件名
lgli/neural-networks-deep-learning-textbook-2nd.rar
备用文件名
lgrsnf/neural-networks-deep-learning-textbook-2nd.rar
备用出版商
Springer International Publishing AG
备用出版商
Springer Nature Switzerland AG
备用版本
Springer Nature (Textbooks & Major Reference Works), [N.p.], 2023
备用版本
Second edition, Cham, Switzerland
备用版本
Switzerland, Switzerland
备用版本
2nd ed. 2023, Cham
元数据中的注释
{"edition":"2","isbns":["3031296419","3031296427","9783031296413","9783031296420"],"last_page":541,"publisher":"Springer","source":"crossref"}
备用描述
This book covers both classical and modern models in deep learning. The primary focus is on the theory and algorithms of deep learning. The theory and algorithms of neural networks are particularly important for understanding important concepts, so that one can understand the important design concepts of neural architectures in different applications. Why do neural networks work? When do they work better than off-the-shelf machine-learning models? When is depth useful? Why is training neural networks so hard? What are the pitfalls? The book is also rich in discussing different applications in order to give the practitioner a flavor of how neural architectures are designed for different types of problems. Deep learning methods for various data domains, such as text, images, and graphs are presented in detail. The chapters of this book span three categories: The basics of neural networks: The backpropagation algorithm is discussed in Chapter 2.Many traditional machine learning models can be understood as special cases of neural networks. Chapter 3 explores the connections between traditional machine learning and neural networks. Support vector machines, linear/logistic regression, singular value decomposition, matrix factorization, and recommender systems are shown to be special cases of neural networks. Fundamentals of neural networks: A detailed discussion of training and regularization is provided in Chapters 4 and 5. Chapters 6 and 7 present radial-basis function (RBF) networks and restricted Boltzmann machines. Advanced topics in neural networks: Chapters 8, 9, and 10 discuss recurrent neural networks, convolutional neural networks, and graph neural networks. Several advanced topics like deep reinforcement learning, attention mechanisms, transformer networks, Kohonen self-organizing maps, and generative adversarial networks are introduced in Chapters 11 and 12. The textbook is written for graduate students and upper under graduate level students. Researchers and practitioners working within this related field will want to purchase this as well.Where possible, an application-centric view is highlighted in order to provide an understanding of the practical uses of each class of techniques.The second edition is substantially reorganized and expanded with separate chapters on backpropagation and graph neural networks. Many chapters have been significantly revised over the first edition.Greater focus is placed on modern deep learning ideas such as attention mechanisms, transformers, and pre-trained language models.
备用描述
This textbook covers both classical and modern models in deep learning and includes examples and exercises throughout the chapters. Deep learning methods for various data domains, such as text, images, and graphs are presented in detail. The chapters of this book span three categories:
The basics of neural networks: The backpropagation algorithm is discussed in Chapter 2.
Many traditional machine learning models can be understood as special cases of neural networks. Chapter 3 explores the connections between traditional machine learning and neural networks. Support vector machines, linear/logistic regression, singular value decomposition, matrix factorization, and recommender systems are shown to be special cases of neural networks.
Fundamentals of neural networks: A detailed discussion of training and regularization is provided in Chapters 4 and 5. Chapters 6 and 7 present radial-basis function (RBF) networks and restricted Boltzmann machines.
Advanced topics in neural networks: Chapters 8, 9, and 10 discuss recurrent neural networks, convolutional neural networks, and graph neural networks. Several advanced topics like deep reinforcement learning, attention mechanisms, transformer networks, Kohonen self-organizing maps, and generative adversarial networks are introduced in Chapters 11 and 12.
The textbook is written for graduate students and upper under graduate level students. Researchers and practitioners working within this related field will want to purchase this as well.
Where possible, an application-centric view is highlighted in order to provide an understanding of the practical uses of each class of techniques.
The second edition is substantially reorganized and expanded with separate chapters on backpropagation and graph neural networks. Many chapters have been significantly revised over the first edition.
Greater focus is placed on modern deep learning ideas such as attention mechanisms, transformers, and pre-trained language models.
备用描述
Keine Beschreibung vorhanden.
Erscheinungsdatum: 30.06.2023
开源日期
2023-07-02
更多信息……
We strongly recommend that you support the author by buying or donating on their personal website, or borrowing in your local library.

🚀 快速下载

成为会员以支持书籍、论文等的长期保存。为了感谢您对我们的支持,您将获得高速下载权益。❤️

🐢 低速下载

由可信的合作方提供。 更多信息请参见常见问题解答。 (可能需要验证浏览器——无限次下载!)

所有选项下载的文件都相同,应该可以安全使用。即使这样,从互联网下载文件时始终要小心。例如,确保您的设备更新及时。
  • 对于大文件,我们建议使用下载管理器以防止中断。
    推荐的下载管理器:JDownloader
  • 您将需要一个电子书或 PDF 阅读器来打开文件,具体取决于文件格式。
    推荐的电子书阅读器:Anna的档案在线查看器ReadEraCalibre
  • 使用在线工具进行格式转换。
    推荐的转换工具:CloudConvertPrintFriendly
  • 您可以将 PDF 和 EPUB 文件发送到您的 Kindle 或 Kobo 电子阅读器。
    推荐的工具:亚马逊的“发送到 Kindle”djazz 的“发送到 Kobo/Kindle”
  • 支持作者和图书馆
    ✍️ 如果您喜欢这个并且能够负担得起,请考虑购买原版,或直接支持作者。
    📚 如果您当地的图书馆有这本书,请考虑在那里免费借阅。