|
![]() |
|||
|
||||
OverviewDeep learning has become the dominant approach in addressing various tasks in Natural Language Processing (NLP). Although text inputs are typically represented as a sequence of tokens, there is a rich variety of NLP problems that can be best expressed with a graph structure. As a result, there is a surge of interest in developing new deep learning techniques on graphs for a large number of NLP tasks. In this monograph, the authors present a comprehensive overview on Graph Neural Networks (GNNs) for Natural Language Processing. They propose a new taxonomy of GNNs for NLP, which systematically organizes existing research of GNNs for NLP along three axes: graph construction, graph representation learning, and graph based encoder-decoder models. They further introduce a large number of NLP applications that exploits the power of GNNs and summarize the corresponding benchmark datasets, evaluation metrics, and open-source codes. Finally, they discuss various outstanding challenges for making the full use of GNNs for NLP as well as future research directions. This is the first comprehensive overview of Graph Neural Networks for Natural Language Processing. It provides students and researchers with a concise and accessible resource to quickly get up to speed with an important area of machine learning research. Full Product DetailsAuthor: Lingfei Wu , Yu Chen , Kai Shen , Xiaojie GuoPublisher: now publishers Inc Imprint: now publishers Inc Weight: 0.322kg ISBN: 9781638281429ISBN 10: 1638281424 Pages: 224 Publication Date: 25 January 2023 Audience: Professional and scholarly , Professional & Vocational Format: Paperback Publisher's Status: Active Availability: In Print ![]() This item will be ordered in for you from one of our suppliers. Upon receipt, we will promptly dispatch it out to you. For in store availability, please contact us. Table of Contents1. Introduction 2. Graph Based Algorithms for NLP 3. Graph Neural Networks 4. Graph Construction Methods for NLP 5. Graph Representation Learning for NLP 6. GNN Based Encoder-Decoder Models 7. Applications 8. General Challenges and Future Directions 9. Conclusions ReferencesReviewsAuthor InformationTab Content 6Author Website:Countries AvailableAll regions |