LIMITED OFFER
Save 50% on book bundles
Immediately download your ebook while waiting for your print delivery. No promo code needed.
Machine reading comprehension (MRC) is a cutting-edge technology in natural language processing (NLP). MRC has recently advanced significantly, surpassing human parity in se… Read more
LIMITED OFFER
Immediately download your ebook while waiting for your print delivery. No promo code needed.
Machine reading comprehension (MRC) is a cutting-edge technology in natural language processing (NLP). MRC has recently advanced significantly, surpassing human parity in several public datasets. It has also been widely deployed by industry in search engine and quality assurance systems. Machine Reading Comprehension: Algorithms and Practice performs a deep-dive into MRC, offering a resource on the complex tasks this technology involves. The title presents the fundamentals of NLP and deep learning, before introducing the task, models, and applications of MRC. This volume gives theoretical treatment to solutions and gives detailed analysis of code, and considers applications in real-world industry. The book includes basic concepts, tasks, datasets, NLP tools, deep learning models and architecture, and insight from hands-on experience. In addition, the title presents the latest advances from the past two years of research. Structured into three sections and eight chapters, this book presents the basis of MRC; MRC models; and hands-on issues in application. This book offers a comprehensive solution for researchers in industry and academia who are looking to understand and deploy machine reading comprehension within natural language processing.
Researchers working on NLP, and particularly on MRC, in both industry and academia. Postgraduate and advanced students in machine learning, deep learning, NLP and aligned areas in computer science
Part I: FoundationChapter 1 Introduction to Machine Reading Comprehension1.1 The Machine Reading Comprehension Task1.1.1 History of Machine Reading Comprehension1.1.2 Application of Machine Reading Comprehension1.2 Natural Language Processing1.2.1 The Status Quo of NLP1.2.2 Existing Issues1.3 Deep learning1.3.1 Features of Deep Learning1.3.2 Achievements of Deep Learning1.4 Evaluation of Machine Reading Comprehension1.4.1 Answer Forms1.4.2 ROUGE: Metric for Evaluating Freestyle Answers1.5 MRC Datasets1.5.1 Single-paragraph Datasets1.5.2 Multi-paragraph Datasets1.5.3 Corpus-based Datasets1.6 How to Make an MRC Dataset1.6.1 Generation of Articles and Questions1.6.2 Generation of Correct Answers1.6.3 How to Build a High-quality MRC Datase1.7 SummaryChapter 2 The Basics of Natural Language Processing2.1 Tokenization2.1.1 Byte Pair Encoding2.2 The Cornerstone of NLP: Word Vectors2.2.1 Word Vectorization2.2.2 Word2vec2.3 Linguistic Tagging2.3.1 Named Entity Recognition2.3.2 Part-of-Speech Tagging2.4 Language Model2.4.1 N-gram Model2.4.2 Evaluation of Language Models2.5 SummaryChapter 3 Deep Learning in Natural Language Processing3.1 From Word Vector to Text Vector3.1.1 Using the Final State of RNN3.1.2 CNN and Poolin3.1.3 Parametrized Weighted Sum3.2 Answer Multiple-choice Questions: Natural Language Understanding3.2.1 Network Structure3.2.2 Implementing Text Classification3.3 Write an Article: Natural Language Generation3.3.1 Network Architecture3.3.2 Implementing Text Generation3.3.3 Beam Search3.4 Keep Focused: Attention Mechanism3.4.1 Attention Mechanism3.4.2 Implementing Attention Function3.4.3 Sequence-to-sequence Model3.5 SummaryPart II: ArchitectureChapter 4 Architecture of MRC Models4.1 General Architecture of MRC Models4.2 Encoding Layer4.2.1 Establishing the Dictionary4.2.2 Character Embeddings4.2.3 Contextual Embeddings4.3 Interaction Layer4.3.1 Cross-Attention4.3.2 Self-Attention4.3.3 Contextual Embeddings4.4 Output Layer4.4.1 Construct the Question Vector4.4.2 Generate Multiple-choice Answers4.4.3 Generate Extractive Answers4.4.4 Generate Freestyle Answers4.5 SummaryChapter 5 Common MRC Models5.1 Bi-Directional Attention Flow Model5.1.1 Encoding Layer5.1.2 Interaction Layer5.1.3 Output Layer5.2 R-Net5.2.1 Gated Attention-based Recurrent Network5.2.2 Encoding Layer5.2.3 Interaction Layer5.2.4 Output Layer5.3 FusionNet5.3.1 History of Word5.3.2 Fully-aware Attention5.3.3 Encoding Layer5.3.4 Interaction Layer5.3.5 Output Layer5.4 Essential-term-aware Retriever-Reader5.4.1 Retriever5.4.2 Reader5.5 SummaryChapter 6 Pre-trained Language Model6.1 Pre-trained Models and Transfer Learning6.2 Translation-based Pre-trained Language Model: CoVe6.2.1 Machine Translation Model6.2.2 Contextual Embeddings6.3 Pre-trained Language Model ELMo6.3.1 Bi-directional Language Model6.3.2 How to Use ELMo6.4 The Generative Pre-Training Language Model: GPT6.4.1 Transformer6.4.2 GPT6.4.3 Apply GPT6.5 The Phenomenal Pre-Trained Language Model: BERT6.5.1 Masked Language Model6.5.2 Next Sentence Prediction6.5.3 Configurations of BERT Pre-training6.5.4 Fine-tuning BERT6.5.5 Improving BERT6.5.6 Implementing BERT Fine-tuning in MRC6.6 SummaryPart III: ApplicationChapter 7 Code Analysis of SDNet Model7.1 Multi-turn Conversational MRC model: SDNet7.1.1 Encoding Layer7.1.2 Interaction Layer and Output Layer7.2 Introduction to Code7.2.1 Code Structure7.2.2 How to Run the Code7.2.3 Configuration File7.3 Pre-processing7.3.1 Initialization7.3.2 Pre-processing7.4 Training7.4.1 Base Class7.4.2 Subclass7.5 Batch Generator7.5.1 Padding7.5.2 Preparing Data for BERT7.6 SDNet Model7.6.1 Network Class7.6.2 Network Layers7.6.3 Generate BERT Embeddings7.7 SummaryChapter 8 Applications and Future of Machine Reading Comprehension8.1 Intelligent Customer Service8.1.1 Building Product Knowledge Base8.1.2 Intent Understanding8.1.3 Answer Generation8.1.4 Other Modules8.2 Search Engine8.2.1 Search Engine Technology8.2.2 MRC in Search Engine8.2.3 Challenges and Future of MRC in Search Engine8.3 Health Care8.4 Laws8.4.1 Automatic Judgement8.4.2 Crime Classification8.5 Finance8.5.1 Predicting Stock Prices8.5.2 News Summarization8.6 Education8.7 The Future of Machine Reading Comprehension8.7.1 Challenges8.7.2 Commercialization8.8 SummaryAppendicesAppendix A Machine Learning BasicsA.1 Types of Machine LearningA.2 Model and ParametersA.3 Generalization and OverfittingAppendix B Deep Learning BasicsB.1 Neural NetworkB.1.1 DefinitionB.1.2 Loss FunctionB.1.3 OptimizationB.2 Common Types of Neural Network in Deep LearningB.2.1 Convolutional Neural NetworkB.2.2 Recurrent Neural NetworkB.2.3 DropoutB.3 The Deep Learning Framework PyTorchB.3.1 Installing PyTorchB.3.2 TensorB.3.3 Gradient ComputationB.3.4 Network LayerB.3.5 Custom Network
CZ