Ihre E-Mail wurde erfolgreich gesendet. Bitte prüfen Sie Ihren Maileingang.

Leider ist ein Fehler beim E-Mail-Versand aufgetreten. Bitte versuchen Sie es erneut.

Vorgang fortführen?

Exportieren
Filter
  • Englisch  (1)
  • Ungarisch
  • 2010-2014  (1)
  • 2005-2009
  • 2000-2004
  • Petrov, Slav  (1)
  • Berlin, Heidelberg : Springer-Verlag Berlin Heidelberg  (1)
  • Informatik  (1)
  • Militärwissenschaft
Datenlieferant
Materialart
Sprache
  • Englisch  (1)
  • Ungarisch
Erscheinungszeitraum
  • 2010-2014  (1)
  • 2005-2009
  • 2000-2004
Jahr
Verlag/Herausgeber
  • Berlin, Heidelberg : Springer-Verlag Berlin Heidelberg  (1)
Fachgebiete(RVK)
  • Informatik  (1)
  • Militärwissenschaft
  • 1
    Online-Ressource
    Online-Ressource
    Berlin, Heidelberg : Springer-Verlag Berlin Heidelberg
    ISBN: 9783642227431
    Sprache: Englisch
    Seiten: Online-Ressource (XXII, 105p. 33 illus., 13 illus. in color, digital)
    Serie: Theory and Applications of Natural Language Processing
    Serie: SpringerLink
    Serie: Bücher
    Paralleltitel: Buchausg. u.d.T. Petrov, Slav Coarse-to-fine natural language processing
    RVK:
    Schlagwort(e): Computer Science ; Computer science ; Computer science ; Computational linguistics ; Statistical methods ; Natürliche Sprache ; Syntaktische Analyse ; Grammatik ; Latente Variable ; Maschinelles Lernen ; Automatische Spracherkennung ; Maschinelle Übersetzung ; Natürliche Sprache ; Syntaktische Analyse ; Grammatik ; Latente Variable ; Maschinelles Lernen ; Automatische Spracherkennung ; Maschinelle Übersetzung
    Kurzfassung: 1.Introduction -- 2.Latent Variable Grammars for Natural Language Parsing -- 3.Discriminative Latent Variable Grammars -- 4.Structured Acoustic Models for Speech Recognition -- 5.Coarse-to-Fine Machine Translation Decoding -- 6.Conclusions and Future Work -- Bibliography
    Kurzfassung: The impact of computer systems that can understand natural language will be tremendous. To develop this capability we need to be able to automatically and efficiently analyze large amounts of text. Manually devised rules are not sufficient to provide coverage to handle the complex structure of natural language, necessitating systems that can automatically learn from examples. To handle the flexibility of natural language, it has become standard practice to use statistical models, which assign probabilities for example to the different meanings of a word or the plausibility of grammatical constructions. This book develops a general coarse-to-fine framework for learning and inference in large statistical models for natural language processing. Coarse-to-fine approaches exploit a sequence of models which introduce complexity gradually. At the top of the sequence is a trivial model in which learning and inference are both cheap. Each subsequent model refines the previous one, until a final, full-complexity model is reached. Applications of this framework to syntactic parsing, speech recognition and machine translation are presented, demonstrating the effectiveness of the approach in terms of accuracy and speed. This book is intended for students and researchers interested in statistical approaches to Natural Language Processing. Slav’s work Coarse-to-Fine Natural Language Processing represents a major advance in the area of syntactic parsing, and a great advertisement for the superiority of the machine-learning approach. Eugene Charniak (Brown University)
    Beschreibung / Inhaltsverzeichnis: Coarse-to-Fine Natural Language Processing; Foreword; Preface; Acknowledgements; Contents; List of Figures; List of Tables; Chapter 1 Introduction; 1.1 Coarse-to-Fine Models; 1.2 Coarse-to-Fine Inference; Chapter 2 Latent Variable Grammars for Natural Language Parsing; 2.1 Introduction; 2.1.1 Experimental Setup; 2.2 Manual Grammar Refinement; 2.2.1 Vertical and Horizontal Markovization; 2.2.2 Additional Linguistic Refinements; 2.3 Generative Latent Variable Grammars; 2.3.1 Hierarchical Estimation; 2.3.2 Adaptive Refinement; 2.3.3 Smoothing; 2.3.4 An Infinite Alternative; 2.4 Inference
    Beschreibung / Inhaltsverzeichnis: 2.4.1 Hierarchical Coarse-to-Fine Pruning2.4.1.1 Projections; 2.4.1.2 Estimating Projected Grammars; 2.4.1.3 Calculating Projected Expectations; 2.4.1.4 Hierarchical Projections; 2.4.1.5 Pruning Experiments; 2.4.2 Objective Functions for Parsing; 2.4.2.1 Minimum Bayes Risk Parsing; 2.4.2.2 Alternative Objective Functions; 2.5 Additional Experiments; 2.5.1 Experimental Setup; 2.5.2 Baseline Grammar Variation; 2.5.3 Final Results WSJ; 2.5.4 Multilingual Parsing; 2.5.5 Corpus Variation; 2.5.6 Training Size Variation; 2.6 Analysis; 2.6.1 Lexical Subcategories; 2.6.2 Phrasal Subcategories
    Beschreibung / Inhaltsverzeichnis: 2.6.3 Multilingual Analysis2.7 Summary and Future Work; Chapter 3 Discriminative Latent Variable Grammars; 3.1 Introduction; 3.2 Log-Linear Latent Variable Grammars; 3.3 Single-Scale Discriminative Grammars; 3.3.1 Efficient Discriminative Estimation; 3.3.1.1 Hierarchical Estimation; 3.3.1.2 Feature-Count Approximation; 3.3.2 Experiments; 3.3.2.1 Efficiency; 3.3.2.2 Regularization; 3.3.2.3 Final Test Set Results; 3.4 Multi-scale Discriminative Grammars; 3.4.1 Hierarchical Refinement; 3.4.2 Learning Sparse Multi-scale Grammars; 3.4.2.1 Hierarchical Training
    Beschreibung / Inhaltsverzeichnis: 3.4.2.2 Efficient Multi-scale Inference3.4.2.3 Feature Count Approximations; 3.4.3 Additional Features; 3.4.3.1 Unknown Word Features; 3.4.3.2 Span Features; 3.4.4 Experiments; 3.4.4.1 Sparsity; 3.4.4.2 Accuracy; 3.4.4.3 Efficiency; 3.4.4.4 Final Results; 3.4.5 Analysis; 3.5 Summary and Future Work; Chapter 4 Structured Acoustic Models for Speech Recognition; 4.1 Introduction; 4.2 Learning; 4.2.1 The Hand-Aligned Case; 4.2.2 Splitting; 4.2.3 Merging; 4.2.4 Smoothing; 4.2.5 The Automatically-Aligned Case; 4.3 Inference; 4.4 Experiments; 4.4.1 Phone Recognition; 4.4.2 Phone Classification
    Beschreibung / Inhaltsverzeichnis: 4.5 Analysis4.6 Summary and Future Work; Chapter 5 Coarse-to-Fine Machine Translation Decoding; 5.1 Introduction; 5.2 Coarse-to-Fine Decoding; 5.2.1 Related Work; 5.2.2 Language Model Projections; 5.2.3 Multipass Decoding; 5.3 Inversion Transduction Grammars; 5.4 Learning Coarse Languages; 5.4.1 Random Projections; 5.4.2 Frequency Clustering; 5.4.3 HMM Clustering; 5.4.4 JCluster; 5.4.5 Clustering Results; 5.5 Experiments; 5.5.1 Clustering; 5.5.2 Spacing; 5.5.3 Encoding Versus Order; 5.5.4 Final Results; 5.5.5 Search Error Analysis; 5.6 Summary and Future Work
    Beschreibung / Inhaltsverzeichnis: Chapter 6 Conclusions and Future Work
    Anmerkung: Description based upon print version of record
    URL: Volltext  (lizenzpflichtig)
    Bibliothek Standort Signatur Band/Heft/Jahr Verfügbarkeit
    BibTip Andere fanden auch interessant ...
Schließen ⊗
Diese Webseite nutzt Cookies und das Analyse-Tool Matomo. Weitere Informationen finden Sie hier...