Encoding Linguistic Structures with Graph Convolutional Networks
Graph Convolutional Networks (GCNs) is an effective tool for modeling graph-structured data. We investigate their applicability in the context of natural language processing (semantic role labeling and machine translation). We introduce a version of GCNs suited to modeling syntactic and/or semantic dependency graphs and use them to construct linguistically-informed sentence encoders. We demonstrate that using them results in state-of-the-art results on semantic role labeling of English and Chinese and a substantial boost in machine translation performance.
In this tutorial I present several neural architectures for the task of semantic role labeling. I begin with Hybrid symbolic/neural models to pass to syntax-agnostic neural models and finally Syntax-aware neural models.