Summary: Semantic Role Labeling with BERT-Based Transformers

Get a quick recap of what we covered in this chapter.

In this chapter, we explored SRL. SRL tasks are difficult for both humans and machines. Transformer models have shown that human baselines can be reached for many NLP topics to a certain extent.

We found that a simple BERT-based transformer can perform predicate sense disambiguation. We ran a simple transformer that could identify the meaning of a verb (predicate) without lexical or syntactic labeling. Shi and Lin (2019) used a standard sentence + verb input format to train their BERT-based transformer.

Get hands-on with 1200+ tech skills courses.