Customizing the Tokenizer and Sentence Segmentation

Let's learn how we can add special case rules to the existing Tokenizer class.

When we work with a specific domain, such as medicine, insurance, or finance, we often come across words, abbreviations, and entities that need special attention. Most domains we'll process have characteristic words and phrases that need custom tokenization rules. Here's how to add a special case rule to an existing Tokenizer class instance:

Get hands-on with 1400+ tech skills courses.