Chrome icon Install the ML Dictionary Chrome Extension and discover a new machine learning concept on every tab 🎉🙌
Logo ml2

Tokenization

Tokenization is the act of breaking up a sequence of strings into pieces such as words, keywords, phrases and other elements called tokens. The tokens become the input for another process like parsing and text mining.

Made by AI Summer Internship ☀️