| as_tokens | Create a list of tokens |
| bind_lr | Bind importance of bigrams |
| bind_tf_idf2 | Bind term frequency and inverse document frequency |
| build_sys_dic | Build system dictionary |
| build_user_dic | Build user dictionary |
| collapse_tokens | Collapse sequences of tokens by condition |
| dictionary_info | Get dictionary information |
| gbs_tokenize | Tokenize sentences using 'MeCab' |
| get_dict_features | Get dictionary features |
| get_transition_cost | Get transition cost between pos attributes |
| ginga | Whole text of 'Ginga Tetsudo no Yoru' written by Miyazawa Kenji from Aozora Bunko |
| is_blank | Check if scalars are blank |
| lex_density | Calculate lexical density |
| mute_tokens | Mute tokens by condition |
| ngram_tokenizer | Ngrams tokenizer |
| pack | Pack a data.frame of tokens |
| prettify | Prettify tokenized output |
| tokenize | Tokenize sentences using 'MeCab' |