wex  v21.04.0
Public Member Functions | List of all members
wex::tokenizer Class Reference

Offers a class that allows you to tokenize a string into substrings or into some container. More...

#include <wex/tokenizer.h>

Public Member Functions

 tokenizer (const std::string &text, const std::string &delimiters=WHITESPACE_DELIMITERS, bool skip_empty_tokens=true)
 Constructor. More...
 
size_t count_tokens () const
 Returns total number of tokens in the string.
 
const std::string get_next_token ()
 Returns the next token, will return empty string if !has_more_tokens().
 
const std::string get_string () const
 Returns not yet tokenized part of string.
 
const std::string get_token () const
 Returns the current token.
 
bool has_more_tokens () const
 Returns true if the string still contains delimiters, and so can be tokenized. More...
 
auto last_delimiter () const
 Get the delimiter which terminated the token last retrieved.
 
template<typename T >
tokenize ()
 tokenizes the complete string into a templatized class (e.g. More...
 
auto tokenize ()
 tokenizes the complete string into a vector of integers (size_t). More...
 

Detailed Description

Offers a class that allows you to tokenize a string into substrings or into some container.

Constructor & Destructor Documentation

◆ tokenizer()

wex::tokenizer::tokenizer ( const std::string &  text,
const std::string &  delimiters = WHITESPACE_DELIMITERS,
bool  skip_empty_tokens = true 
)

Constructor.

Parameters
textstring to tokenize
delimitersdelimiter characters, if no delimiter is given whitespace is used
skip_empty_tokensspecify whether to skip empty tokens

Member Function Documentation

◆ has_more_tokens()

bool wex::tokenizer::has_more_tokens ( ) const

Returns true if the string still contains delimiters, and so can be tokenized.

A sequence of delimiters is skipped: an empty token is not returned.

◆ tokenize() [1/2]

template<typename T >
T wex::tokenizer::tokenize ( )
inline

tokenizes the complete string into a templatized class (e.g.

vector<std::string>). Always restarts, so you can use has_more_tokens before. Returns the filled in container.

◆ tokenize() [2/2]

auto wex::tokenizer::tokenize ( )
inline

tokenizes the complete string into a vector of integers (size_t).

Always restarts, so you can use has_more_tokens before. Returns the filled in vector.