class Linguist::Tokenizer

language symbols.
It strips any data strings or comments and preserves significant
Tokens are designed for use in the language bayes classifier.
Generic programming language tokenizer.

def self.tokenize(data)

Returns Array of token Strings.

data - String to tokenize

Public: Extract tokens from data
def self.tokenize(data)
  new.extract_tokens(data)
end