class Crass::Tokenizer

def tokenize

Experimental RBS support (using type sampling data from the type_fusion project).

def tokenize: () -> (Array[Hash, node, Symbol, pos, Integer, raw, String, type, Symbol, value, String] | Array[Hash, node, Symbol, pos, Integer, raw, String, value, String])

This signature was generated using 2 samples from 1 application.

Tokenizes the input stream and returns an array of tokens.
def tokenize
  @s.reset
  tokens = []
  while token = consume
    tokens << token
  end
  tokens
end