A function that accepts input in its original form and separates it into tokens, will be called a lexcical analyzer. These analyzers are functional: call one with the original input, and it returns a function of 0 arguments (with changeable internal state) that keeps returning tokens each time it is called until none are left. Actually, the analyzer is to return a pair: (pos,token), where pos is a string indicating the position where token was found in the input. A position will be a sort of thing which can be converted to string with toString (for printing error messages) and can be sorted.
The object Analyzer is a self initializing type, with ancestor classes FunctionClosure < Function < Thing.