Reto resuelto! 😄
.
Primero escribo el test expectando los tokens:
def test_one_character_operator(self) -> None:
source: str = "=+"
source: str = "=+-/*<>!"
lexer: Lexer = Lexer(source)
tokens: List[Token] = []
for i in range(len(source)):
tokens.append(lexer.next_token())
expected_tokens: List[Token] = [
Token(TokenType.ASSIGN, "="),
Token(TokenType.PLUS, "+"),
Token(TokenType.MINUS, "-"),
Token(TokenType.DIVISION, "/"),
Token(TokenType.MULTIPLICATION, "*"),
Token(TokenType.LT, "<"),
Token(TokenType.GT, ">"),
Token(TokenType.NEGATION, "!"),
]
self.assertEquals(tokens, expected_tokens)
Después, en la lista de token types se agregan los nuevos tokens:
DIVISION = auto()
GT = auto() # Gretater Than (>)
MINUS = auto() # Resta
MULTIPLICATION = auto()
NEGATION = auto() # Negación (!)
Después se añaden las condiciones al lexer para que pueda retornar los tokens correctos (hay que escapar la multiplicación):
elif match(r"^-$", self._character):
token = Token(TokenType.MINUS, self._character)
elif match(r"^/$", self._character):
token = Token(TokenType.DIVISION, self._character)
elif match(r"^\*$", self._character):
token = Token(TokenType.MULTIPLICATION, self._character)
elif match(r"^<$", self._character):
token = Token(TokenType.LT, self._character)
elif match(r"^>$", self._character):
token = Token(TokenType.GT, self._character)
elif match(r"^!$", self._character):
token = Token(TokenType.NEGATION, self._character)
¡Y listo! 😄
.
https://github.com/RetaxMaster/lpp/commit/b935aa10e833565492fea8935a68e4dd87067c66
¿Quieres ver más aportes, preguntas y respuestas de la comunidad?