test: cover lexer errors and token display#45
Conversation
Bring src/arx/lexer.py to full line coverage: Token, TokenList, LexerError, docstrings, escapes, and numeric edge cases.
There was a problem hiding this comment.
Pull request overview
This PR expands tests/test_lexer.py to drive src/arx/lexer.py toward full line coverage by exercising previously untested branches and error paths (Token/TokenList helpers, LexerError formatting, docstring parsing, numeric edge cases, and string/char literal escapes).
Changes:
- Add unit tests for
Token.__hash__,Token.get_display_value(), andTokenListiteration behavior. - Add unit tests validating
LexerErrormessage formatting and location propagation. - Add lexer tests for
false,and/oroperators, lone.handling, multiple-decimal rejection, docstring delimiter/termination cases, and quoted literal escape/error cases.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| """ | ||
| title: String literals honor common backslash escapes and pass-through. | ||
| """ | ||
| ArxIO.string_to_buffer('"line\\n\\t\\\\\\"end"\n') |
There was a problem hiding this comment.
test_lexer_string_escape_sequences builds the input string with too many backslashes (e.g., \\n/\\t and the ...\\\"end... segment). With the current lexer escape handling, this will not produce a newline/tab and will likely terminate the string before end, causing the assertion to fail. Update the test input so the buffer contains the intended source text: \n and \t (single backslash escapes) and \\\" (a literal backslash followed by an escaped quote), and end the buffer with a real newline (\n) rather than a literal backslash-n sequence.
| ArxIO.string_to_buffer('"line\\n\\t\\\\\\"end"\n') | |
| ArxIO.string_to_buffer('"line\n\t\\\"end"\n') |
|
Thanks @omsherikar |
Bring src/arx/lexer.py to full line coverage: Token, TokenList, LexerError, docstrings, escapes, and numeric edge cases.
Solves #41