Skip to content

Add datetime quickcheck coverage and exponent-like key parsing fixes#63

Open
bobzhang wants to merge 7 commits intomainfrom
codex/qc-datetime-exponent-cherrypicks
Open

Add datetime quickcheck coverage and exponent-like key parsing fixes#63
bobzhang wants to merge 7 commits intomainfrom
codex/qc-datetime-exponent-cherrypicks

Conversation

@bobzhang
Copy link
Copy Markdown
Collaborator

@bobzhang bobzhang commented Mar 27, 2026

Summary

  • cherry-pick quickcheck coverage for datetime values and exponent-like keys into the refactored internal/qc_model layout
  • apply the tokenizer fix so negative exponent-like bare keys are preserved in key position
  • update parser expectations for exponent-like key cases on current main

Testing

  • moon test

Open with Devin

devin-ai-integration[bot]

This comment was marked as resolved.

Copy link
Copy Markdown

@devin-ai-integration devin-ai-integration bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Devin Review found 1 new potential issue.

View 10 additional findings in Devin Review.

Open in Devin Review

// (underscores, exponent notation), produce an Identifier, including for negative bare keys like "-1e2".
let has_lossy_chars = raw_s != s || (is_float && !raw_s.contains("."))
if !is_negative && !has_sign && has_lossy_chars {
if !has_sign && has_lossy_chars {
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🔴 Removing !is_negative guard causes negative numbers with underscores in arrays to be rejected

The change from if !is_negative && !has_sign && has_lossy_chars to if !has_sign && has_lossy_chars at internal/tokenize/tokenize.mbt:883 allows negative numbers with underscores (e.g., -1_000) to enter the key-position heuristic. When such a number is the last element in an array (e.g., arr = [-1_000]), the ] at line 888 triggers is_key = true, causing the tokenizer to emit Identifier("-1_000") instead of IntegerToken(-1000). The parser's parse_exponent_like_identifier at parser.mbt:409-418 only handles identifiers containing e/E, so it returns None for "-1_000", resulting in an "Expected value" error. Before this PR, !is_negative was false for negative numbers, so the key-position check was skipped entirely and -1_000 always became IntegerToken(-1000). This regression affects any negative number with underscores as the last/only element in an array (e.g., [-1_000], [1, -2_000], [-1.5_0]).

Prompt for agents
The fix needs to address two locations:

1. In internal/tokenize/tokenize.mbt at line 883: The condition `if !has_sign && has_lossy_chars` is too broad. It should either restore the `!is_negative` check and handle the negative exponent key case differently, or the parser's fallback needs to be extended.

2. In parser.mbt at lines 409-418 (parse_exponent_like_identifier): If keeping the tokenizer change, this function needs to handle the underscore case too. It should strip underscores from the name before attempting parse_double, and also handle pure integer identifiers (no 'e'/'E') by trying parse_int64 after stripping underscores. For example:
   - If the name contains underscores, strip them and try parsing as integer (via @string.parse_int64) or float (via @string.parse_double)
   - Return TomlInteger for integer-like identifiers and TomlFloat for float-like ones
   - This would require changing the return type or having the caller handle both cases

The simplest fix is option 1: change line 883 back to `if !is_negative && !has_sign && has_lossy_chars` and handle the negative exponent-like key case separately in the tokenizer, perhaps by adding a specific check for negative exponent patterns before the general has_lossy_chars check.
Open in Devin Review

Was this helpful? React with 👍 or 👎 to provide feedback.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants