Skip to content

Conversation

@pranavrajpal
Copy link
Contributor

Having --check-untyped-defs off by default is rather surprising behavior, since most people don't expect whether a function is annotated to change whether mypy checks that function. See #11201 and #3948 (comment) for previous discussions of this.

I'm mainly creating this to see what mypy_primer says about the impact of changing the default and if there are other changes we can make to reduce the number of new errors. I haven't looked at whether the test suite passes yet.

Test Plan

I made sure mypy --check-untyped-defs, mypy --no-check-untyped-defs, and mypy --strict all behaved the same as before, and mypy with no options reported errors in untyped functions. I also looked at the mypy --help output and made sure it seemed reasonable.

Changing the default of an option apparently isn't as straightforward as
just changing the default argument to add_invertible_flag. We need to
also specify dest, change the name of the flag, and change the default
in options.py.
Remove strict_flag=True to prevent check-untyped-defs from being
disabled when --strict is passed. Also update the help text for the
untyped group to remove the statement referring to --check-untyped-defs
being disabled by default.
@github-actions

This comment has been minimized.

@pranavrajpal
Copy link
Contributor Author

My analysis of some of the mypy_primer errors (see below for summary):

Stone

  • Most of the Optional[Any] errors seem to be true positives because there are attributes that are only given a non-None value in a method separate from __init__.
  • There are some errors involving numbers.Integral, which seem somewhat similar to int is not a Number? #3186
  • The first 2 errors in stone_serializers.py are examples of Inner method discards type narrowing done in outer method #2608 which should be solvable using the heuristic pyright uses (see Inner method discards type narrowing done in outer method #2608 (comment))
  • The last 2 errors in stone_serializers.py have a variable being assigned different types at different points. The right fix for that is probably just adding a type annotation or using an overload for the function.
  • The test_stone_internal.py errors all seem solvable by enabling --allow-redefinition.
  • The dict entry has incompatible type errors are caused by heterogeneous dicts with values of type str and List[Dict[str, str]]. Inferring a union for the value type might help with that, but I don't know if that's going to result in more errors due to values of dicts being invariant.

Pandera

Most of these errors look like genuine errors. Some notes about some of those errors:

  • The __setstate__ error seems like a typeshed bug. __setstate__ is defined on BaseException, but it isn't defined in typeshed.
  • The singledispatch error in engine.py is technically correct but relatively easy to fix (the 2 classes are incompatible, but the right fix is just changing Dispatch to a Protocol since it's only being used for type checking purposes.
  • The strategies.py errors are more examples of Inner method discards type narrowing done in outer method #2608.
  • io.py is another example of mypy getting confused by heterogeneous dicts.

pyjwt

  • The first 2 errors in algorithms.py are caused by redefinition of the same variable in different branches.
  • The next 3 errors in algorithms.py can easily be fixed by adding a type annotation.
  • The help.py errors are because typeshed doesn't have sys.pypy_version_info

Summary

  • Enabling --allow-redefinition would probably help reduce the number of errors a decent amount. More lenient rules around allowing redefinition than --allow-redefinition might help even more, but that comes at the cost of hiding true errors and worse error messages.
  • Inner method discards type narrowing done in outer method #2608 also seems to come up a decent amount, so fixing that could help reduce errors more.

@hauntsaninja
Copy link
Collaborator

Thanks for looking into this! You probably know this already, but just in case, note that ^ is only a fraction of the complete mypy_primer output (... (truncated 3325383 chars) ...). If you're interested in seeing all of it, you can download the "mypy_primer_diffs" artifact at https://github.com/python/mypy/actions/runs/2403480711

@github-actions

This comment has been minimized.

@github-actions

This comment has been minimized.

@github-actions
Copy link
Contributor

github-actions bot commented Jan 2, 2026

Diff from mypy_primer, showing the effect of this PR on open source code:

beartype (https://github.com/beartype/beartype)
+ beartype/typing/_typingcache.py:130: error: Exception must be derived from BaseException  [misc]
+ beartype/typing/_typingpep544.py:520: error: "__class_getitem__" undefined in superclass  [misc]

pyinstrument (https://github.com/joerick/pyinstrument)
+ pyinstrument/vendor/appdirs.py:497: error: Module has no attribute "OpenKey"  [attr-defined]
+ pyinstrument/vendor/appdirs.py:498: error: Module has no attribute "HKEY_CURRENT_USER"  [attr-defined]
+ pyinstrument/vendor/appdirs.py:501: error: Module has no attribute "QueryValueEx"  [attr-defined]
+ pyinstrument/vendor/appdirs.py:515: error: Module has no attribute "windll"  [attr-defined]
+ pyinstrument/vendor/appdirs.py:526: error: Module has no attribute "windll"  [attr-defined]
+ pyinstrument/vendor/appdirs.py:537: error: Module has no attribute "zeros"  [attr-defined]
+ pyinstrument/vendor/appdirs.py:550: error: Module has no attribute "zeros"  [attr-defined]
- pyinstrument/magic/_utils.py:73: error: Unused "type: ignore" comment  [unused-ignore]
- pyinstrument/magic/_utils.py:74: error: Unused "type: ignore" comment  [unused-ignore]
- pyinstrument/frame.py:346: error: Unused "type: ignore" comment  [unused-ignore]
- pyinstrument/frame.py:348: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
- pyinstrument/frame.py:352: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
- pyinstrument/frame.py:394: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
+ pyinstrument/vendor/decorator.py:57: error: Module has no attribute "getargspec"  [attr-defined]
+ pyinstrument/vendor/decorator.py:103: error: Need type annotation for "arg"  [var-annotated]
+ pyinstrument/vendor/decorator.py:253: error: Incompatible types in assignment (expression has type "str | None", variable has type "str")  [assignment]
+ pyinstrument/vendor/decorator.py:259: error: Incompatible types in assignment (expression has type "tuple[Any, ...] | None", variable has type "tuple[()]")  [assignment]
+ pyinstrument/vendor/decorator.py:294: error: Missing positional arguments "args", "kwds" in call to "__init__" of "_GeneratorContextManagerBase"  [call-arg]
+ pyinstrument/vendor/decorator.py:353: error: Need type annotation for "typemap" (hint: "typemap: dict[<type>, <type>] = ...")  [var-annotated]
+ pyinstrument/vendor/decorator.py:360: error: Need type annotation for "ras"  [var-annotated]
- pyinstrument/profiler.py:246: error: Unused "type: ignore" comment  [unused-ignore]
- pyinstrument/context_manager.py:71: error: Unused "type: ignore" comment  [unused-ignore]
- pyinstrument/__main__.py:35: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
+ pyinstrument/__main__.py:324: error: Incompatible types in assignment (expression has type "TextIO | Any", variable has type "TextIOWrapper[_WrappedBuffer]")  [assignment]
- pyinstrument/middleware.py:53: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]

spack (https://github.com/spack/spack)
+ lib/spack/spack/util/unparse/unparser.py:729: error: Incompatible types in assignment (expression has type "str", variable has type "expr")  [assignment]
+ lib/spack/spack/util/unparse/unparser.py:732: error: "expr" has no attribute "replace"; maybe "__replace__"?  [attr-defined]
+ lib/spack/spack/llnl/util/tty/color.py:256: error: Argument 1 to "_escape" has incompatible type "int"; expected "str"  [arg-type]
+ lib/spack/spack/llnl/util/tty/color.py:256: error: Argument 2 to "_escape" has incompatible type "bool | None"; expected "bool"  [arg-type]
+ lib/spack/spack/llnl/util/tty/color.py:264: error: Argument 2 to "_escape" has incompatible type "bool | None"; expected "bool"  [arg-type]
+ lib/spack/spack/llnl/util/tty/color.py:266: error: Argument 1 to "_escape" has incompatible type "int"; expected "str"  [arg-type]
+ lib/spack/spack/llnl/util/tty/color.py:266: error: Argument 2 to "_escape" has incompatible type "bool | None"; expected "bool"  [arg-type]
+ lib/spack/spack/llnl/util/tty/color.py:428: error: Too many arguments for "write"  [call-arg]
+ lib/spack/spack/llnl/util/tty/color.py:428: error: "ColorStream" has no attribute "color"; maybe "_color"?  [attr-defined]
- lib/spack/spack/llnl/util/lang.py:441: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
+ lib/spack/spack/llnl/util/lang.py:92: error: Need type annotation for "result" (hint: "result: dict[<type>, <type>] = ...")  [var-annotated]
+ lib/spack/spack/llnl/util/lang.py:264: error: Unsupported left operand type for < ("object")  [operator]
+ lib/spack/spack/llnl/util/lang.py:620: error: Item "None" of "Match[str] | None" has no attribute "groups"  [union-attr]
+ lib/spack/spack/llnl/util/lang.py:634: error: Unsupported operand types for - ("None" and "timedelta")  [operator]
+ lib/spack/spack/llnl/util/lang.py:634: note: Left operand is of type "datetime | None"
+ lib/spack/spack/llnl/util/lang.py:653: error: Incompatible types in assignment (expression has type "float", variable has type "int")  [assignment]
+ lib/spack/spack/llnl/util/lang.py:655: error: Incompatible types in assignment (expression has type "float", variable has type "int")  [assignment]
+ lib/spack/spack/llnl/util/lang.py:657: error: Incompatible types in assignment (expression has type "float", variable has type "int")  [assignment]
+ lib/spack/spack/llnl/util/lang.py:734: error: Incompatible types in assignment (expression has type "object", variable has type "None")  [assignment]
- lib/spack/spack/llnl/util/lang.py:993: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
+ lib/spack/spack/llnl/util/lang.py:816: error: Argument 1 to "module_from_spec" has incompatible type "ModuleSpec | None"; expected "ModuleSpec"  [arg-type]
+ lib/spack/spack/llnl/util/lang.py:821: error: Item "None" of "ModuleSpec | None" has no attribute "name"  [union-attr]
+ lib/spack/spack/llnl/util/lang.py:823: error: Item "None" of "ModuleSpec | None" has no attribute "loader"  [union-attr]
+ lib/spack/spack/llnl/util/lang.py:823: error: Item "None" of "Loader | Any | None" has no attribute "exec_module"  [union-attr]
+ lib/spack/spack/llnl/util/lang.py:826: error: Item "None" of "ModuleSpec | None" has no attribute "name"  [union-attr]
+ lib/spack/spack/llnl/util/tty/__init__.py:204: error: Item "None" of "Any | None" has no attribute "write"  [union-attr]
+ lib/spack/spack/llnl/util/tty/__init__.py:206: error: Item "None" of "Any | None" has no attribute "write"  [union-attr]
+ lib/spack/spack/llnl/util/tty/__init__.py:207: error: Item "None" of "Any | None" has no attribute "flush"  [union-attr]
+ lib/spack/spack/llnl/util/tty/__init__.py:291: error: "dict[str, Any]" has no attribute "iterkeys"  [attr-defined]
+ lib/spack/spack/util/windows_registry.py:84: error: Name "winreg" is not defined  [name-defined]
+ lib/spack/spack/util/windows_registry.py:90: error: Name "winreg" is not defined  [name-defined]
+ lib/spack/spack/util/windows_registry.py:99: error: Name "winreg" is not defined  [name-defined]
+ lib/spack/spack/util/windows_registry.py:107: error: Name "winreg" is not defined  [name-defined]
+ lib/spack/spack/util/windows_registry.py:113: error: Name "winreg" is not defined  [name-defined]
+ lib/spack/spack/util/windows_registry.py:126: error: Name "winreg" is not defined  [name-defined]
+ lib/spack/spack/util/windows_registry.py:148: error: Name "winreg" is not defined  [name-defined]
+ lib/spack/spack/util/windows_registry.py:170: error: Name "winreg" is not defined  [name-defined]
+ lib/spack/spack/util/windows_registry.py:249: error: Cannot determine type of "_reg"  [has-type]
+ lib/spack/spack/util/windows_registry.py:249: error: Cannot determine type of "root"  [has-type]
+ lib/spack/spack/util/windows_registry.py:249: error: Cannot determine type of "key"  [has-type]
+ lib/spack/spack/util/windows_registry.py:259: error: Cannot determine type of "key"  [has-type]
+ lib/spack/spack/util/windows_registry.py:269: error: Cannot determine type of "_reg"  [has-type]
+ lib/spack/spack/util/windows_registry.py:271: error: Cannot determine type of "_reg"  [has-type]
+ lib/spack/spack/util/windows_registry.py:276: error: Cannot determine type of "key"  [has-type]
+ lib/spack/spack/util/windows_registry.py:282: error: Cannot determine type of "key"  [has-type]
+ lib/spack/spack/util/windows_registry.py:288: error: Cannot determine type of "key"  [has-type]
+ lib/spack/spack/util/windows_registry.py:301: error: Cannot determine type of "key"  [has-type]
+ lib/spack/spack/util/windows_registry.py:321: error: Cannot determine type of "key"  [has-type]
+ lib/spack/spack/llnl/util/lock.py:738: error: "AbstractContextManager[Any, bool | None]" not callable  [operator]
+ lib/spack/spack/llnl/util/lock.py:738: error: Incompatible types in assignment (expression has type "bool | Any", variable has type "None")  [assignment]
+ lib/spack/spack/llnl/util/lock.py:740: error: "None" has no attribute "__enter__"  [attr-defined]
+ lib/spack/spack/llnl/util/lock.py:754: error: Too many arguments  [call-arg]
+ lib/spack/spack/llnl/util/lock.py:754: error: "AbstractContextManager[Any, bool | None]" not callable  [operator]
+ lib/spack/spack/llnl/util/filesystem.py:142: error: Module has no attribute "_copyxattr"  [attr-defined]
+ lib/spack/spack/llnl/util/filesystem.py:1242: error: No overload variant of "__call__" of "_RmtreeType" matches argument types "str", "dict[str, bool]"  [call-overload]
+ lib/spack/spack/llnl/util/filesystem.py:1242: note: Possible overload variants:
+ lib/spack/spack/llnl/util/filesystem.py:1242: note:     def __call__(self, path: str | bytes | PathLike[str] | PathLike[bytes], ignore_errors: bool, onerror: Callable[[Callable[..., Any], str, tuple[type[BaseException], BaseException, TracebackType]], object] | None, *, onexc: None = ..., dir_fd: int | None = ...) -> None
+ lib/spack/spack/llnl/util/filesystem.py:1242: note:     def __call__(self, path: str | bytes | PathLike[str] | PathLike[bytes], ignore_errors: bool = ..., *, onerror: Callable[[Callable[..., Any], str, tuple[type[BaseException], BaseException, TracebackType]], object] | None, onexc: None = ..., dir_fd: int | None = ...) -> None
+ lib/spack/spack/llnl/util/filesystem.py:1242: note:     def __call__(self, path: str | bytes | PathLike[str] | PathLike[bytes], ignore_errors: bool = ..., *, onexc: Callable[[Callable[..., Any], str, BaseException], object] | None = ..., dir_fd: int | None = ...) -> None
+ lib/spack/spack/llnl/util/filesystem.py:2520: error: Need type annotation for "files" (hint: "files: list[<type>] = ...")  [var-annotated]
- lib/spack/spack/operating_systems/windows_os.py:52: error: Unused "type: ignore" comment  [unused-ignore]
+ lib/spack/spack/util/ctest_log_parser.py:326: error: Incompatible types in assignment (expression has type "Callable[[Any, Any, Any, Any, Any], Any]", variable has type "Callable[[Any, Any, Any], Any]")  [assignment]
+ lib/spack/spack/util/ctest_log_parser.py:342: error: Incompatible types in assignment (expression has type "BuildWarning", variable has type "BuildError")  [assignment]
+ lib/spack/spack/util/module_cmd.py:171: error: Need type annotation for "path_occurrences" (hint: "path_occurrences: dict[<type>, <type>] = ...")  [var-annotated]
+ lib/spack/spack/util/gcs.py:92: error: Item "None" of "Any | None" has no attribute "get_blob"  [union-attr]
+ lib/spack/spack/util/gcs.py:97: error: Item "None" of "Any | None" has no attribute "blob"  [union-attr]
+ lib/spack/spack/util/gcs.py:116: error: Item "None" of "Any | None" has no attribute "list_blobs"  [union-attr]
+ lib/spack/spack/util/archive.py:269: error: Item "None" of "IO[bytes] | None" has no attribute "read"  [union-attr]
+ lib/spack/spack/util/archive.py:277: error: Item "None" of "IO[bytes] | None" has no attribute "read"  [union-attr]
+ lib/spack/spack/test/util/environment.py:28: error: Argument 1 to "is_system_path" has incompatible type "None"; expected "str"  [arg-type]
+ lib/spack/spack/test/llnl/util/lock.py:176: error: Item "None" of "Any | None" has no attribute "rank"  [union-attr]
+ lib/spack/spack/test/llnl/util/lock.py:179: error: Item "None" of "Any | None" has no attribute "bcast"  [union-attr]
+ lib/spack/spack/test/llnl/util/lock.py:188: error: Item "None" of "Any | None" has no attribute "barrier"  [union-attr]
+ lib/spack/spack/test/llnl/util/lock.py:190: error: Item "None" of "Any | None" has no attribute "rank"  [union-attr]
+ lib/spack/spack/test/llnl/util/lock.py:192: error: Argument 1 to "__call__" of "_RmtreeType" has incompatible type "Any | str | None"; expected "str | bytes | PathLike[str] | PathLike[bytes]"  [arg-type]
+ lib/spack/spack/test/llnl/util/lock.py:203: error: Item "None" of "Any | None" has no attribute "rank"  [union-attr]
+ lib/spack/spack/test/llnl/util/lock.py:225: error: Argument "_wait_times" to "_poll_interval_generator" of "Lock" has incompatible type "list[int]"; expected "tuple[float, float, float] | None"  [arg-type]
+ lib/spack/spack/test/llnl/util/lock.py:258: error: Item "None" of "Any | None" has no attribute "size"  [union-attr]
+ lib/spack/spack/test/llnl/util/lock.py:261: error: Item "None" of "Any | None" has no attribute "Barrier"  [union-attr]
+ lib/spack/spack/test/llnl/util/lock.py:263: error: Item "None" of "Any | None" has no attribute "rank"  [union-attr]
+ lib/spack/spack/test/llnl/util/lock.py:264: error: Item "None" of "Any | None" has no attribute "Split"  [union-attr]
+ lib/spack/spack/test/llnl/util/lock.py:281: error: Item "None" of "Any | None" has no attribute "Abort"  [union-attr]
+ lib/spack/spack/test/llnl/util/lock.py:284: error: Item "None" of "Any | None" has no attribute "Barrier"  [union-attr]
+ lib/spack/spack/test/llnl/util/lock.py:648: error: Item "None" of "IO[bytes] | None" has no attribute "mode"  [union-attr]
+ lib/spack/spack/test/llnl/util/lock.py:653: error: Item "None" of "IO[bytes] | None" has no attribute "mode"  [union-attr]
+ lib/spack/spack/test/llnl/util/lock.py:658: error: Item "None" of "IO[bytes] | None" has no attribute "mode"  [union-attr]
+ lib/spack/spack/test/llnl/util/lock.py:681: error: Item "None" of "IO[bytes] | None" has no attribute "mode"  [union-attr]
+ lib/spack/spack/test/llnl/util/lock.py:896: error: Need type annotation for "vals"  [var-annotated]
+ lib/spack/spack/test/llnl/util/lock.py:942: error: Need type annotation for "vals"  [var-annotated]
+ lib/spack/spack/test/llnl/util/lock.py:1004: error: Need type annotation for "vals"  [var-annotated]
+ lib/spack/spack/test/llnl/util/lock.py:1087: error: Need type annotation for "vals"  [var-annotated]
+ lib/spack/spack/test/llnl/util/lock.py:1091: error: Argument "release" to "WriteTransaction" has incompatible type "Callable[[Any, Any, Any], Any]"; expected "Callable[[], bool] | None | AbstractContextManager[Any, bool | None]"  [arg-type]
+ lib/spack/spack/test/llnl/util/lock.py:1093: error: Argument "release" to "WriteTransaction" has incompatible type "Callable[[Any, Any, Any], Any]"; expected "Callable[[], bool] | None | AbstractContextManager[Any, bool | None]"  [arg-type]
+ lib/spack/spack/test/llnl/util/lock.py:1102: error: Argument "release" to "WriteTransaction" has incompatible type "Callable[[Any, Any, Any], Any]"; expected "Callable[[], bool] | None | AbstractContextManager[Any, bool | None]"  [arg-type]
+ lib/spack/spack/test/llnl/util/lock.py:1108: error: Argument "release" to "WriteTransaction" has incompatible type "Callable[[Any, Any, Any], Any]"; expected "Callable[[], bool] | None | AbstractContextManager[Any, bool | None]"  [arg-type]
+ lib/spack/spack/test/llnl/util/lock.py:1112: error: Argument "release" to "WriteTransaction" has incompatible type "Callable[[Any, Any, Any], Any]"; expected "Callable[[], bool] | None | AbstractContextManager[Any, bool | None]"  [arg-type]
+ lib/spack/spack/test/llnl/util/lock.py:1121: error: Argument "release" to "WriteTransaction" has incompatible type "Callable[[Any, Any, Any], Any]"; expected "Callable[[], bool] | None | AbstractContextManager[Any, bool | None]"  [arg-type]
+ lib/spack/spack/test/llnl/util/lock.py:1125: error: Argument "release" to "WriteTransaction" has incompatible type "Callable[[Any, Any, Any], Any]"; expected "Callable[[], bool] | None | AbstractContextManager[Any, bool | None]"  [arg-type]
+ lib/spack/spack/test/llnl/util/lock.py:1138: error: Need type annotation for "vals"  [var-annotated]
+ lib/spack/spack/test/llnl/util/lock.py:1260: error: Need type annotation for "q1"  [var-annotated]
+ lib/spack/spack/test/llnl/util/lock.py:1260: error: Need type annotation for "q2"  [var-annotated]
+ lib/spack/spack/test/llnl/util/lang.py:208: error: Unsupported left operand type for < ("KeyComparable")  [operator]
+ lib/spack/spack/test/llnl/util/lang.py:209: error: Unsupported left operand type for > ("KeyComparable")  [operator]
+ lib/spack/spack/test/llnl/util/lang.py:211: error: Unsupported left operand type for <= ("KeyComparable")  [operator]
+ lib/spack/spack/test/llnl/util/lang.py:212: error: Unsupported left operand type for >= ("KeyComparable")  [operator]
+ lib/spack/spack/test/llnl/util/lang.py:214: error: Unsupported left operand type for <= ("KeyComparable")  [operator]
+ lib/spack/spack/test/llnl/util/lang.py:215: error: Unsupported left operand type for <= ("KeyComparable")  [operator]
+ lib/spack/spack/test/llnl/util/lang.py:216: error: Unsupported left operand type for >= ("KeyComparable")  [operator]
+ lib/spack/spack/test/llnl/util/lang.py:217: error: Unsupported left operand type for >= ("KeyComparable")  [operator]
+ lib/spack/spack/test/llnl/util/lang.py:355: error: Need type annotation for "m"  [var-annotated]
+ lib/spack/spack/test/llnl/util/lang.py:362: error: Need type annotation for "m"  [var-annotated]
+ lib/spack/spack/report.py:167: error: Signature of "succeed" incompatible with supertype "SpecRecord"  [override]
+ lib/spack/spack/report.py:167: note:      Superclass:
+ lib/spack/spack/report.py:167: note:          def succeed(self) -> Any
+ lib/spack/spack/report.py:167: note:      Subclass:
+ lib/spack/spack/report.py:167: note:          def succeed(self, externals: Any) -> Any
+ lib/spack/spack/test/relocate_text.py:15: error: Item "None" of "Match[bytes] | None" has no attribute "group"  [union-attr]
+ lib/spack/spack/test/relocate_text.py:27: error: Item "None" of "Match[bytes] | None" has no attribute "group"  [union-attr]
+ lib/spack/spack/test/relocate_text.py:31: error: Item "None" of "Match[bytes] | None" has no attribute "group"  [union-attr]
+ lib/spack/spack/test/relocate_text.py:35: error: Item "None" of "Match[bytes] | None" has no attribute "group"  [union-attr]
+ lib/spack/spack/util/spack_yaml.py:171: error: "syaml_str" has no attribute "override"  [attr-defined]
+ lib/spack/spack/util/spack_yaml.py:520: error: Argument 2 to "StringMark" has incompatible type "None"; expected "int"  [arg-type]
+ lib/spack/spack/util/spack_yaml.py:520: error: Argument 3 to "StringMark" has incompatible type "None"; expected "int"  [arg-type]
+ lib/spack/spack/util/spack_yaml.py:520: error: Argument 4 to "StringMark" has incompatible type "None"; expected "int"  [arg-type]
+ lib/spack/spack/test/spack_yaml.py:83: error: Need type annotation for "aliased_list_2" (hint: "aliased_list_2: list[<type>] = ...")  [var-annotated]

... (truncated 850 lines) ...

hydpy (https://github.com/hydpy-dev/hydpy)
+ hydpy/docs/sphinx/defaultlinks_extension.py:18: error: Need type annotation for "viewlist"  [var-annotated]
+ hydpy/core/variabletools.py:1929: error: Argument 1 to "Variable" has incompatible type "None"; expected "SubVariables[Any, Any, Any]"  [arg-type]
+ hydpy/core/testtools.py:1216: error: Incompatible types in assignment (expression has type "type[_Open]", variable has type overloaded function)  [assignment]
+ hydpy/core/testtools.py:1430: error: Cannot assign to a type  [misc]
+ hydpy/core/testtools.py:1433: error: Cannot assign to a type  [misc]
+ hydpy/core/parametertools.py:121: error: Item "None" of "FrameType | None" has no attribute "f_back"  [union-attr]
+ hydpy/core/parametertools.py:122: error: Item "None" of "FrameType | Any | None" has no attribute "f_locals"  [union-attr]
+ hydpy/core/parametertools.py:122: error: Incompatible types in assignment (expression has type "Any | None", variable has type "str")  [assignment]
+ hydpy/core/parametertools.py:4790: error: "None" not callable  [misc]
+ hydpy/auxs/networktools.py:242: error: Need type annotation for "tree" (hint: "tree: dict[<type>, <type>] = ...")  [var-annotated]
+ hydpy/models/arma/arma_derived.py:70: error: "Parameter" has no attribute "thresholds"  [attr-defined]
+ hydpy/models/arma/arma_derived.py:96: error: "Parameter" has no attribute "thresholds"  [attr-defined]
+ hydpy/models/arma/arma_derived.py:118: error: "Parameter" has no attribute "ar_orders"  [attr-defined]
+ hydpy/models/arma/arma_derived.py:140: error: "Parameter" has no attribute "ma_orders"  [attr-defined]
+ hydpy/models/arma/arma_derived.py:169: error: "Parameter" has no attribute "ar_coefs"  [attr-defined]
+ hydpy/models/arma/arma_derived.py:201: error: "Parameter" has no attribute "ma_coefs"  [attr-defined]
+ hydpy/models/whmod/whmod_sequences.py:17: error: "Sequence_" has no attribute "model"  [attr-defined]
+ hydpy/models/whmod/whmod_derived.py:100: error: No overload variant of "__call__" of "_UFunc_Nin2_Nout1" matches argument types "Parameter", "float"  [call-overload]
+ hydpy/models/whmod/whmod_derived.py:100: note: Possible overload variants:
+ hydpy/models/whmod/whmod_derived.py:100: note:     def __call__(complex | str | bytes | generic[Any], complex | str | bytes | generic[Any], /, out: None = ..., *, dtype: type[Any] | dtype[Any] | _SupportsDType[dtype[Any]] | tuple[Any, Any] | list[Any] | _DTypeDict | str | None = ..., where: _SupportsArray[dtype[numpy.bool[builtins.bool]]] | _NestedSequence[_SupportsArray[dtype[numpy.bool[builtins.bool]]]] | builtins.bool | _NestedSequence[builtins.bool] | None = ..., casting: Literal['no', 'equiv', 'safe', 'same_kind', 'unsafe'] = ..., order: Literal['K', 'A', 'C', 'F'] | None = ..., subok: bool = ..., signature: tuple[str | None, str | None, str | None] | str | None = ...) -> Any
+ hydpy/models/whmod/whmod_derived.py:100: note:     def __call__(Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], ndarray[tuple[Any, ...], dtype[generic[Any]]], /, out: ndarray[tuple[Any, ...], dtype[generic[Any]]] | tuple[ndarray[tuple[Any, ...], dtype[generic[Any]]]] | None = ..., *, dtype: type[Any] | dtype[Any] | _SupportsDType[dtype[Any]] | tuple[Any, Any] | list[Any] | _DTypeDict | str | None = ..., where: _SupportsArray[dtype[numpy.bool[builtins.bool]]] | _NestedSequence[_SupportsArray[dtype[numpy.bool[builtins.bool]]]] | builtins.bool | _NestedSequence[builtins.bool] | None = ..., casting: Literal['no', 'equiv', 'safe', 'same_kind', 'unsafe'] = ..., order: Literal['K', 'A', 'C', 'F'] | None = ..., subok: bool = ..., signature: tuple[str | None, str | None, str | None] | str | None = ...) -> ndarray[tuple[Any, ...], dtype[Any]]
+ hydpy/models/whmod/whmod_derived.py:100: note:     def __call__(ndarray[tuple[Any, ...], dtype[generic[Any]]], Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], /, out: ndarray[tuple[Any, ...], dtype[generic[Any]]] | tuple[ndarray[tuple[Any, ...], dtype[generic[Any]]]] | None = ..., *, dtype: type[Any] | dtype[Any] | _SupportsDType[dtype[Any]] | tuple[Any, Any] | list[Any] | _DTypeDict | str | None = ..., where: _SupportsArray[dtype[numpy.bool[builtins.bool]]] | _NestedSequence[_SupportsArray[dtype[numpy.bool[builtins.bool]]]] | builtins.bool | _NestedSequence[builtins.bool] | None = ..., casting: Literal['no', 'equiv', 'safe', 'same_kind', 'unsafe'] = ..., order: Literal['K', 'A', 'C', 'F'] | None = ..., subok: bool = ..., signature: tuple[str | None, str | None, str | None] | str | None = ...) -> ndarray[tuple[Any, ...], dtype[Any]]
+ hydpy/models/whmod/whmod_derived.py:100: note:     def __call__(Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], /, out: ndarray[tuple[Any, ...], dtype[generic[Any]]] | tuple[ndarray[tuple[Any, ...], dtype[generic[Any]]]], *, dtype: type[Any] | dtype[Any] | _SupportsDType[dtype[Any]] | tuple[Any, Any] | list[Any] | _DTypeDict | str | None = ..., where: _SupportsArray[dtype[numpy.bool[builtins.bool]]] | _NestedSequence[_SupportsArray[dtype[numpy.bool[builtins.bool]]]] | builtins.bool | _NestedSequence[builtins.bool] | None = ..., casting: Literal['no', 'equiv', 'safe', 'same_kind', 'unsafe'] = ..., order: Literal['K', 'A', 'C', 'F'] | None = ..., subok: bool = ..., signature: tuple[str | None, str | None, str | None] | str | None = ...) -> ndarray[tuple[Any, ...], dtype[Any]]
+ hydpy/models/whmod/whmod_derived.py:100: note:     def __call__(Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], Buffer | _SupportsArray[dtype[Any]] | _NestedSequence[_SupportsArray[dtype[Any]]] | complex | bytes | str | _NestedSequence[complex | bytes | str], /, out: ndarray[tuple[Any, ...], dtype[generic[Any]]] | tuple[ndarray[tuple[Any, ...], dtype[generic[Any]]]] | None = ..., *, dtype: type[Any] | dtype[Any] | _SupportsDType[dtype[Any]] | tuple[Any, Any] | list[Any] | _DTypeDict | str | None = ..., where: _SupportsArray[dtype[numpy.bool[builtins.bool]]] | _NestedSequence[_SupportsArray[dtype[numpy.bool[builtins.bool]]]] | builtins.bool | _NestedSequence[builtins.bool] | None = ..., casting: Literal['no', 'equiv', 'safe', 'same_kind', 'unsafe'] = ..., order: Literal['K', 'A', 'C', 'F'] | None = ..., subok: bool = ..., signature: tuple[str | None, str | None, str | None] | str | None = ...) -> ndarray[tuple[Any, ...], dtype[Any]] | Any
+ hydpy/models/conv/conv_model.py:694: error: "Parameter" has no attribute "nodes"  [attr-defined]
+ hydpy/models/conv/conv_model.py:697: error: "Parameter" has no attribute "nodes"  [attr-defined]
+ hydpy/models/conv/conv_model.py:703: error: "Parameter" has no attribute "nodes"  [attr-defined]

spark (https://github.com/apache/spark)
+ python/pyspark/find_spark_home.py:59: error: Item "None" of "ModuleSpec | None" has no attribute "origin"  [union-attr]
+ python/pyspark/find_spark_home.py:59: error: Argument 1 to "dirname" has incompatible type "str | Any | None"; expected "PathLike[Any]"  [arg-type]
+ python/pyspark/rddsampler.py:37: error: Item "None" of "Any | None" has no attribute "random"  [union-attr]
+ python/pyspark/rddsampler.py:45: error: Item "None" of "Any | None" has no attribute "random"  [union-attr]
+ python/pyspark/rddsampler.py:49: error: Item "None" of "Any | None" has no attribute "random"  [union-attr]
+ python/pyspark/rddsampler.py:52: error: Item "None" of "Any | None" has no attribute "expovariate"  [union-attr]
+ python/pyspark/rddsampler.py:56: error: Item "None" of "Any | None" has no attribute "expovariate"  [union-attr]
+ python/pyspark/install.py:101: error: Item "None" of "Match[str] | None" has no attribute "group"  [union-attr]
+ python/pyspark/install.py:102: error: Item "None" of "Match[str] | None" has no attribute "group"  [union-attr]
+ python/pyspark/join.py:114: error: Need type annotation for "bufs"  [var-annotated]
+ python/pyspark/serializers.py:362: error: "namedtuple()" expects a string literal as the first argument  [misc]
+ python/pyspark/serializers.py:362: error: Name "cls" already defined on line 360  [no-redef]
+ python/pyspark/serializers.py:364: error: "None" not callable  [misc]
+ python/pyspark/serializers.py:392: error: Name "_old_namedtuple" is not defined  [name-defined]
+ python/pyspark/serializers.py:393: error: Name "_old_namedtuple_kwdefaults" is not defined  [name-defined]
+ python/pyspark/serializers.py:396: error: Name "_old_namedtuple_kwdefaults" is not defined  [name-defined]
+ python/pyspark/serializers.py:398: error: Name "_old_namedtuple" is not defined  [name-defined]
+ python/pyspark/serializers.py:404: error: Name "_old_namedtuple_kwdefaults" is not defined  [name-defined]
+ python/pyspark/serializers.py:405: error: Name "_old_namedtuple" is not defined  [name-defined]
+ python/pyspark/serializers.py:408: error: "def namedtuple(typename: str, field_names: str | Iterable[str], *, rename: bool = ..., module: str | None = ..., defaults: Iterable[Any] | None = ...) -> type[tuple[Any, ...]]" has no attribute "__hijack"  [attr-defined]
+ python/pyspark/pandas/missing/__init__.py:36: error: "property" used with a non-method  [misc]
+ python/pyspark/pandas/missing/__init__.py:42: error: "property" used with a non-method  [misc]
+ python/pyspark/shuffle.py:490: error: Need type annotation for "chunks" (hint: "chunks: list[<type>] = ...")  [var-annotated]
+ python/pyspark/shuffle.py:580: error: Item "None" of "Any | None" has no attribute "write"  [union-attr]
+ python/pyspark/shuffle.py:591: error: Item "None" of "Any | None" has no attribute "load_stream"  [union-attr]
+ python/pyspark/shuffle.py:629: error: Item "None" of "Any | None" has no attribute "tell"  [union-attr]
+ python/pyspark/shuffle.py:630: error: Item "None" of "Any | None" has no attribute "dump_stream"  [union-attr]
+ python/pyspark/shuffle.py:633: error: Item "None" of "Any | None" has no attribute "tell"  [union-attr]
+ python/pyspark/sql/pandas/serializers.py:543: error: Signature of "arrow_to_pandas" incompatible with supertype "ArrowStreamPandasSerializer"  [override]
+ python/pyspark/sql/pandas/serializers.py:543: note:      Superclass:
+ python/pyspark/sql/pandas/serializers.py:543: note:          def arrow_to_pandas(self, arrow_column: Any, idx: Any, struct_in_pandas: Any = ..., ndarray_as_list: Any = ..., spark_type: Any = ...) -> Any
+ python/pyspark/sql/pandas/serializers.py:543: note:      Subclass:
+ python/pyspark/sql/pandas/serializers.py:543: note:          def arrow_to_pandas(self, arrow_column: Any, idx: Any) -> Any
+ python/pyspark/sql/pandas/serializers.py:1645: error: Need type annotation for "state_pdfs" (hint: "state_pdfs: list[<type>] = ...")  [var-annotated]
+ python/pyspark/sql/pandas/serializers.py:1776: error: Incompatible types in assignment (expression has type "float", variable has type "int")  [assignment]
+ python/pyspark/sql/pandas/serializers.py:1811: error: Item "None" of "Any | None" has no attribute "__iter__" (not iterable)  [union-attr]
+ python/pyspark/sql/pandas/serializers.py:1951: error: Item "None" of "Any | None" has no attribute "__iter__" (not iterable)  [union-attr]
+ python/pyspark/sql/pandas/serializers.py:1955: error: Item "None" of "Any | None" has no attribute "__iter__" (not iterable)  [union-attr]
+ python/pyspark/sql/pandas/serializers.py:2050: error: Item "None" of "Any | None" has no attribute "__iter__" (not iterable)  [union-attr]
+ python/pyspark/pandas/utils.py:604: error: "property" used with a non-method  [misc]
+ python/pyspark/pandas/plot/core.py:185: error: Argument 1 to "len" has incompatible type "ndarray[Any, Any] | generic[Any]"; expected "Sized"  [arg-type]
+ python/pyspark/pandas/plot/core.py:220: error: Value of type "ndarray[Any, Any] | generic[Any]" is not indexable  [index]
+ python/pyspark/pandas/plot/core.py:222: error: Value of type "ndarray[Any, Any] | generic[Any]" is not indexable  [index]
+ python/pyspark/pandas/plot/core.py:222: error: Argument 1 to "len" has incompatible type "ndarray[Any, Any] | generic[Any]"; expected "Sized"  [arg-type]
+ python/pyspark/pandas/plot/core.py:223: error: Value of type "ndarray[Any, Any] | generic[Any]" is not indexable  [index]
+ python/pyspark/pandas/plot/core.py:300: error: Argument 1 to "len" has incompatible type "ndarray[Any, Any] | generic[Any]"; expected "Sized"  [arg-type]
+ python/pyspark/pandas/plot/core.py:542: error: Name "module" already defined (by an import)  [no-redef]
+ python/pyspark/pandas/frame.py:596: error: Need type annotation for "data_df"  [var-annotated]
+ python/pyspark/pandas/series.py:421: error: Unused "type: ignore" comment  [unused-ignore]
+ python/pyspark/pandas/series.py:422: error: Unused "type: ignore" comment  [unused-ignore]
+ python/pyspark/pandas/series.py:445: error: Need type annotation for "anchor"  [var-annotated]
+ python/pyspark/pandas/series.py:466: error: No overload variant of "Series" matches argument types "Any", "Any", "Any", "Any", "Any", "Any"  [call-overload]
+ python/pyspark/pandas/series.py:466: note: Possible overload variants:
+ python/pyspark/pandas/series.py:466: note:     def [S1: str | bytes | bool | int | float | complex | <6 more items> | type[str] | list[str] | date | time | datetime | timedelta] __new__(cls, data: Sequence[Never], index: Mapping[Any, Any] | ExtensionArray | ndarray[tuple[Any, ...], dtype[Any]] | Index[Any] | Series[Any] | SequenceNotStr[Any] | range | KeysView[Any] | None = ..., dtype: None = ..., name: Hashable = ..., copy: bool | None = ...) -> Series[Any]
+ python/pyspark/pandas/series.py:466: note:     def [S1: str | bytes | bool | int | float | complex | <6 more items> | type[str] | list[str] | date | time | datetime | timedelta] __new__(cls, data: Sequence[list[str]], index: Mapping[Any, Any] | ExtensionArray | ndarray[tuple[Any, ...], dtype[Any]] | Index[Any] | Series[Any] | SequenceNotStr[Any] | range | KeysView[Any] | None = ..., dtype: None = ..., name: Hashable = ..., copy: bool | None = ...) -> Series[list[str]]
+ python/pyspark/pandas/series.py:466: note:     def [S1: str | bytes | bool | int | float | complex | <6 more items> | type[str] | list[str] | date | time | datetime | timedelta] __new__(cls, data: Sequence[str], index: Mapping[Any, Any] | ExtensionArray | ndarray[tuple[Any, ...], dtype[Any]] | Index[Any] | Series[Any] | SequenceNotStr[Any] | range | KeysView[Any] | None = ..., dtype: ExtensionDtype | str | dtype[generic[Any]] | type[complex] | type[bool] | type[object] | type[str] | None = ..., name: Hashable = ..., copy: bool | None = ...) -> Series[str]
+ python/pyspark/pandas/series.py:466: note:     def [S1: str | bytes | bool | int | float | complex | <6 more items> | type[str] | list[str] | date | time | datetime | timedelta, HashableT1: Hashable] __new__(cls, data: DatetimeIndex | Sequence[datetime64[date | int | None] | datetime | date] | dict[HashableT1, datetime64[date | int | None] | datetime | date] | datetime64[date | int | None] | datetime | date, index: Mapping[Any, Any] | ExtensionArray | ndarray[tuple[Any, ...], dtype[Any]] | Index[Any] | Series[Any] | SequenceNotStr[Any] | range | KeysView[Any] | None = ..., dtype: Literal['datetime64[s, UTC]', 'datetime64[ms, UTC]', 'datetime64[us, UTC]', 'datetime64[ns, UTC]'] | DatetimeTZDtype | Literal['datetime64[s]', 'datetime64[ms]', 'datetime64[us]', 'datetime64[ns]', 'M8[s]', 'M8[ms]', 'M8[us]', 'M8[ns]', '<M8[s]', '<M8[ms]', '<M8[us]', '<M8[ns]'] | Literal['date32[pyarrow]', 'date64[pyarrow]', 'timestamp[s][pyarrow]', 'timestamp[ms][pyarrow]', 'timestamp[us][pyarrow]', 'timestamp[ns][pyarrow]'] = ..., name: Hashable = ..., copy: bool | None = ...) -> Series[Timestamp]
+ python/pyspark/pandas/series.py:466: note:     def [S1: str | bytes | bool | int | float | complex | <6 more items> | type[str] | list[str] | date | time | datetime | timedelta] __new__(cls, data: Sequence[datetime | timedelta64[timedelta | int | None]] | ndarray[tuple[Any, ...], dtype[datetime64[date | int | None]]] | DatetimeArray, index: Mapping[Any, Any] | ExtensionArray | ndarray[tuple[Any, ...], dtype[Any]] | Index[Any] | Series[Any] | SequenceNotStr[Any] | range | KeysView[Any] | None = ..., *, dtype: Literal['datetime64[s, UTC]', 'datetime64[ms, UTC]', 'datetime64[us, UTC]', 'datetime64[ns, UTC]'] | DatetimeTZDtype | Literal['datetime64[s]', 'datetime64[ms]', 'datetime64[us]', 'datetime64[ns]', 'M8[s]', 'M8[ms]', 'M8[us]', 'M8[ns]', '<M8[s]', '<M8[ms]', '<M8[us]', '<M8[ns]'] | Literal['date32[pyarrow]', 'date64[pyarrow]', 'timestamp[s][pyarrow]', 'timestamp[ms][pyarrow]', 'timestamp[us][pyarrow]', 'timestamp[ns][pyarrow]'], name: Hashable = ..., copy: bool | None = ...) -> Series[Timestamp]
+ python/pyspark/pandas/series.py:466: note:     def [S1: str | bytes | bool | int | float | complex | <6 more items> | type[str] | list[str] | date | time | datetime | timedelta] __new__(cls, data: ExtensionArray | ndarray[tuple[Any, ...], dtype[Any]] | dict[str, ndarray[tuple[Any, ...], dtype[Any]]] | SequenceNotStr[Any], index: Mapping[Any, Any] | ExtensionArray | ndarray[tuple[Any, ...], dtype[Any]] | Index[Any] | Series[Any] | SequenceNotStr[Any] | range | KeysView[Any] | None = ..., *, dtype: CategoricalDtype | Literal['category'], name: Hashable = ..., copy: bool | None = ...) -> Series[CategoricalDtype]
+ python/pyspark/pandas/series.py:466: note:     def [S1: str | bytes | bool | int | float | complex | <6 more items> | type[str] | list[str] | date | time | datetime | timedelta] __new__(cls, data: PeriodIndex | Sequence[Period], index: Mapping[Any, Any] | ExtensionArray | ndarray[tuple[Any, ...], dtype[Any]] | Index[Any] | Series[Any] | SequenceNotStr[Any] | range | KeysView[Any] | None = ..., dtype: PeriodDtype = ..., name: Hashable = ..., copy: bool | None = ...) -> Series[Period]
+ python/pyspark/pandas/series.py:466: note:     def [S1: str | bytes | bool | int | float | complex | <6 more items> | type[str] | list[str] | date | time | datetime | timedelta] __new__(cls, data: Sequence[BaseOffset], index: Mapping[Any, Any] | ExtensionArray | ndarray[tuple[Any, ...], dtype[Any]] | Index[Any] | Series[Any] | SequenceNotStr[Any] | range | KeysView[Any] | None = ..., dtype: PeriodDtype = ..., name: Hashable = ..., copy: bool | None = ...) -> Series[BaseOffset]
+ python/pyspark/pandas/series.py:466: note:     def [S1: str | bytes | bool | int | float | complex | <6 more items> | type[str] | list[str] | date | time | datetime | timedelta, HashableT1: Hashable] __new__(cls, data: TimedeltaIndex | Sequence[timedelta64[timedelta | int | None] | timedelta] | dict[HashableT1, timedelta64[timedelta | int | None] | timedelta] | timedelta64[timedelta | int | None] | timedelta, index: Mapping[Any, Any] | ExtensionArray | ndarray[tuple[Any, ...], dtype[Any]] | Index[Any] | Series[Any] | SequenceNotStr[Any] | range | KeysView[Any] | None = ..., dtype: Literal['timedelta64[s]', 'timedelta64[ms]', 'timedelta64[us]', 'timedelta64[ns]', 'm8[s]', 'm8[ms]', 'm8[us]', 'm8[ns]', '<m8[s]', '<m8[ms]', '<m8[us]', '<m8[ns]'] | Literal['duration[s][pyarrow]', 'duration[ms][pyarrow]', 'duration[us][pyarrow]', 'duration[ns][pyarrow]'] = ..., name: Hashable = ..., copy: bool | None = ...) -> Series[Timedelta]
+ python/pyspark/pandas/series.py:466: note:     def [S1: str | bytes | bool | int | float | complex | <6 more items> | type[str] | list[str] | date | time | datetime | timedelta, _OrderableT: int | float | Timestamp | Timedelta, HashableT1: Hashable] __new__(cls, data: IntervalIndex[Interval[_OrderableT]] | Interval[_OrderableT] | Sequence[Interval[_OrderableT]] | dict[HashableT1, Interval[_OrderableT]], index: Mapping[Any, Any] | ExtensionArray | ndarray[tuple[Any, ...], dtype[Any]] | Index[Any] | Series[Any] | SequenceNotStr[Any] | range | KeysView[Any] | None = ..., dtype: Literal['Interval'] = ..., name: Hashable = ..., copy: bool | None = ...) -> Series[Interval[_OrderableT]]
+ python/pyspark/pandas/series.py:466: note:     def [S1: str | bytes | bool | int | float | complex | <6 more items> | type[str] | list[str] | date | time | datetime | timedelta, HashableT1: Hashable] __new__(cls, data: str | bytes | date | datetime | timedelta | <16 more items> | None, index: Mapping[Any, Any] | ExtensionArray | ndarray[tuple[Any, ...], dtype[Any]] | Index[Any] | Series[Any] | SequenceNotStr[Any] | range | KeysView[Any] | None = ..., *, dtype: type[S1], name: Hashable = ..., copy: bool | None = ...) -> Series[S1]
+ python/pyspark/pandas/series.py:466: note:     def [S1: str | bytes | bool | int | float | complex | <6 more items> | type[str] | list[str] | date | time | datetime | timedelta] __new__(cls, data: Sequence[builtins.bool | numpy.bool[builtins.bool]], index: Mapping[Any, Any] | ExtensionArray | ndarray[tuple[Any, ...], dtype[Any]] | Index[Any] | Series[Any] | SequenceNotStr[Any] | range | KeysView[Any] | None = ..., dtype: Literal['bool', 'boolean', '?', 'b1', 'bool_', 'bool[pyarrow]', 'boolean[pyarrow]'] | type[builtins.bool] | BooleanDtype | type[numpy.bool[builtins.bool]] | None = ..., name: Hashable = ..., copy: bool | None = ...) -> Series[bool]
+ python/pyspark/pandas/series.py:466: note:     def [S1: str | bytes | bool | int | float | complex | <6 more items> | type[str] | list[str] | date | time | datetime | timedelta] __new__(cls, data: Sequence[int | integer[Any]], index: Mapping[Any, Any] | ExtensionArray | ndarray[tuple[Any, ...], dtype[Any]] | Index[Any] | Series[Any] | SequenceNotStr[Any] | range | KeysView[Any] | None = ..., dtype: Literal['int', 'Int8', 'Int16', 'Int32', 'Int64', 'b', 'i1', 'int8', 'byte', 'h', 'i2', 'int16', 'short', 'i', 'i4', 'int32', 'intc', 'l', 'i8', 'int64', 'int_', 'long', 'q', 'longlong', 'p', 'intp', 'int8[pyarrow]', 'int16[pyarrow]', 'int32[pyarrow]', 'int64[pyarrow]', 'UInt8', 'UInt16', 'UInt32', 'UInt64', 'B', 'u1', 'uint8', 'ubyte', 'H', 'u2', 'uint16', 'ushort', 'I', 'u4', 'uint32', 'uintc', 'L', 'u8', 'uint', 'ulong', 'uint64', 'Q', 'ulonglong', 'P', 'uintp', 'uint8[pyarrow]', 'uint16[pyarrow]', 'uint32[pyarrow]', 'uint64[pyarrow]'] | type[int] | Int8Dtype | Int16Dtype | Int32Dtype | Int64Dtype | <14 more items> | None = ..., name: Hashable = ..., copy: bool | None = ...) -> Series[int]
+ python/pyspark/pandas/series.py:466: note:     def [S1: str | bytes | bool | int | float | complex | <6 more items> | type[str] | list[str] | date | time | datetime | timedelta] __new__(cls, data: Sequence[float | floating[Any]] | ndarray[tuple[Any, ...], dtype[floating[Any]]] | FloatingArray, index: Mapping[Any, Any] | ExtensionArray | ndarray[tuple[Any, ...], dtype[Any]] | Index[Any] | Series[Any] | SequenceNotStr[Any] | range | KeysView[Any] | None = ..., dtype: None = ..., name: Hashable = ..., copy: bool | None = ...) -> Series[float]

... (truncated 79 lines) ...

prefect (https://github.com/PrefectHQ/prefect)
+ src/prefect/_vendor/croniter/croniter.py:44: error: Module has no attribute "maxint"  [attr-defined]
+ src/prefect/_vendor/croniter/croniter.py:905: error: Need type annotation for "nth_weekday_of_month" (hint: "nth_weekday_of_month: dict[<type>, <type>] = ...")  [var-annotated]
+ src/prefect/_vendor/croniter/croniter.py:959: error: Incompatible types in assignment (expression has type "str | Any", variable has type "int | None")  [assignment]
+ src/prefect/_vendor/croniter/croniter.py:1076: error: List comprehension has incompatible type List[str]; expected List[int]  [misc]
+ src/prefect/_vendor/croniter/croniter.py:1091: error: Incompatible types in assignment (expression has type "int", variable has type "str")  [assignment]
+ src/prefect/_vendor/croniter/croniter.py:1112: error: Incompatible types in assignment (expression has type "set[Any]", variable has type "list[Any]")  [assignment]
+ src/prefect/_vendor/croniter/croniter.py:1317: error: Incompatible types in assignment (expression has type "type[float]", variable has type "type[datetime]")  [assignment]
+ src/prefect/_internal/concurrency/cancellation.py:148: error: Incompatible types in assignment (expression has type "float | None", variable has type "None")  [assignment]
+ src/prefect/_internal/concurrency/cancellation.py:150: error: Incompatible types in assignment (expression has type "float", variable has type "None")  [assignment]
+ src/prefect/_internal/concurrency/calls.py:98: error: "None" has no attribute "add_cancel_callback"  [attr-defined]
- src/prefect/utilities/visualization.py:116: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
- src/prefect/utilities/visualization.py:117: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
- src/prefect/utilities/visualization.py:118: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
- src/prefect/_internal/concurrency/services.py:442: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
+ src/prefect/runner/server.py:76: error: Value of type "Coroutine[Any, Any, R?]" must be used  [unused-coroutine]
+ src/prefect/runner/server.py:76: note: Are you missing an await?
- src/prefect/client/subscriptions.py:92: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
- src/prefect/client/subscriptions.py:95: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
- src/prefect/task_runs.py:77: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
- src/prefect/task_runs.py:78: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
- src/prefect/task_runs.py:79: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
- src/prefect/task_runs.py:82: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
- src/prefect/task_runs.py:83: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
- src/prefect/task_runs.py:84: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
- src/prefect/_waiters.py:81: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
- src/prefect/_waiters.py:82: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
- src/prefect/_waiters.py:83: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
- src/prefect/_waiters.py:86: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
- src/prefect/_waiters.py:87: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
- src/prefect/_waiters.py:88: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
+ src/prefect/filesystems.py:200: error: Argument 1 to "relative_to" of "PurePath" has incompatible type "str | None"; expected "str | PathLike[str]"  [arg-type]
+ src/prefect/filesystems.py:287: error: Argument 1 to "relative_to" of "PurePath" has incompatible type "str | None"; expected "str | PathLike[str]"  [arg-type]
- src/prefect/task_runners.py:70: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
- src/prefect/deployments/runner.py:321: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
- src/prefect/server/services/task_run_recorder.py:408: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
+ src/prefect/flow_engine.py:705: error: Incompatible types in assignment (expression has type "float | int", variable has type "int | None")  [assignment]
+ src/prefect/flow_engine.py:737: error: Item "Abort" of "Abort | Pause" has no attribute "state"  [union-attr]
+ src/prefect/flow_engine.py:795: error: Item "None" of "FlowRun | None" has no attribute "name"  [union-attr]
+ src/prefect/flow_engine.py:812: error: R? has no attribute "__await__"  [attr-defined]
+ src/prefect/flow_engine.py:812: error: Argument 2 to "call_with_parameters" has incompatible type "dict[str, Any] | None"; expected "dict[str, Any]"  [arg-type]
+ src/prefect/flow_engine.py:1293: error: Incompatible types in assignment (expression has type "float | int", variable has type "int | None")  [assignment]
+ src/prefect/flow_engine.py:1325: error: Item "Abort" of "Abort | Pause" has no attribute "state"  [union-attr]
+ src/prefect/flow_engine.py:1385: error: Item "None" of "FlowRun | None" has no attribute "name"  [union-attr]
+ src/prefect/task_engine.py:331: error: "Logger" has no attribute "extra"  [attr-defined]
+ src/prefect/task_engine.py:333: error: Item "None" of "TaskRun | None" has no attribute "name"  [union-attr]
+ src/prefect/task_engine.py:335: error: Item "None" of "TaskRun | None" has no attribute "name"  [union-attr]

... (truncated 3560 lines) ...```

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants