refactor: add missing explicit (non-dynamic) imports silence linter errors#719
refactor: add missing explicit (non-dynamic) imports silence linter errors#719anatoly-ryabchenko wants to merge 2 commits intoroboflow:developfrom
Conversation
….py to silence linter errors
|
|
Codecov Report✅ All modified and coverable lines are covered by tests. ❌ Your project check has failed because the head coverage (65%) is below the target coverage (95%). You can increase the head coverage or adjust the target coverage. Additional details and impacted files@@ Coverage Diff @@
## develop #719 +/- ##
======================================
Coverage 65% 65%
======================================
Files 56 56
Lines 7208 7210 +2
======================================
+ Hits 4708 4711 +3
+ Misses 2500 2499 -1 🚀 New features to boost your workflow:
|
|
There was a problem hiding this comment.
Pull request overview
This PR adds explicit imports for ast in main.py and torch in datasets/_develop.py to resolve linter errors. While these imports are technically used in both files, adding the module-level torch import in _develop.py conflicts with the intentional lazy-loading design of the _SimpleDataset class.
Changes:
- Added
import asttosrc/rfdetr/main.py(used forast.literal_evalin argument parsing) - Added
import torchtosrc/rfdetr/datasets/_develop.py(needed fortorch.Tensortype hint)
Reviewed changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 1 comment.
| File | Description |
|---|---|
| src/rfdetr/main.py | Adds missing ast import needed for ast.literal_eval call on line 1016 |
| src/rfdetr/datasets/_develop.py | Adds module-level torch import for type hint, but conflicts with lazy loading design |
| from typing import Any, Generator, Optional, Tuple | ||
| from urllib.request import urlretrieve | ||
|
|
||
| import torch |
There was a problem hiding this comment.
Adding a module-level torch import conflicts with the documented design intention of _SimpleDataset. The class docstring explicitly states that it "does not pull in torch at module load time" (lines 44-46), and torch was intentionally imported locally within __getitem__ (line 72) to achieve lazy loading.
While the module-level import is needed for linter compatibility with the torch.Tensor type hint on line 70, this breaks the lazy loading behavior. Consider one of these alternatives:
- Use a string literal for the type hint:
def __getitem__(self, idx: int) -> Tuple["torch.Tensor", dict]:combined withfrom __future__ import annotations(already present) - Import torch conditionally using TYPE_CHECKING:
if TYPE_CHECKING: import torch - Update the docstring to reflect that torch is now imported at module load time
Since this file already uses from __future__ import annotations (line 13), option 1 (string literal) would preserve the lazy loading behavior while satisfying linters.
I have accepted the license agreement several times, but it still shows up as not signed. This is my first PR for an open-source project, so I'm a bit new to this.
|
a39bebd to
a6e6ca0
Compare
could you please share the screenshot :) |
What does this PR do?
import ast was missing from main.py
import torch was missing from datasets/_develop.py
this may not have caused runtime crashes, since dependencies were imported upstream, but it is a bad practice and adds linter errors. This PR fixes that.
Related Issue(s): N/A
Type of Change
Testing
Test details:
Linter errors disappear (tested with PyCharm), functionality not affected (training, export).
Checklist
Additional Context