Skip to content
Merged
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions src/rfdetr/datasets/_develop.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,8 @@
from typing import TYPE_CHECKING, Any, Generator, Optional, Tuple
from urllib.request import urlretrieve

import torch
Copy link

Copilot AI Feb 22, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adding a module-level torch import conflicts with the documented design intention of _SimpleDataset. The class docstring explicitly states that it "does not pull in torch at module load time" (lines 44-46), and torch was intentionally imported locally within __getitem__ (line 72) to achieve lazy loading.

While the module-level import is needed for linter compatibility with the torch.Tensor type hint on line 70, this breaks the lazy loading behavior. Consider one of these alternatives:

  1. Use a string literal for the type hint: def __getitem__(self, idx: int) -> Tuple["torch.Tensor", dict]: combined with from __future__ import annotations (already present)
  2. Import torch conditionally using TYPE_CHECKING: if TYPE_CHECKING: import torch
  3. Update the docstring to reflect that torch is now imported at module load time

Since this file already uses from __future__ import annotations (line 13), option 1 (string literal) would preserve the lazy loading behavior while satisfying linters.

Copilot uses AI. Check for mistakes.

from rfdetr.util.logger import get_logger

logger = get_logger()
Expand Down
Loading