Skip to content

releasing 1.6.2 [rebase & merge]#884

Merged
Borda merged 6 commits intorelease/stablefrom
releasing/1.6.2
Mar 27, 2026
Merged

releasing 1.6.2 [rebase & merge]#884
Borda merged 6 commits intorelease/stablefrom
releasing/1.6.2

Conversation

@Borda
Copy link
Copy Markdown
Member

@Borda Borda commented Mar 27, 2026

🚀 Added

detections = model.predict("image.jpg", shape=(480, 640))

🌱 Changed

from rfdetr import RFDETRSmall
from torch import device

model = RFDETRSmall(...)

model.train(..., device=device("cuda:1"))
model.train(..., device="cuda:0")

🔧 Fixed


🏆 Contributors

Welcome to our new contributors, and thank you to everyone who helped with this release:

  • zhaoshuo (@zhaoshuo1223) — ONNX export shape validation and patch_size fixes
  • Sven Goluza (@svengoluza) — ONNX export dynamic batch fix
  • Jirka Borovec (@Borda) (LinkedIn) — shape inference, torch.device support, release coordination

Full changelog: 1.6.1...1.6.2

svengoluza and others added 5 commits March 27, 2026 10:55
* Fix ONNX export for dynamic batch dimensions
* fix: restore Python int pairs for gen_encoder_output_proposals
* test: add dynamic_batch coverage for CLI and RFDETR.export()
* fix: replace H_.expand(N_) with torch.full for Python int spatial dims
* Apply suggestions from code review

---------

Co-authored-by: jirka <6035284+Borda@users.noreply.github.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
… `RFDETR.train()` (#872)

* feat: accept torch.device and indexed device strings
* fix: normalize device input in ModelConfig validator
* test: cover torch.device and indexed train device mapping
* fix: raise ValueError not TypeError; warn on unmapped device
* refactor: extract private helpers for device parsing
* refactor: simplify private train device helper call
* test: cover train device invalid and unmapped branches

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: Borda <6035284+Borda@users.noreply.github.com>
Co-authored-by: Claude Code <noreply@anthropic.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
* feat(predict): add shape parameter for non-square inference (#682)

RFDETR.predict() previously ignored a shape kwarg and always resized
inputs to the square (resolution, resolution) — making it impossible to
run inference matching a non-square ONNX export. Add an explicit
shape: Optional[tuple[int, int]] parameter that, when provided, overrides
the default square resize, consistent with the existing export(shape=…) API.

Also fix the _optimized_resolution mismatch guard to compare both height
and width (previously only height was checked), and validate that any
provided shape has both dimensions divisible by 14 (same constraint as
export()).

- Add Raises: section to predict() docstring covering all four
  ValueError paths introduced by the shape parameter
- Document the divisible-by-14 constraint in the shape param description
- Reject bool values in shape validation (bool is a subclass of int
  but semantically invalid as a dimension)
- Add blank line between shape validation block and the inference
  optimisation warning for readability
- Remove stale # type: ignore[misc] — Optional[tuple[int, int]] is
  sufficiently typed after the `if shape is not None:` guard
- Add 8 parametrized test cases covering float dims, wrong arity
  (1- and 3-element), zero/negative, and bool dimensions
- Add CHANGELOG entry under [Unreleased] ### Added (closes #682)
- Add match="shape" to test_predict_shape_invalid_raises so any
  ValueError unrelated to the shape parameter cannot silently pass
- Add negative_width case (-14 width) to complete the negative-dim
  symmetry with existing negative_height
- Parametrize test_predict_shape_not_divisible_by_14_raises to cover
  height-not-divisible path alongside the existing width path
- Apply ruff type-annotation modernisation (Optional[...] → ... | None,
  bare tuple → tuple[int | float | bool, ...])
- Add support for integer-like types (e.g., numpy, torch) via the `__index__` protocol.
- Improve shape validation error messages and clarify bool rejection.
- Update test coverage to ensure acceptance of integer-like dimensions.

---------

Co-authored-by: OpenAI Codex <codex@openai.com>
Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
…876)

- Resolve patch_size from model_config by default (with fallback) and ensure
  explicit values match configuration across both export() and predict()
- Validate patch_size strictly:
  - must be a positive integer
  - reject bool values (True/False)
- Use block_size = patch_size * num_windows for all shape divisibility checks,
  matching backbone expectations

- Add robust shape validation:
  - enforce (H, W) arity and reject invalid types (bool, float)
  - require positive dimensions (prevent modulo passing negative/zero values)
  - ensure divisibility by block_size
  - validate default resolution when shape is not provided

- Improve error handling with clear, consistent ValueError messages
  across all validation paths

- Refactor validation logic:
  - extract _validate_shape_dims() for shared shape checks
  - extract _resolve_patch_size() for consistent patch_size handling
  - simplify export() and predict() by delegating to shared helpers

- Expand and clean up test coverage:
  - cover patch_size resolution, mismatch, and invalid values
  - cover shape validation (arity, type, positivity, divisibility)
  - add regression tests for rectangular input shapes
  - validate num_windows interactions and default resolution behavior
  - simplify and reorganize test structure for clarity

- Update docstrings to document patch_size behavior and validation rules

---------

Co-authored-by: jirka <6035284+Borda@users.noreply.github.com>
Co-authored-by: Claude Code <noreply@anthropic.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
* add LitServe deployment tutorial card to tutorials index
* Apply suggestions from code review

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: Borda <6035284+Borda@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
@Borda Borda force-pushed the releasing/1.6.2 branch from d0e7c63 to a57e420 Compare March 27, 2026 11:20
@codecov
Copy link
Copy Markdown

codecov bot commented Mar 27, 2026

Codecov Report

❌ Patch coverage is 98.03922% with 2 lines in your changes missing coverage. Please review.
✅ Project coverage is 76%. Comparing base (bddf21f) to head (ce998f7).
⚠️ Report is 6 commits behind head on release/stable.

Additional details and impacted files
@@              Coverage Diff              @@
##           release/stable   #884   +/-   ##
=============================================
  Coverage              76%    76%           
=============================================
  Files                  92     92           
  Lines                7158   7241   +83     
=============================================
+ Hits                 5406   5489   +83     
  Misses               1752   1752           
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@Borda Borda force-pushed the releasing/1.6.2 branch from 29bf301 to e5ecd2a Compare March 27, 2026 12:28
@Borda Borda marked this pull request as ready for review March 27, 2026 12:38
Copilot AI review requested due to automatic review settings March 27, 2026 12:38
@Borda Borda requested a review from isaacrob as a code owner March 27, 2026 12:38
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Release prep for rfdetr==1.6.2, expanding inference/export shape handling, improving device selection for training, and fixing ONNX export edge-cases (dynamic batch + patch/window divisibility).

Changes:

  • Add/enable RFDETR.predict(shape=(H, W)) with strict shape validation and correct non-square resizing behavior.
  • Accept torch.device and indexed device strings (e.g. "cuda:1") in ModelConfig.device and map RFDETR.train(device=...) to PTL accelerator/devices.
  • Fix ONNX export correctness: dynamic-batch dynamic_axes, patch/window divisibility validation, and tracing robustness for dynamic batch dimensions.

Reviewed changes

Copilot reviewed 14 out of 14 changed files in this pull request and generated 1 comment.

Show a summary per file
File Description
src/rfdetr/detr.py Adds shared shape/patch-size validation helpers; implements predict(shape=...), patch/window divisibility checks, dynamic batch export, and improved device-to-PTL mapping in train().
src/rfdetr/config.py Normalizes device via a Pydantic validator to support torch.device and indexed device strings.
src/rfdetr/export/main.py Fixes rectangular Resize((H, W)) and forwards dynamic_axes when dynamic_batch is set.
src/rfdetr/models/transformer.py Adjusts spatial shape handling to better support ONNX tracing and avoid dynamic-batch tracer failures.
src/rfdetr/models/backbone/projector.py Fixes LayerNorm to use self.normalized_shape for traceability.
tests/models/test_predict.py Adds regression coverage for predict(shape=...) resizing and for patch/window-based shape validation.
tests/models/test_validate_shape_dims.py New focused unit tests for _validate_shape_dims and _resolve_patch_size.
tests/models/test_export.py Adds tests for dynamic batch export forwarding and patch/window shape validation; adds regression for rectangular infer-image creation.
tests/models/test_transformer.py Regression test ensuring gen_encoder_output_proposals handles Python-int spatial shapes when masks=None.
tests/models/test_config.py Adds tests for ModelConfig.device accepting indexed strings and torch.device.
tests/training/test_detr_shim.py Updates training shim tests to reflect new device forwarding/mapping behavior and warnings/errors.
CHANGELOG.md Adds 1.6.2 release notes.
pyproject.toml Bumps package version to 1.6.2.
docs/tutorials/index.md Adds LitServe deployment tutorial link to tutorials index.

@Borda Borda force-pushed the releasing/1.6.2 branch from 91b809a to a06ff08 Compare March 27, 2026 12:48
@Borda Borda force-pushed the releasing/1.6.2 branch from a06ff08 to ce998f7 Compare March 27, 2026 14:28
@Borda Borda merged commit a35e680 into release/stable Mar 27, 2026
23 of 25 checks passed
@Borda Borda deleted the releasing/1.6.2 branch March 27, 2026 16:11
@Borda Borda self-assigned this Mar 27, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants