releasing 1.6.2 [rebase & merge]#884
Merged
Borda merged 6 commits intorelease/stablefrom Mar 27, 2026
Merged
Conversation
* Fix ONNX export for dynamic batch dimensions * fix: restore Python int pairs for gen_encoder_output_proposals * test: add dynamic_batch coverage for CLI and RFDETR.export() * fix: replace H_.expand(N_) with torch.full for Python int spatial dims * Apply suggestions from code review --------- Co-authored-by: jirka <6035284+Borda@users.noreply.github.com> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
… `RFDETR.train()` (#872) * feat: accept torch.device and indexed device strings * fix: normalize device input in ModelConfig validator * test: cover torch.device and indexed train device mapping * fix: raise ValueError not TypeError; warn on unmapped device * refactor: extract private helpers for device parsing * refactor: simplify private train device helper call * test: cover train device invalid and unmapped branches --------- Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com> Co-authored-by: Borda <6035284+Borda@users.noreply.github.com> Co-authored-by: Claude Code <noreply@anthropic.com> Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
* feat(predict): add shape parameter for non-square inference (#682) RFDETR.predict() previously ignored a shape kwarg and always resized inputs to the square (resolution, resolution) — making it impossible to run inference matching a non-square ONNX export. Add an explicit shape: Optional[tuple[int, int]] parameter that, when provided, overrides the default square resize, consistent with the existing export(shape=…) API. Also fix the _optimized_resolution mismatch guard to compare both height and width (previously only height was checked), and validate that any provided shape has both dimensions divisible by 14 (same constraint as export()). - Add Raises: section to predict() docstring covering all four ValueError paths introduced by the shape parameter - Document the divisible-by-14 constraint in the shape param description - Reject bool values in shape validation (bool is a subclass of int but semantically invalid as a dimension) - Add blank line between shape validation block and the inference optimisation warning for readability - Remove stale # type: ignore[misc] — Optional[tuple[int, int]] is sufficiently typed after the `if shape is not None:` guard - Add 8 parametrized test cases covering float dims, wrong arity (1- and 3-element), zero/negative, and bool dimensions - Add CHANGELOG entry under [Unreleased] ### Added (closes #682) - Add match="shape" to test_predict_shape_invalid_raises so any ValueError unrelated to the shape parameter cannot silently pass - Add negative_width case (-14 width) to complete the negative-dim symmetry with existing negative_height - Parametrize test_predict_shape_not_divisible_by_14_raises to cover height-not-divisible path alongside the existing width path - Apply ruff type-annotation modernisation (Optional[...] → ... | None, bare tuple → tuple[int | float | bool, ...]) - Add support for integer-like types (e.g., numpy, torch) via the `__index__` protocol. - Improve shape validation error messages and clarify bool rejection. - Update test coverage to ensure acceptance of integer-like dimensions. --------- Co-authored-by: OpenAI Codex <codex@openai.com> Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com> Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
…876) - Resolve patch_size from model_config by default (with fallback) and ensure explicit values match configuration across both export() and predict() - Validate patch_size strictly: - must be a positive integer - reject bool values (True/False) - Use block_size = patch_size * num_windows for all shape divisibility checks, matching backbone expectations - Add robust shape validation: - enforce (H, W) arity and reject invalid types (bool, float) - require positive dimensions (prevent modulo passing negative/zero values) - ensure divisibility by block_size - validate default resolution when shape is not provided - Improve error handling with clear, consistent ValueError messages across all validation paths - Refactor validation logic: - extract _validate_shape_dims() for shared shape checks - extract _resolve_patch_size() for consistent patch_size handling - simplify export() and predict() by delegating to shared helpers - Expand and clean up test coverage: - cover patch_size resolution, mismatch, and invalid values - cover shape validation (arity, type, positivity, divisibility) - add regression tests for rectangular input shapes - validate num_windows interactions and default resolution behavior - simplify and reorganize test structure for clarity - Update docstrings to document patch_size behavior and validation rules --------- Co-authored-by: jirka <6035284+Borda@users.noreply.github.com> Co-authored-by: Claude Code <noreply@anthropic.com> Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
* add LitServe deployment tutorial card to tutorials index * Apply suggestions from code review --------- Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com> Co-authored-by: Borda <6035284+Borda@users.noreply.github.com> Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Codecov Report❌ Patch coverage is Additional details and impacted files@@ Coverage Diff @@
## release/stable #884 +/- ##
=============================================
Coverage 76% 76%
=============================================
Files 92 92
Lines 7158 7241 +83
=============================================
+ Hits 5406 5489 +83
Misses 1752 1752 🚀 New features to boost your workflow:
|
Contributor
There was a problem hiding this comment.
Pull request overview
Release prep for rfdetr==1.6.2, expanding inference/export shape handling, improving device selection for training, and fixing ONNX export edge-cases (dynamic batch + patch/window divisibility).
Changes:
- Add/enable
RFDETR.predict(shape=(H, W))with strict shape validation and correct non-square resizing behavior. - Accept
torch.deviceand indexed device strings (e.g."cuda:1") inModelConfig.deviceand mapRFDETR.train(device=...)to PTLaccelerator/devices. - Fix ONNX export correctness: dynamic-batch
dynamic_axes, patch/window divisibility validation, and tracing robustness for dynamic batch dimensions.
Reviewed changes
Copilot reviewed 14 out of 14 changed files in this pull request and generated 1 comment.
Show a summary per file
| File | Description |
|---|---|
src/rfdetr/detr.py |
Adds shared shape/patch-size validation helpers; implements predict(shape=...), patch/window divisibility checks, dynamic batch export, and improved device-to-PTL mapping in train(). |
src/rfdetr/config.py |
Normalizes device via a Pydantic validator to support torch.device and indexed device strings. |
src/rfdetr/export/main.py |
Fixes rectangular Resize((H, W)) and forwards dynamic_axes when dynamic_batch is set. |
src/rfdetr/models/transformer.py |
Adjusts spatial shape handling to better support ONNX tracing and avoid dynamic-batch tracer failures. |
src/rfdetr/models/backbone/projector.py |
Fixes LayerNorm to use self.normalized_shape for traceability. |
tests/models/test_predict.py |
Adds regression coverage for predict(shape=...) resizing and for patch/window-based shape validation. |
tests/models/test_validate_shape_dims.py |
New focused unit tests for _validate_shape_dims and _resolve_patch_size. |
tests/models/test_export.py |
Adds tests for dynamic batch export forwarding and patch/window shape validation; adds regression for rectangular infer-image creation. |
tests/models/test_transformer.py |
Regression test ensuring gen_encoder_output_proposals handles Python-int spatial shapes when masks=None. |
tests/models/test_config.py |
Adds tests for ModelConfig.device accepting indexed strings and torch.device. |
tests/training/test_detr_shim.py |
Updates training shim tests to reflect new device forwarding/mapping behavior and warnings/errors. |
CHANGELOG.md |
Adds 1.6.2 release notes. |
pyproject.toml |
Bumps package version to 1.6.2. |
docs/tutorials/index.md |
Adds LitServe deployment tutorial link to tutorials index. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
🚀 Added
RFDETR.predict(shape=...)— pass an explicit(height, width)tuple to run inference at a non-square resolution, matching the resolution used when exporting the model. Both dimensions must be positive integers divisible by 14. (feat(predict): add shape parameter for non-square inference #866)🌱 Changed
ModelConfig.deviceandRFDETR.train(device=...)now accepttorch.deviceobjects and indexed device strings ("cuda:0","cuda:1"). Existing string values ("cpu","cuda") are unchanged.RFDETR.train()warns when a valid but unmapped device type is passed to PyTorch Lightning auto-detection. (Accepttorch.deviceand indexed device strings inModelConfigandRFDETR.train()#872)🔧 Fixed
patch_sizeargument:export()andpredict()now resolvepatch_sizefrommodel_configby default, validate it strictly (must be a positive integer, not bool), and enforce that(H, W)dimensions are divisible bypatch_size × num_windows. (fix: ONNX export shape ignored; validate divisibility by patch_size #876)torch.fullis now used for Python-int spatial dims to avoidH_.expand(N_)tracer failures. (Fix ONNX export for dynamic batch dimensions #871)🏆 Contributors
Welcome to our new contributors, and thank you to everyone who helped with this release:
Full changelog: 1.6.1...1.6.2