Skip to content

Fix DPT decoder bugs#26

Open
bingyic wants to merge 2 commits intoadd-decoders-modulefrom
fix-dpt-decoders
Open

Fix DPT decoder bugs#26
bingyic wants to merge 2 commits intoadd-decoders-modulefrom
fix-dpt-decoders

Conversation

@bingyic
Copy link
Copy Markdown
Collaborator

@bingyic bingyic commented Apr 22, 2026

This PR applies fixes to the DPT decoders introduced in PR #24.

Changes:

  • Add F.relu() after DPTHead project conv to match Scenic's output_activation=True default.
  • Fix DepthDecoder to route through parent's nn.Linear head instead of bypassing it.
  • Register bin_centers as a buffer with configurable num_depth_bins.
  • Add weight key remapping for all decoder types.

bingyic added 2 commits April 22, 2026 08:19
- Add F.relu() after DPTHead project conv to match Scenic's output_activation=True default
- Fix DepthDecoder to route through parent's nn.Linear head instead of bypassing it
- Register bin_centers as a buffer with configurable num_depth_bins
- Add weight key remapping for all decoder types
Three fixes verified by numerical parity tests (max abs diff < 1e-4):

1. GELU approximation: JAX defaults to tanh approximation, PyTorch uses
   exact. Use F.gelu(x, approximate='tanh') for numerical parity.

2. Remove spurious ReLU: Scenic DPT defaults to output_activation=False,
   so no ReLU should be applied after the project conv.

3. ConvTranspose kernel flip: Flax ConvTranspose uses
   transpose_kernel=False (no kernel flip), while PyTorch ConvTranspose2d
   always flips. Pre-flip weights 180 degrees during loading to compensate.

These fixes bring the max absolute difference between Scenic and PyTorch
decoder outputs below 1e-4 across all three heads (depth, normals,
segmentation). No checkpoint re-export needed — all fixes are in the
inference code/weight loading path.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant