Skip to content

Fix: Add null check for num_features parameter in BatchNorm1D#161

Open
prjanitor wants to merge 1 commit intoimdeepmind:mainfrom
prjanitor:prjanitor/9bf6a95eb0557c691c65bdcb7740baa68f44053d
Open

Fix: Add null check for num_features parameter in BatchNorm1D#161
prjanitor wants to merge 1 commit intoimdeepmind:mainfrom
prjanitor:prjanitor/9bf6a95eb0557c691c65bdcb7740baa68f44053d

Conversation

@prjanitor
Copy link
Copy Markdown

Bug Description

The num_features parameter in BatchNorm1D defaults to None, which causes PyTorch's BatchNorm1d to raise an error when instantiated without this required parameter.

Root Cause

The validation logic only checked if num_features was an integer when it was not None:

if num_features is not None and not isinstance(num_features, int):
    raise ValueError("Please provide a valid num_features")

This allowed num_features=None to pass validation, but PyTorch requires this parameter.

Fix

Changed the validation to also check for None:

if num_features is None or not isinstance(num_features, int):
    raise ValueError("Please provide a valid num_features")

Testing

Users will now receive a clear error message when attempting to create a BatchNorm1D layer without specifying num_features.


This PR was generated by PRJanitor — an automated tool that finds and fixes small bugs in open-source projects.

We respect your contribution guidelines — if your project doesn't accept bot PRs, we won't send more. You can also add a .github/prjanitor.yml file with enabled: false to opt out explicitly.

The num_features parameter is required by PyTorch's BatchNorm1d but was defaulting to None.
Added a null check to ensure num_features is provided before instantiation.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant