Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Some better error messages #189

Open
wants to merge 5 commits into
base: master
Choose a base branch
from

Conversation

opfromthestart
Copy link
Contributor

@opfromthestart opfromthestart commented Jan 31, 2023

What does this PR accomplish?

Makes error messages more readable

  • 馃┕ Bug Fix
  • 馃 Feature
  • 馃摍 Documentation

Closes #182 .

Changes proposed by this PR:

This replaces some operations that can fail with better error messages

Notes to reviewer:

Unsure of how to test this per se, since it is based on errors.
I don't know all the places it could go wrong, so I need help with identification.

馃摐 Checklist

  • Add explicit bounds checks when needed
  • Add help messages for common errors
  • Test coverage is excellent
  • All unit tests pass
  • The juice-examples run just fine
  • Documentation is thorough, extensive and explicit

"Input Shape Mismatch\nExpected {:?}\nActual {:?}",
reshaped_shape, old_shape
);
if reshaped_shape[1..] == old_shape[1..] {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it'd make more sense to add an assert per dimension, and just fail with the first.

for dim in 0..5 {
     let old = old_shape[dim];
     let reshaped = reshaped__shape[dim];
     assert_eq!(old, reshpaed, "Dimension {dim} mismatches {old} != {reshaped}"); 
}
unreachable!("If the total size is not equal, one dimension mismatch must exist. qed");

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did it this way to catch some errors with mismatched batch sizes to make it a bit more clear.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it should always show the full dimensions as well. That way it stays clear where it could originate from your code. These errors can arise from rather deep inside the code.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe that my recent commit should have done this.

@@ -293,6 +293,9 @@ impl<B: IBackend + LayerOps<f32> + 'static> ILayer<B> for Sequential<B> {
output_data: &mut [ArcLock<SharedTensor<f32>>],
) {
for layer in &self.layers {
if layer.borrow().input_blob_names.len() < input_data.len() {
panic!("Layer {} expected {} input(s) but got {}.", layer.borrow().name, layer.borrow().input_blob_names.len(), input_data.len());
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

馃憤

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Unhelpful panic message on (presumably?) invalid inputs to forward
2 participants