Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix Frontend Failing Test: torch - creation_ops.torch.full #28052

Closed
sgalpha01 opened this issue Jan 25, 2024 · 2 comments · May be fixed by #28053
Closed

Fix Frontend Failing Test: torch - creation_ops.torch.full #28052

sgalpha01 opened this issue Jan 25, 2024 · 2 comments · May be fixed by #28053
Labels
Sub Task a sub task which is stemming from a ToDo list issue

Comments

@sgalpha01
Copy link
Contributor

ToDo - 27498
Type- Priority Open

@sgalpha01 sgalpha01 added the Sub Task a sub task which is stemming from a ToDo list issue label Jan 25, 2024
@sgalpha01
Copy link
Contributor Author

sgalpha01 commented Jan 25, 2024

It was failing for two backends:

  1. paddle: The error was - TypeError: float() argument must be a string or a real number, not 'complex'. So, it could be solved by adding @with_unsupported_dtypes decorator.
  2. torch: The error is -
E       TypeError: full() received an invalid combination of arguments - got (tuple, Tensor, out=NoneType, device=torch.device, dtype=torch.dtype), but expected one of:
E        * (tuple of ints size, Number fill_value, *, tuple of names names, torch.dtype dtype, torch.layout layout, torch.device device, bool pin_memory, bool requires_grad)
E        * (tuple of ints size, Number fill_value, *, Tensor out, torch.dtype dtype, torch.layout layout, torch.device device, bool pin_memory, bool requires_grad)
E       
E       Falsifying example: test_torch_full(
E           on_device='cpu',
E           frontend='torch',
E           backend_fw='torch',
E           shape=(1,),
E           fill_value=array(-1.),
E           dtype=['float64'],
E           test_flags=FrontendFunctionTestFlags(
E               num_positional_args=0,
E               with_out=False,
E               with_copy=False,
E               inplace=False,
E               as_variable=[True],
E               native_arrays=[False],
E               test_trace=False,
E               generate_frontend_arrays=False,
E               transpile=False,
E               precision_mode=False,
E           ),
E           fn_tree='ivy.functional.frontends.torch.full',
E       )

Complete Log -
frontend_torch_creation_ops_full.log

I didn't understand this, as I'm unable to reproduce this outside the test suite. So, I tried adding print statements in the test function.

Check this two examples:
a)

---
Starting Frontend Function Test...
input_dtypes:  ['float16']
test_flags:  num_positional_args=1. with_out=False. with_copy=False. inplace=False. native_arrays=[False]. as_variable=[True]. test_trace=False. generate_frontend_arrays=False. transpile=False.precision_mode=True. 
backend_to_test:  torch
on_device:  cpu
frontend:  torch
fn_tree:  ivy.functional.frontends.torch.full
gt_fn_tree:  None
rtol:  None
atol:  1e-06
tolerance_dict:  None
test_values:  True
all_as_kwargs_np:  {'size': (1,), 'fill_value': array(-0.4304, dtype=float16), 'dtype': 'float16', 'device': 'cpu'}
Checkpoint 1
passing args to get_frontend_ret:  [(1,)]
passing kwargs to get_frontend_ret:  {'fill_value': ivy.array(-0.43041992), 'dtype': 'float16', 'device': 'cpu'}
I'm inside full function
These are the arguments: 
size:  (1,)
fill_value:  ivy.array(-0.43041992)
type(fill_value):  <class 'ivy.data_classes.array.array.Array'>
dtype of fill value:  float16
dtype:  float16
^^^
Frontend Function Issue
full() received an invalid combination of arguments - got (tuple, Tensor, out=NoneType, device=torch.device, dtype=torch.dtype), but expected one of:
 * (tuple of ints size, Number fill_value, *, tuple of names names, torch.dtype dtype, torch.layout layout, torch.device device, bool pin_memory, bool requires_grad)
 * (tuple of ints size, Number fill_value, *, Tensor out, torch.dtype dtype, torch.layout layout, torch.device device, bool pin_memory, bool requires_grad)

^^^
---

b)

---
Starting Frontend Function Test...
input_dtypes:  ['float16']
test_flags:  num_positional_args=0. with_out=False. with_copy=False. inplace=False. native_arrays=[False]. as_variable=[False]. test_trace=False. generate_frontend_arrays=False. transpile=False.precision_mode=False. 
backend_to_test:  torch
on_device:  cpu
frontend:  torch
fn_tree:  ivy.functional.frontends.torch.full
gt_fn_tree:  None
rtol:  None
atol:  1e-06
tolerance_dict:  None
test_values:  True
all_as_kwargs_np:  {'size': (10,), 'fill_value': array(-1., dtype=float16), 'dtype': 'float16', 'device': 'cpu'}
Checkpoint 1
passing args to get_frontend_ret:  []
passing kwargs to get_frontend_ret:  {'size': (10,), 'fill_value': ivy.array(-1.), 'dtype': 'float16', 'device': 'cpu'}
I'm inside full function
These are the arguments: 
size:  (10,)
fill_value:  ivy.array(-1.)
type(fill_value):  <class 'ivy.data_classes.array.array.Array'>
dtype of fill value:  float16
dtype:  float16
This is the return value:  ivy.array([-1., -1., -1., -1., -1., -1., -1., -1., -1., -1.])
I'm leaving full function
ret:  ivy.frontends.torch.Tensor([-1., -1., -1., -1., -1., -1., -1., -1., -1., -1.])
type of ret:  <class 'ivy.functional.frontends.torch.tensor.Tensor'>
dtype of ret:  float16
Checkpoint 2
frontend_fw:  <module 'torch' from '/home/r0gue_shinobi/.miniconda3/envs/ivy_dev/lib/python3.10/site-packages/torch/__init__.py'>
frontend_fw_fn:  <built-in method full of type object at 0xb348b269a40>
frontend_ret:  tensor([-1., -1., -1., -1., -1., -1., -1., -1., -1., -1.], dtype=torch.float16)
type of frontend_ret:  <class 'torch.Tensor'>
dtype of frontend_ret:  torch.float16
Checkpoint 3
Checkpoint 4
Frontend Test Finished
Initiating value test...
Value test finished
---

What I don't understand is, even after having similar arguments inside the full function, why one is failing and the other is passing.

Complete Log -
frontend_torch_creation_ops_full_verbose.log

@ivy-leaves
Copy link

Hello @contributor,

Great news! The test 'torch.full' is now passing. We are closing this issue. Feel free to choose and work on another failing test.

Thank you for your contribution!

Best,
Ivy Team

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Sub Task a sub task which is stemming from a ToDo list issue
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants