Skip to content

Conversation

@haoyang9804
Copy link
Contributor

This is a code patch for fixing this issue: #15004
Please refer to my comments in this link for the root cause of this issue.

@tvm-bot
Copy link
Collaborator

tvm-bot commented Jul 23, 2023

Thanks for contributing to TVM! Please refer to the contributing guidelines https://tvm.apache.org/docs/contribute/ for useful information and tips. Please request code reviews from Reviewers by @-ing them in a comment.

Generated by tvm-bot

@haoyang9804
Copy link
Contributor Author

cc @jikechao

Copy link
Member

@jikechao jikechao left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@haoyang9804
LGTM, Thanks for your fixing.

@jikechao
Copy link
Member

@echuraev @Hzfengsy @vvchernov
Please recheck it.

@Hzfengsy Hzfengsy changed the title Fix an adaptive_max_pool1d operator conversion bug [Relay] Fix an adaptive_max_pool1d operator conversion bug Jul 24, 2023
@Hzfengsy
Copy link
Member

Please add a regression test for it

@vvchernov
Copy link
Contributor

From one hand LGTM, from another one it confuses me due to fix for specific op was done on very high level (and CI test was constructed for specific op not for all cases). It looks like it has been assumed that outputs exist at any ret_name in ret_names. I need time to analyze it by my-self or explanation what is wrong with the initial assumption.

@haoyang9804
Copy link
Contributor Author

From one hand LGTM, from another one it confuses me due to fix for specific op was done on very high level (and CI test was constructed for specific op not for all cases). It looks like it has been assumed that outputs exist at any ret_name in ret_names. I need time to analyze it by my-self or explanation what is wrong with the initial assumption.

I agree that "fix for specific op was done on very high level". So I make the fix only specific to the buggy pytorch op. This is only a workaround currently. I believe there are some deeper issues make this bug happen.

@jikechao
Copy link
Member

jikechao commented Sep 2, 2023

cc @vvchernov

Copy link
Contributor

@echuraev echuraev left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Thanks!

@echuraev echuraev merged commit d75083c into apache:main Sep 4, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants