Skip to content

Conversation

@wonjoo-wj
Copy link
Collaborator

Lower LogSigmoid, LogSigmoidBackward, Elu, and EluBackward

As part of #3527, lowering existing ops that rely on IR level computation and removing the use of ScopePusher.

Testing unit tests succeeded locally.

@wonjoo-wj
Copy link
Collaborator Author

Unit tests failing:

[  FAILED  ] AtenXlaTensorTest.TestSelu
[  FAILED  ] AtenXlaTensorTest.TestSeluInPlace
[  FAILED  ] AtenXlaTensorTest.TestCelu
[  FAILED  ] AtenXlaTensorTest.TestCeluInPlace

Seems related to this PR, looking into it.

@wonjoo-wj
Copy link
Collaborator Author

This is a bit odd, all these unit tests are succeeding locally.

@JackCaoG
Copy link
Collaborator

can you do a full cpp test run using https://github.com/pytorch/xla/blob/master/test/cpp/run_tests.sh (./test/cpp/run_tests.sh) should work

@wonjoo-wj
Copy link
Collaborator Author

can you do a full cpp test run using https://github.com/pytorch/xla/blob/master/test/cpp/run_tests.sh (./test/cpp/run_tests.sh) should work

Thanks! Doing that enabled me to reproduce the failures locally.

I'm a bit confused here, according to https://github.com/pytorch/xla/blob/master/xla_native_functions.yaml, we don't lower selu or celu yet we have cpp tests for these ops. Not sure how this PR affected these tests. I'll try to build locally on master and check the difference.

@JackCaoG
Copy link
Collaborator

seems like torch::selu will call elu under the hood.

@wonjoo-wj wonjoo-wj force-pushed the ops branch 2 times, most recently from 8fd5fdc to ed9f8a3 Compare May 4, 2022 03:47
@wonjoo-wj
Copy link
Collaborator Author

Seemingly unrelated tests failing:

======================================================================
FAIL: test_addcmul_promotion_xla (__main__.TestTypePromotionXLA)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/opt/conda/lib/python3.7/site-packages/torch/testing/_internal/common_device_type.py", line 376, in instantiated_test
    result = test(self, **param_kwargs)
  File "/tmp/pytorch/xla/test/../../test/test_type_promotion.py", line 849, in test_addcmul_promotion
    self._ternary_promotion_common(device, op1, op2)
  File "/tmp/pytorch/xla/test/../../test/test_type_promotion.py", line 830, in _ternary_promotion_common
    self.assertEqual(res1, res2)
  File "/tmp/pytorch/xla/test/pytorch_test_base.py", line 628, in assertEqual
    return DeviceTypeTestBase.assertEqual(self, x, y, *args, **kwargs)
  File "/opt/conda/lib/python3.7/site-packages/torch/testing/_internal/common_utils.py", line 2240, in assertEqual
    msg=msg,
  File "/opt/conda/lib/python3.7/site-packages/torch/testing/_comparison.py", line 1074, in assert_equal
    raise error_metas[0].to_error()
AssertionError: The values for attribute 'dtype' do not match: torch.float64 != torch.complex128.

----------------------------------------------------------------------

Looking into it.

@JackCaoG
Copy link
Collaborator

JackCaoG commented May 4, 2022

You can rerun the test, upstream pr should be reverted.

@wonjoo-wj
Copy link
Collaborator Author

@JackCaoG, this should be ready for a quick review now. Thanks!

Copy link
Collaborator

@JackCaoG JackCaoG left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@wonjoo-wj wonjoo-wj merged commit cc312e2 into master May 9, 2022
@wonjoo-wj wonjoo-wj deleted the ops branch May 9, 2022 17:33
ysiraichi added a commit that referenced this pull request May 22, 2025
- `SgnOp` and `SignOp`
    - Full codegen migration: #3577
    - Mistakenly re-introduced: #3572
- `LogSigmoid`
    - Introduced: #3539
    - Full codegen migration: #3743
- `SiLU`
    - Introduced: #2721
    - Full codegen migration: #3780
- `SiLUBackward`
    - Introduced: #3195
    - Full codegen migration: #3780
- `SeLU`
    - Introduced: #3547
    - Full codegen migration: #3780
- `Sigmoid`
    - Introduced: 6a73deb (no PR record)
    - Full codegen migration: #6342
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants