Skip to content

Test PARQ with torchao activation quantization #2370

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Jun 16, 2025
Merged

Test PARQ with torchao activation quantization #2370

merged 2 commits into from
Jun 16, 2025

Conversation

lisjin
Copy link
Contributor

@lisjin lisjin commented Jun 12, 2025

Added a test case to show numerical equivalency between quantizing:

  1. intx weights with PARQ's UnifTorchaoQuantizer + int8 activations with torchao's FakeQuantizeConfig
  2. intx weights + int8 activations with torchao's Int8DynamicActivationIntxWeightConfig

Next steps with target EmbeddingQuantizer and PackedLinearInt8DynamicActivationIntxWeightLayout.

@lisjin lisjin requested review from andrewor14 and metascroy June 12, 2025 23:29
Copy link

pytorch-bot bot commented Jun 12, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/ao/2370

Note: Links to docs will display an error until the docs builds have been completed.

⏳ No Failures, 1 Pending

As of commit ec68ca9 with merge base 5bdc25d (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jun 12, 2025
@lisjin lisjin added the topic: improvement Use this tag if this PR is an improvement (doesn't fit into any of the other categories) label Jun 12, 2025
Copy link
Contributor

@andrewor14 andrewor14 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great, thanks!

self._dequantize = dequantize_affine
elif zero_point_domain == ZeroPointDomain.NONE:
self._quantize = quantize_affine_no_zero_point
self._dequantize = dequantize_affine_no_zero_point
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These changes aren't required to make the test pass right, just for clean ups? (we can keep them in this PR, just wanted to ask for my understanding)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, this is just for simplification!

@lisjin lisjin merged commit 488c856 into main Jun 16, 2025
19 checks passed
@lisjin lisjin deleted the parq branch June 16, 2025 17:24
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. topic: improvement Use this tag if this PR is an improvement (doesn't fit into any of the other categories)
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants