-
Notifications
You must be signed in to change notification settings - Fork 6.5k
Add examples with Intel optimizations #1579
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
@patrickvonplaten @anton-l @patil-suraj please review. Thanks. |
|
The documentation is not available anymore as the PR was closed or merged. |
examples/README.md
Outdated
|
|
||
| ## Research Projects | ||
|
|
||
| We also provide **research_projects** examples that can be used to accelerate the training (fine-tuning) and inference. These examples are useful and offer the extended capablities which are complementary to the official examples. You may refer to [research_projects](https://github.com/huggingface/diffusers/tree/main/examples/research_projects) for details. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| We also provide **research_projects** examples that can be used to accelerate the training (fine-tuning) and inference. These examples are useful and offer the extended capablities which are complementary to the official examples. You may refer to [research_projects](https://github.com/huggingface/diffusers/tree/main/examples/research_projects) for details. | |
| We also provide **research_projects** examples that are maintained by the community as defined in the respective research project folders. These examples are useful and offer the extended capabilities which are complementary to the official examples. You may refer to [research_projects](https://github.com/huggingface/diffusers/tree/main/examples/research_projects) for details. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done.
| @@ -0,0 +1,49 @@ | |||
| import torch | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could we add a README.md to research_projects/intel_opts?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done. Added some initial ones. Please review. Thanks.
|
@patrickvonplaten please let me know if you have additional comments. |
anton-l
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks nice as a research example! @hshen14 some benchmark results would probably give the examples more visibility, but I'll leave the decision to include them up to you :)
Thanks @anton-l. This is on my list to add it later, since BFloat16 support relies on 4th Gen of Intel Xeon Scalable Processor which will be available soon while early preview on AWS. |
| @@ -0,0 +1,15 @@ | |||
| ## Diffusers examples with Intel optimizations | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| ## Diffusers examples with Intel optimizations | |
| ## Diffusers examples with Intel optimizations | |
| **This research project is not actively maintained by the diffusers team. For any questions or comments, please make sure to tag @hshen14 .** |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
|
@patrickvonplaten The notes are updated accordingly per your comments. Please check. Thanks. |
patrickvonplaten
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot!
* Add examples with Intel optimizations (BF16 fine-tuning and inference) * Remove unused package * Add README for intel_opts and refine the description for research projects * Add notes of intel opts for diffusers
Per comments from #1499, this PR is to add examples with Intel optimizations for fine-tuning and inference. Bfloat16 fine-tuning is enabled for textual_inversion, while Bfloat16 inference is generally applicable for Stable Diffusion.