Skip to content

Conversation

@giterator
Copy link
Contributor

Added LLama4 implementation, new rope implementation

custom rope for llama4

custom rope for llama4 and ops

cleanup rope dead code

remove debugging cases

reformat

moved ml_dtypes import

lint fix

lint fix

remove imports

fix tests

black format

undo lint for 2 lines
@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @giterator, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request primarily introduces support for the Llama 4 model, focusing on its unique Rotary Position Embedding (RoPE) scaling mechanism. It includes new functions to handle Llama 4's frequency computation and position mapping within the RoPE implementation. Additionally, the PR expands the frontend operator set with log, floor, and arange, and enhances data type support by integrating ml_dtypes for bf16 and fp8.

Highlights

  • Llama 4 RoPE Implementation: New functions rope_freq_llama4 and llama4_rope_with_position_map have been introduced to support Llama 4's specific Rotary Position Embedding (RoPE) scaling, including logic for smooth interpolation.
  • New Frontend Operators: The frontend API has been expanded with new operators: log, floor, and arange, providing additional mathematical and tensor creation capabilities.
  • Enhanced Data Type Support: The ml_dtypes library is now imported to enable support for bf16 and fp8 data types within the const function.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces support for Llama4, including a new RoPE implementation, and adds log, floor, and arange operators to the nn frontend. The changes are generally well-implemented. However, the new Llama4 RoPE implementation contains some copy-pasted code from the Llama3 implementation that should be updated for clarity. More importantly, there is significant code duplication in llama4_rope_with_position_map, which should be refactored to improve maintainability.

@MasterJH5574 MasterJH5574 changed the title LLama 4 support [Relax] Operator and RoPE support for Llama4 Sep 23, 2025
Copy link
Contributor

@MasterJH5574 MasterJH5574 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thank you @giterator!

@MasterJH5574 MasterJH5574 merged commit f971595 into apache:main Sep 23, 2025
11 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants