Fix RoPE inner product equation & add note on the difference in implementation #265
Add this suggestion to a batch that can be applied as a single commit. This suggestion is invalid because no changes were made to the code. Suggestions cannot be applied while the pull request is closed. Suggestions cannot be applied while viewing a subset of changes. Only one suggestion per line can be applied in a batch. Add this suggestion to a batch that can be applied as a single commit. Applying suggestions on deleted lines is not supported. You must change the existing code in this line in order to create a valid suggestion. Outdated suggestions cannot be applied. This suggestion has been applied or marked resolved. Suggestions cannot be applied from pending reviews. Suggestions cannot be applied on multi-line comments. Suggestions cannot be applied while the pull request is queued to merge. Suggestion cannot be applied right now. Please check back later.
Hi, thank you for your work. I noticed an error in the RoPE inner product equation. Additionally, this implementation uses a different feature pairing strategy for feature subspaces rotation compared to the original paper, which I believe is worth noting to avoid confusion.
Ref: https://github.com/pytorch/torchtune/blob/main/torchtune/modules/position_embeddings.py#L117
Cheer,