WARNING: THIS SITE IS A MIRROR OF GITHUB.COM / IT CANNOT LOGIN OR REGISTER ACCOUNTS / THE CONTENTS ARE PROVIDED AS-IS / THIS SITE ASSUMES NO RESPONSIBILITY FOR ANY DISPLAYED CONTENT OR LINKS / IF YOU FOUND SOMETHING MAY NOT GOOD FOR EVERYONE, CONTACT ADMIN AT ilovescratch@foxmail.com
Skip to content

Conversation

@copybara-service
Copy link
Contributor

@copybara-service copybara-service bot commented Dec 3, 2025

Move bias add to after the dot operation.

Prior functionality added the bias prior to the dot operation by initializing the accumulator along with quantization offsets. This assumes that the bias has the same scale as the inputs.

New functionality adds the bias after the dot operation and uses the scale attached to the bias node (not the scale of the inputs). By moving this out, we are able to fuse it with other post-dot scaling operations.

Prior functionality added the bias prior to the dot operation by initializing the accumulator along with quantization offsets. This assumes that the bias has the same scale as the inputs.

New functionality adds the bias after the dot operation and uses the scale attached to the bias node (not the scale of the inputs). By moving this out, we are able to fuse it with other post-dot scaling operations.

PiperOrigin-RevId: 839642272
@copybara-service copybara-service bot changed the title Move bias add to after dot Move bias add to after the dot operation. Dec 3, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant