WARNING: THIS SITE IS A MIRROR OF GITHUB.COM / IT CANNOT LOGIN OR REGISTER ACCOUNTS / THE CONTENTS ARE PROVIDED AS-IS / THIS SITE ASSUMES NO RESPONSIBILITY FOR ANY DISPLAYED CONTENT OR LINKS / IF YOU FOUND SOMETHING MAY NOT GOOD FOR EVERYONE, CONTACT ADMIN AT ilovescratch@foxmail.com
Skip to content

Conversation

@Limerances
Copy link
Contributor

Motivation

Remove the add_bias option; this option is no longer supported.

Modifications

Usage or Command

Accuracy Tests

Checklist

  • Add at least a tag in the PR title.
    • Tag list: [[FDConfig],[APIServer],[Engine], [Scheduler], [PD Disaggregation], [Executor], [Graph Optimization], [Speculative Decoding], [RL], [Models], [Quantization], [Loader], [OP], [KVCache], [DataProcessor], [BugFix], [Docs], [CI], [Optimization], [Feature], [Benchmark], [Others], [XPU], [HPU], [GCU], [DCU], [Iluvatar], [Metax]]
    • You can add new tags based on the PR content, but the semantics must be clear.
  • Format your code, run pre-commit before commit.
  • Add unit tests. Please write the reason in this PR if no unit tests.
  • Provide accuracy results.
  • If the current PR is submitting to the release branch, make sure the PR has been submitted to the develop branch, then cherry-pick it to the release branch with the [Cherry-Pick] PR tag.

Copilot AI review requested due to automatic review settings December 8, 2025 05:31
@paddle-bot
Copy link

paddle-bot bot commented Dec 8, 2025

Thanks for your contribution!

@paddle-bot paddle-bot bot added the contributor External developers label Dec 8, 2025
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR removes the deprecated add_bias option from linear layers throughout the codebase. The parameter was previously used to control whether bias should be added in the current layer or in pre/post layers, but is no longer supported.

Key changes:

  • Removed add_bias parameter from all linear layer classes (LinearBase, ColumnParallelLinear, MergedColumnParallelLinear, ReplicatedLinear, GatedLinear, QKVParallelLinear, RowParallelLinear)
  • Updated quantization methods to unconditionally pass layer.bias instead of checking add_bias flag
  • Cleaned up test mocks that referenced the removed parameter

Reviewed changes

Copilot reviewed 5 out of 5 changed files in this pull request and generated 1 comment.

Show a summary per file
File Description
fastdeploy/model_executor/layers/linear.py Removed add_bias parameter from all linear layer class constructors and docstrings; simplified bias handling in RowParallelLinear
fastdeploy/model_executor/layers/quantization/weight_only.py Changed bias parameter to always pass layer.bias instead of conditional based on add_bias
fastdeploy/model_executor/layers/quantization/w4afp8.py Changed bias parameter to always pass layer.bias instead of conditional based on add_bias
fastdeploy/model_executor/models/gpt_oss.py Removed add_bias=True argument from RowParallelLinear instantiation
tests/quantization/test_w4afp8.py Removed mock setup of add_bias attribute from test fixtures
Comments suppressed due to low confidence (1)

tests/quantization/test_w4afp8.py:146

  • After removing the add_bias parameter, this test no longer properly tests the "without_bias" scenario. The setUp method sets self.layer.bias = "bias" (line 43), so this test now actually tests with a bias present.

To properly test the case when bias is None, you should add:

self.layer.bias = None

at the beginning of this test method (after line 141), so that it correctly verifies the behavior when no bias is provided.

    def test_apply_without_bias(self, mock_gemm):
        mock_gemm.return_value = "out"
        x = "x"

        result = self.method.apply(self.layer, x)
        self.assertEqual(result, "out")
        args = mock_gemm.call_args.kwargs
        self.assertIsNone(args["bias"])

assert with_bias, "with_bias must be True when add_bias is True."
if self.tp_size > 1 and self.reduce_results:
set_weight_attrs(self.bias, {"tp_row_bias": True})
if self.tp_size > 1 and self.reduce_results:
Copy link

Copilot AI Dec 8, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The code now unconditionally accesses self.bias when self.tp_size > 1 and self.reduce_results, but self.bias is None when with_bias=False (see line 173-179 in LinearLayer.init). This will cause an AttributeError when trying to set weight attributes on a None object.

Consider adding a condition to check if bias exists:

if self.tp_size > 1 and self.reduce_results and self.with_bias:
    set_weight_attrs(self.bias, {"tp_row_bias": True})
Suggested change
if self.tp_size > 1 and self.reduce_results:
if self.tp_size > 1 and self.reduce_results and with_bias:

Copilot uses AI. Check for mistakes.
@codecov-commenter
Copy link

Codecov Report

❌ Patch coverage is 0% with 2 lines in your changes missing coverage. Please review.
⚠️ Please upload report for BASE (develop@3066a0c). Learn more about missing BASE report.

Files with missing lines Patch % Lines
fastdeploy/model_executor/layers/linear.py 0.00% 1 Missing and 1 partial ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             develop    #5425   +/-   ##
==========================================
  Coverage           ?   59.51%           
==========================================
  Files              ?      327           
  Lines              ?    40643           
  Branches           ?     6170           
==========================================
  Hits               ?    24188           
  Misses             ?    14588           
  Partials           ?     1867           
Flag Coverage Δ
GPU 59.51% <0.00%> (?)

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

contributor External developers

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants