WARNING: THIS SITE IS A MIRROR OF GITHUB.COM / IT CANNOT LOGIN OR REGISTER ACCOUNTS / THE CONTENTS ARE PROVIDED AS-IS / THIS SITE ASSUMES NO RESPONSIBILITY FOR ANY DISPLAYED CONTENT OR LINKS / IF YOU FOUND SOMETHING MAY NOT GOOD FOR EVERYONE, CONTACT ADMIN AT ilovescratch@foxmail.com
Skip to content

Commit a1c7584

Browse files
committed
block info
1 parent f668c2e commit a1c7584

File tree

1 file changed

+3
-0
lines changed

1 file changed

+3
-0
lines changed

comfy/ldm/lumina/model.py

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -625,7 +625,10 @@ def _forward(self, x, timesteps, context, num_tokens, attention_mask=None, trans
625625
img, mask, img_size, cap_size, freqs_cis = self.patchify_and_embed(x, cap_feats, cap_mask, t, num_tokens, transformer_options=transformer_options)
626626
freqs_cis = freqs_cis.to(img.device)
627627

628+
transformer_options["total_blocks"] = len(self.layers)
629+
transformer_options["block_type"] = "double"
628630
for i, layer in enumerate(self.layers):
631+
transformer_options["block_index"] = torch.tensor(i, dtype=torch.uint8, device=img.device)
629632
img = layer(img, mask, freqs_cis, adaln_input, transformer_options=transformer_options)
630633
if "double_block" in patches:
631634
for p in patches["double_block"]:

0 commit comments

Comments
 (0)