Replies: 5 comments 6 replies
-
|
I love the Power Lora node, but yes - being able to have the lora stack output would be great! For my own use, I would be happy if the node could output a text string of all the enabled loras (to append to my prompt before saving - just for convenience and flexibility). I can accomplish this if it outputs the lora stack, since there are stack -> string nodes already out there. |
Beta Was this translation helpful? Give feedback.
-
|
WOW!! 🤩 Though I don’t know any python, your instructions were crystal clear and worked like a charm. I cannot thank you enough. I’m so grateful and over the moon with your modification 🙏🏼🙏🏼🙏🏼Sent from my iPhoneOn Mar 4, 2025, at 12:21 PM, Symbiomatrix ***@***.***> wrote:
@SebAnt333 Just did that in my side version, it's very easy. In py/power_lora_loader.py:
Add to INPUT_TYPES the variable OUTPUT_IS_LIST = (False, False, True) and modify variables RETURN_TYPES = ("MODEL", "CLIP", "STRING") , RETURN_NAMES = ("MODEL", "CLIP", "LORA_NAMES")
In load_loras, create a variable, say lnames = [], then fill it inside the loop (say after if lora is not None: lnames.append(value["lora"])).
return (model, clip, lnames)
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you were mentioned.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
|
Using Claude which I just subscribed to, I further modified it to exclude
the .safetensors suffix but add the LoRA strength rounded to 2 decimal
places. You have opened my eyes to a whole new world of modifying nodes,
because I'm more of an artist than a programmer.
if lora_name.endswith('.safetensors'):
lora_name = lora_name[:-12] # Remove last 12 characters ('.safetensors')
# Concatenate strength_model to the lora_name (with 2 decimal places)
formatted_strength = f"{strength_model:.2f}".rstrip('0').rstrip('.') if '.'
in f"{strength_model:.2f}" else f"{strength_model:.2f}"
lora_name = f"{lora_name}_{formatted_strength}"
…On Tue, Mar 4, 2025 at 1:36 PM Symbiomatrix ***@***.***> wrote:
You're welcome. Output works great, could probably just as easily pass
around the lora stack itself in some form.
Problem is the input - this node seems to have a custom loading function
which discards anything not related to loras, so whenever you reload the
new widget isn't loaded nor created. I don't even know how to make it
optional so it just throws an error.
—
Reply to this email directly, view it on GitHub
<#324 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AQKYSNVVKRDXIZYUDIUDMA32SXXETAVCNFSM6AAAAABNY6CRXWVHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTEMZZGIZTGMY>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
|
Thanks for the tip - I am making the change!
…On Tue, Mar 4, 2025 at 2:50 PM Symbiomatrix ***@***.***> wrote:
Small improvement, you can remove the .safetensors suffix more dynamically
using lora_name.rsplit(".", 1)[0] instead of fixed 12 chars and check .
This will remove everything after the last period.
—
Reply to this email directly, view it on GitHub
<#324 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AQKYSNSPE432VFLGDMMGJ332SX7YHAVCNFSM6AAAAABNY6CRXWVHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTEMZZGI4TQNY>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
|
Heya, I also love the Power Lora and I truly believe a string output of its Loras would be a great addition. I did everything as @Symbiomatrix said and I kinda made it work. My problem is two fold: 1. it outputs only one Lora name rather than all of them and 2. it just gives me the path including the name so I have to run it through a text "stripper" to get only the Lora name. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Imagine you want to use a lora stack with two different checkpoints in the same workflow, say, for creating the original image and then using it in img2img. There's not really any way to do that with the Power Lora Loader node since it can't turn any of its widgets into inputs, and you also can't give it inputs from 2 different models.
If it could output a lora stack and send it to a combiner node that takes 'model', 'clip', and 'lora stack' as input and outputs 'model' and 'clip', then a single Power Lora Loader could be used with as many KSamplers as you want.
Beta Was this translation helpful? Give feedback.
All reactions