-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
similar samplers #1
Comments
Not the same, though much of the code for these variants is the same. |
Can you do the other 2 samplers from the euler-smea one? The negative dy one is one of the better ones from those imo. Also it would be nice to figure out the rest of the cfgpp stuff from comfy side. For the cond scale multiplier, like you said it could be put into the cfg_denoiser script or whatever its called, but I'm not sure about how it exactly needs to be, but that would let you still use any cfg since the scale would be adjusted on a per sampler basis as it is currently? As for the comfy versions, I'm not really sure how it works even though I tried to do the same thing in the forge scripts since a lot were already there |
You can try the 'neg' branch. I'm not totally convinced of the value, so not in main branch (yet). |
yeah not sure, it was from the ddim cfgpp implementation that was added to a1111 initially and that's how they did it there and it got moved over to forge as well (without the cfg denoiser modifications). figured it would be better to be able to use the scale normally with a multiplier instead of being restricted to sub 2 values but I wasn't able to test it out in the jerry rig I tried. didn't see the other branch, I'll try it later |
yoinked these from @yoinked-h https://pastebin.com/nW2s2uKQ also from comfy def get_sigmas_vp(n, beta_d=19.9, beta_min=0.1, eps_s=1e-3, device='cpu'):
"""Constructs a continuous VP noise schedule."""
t = torch.linspace(1, eps_s, n, device=device)
sigmas = torch.sqrt(torch.exp(beta_d * t ** 2 / 2 + beta_min * t) - 1)
return append_zero(sigmas) def get_sigmas_laplace(n, sigma_min, sigma_max, mu=0., beta=0.5, device='cpu'):
"""Constructs the noise schedule proposed by Tiankai et al. (2024). """
epsilon = 1e-5 # avoid log(0)
x = torch.linspace(0, 1, n, device=device)
clamp = lambda x: torch.clamp(x, min=sigma_min, max=sigma_max)
lmb = mu - beta * torch.sign(0.5-x) * torch.log(1 - 2 * torch.abs(0.5-x) + epsilon)
sigmas = clamp(torch.exp(lmb))
return sigmas just some stuff to look at |
cfgpp on a1111-like.... its over..... (cfgpp is not possible on a1111 without severe modifications) |
Thanks for those. |
Why not help make it work 4head |
id help but i dont work with much inference-side code, more so sampler-side |
Some samplers from clybius, haven't tried them though |
Refined Exponential Solver looks good from initial tests. TTM doesn't work for me, fails when JIT compiling. Haven't explored others yet. |
https://github.com/Kittensx/Simple_KES new shedule |
has been tested and can be used |
Wonderfully over-engineered way to generate a list of decreasing numbers. |
any idea how to work this 1111? |
There's a webUI extension for TCD. I haven't tried it, looks like DDPM, so I tried that with TCD loras and it worked very well. |
interesing |
Can you add samplers from this repository?
Judging by the name, this is the same sampler you added yesterday, plus a couple more with the same structure.
https://github.com/rabidcopy/Euler-Smea-Dyn-Sampler
rabidcopy/Euler-Smea-Dyn-Sampler@d78ff55
The text was updated successfully, but these errors were encountered: