Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merging the latest master changes from the original repo #3

Draft
wants to merge 712 commits into
base: master
Choose a base branch
from

Conversation

ghost
Copy link

@ghost ghost commented Jul 17, 2024

Merging the latest master changes from the original repo

@ghost
Copy link
Author

ghost commented Jul 17, 2024

@fksato
why can't i add reviewers in this PR ?

@ghost ghost self-assigned this Jul 23, 2024
@ghost ghost marked this pull request as draft July 23, 2024 09:18
huchenlei and others added 27 commits November 15, 2024 20:17
* Update web content to release v1.3.44

* nit
This one should work for skipping the single layers of models like Flux
and Auraflow.

If you want to see how these models work and how many double/single layers
they have see the "ModelMerge*" nodes for the specific model.
* Update UI ScreenShot in README

* Remove legacy UI screenshot file

* nit

* nit
Add a way to reshape lora weights.

Allow weight patches to all weight not just .weight and .bias

Add a way for a lora to set a weight to a specific value.
At length=1, the LTX model can do txt2img and img2img with no other changes required.
comfyanonymous and others added 30 commits January 16, 2025 14:54
* Use `torch.special.expm1`

This function provides greater precision than `exp(x) - 1` for small values of `x`.

Found with TorchFix https://github.com/pytorch-labs/torchfix/

* Use non-alias
)

* [i18n] Add /i18n endpoint to provide all custom node translations

* Sort glob result for deterministic ordering

* Update comment
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.