Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

problems when running bash setup.sh #15

Open
Qinmayyear opened this issue Nov 28, 2024 · 3 comments
Open

problems when running bash setup.sh #15

Qinmayyear opened this issue Nov 28, 2024 · 3 comments

Comments

@Qinmayyear
Copy link

Qinmayyear commented Nov 28, 2024

Got following errors:

ERROR: Invalid requirement: 'torch==2.1.0 torchvision==0.16.0 torchaudio==2.1.0': Expected end or semicolon (after version specifier)
    torch==2.1.0 torchvision==0.16.0 torchaudio==2.1.0
         ~~~~~~~~^ (from line 17 of requirements.txt)

After I did changes on line 17 of requirements.txt which I separate them to several lines, I got:

ERROR: Could not find a version that satisfies the requirement omegaconf==2.3.0 (from versions: none)
ERROR: No matching distribution found for omegaconf==2.3.0
@Qinmayyear Qinmayyear changed the title problem using bash setup.sh problems when running bash setup.sh Nov 28, 2024
@princepride
Copy link
Contributor

You cannot write torch==2.1.0 torchvision==0.16.0 torchaudio==2.1.0 --index-url https://download.pytorch.org/whl/cu121 in your requirements file, you should install it use terminal command: pip install torch==2.1.0 torchvision==0.16.0 torchaudio==2.1.0 --index-url https://download.pytorch.org/whl/cu121

@Qinmayyear
Copy link
Author

Thank you for your assistance. There's another conflict in the requirements.txt which:

ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
torch 2.1.0+cu121 requires triton==2.1.0, but you have triton 2.2.0 which is incompatible.

Should I change to triton 2.1.0?

Also, will I able to perform inference on videos around 1 minutes using 7B model with an RTX 4090 GPU?

@princepride
Copy link
Contributor

  • I didn't encounter this issue; perhaps you should try downgrading the Triton version to 2.1.0.
  • This depends on how many keyframes you select from a video. If the default is 8 frames, then the RTX 4090 can run it normally.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants