Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

lib/bb/fetch2/git.py: added handling of password #16

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

SvOlli
Copy link

@SvOlli SvOlli commented Sep 22, 2017

I had problems in yocto trying to download a password protected git repository hosted via http.

Tracking down the problem, I noticed that the git fetcher of bitbake only handles the password, but drops the password. This is the change that fixed the problem for me. I hope it will be of use for others as well.

Trying to checkout sourcecode which is username/password protected fails.
This is a fix that also hands down the password to the downloader.
@alkino
Copy link

alkino commented Jun 12, 2018

This patch seems interesting but you should read README to see how contributing, Pull Request are never read and cannot be disable on github.

@leiflm
Copy link
Contributor

leiflm commented Sep 23, 2020

I just stumbled over this PR and wanted to let other users know, that bitbake can make use of the git-credentials.

Simplify put your credentials into a file (i.e. .git-credentials):

https://my_https_user:[email protected]

and configure git (and bitbake) to use it:

git config --global credential.helper "store --file $PWD/.git-credentials"

@timblaktu
Copy link

@leiflm I stumbled here trying to figure out why bitbake's git fetcher is not only failing to authenticate to my private, token-based bitbucket URI, but is deleting my .git-credentials file!

I'm running bitbake in a container, that worked great until I set SRCREV to AUTOREV, and using the .git-credentials file works correctly in every manual use case (cloning in host and in container shell) except the bitbake fetcher in the AUTOREV case (which fails early trying to ls-remote to see if a new Rev needs to be pulled). I have cleaned sstate for the recipe in question (kernel bbappend using my kernel fork), and this problem persists.

Ive also tried putting the creds in a .netrc in the container to no avail, so I'm thinking it's something specific to the fetcher use case when SRCREV=AUTOREV. I haven't been able to debug the git.py code beyond confirming that the .git-credentials file exists before making the git ls-remote call, and does not exist afterwards.

Any ideas?

@timblaktu
Copy link

timblaktu commented Dec 20, 2024

Using GIT_TRACE=1 to compare git ls-remote traces in my two scenarios, I see that in the working case, manually running the command in the host or in the container, I see this sequence:

.
.
09:03:29.428844 run-command.c:668       trace: run_command: git-remote-https https://[email protected]/path/to/my/private/repo https://[email protected]/path/to/my/private/repo
09:03:29.967662 run-command.c:668       trace: run_command: 'git credential-store get'
09:03:29.972777 git.c:455               trace: built-in: git credential-store get
09:03:30.237799 run-command.c:668       trace: run_command: 'git credential-store store'
09:03:30.242330 git.c:455               trace: built-in: git credential-store store

..followed by successful output.

But in the failed case, with bb fetcher running, it is the same except it does not progress from git credential-store get to 'git credential-store store', but instead transitions to a git credential-store erase, which seems to be a smoking gun, but I don't know who's holding it. Also, I see an extra "exec" line, which seems suspicious:

.
.
17:15:08.878086 run-command.c:657       trace: run_command: git remote-https https://[email protected]/path/to/my/private/repo https://[email protected]/path/to/my/private/repo
17:15:08.884345 git.c:750               trace: exec: git-remote-https https://[email protected]/path/to/my/private/repo https://[email protected]/path/to/my/private/repo
17:15:08.884535 run-command.c:657       trace: run_command: git-remote-https https://[email protected]/path/to/my/private/repo https://[email protected]/path/to/my/private/repo
17:15:09.381721 run-command.c:657       trace: run_command: 'git credential-store get'                                                                                                         
17:15:09.384345 git.c:463               trace: built-in: git credential-store get                                                                                                              
17:15:09.531045 run-command.c:657       trace: run_command: 'git credential-store erase'                                                                                                       
17:15:09.533000 git.c:463               trace: built-in: git credential-store erase                                                     
remote: Invalid credentials

@timblaktu
Copy link

timblaktu commented Dec 20, 2024

Was finally able to reproduce this outside bitbake, using the oe environment's version of git (the one bb was using, 2.44.0), and will go digging in git repo/issues for root cause. Cranking up the verbosity in my ls-remote "invalid credentials" case, I can see:

  1. git remote-https
    1. Successfully connects to Bitbucket
    2. Successfully conducts TLS handshake
  2. git credential-store get
    1. Resuses existing socket/session
    2. Sends GET /xilica/linux-imx/info/refs?service=git-upload-pack HTTP/1.1
    3. Receives response header with 401 Unauthorized and some more data
  3. git credential-store erase
    1. This appears to be happening do to the above auth failure.

So, the main question I will ask the git devs is why the git credential-store get command does not seem to be using the existing configured git credential helper store, configured with existing default file ~/.git-credentials, before requesting refs from BB (which I would expect to fail if it did not get the cred from the store and put it in the auth header, but I do not see any evidence of it doing that.)

EDIT: I have confirmed that git 2.44.0 (running on termux aarch64) does NOT exhibit this misbehavior.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants