English 中文(简体)
Gcloud - 无法使用一个服务账户提供许多 VM
原标题:Gcloud - cannot provision many VM using one Service Account
I am using Gcloud to run Prow (Continous Integration server). One of my job creates a virtual machine, perform some tests and then delete that instance. I use a service account to create VM, run tests. #!/bin/bash set -o errexit cleanup() { gcloud compute instances delete kyma-integration-test-${RANDOM_ID} } gcloud config set project ... gcloud auth activate-service-account --key-file ... gcloud compute instances create --metadata enable-oslogin=TRUE --image debian-9-stretch-v20181009 --image-project debian-cloud --machine-type n1-standard-4 --boot-disk-size 20 trap cleanup exit gcloud compute scp --strict-host-key-checking=no --quiet :~/ gcloud compute ssh --quiet -- ./ After some time, I got following error: ERROR: (gcloud.compute.scp) INVALID_ARGUMENT: Login profile size exceeds 32 KiB. Delete profile values to make additional space. Indeed, for that service account, describe command returns a lot of data, for example ~70 entries in sshPublicKeys section. gcloud auth activate-service-account --key-file ... gcloud compute os-login describe-profile Most of this public keys refer to already removed VM instances. How to perform cleanup of this list? Or is it possible to not store that public keys at all?
问题回答
The permanent solution is to use --ssh-key-expire-after 30s. You still need to cleanup the existing keys with the solutions above or a little more command kungfu like this (without grep). for i in $(gcloud compute os-login ssh-keys list --format="table[no-heading](value.fingerprint)"); do echo $i; gcloud compute os-login ssh-keys remove --key $i || true; done NOTE: you have to be using the offending account. gcloud config account activate ACCOUNT and/or gcloud auth activate-service-account --key-file=FILE or gcloud auth login Need a new ssh key in a script: # KEYNAME should be something like $HOME/.ssh/google_compute_engine ssh-keygen -t rsa -N "" -f "${KEYNAME}" -C "${USERNAME}" || true chmod 400 ${KEYNAME}* cat > ssh-keys < ssh-keys <
A very crude way to do the above that worked for me was: for i in $(gcloud compute os-login ssh-keys list); do echo $i; gcloud compute os-login ssh-keys remove --key $i; done I stopped this (with Control-C) after deleting a few tens of keys and then it worked again. Actually, in the project metadata in the GUI, I do not see a lot of key. Only : gke...cidr : network-name... sshKeys : gke-e9... SSH Keys => peter_v : ssh-rsa my public key
In my case, I was using another service account to run ssh, so basically I m using a impersonate. If you are using an impersonation too, you need to delete the ssh key list from the service account which you re impersonating. for i in $(gcloud compute os-login ssh-keys list --impersonate-service-account="your_sc@serviceaccount.com" --format="table[no-heading](value.fingerprint)"); do echo $i; gcloud compute os-login ssh-keys remove --key $i --impersonate-service-account="your_sc@serviceaccount.com" || true; done And then add "--ssh-key-expire-after=7m" the amount of time is defined by your needs gcloud compute ssh ${MY_VM} --zone ${GKE_ZONE} --project ${PROJECT_ID} --tunnel-through-iap --ssh-key-expire-after=7m --impersonate-service-account="your_sc@serviceaccount.com"
These key are stored in your Project Metadata yo can remove them by deleting trough the Google Console UI
Seeing as you were mentioning OS Login in your question: there is a way to delete specific SSH keys from a user s profile using this command. Alternatively, instead of performing SCP, I d advise you, much like John Hanley has, to put the file you re copying into the instance in Storage and retrieve it via a startup script (you could also use a custom Compute image).
thanks @peter_v, Got it resolved. :)




相关问题
saving just the deltas over a network/internet

Is there currently a filesystem agnostic way to have a file in two locations on a network, change one copy, and have just the changed bits (or more likely blocks) synced to the other copy? It would ...

cloud hosting vs. managed hosting [closed]

It seems the hype about cloud computing cannot be avoided, but the actual transition to that new platform is subject to many discussions... From a Theoretical viewpoint, the following can be said: ...

Many users, many cpus, no delays. Good for cloud?

I wish to set up a CPU-intensive time-important query service for users on the internet. A usage scenario is described below. Is cloud computing the right way to go for such an implementation? If so, ...

Pure Javascript app + Amazon S3?

I m looking to confirm or refute the following: For what I have read so far it is not possible to write a web application with only javascript -- no server side logic -- served from Amazon S3 that ...

Storing files on the Cloud or the FileSystem? [closed]

Simple question, doesn t seem to have been directly asked yet. What are the benefits to storing files (images/videos/documents/etc) on Amazon S3-and-related vs. on the File System your app is ...