I'm running Ansible playbook and it works fine on one machine.
On a new machine when I try for the first time, I get the following error.
17:04:34 PLAY [appservers] *************************************************************
17:04:34
17:04:34 GATHERING FACTS ***************************************************************
17:04:34 fatal: [server02.cit.product-ref.dev] => {'msg': "FAILED: (22, 'Invalid argument')", 'failed': True}
17:04:34 fatal: [server01.cit.product-ref.dev] => {'msg': "FAILED: (22, 'Invalid argument')", 'failed': True}
17:04:34
17:04:34 TASK: [common | remove old ansible-tmp-*] *************************************
17:04:34 FATAL: no hosts matched or all hosts have already failed -- aborting
17:04:34
17:04:34
17:04:34 PLAY RECAP ********************************************************************
17:04:34 to retry, use: --limit @/var/lib/jenkins/site.retry
17:04:34
17:04:34 server01.cit.product-ref.dev : ok=0 changed=0 unreachable=1 failed=0
17:04:34 server02.cit.product-ref.dev : ok=0 changed=0 unreachable=1 failed=0
17:04:34
17:04:34 Build step 'Execute shell' marked build as failure
17:04:34 Finished: FAILURE
This error can be resolved, if I first go to the source machine (from where I'm running the ansible playbook) and manually ssh to the target machine (as the given user) and enter "yes" for known_hosts file entry.
Now, if I run the same ansible playbook second time, it works without an error.
Therefore, how can I suppress the prompt what SSH gives while making ssh known_hosts entry for the first time for a given user (~/.ssh folder, file known_hosts)?
I found I can do this if I use the following config entries in ~/.ssh/config file.
~/.ssh/config
# For vapp virtual machines
Host *
StrictHostKeyChecking no
UserKnownHostsFile=/dev/null
User kobaloki
LogLevel ERROR
i.e. if I place the above code in the user's ~/.ssh/config file of a remote machine and try Ansible playbook for the first time, I won't be prompted for entring "yes" and playbook will run successfully (without requiring the user to manually create a known_hosts file entry from the source machine to the target/remote machine).
My questions:
1. What security issues I should take care if I go ~/.ssh/config way
2. How can I pass the settings (what's there in the config file) as parameters/options to ansible at command line so that it will run first time on a new machine (without prompting / depending upon the known_hosts file entry on the source machine for the target machine?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…