How do you guys quickly sync your settings (especially bash aliases and ssh keys) across your machines?
Ideally i want a simple script to run on every new server I work with. Any suggestions?
I suggest you don’t sync SSH keys. That’s just increasing the blast radius of any one of those machines being compromised.
Exactly this. Don’t move private keys between machines. Generate them where you need them, it’s not like they cost anything
Right. Use some kind of centralized authentication like freeipa.
For bash aliases, I just pull down a .bashrc from github gists.
OP should just generate a unique SSH key per device (+ user).
Agreed. I’ve probably got 100 keys registered with GitHub and 98 of them the private key is long destroyed due to OS reinstalls or whatnot. Format machine, new key. New machine, new key.
FYI: You can remove the old keys from GitHub.
Is the url is easy to rember?
I mean, you want to copy the public keys that represents your machines, right?
I’m surprised no one mentioned ansible yet. It’s meant for this (and more).
By ssh keys I assume you’re talking about authorized_keys, not private keys. I agree with other posters that private keys should not be synced, just generate new ones and add them to the relevant servers authorized_keys with ansible.
If the keys are password protected… eh why not sync them.
Also ssh certificates are a thing, they make doing that kind of stuff way easier instead of updating known hosts and authorized keys all the time
I use Ansible for this as well. It’s great. I encrypt secrets with Ansible vault and then use it to set keys, permissions, config files, etc. across my various workstations. Makes setup and troubleshooting a breeze.
Dotfiles go in git, SSH keys are state.
I’m looking to migrate to home-manager though because I use Nix on all my devices anyways.
Home manager is great
I also have multiple versions of by bash_profile with syntax specific to the OS. It checks if we’re on MacOS or Linux with a kernel check and then reads the appropriate ancillary bash_profile for that platform. Anything that can live in the main bash_profile with the same command on both platforms lives there and anything that needs to be system-specific is in the other one.
I have all my important functions as individual files that get loaded with the following:
function loadfuncs() { local funcdir="$HOME/.dotfiles/functions/" [ -e "${funcdir}".DS_Store ] && rm "$HOME/.dotfiles/functions/.DS_Store" local n=0 for i in "${funcdir}"*; do source "$(realpath $i)" n=$(( n + 1 )) done } loadfuncs
This looks popular: www.chezmoi.io
+1 this, it is amazing. The scripting features are the cherry on top.
Git and GNU stow.
I love this solution, I’ve been using it for years. I had previously just been using the home directory is a git repo approach, and it never quite felt natural to me and came with quite a few annoyances. Adding stow to the mix was exactly what I needed.
This is the only answer for me. Bonus points if your .login file does a background git pull.
Ditto – I’ve been keeping a central to me git repo for my settings for years. Any new machine I’m on ‘git clone ; ./settings/setup.sh’, then my pull’d .profile does a git pull on login.
Syncthing. If you want flatpak, syncthingy.
Its simply best, does all the annoying background things like webUI, machines, versioning, verifying etc. If you disable global discovery you can use it tough LAN only
yadm
Yet Another Stow-Based Dotfile Sync Manager
yas-bdsm
On my devices like PCs, laptops or phones, syncthing syncs all my .rc files, configs, keys, etc.
For things like servers, routers, etc. I rely on OpenSSH’s ability to send over environmental variables to send my aliases and functions.
On the remote I have
[ -n "$SSH_CONNECTION" ] && eval "$(echo "$LC_RC" | { { base64 -d || openssl base64 -d; } | gzip -d; } 2>/dev/null)"
in whatever is loaded when I connect (.bashrc, usually)
On the local machine
alias ssh="$([ -z "$SSH_CONNECTION" ] && echo 'LC_RC=$(gzip < ~/.rc | base64 -w 0)') ssh'
That’s not the best way to do that by any means (it doesn’t work with dropbear, for example), but for cases like that I have other non-generic, one-off solutions.
Have you considered a shared folder with Syncthing?
I use a git repo combined with the basic install utility. Clone the repo, run the app installer, then run the install script. For symlinks I just use a zsh script.
Thanks that’s a good idea.
I keep my dotfiles in a got repo and just do a
git pull
your update them. That could definitely be a cron job if you needed.SSH keys are a little trickier. I’d like to tell you I have a unique key for each of my desktop machines since that would be best practice, but that’s not the case. Instead I have a Syncthing shared folder. When I get around to cleaning that up, I’ll probably do just that and keep an
authorize_keys
andknown_hosts
file in git so I can pull them to needed hosts and a cron job to keep them updated.Several good suggestions on here already. Home manager might be another approach.
My solution is not ideal:
I created a directory, called ~/config_sync. I create sym links for config files, like ~/.bashtc to ~/config_sync/bashrc
However, I need to record the sym links I’ve created, and repeat this process on new machines
Look into using GNU stow! It’s exactly what you’re doing but it creates the symlinks for you.
deleted by creator