Timeline for Keep two entire Linux desktop systems in sync
Current License: CC BY-SA 4.0
22 events
| when toggle format | what | by | license | comment | |
|---|---|---|---|---|---|
| Oct 14 at 9:33 | vote | accept | daniele_athome | ||
| Sep 3 at 8:27 | answer | added | Basile Starynkevitch | timeline score: 1 | |
| Sep 3 at 7:41 | answer | added | jpa | timeline score: 2 | |
| Sep 2 at 16:59 | comment | added | duct_tape_coder | On the Windows side, online cloud providers like OneDrive offer disk syncing. The files may not be immediately available on both systems but are represented on both sides and trying to access a file on the system where it's currently not causes a seamless download in the background. I don't know if there's an official linux equivalent but there's a GitHub project for Onedrive support: github.com/abraunegg/onedrive | |
| Sep 2 at 16:26 | comment | added | Vladimir Cravero | Well keep in mind that a VM is a turnkey solution @daniele_athome - it will just work and is super easy to setup, no need to manage what you sync and what you do not sync or whatever - it literally is the same computer every time you fire it up. It also makes backing up the system trivial. | |
| Sep 2 at 13:30 | comment | added | reinierpost | Can you guarantee they will never be used at the same time? If not, the task is only doable if you share all data files across both systems. Including databases, browser profiles, manually edited config files, etc., but excluding system logs. You could then use a distributed file system spanning both systems for redundancy. | |
| Sep 2 at 13:20 | answer | added | rugk | timeline score: 3 | |
| Sep 2 at 12:07 | comment | added | daniele_athome | @VladimirCravero ok I understand what you mean now: I thought you meant accessing the VM disk image through a remote file system of sort (while the VM is running). I'll keep your approach in mind, it surely has its advantages. | |
| Sep 2 at 11:44 | comment | added | Vladimir Cravero | @DanieleRicci well you'd need to be "online" only to sync the disk between the machines. regarding performances, I am hesitant to liquidate this with "too many read writes" - hypervisor software is really really good nowadays and can get tricky only if you need graphic acceleration to be honest. | |
| Sep 2 at 8:45 | answer | added | William Hay | timeline score: 0 | |
| Sep 2 at 8:26 | comment | added | daniele_athome | @VladimirCravero not suitable for a development environment, sorry (too many read/writes); also I need to be able to work offline too. | |
| Sep 2 at 8:25 | comment | added | daniele_athome | @MarkSetchell actually I'm already a happy user of syncthing, but I use it only for a small subset of data that I need to sync among my devices. I was considering using it for this as well, but I'd rather stick to basic Linux tools (eventually I would publish the source, if I'll ever write it :-) | |
| Sep 2 at 6:42 | comment | added | Vladimir Cravero | This is not worthy of an answer, but if you can accept the performance penalty you could use a virtual machine and have the disk image in some shared folder of some sorts. | |
| Sep 1 at 19:38 | comment | added | Mark Setchell | You might find syncthing useful for some of your $HOME data. | |
| Sep 1 at 18:47 | history | became hot network question | |||
| Sep 1 at 15:45 | history | edited | Kusalananda♦ | edited tags | |
| Sep 1 at 13:31 | comment | added | oldfred | I have laptop, desktop, 2 full installs on external SSD, & several flash drives. I use rsync, but really only for data. My /home is inside my / and I only sync that when I do new install. same with exported list of apps, but now mostly in install script. I regularly use rsync to copy data partitions. I have to be careful with --delete, so my desktop is main system & use delete on copy to others, but no delete on copy back to desktop. Desktop also has full backup from NVMe drive to HDD drive. All rsync copy commands are in script files unique to each device as I use labels to mount. | |
| Sep 1 at 12:17 | answer | added | Marcus Müller | timeline score: 10 | |
| Sep 1 at 11:47 | comment | added | daniele_athome | I'd like to avoid network latency if possible - also, one of my requirements is disaster recovery, so data must be duplicated on both machines. | |
| Sep 1 at 11:15 | comment | added | td211 | What about placing your /home on a shared drive (maybe some external SSD for rsyncing)? Or mount /home during boot from a network shared disk? As for system related files, I don't think it is a good idea to copy files back and forth. It may work or fail in mysterious ways. Just install the same packages using apt. | |
| S Sep 1 at 10:36 | review | First questions | |||
| Sep 1 at 15:23 | |||||
| S Sep 1 at 10:36 | history | asked | daniele_athome | CC BY-SA 4.0 |