I have computer 1 logging voltage data to a file volts.json every second.
My second computer connects via ssh and grabs that file every 5 minutes. Splunk indexes that file for a dashboard.
Is scp efficient in this manner, if so then ok. Next is how to manage the file and keep it small without growing to 2mb lets say? is there a command to roll off the earlier logs and keep the newest?
the json looks like this right now:
{ "measuredatetime": "2022-06-27T18:00:10.915668", "voltage": 207.5, "current_A": 0.0, "power_W": 0.0, "energy_Wh": 2, "frequency_Hz": 60.0, "power_factor": 0.0, "alarm": 0 } { "measuredatetime": "2022-06-27T18:00:11.991936", "voltage": 207.5, "current_A": 0.0, "power_W": 0.0, "energy_Wh": 2, "frequency_Hz": 59.9, "power_factor": 0.0, "alarm": 0 }
scpis ok, you might want to add-Cif you're not doing it already, since that kind of data will be compressed a lot. The other questions depend on the program doing the logging and the program doing the rendering. You could also mount viasshfsand follow the file directly, for example.rsync, as stated here rsync support lots of compressions