I have computer 1 logging voltage data to a file volts.json
every second.
My second computer connects via ssh
and grabs that file every 5 minutes. Splunk indexes that file for a dashboard.
Is scp
efficient in this manner, if so then ok. Next is how to manage the file and keep it small without growing to 2mb lets say? is there a command to roll off the earlier logs and keep the newest?
the json looks like this right now:
{
"measuredatetime": "2022-06-27T18:00:10.915668",
"voltage": 207.5,
"current_A": 0.0,
"power_W": 0.0,
"energy_Wh": 2,
"frequency_Hz": 60.0,
"power_factor": 0.0,
"alarm": 0
}
{
"measuredatetime": "2022-06-27T18:00:11.991936",
"voltage": 207.5,
"current_A": 0.0,
"power_W": 0.0,
"energy_Wh": 2,
"frequency_Hz": 59.9,
"power_factor": 0.0,
"alarm": 0
}
scp
is ok, you might want to add-C
if you're not doing it already, since that kind of data will be compressed a lot. The other questions depend on the program doing the logging and the program doing the rendering. You could also mount viasshfs
and follow the file directly, for example. – Eduardo Trápani Jun 28 '22 at 00:49rsync
, as stated here rsync support lots of compressions – k.Cyborg Jun 28 '22 at 11:57-Kusalananda Its a python script file write append mode
Thanks, I thought about rsync but wasnt sure -k.Cyborg
– Tom Jun 28 '22 at 12:21