Timeline for Calculate max TPS from log file
Current License: CC BY-SA 3.0
8 events
| when toggle format | what | by | license | comment | |
|---|---|---|---|---|---|
| Nov 10, 2015 at 19:21 | vote | accept | Pancho | ||
| Nov 10, 2015 at 19:15 | answer | added | Jeff Schaller♦ | timeline score: 0 | |
| Nov 10, 2015 at 16:56 | comment | added | Pancho | Yes, the input is sorted on the second column (c12-c19), so it's OK to have uniq before sort in this case ...and yes, that solution works for me. I'm used to using sort -u and didn't know about the -c option for uniq. I tried it and it now takes minutes vs hours previously! Thanks so much! If you post the answer I'll select it. | |
| Nov 9, 2015 at 20:57 | comment | added | Jeff Schaller♦ | I would normally sort before uniq, but I'm relying on the OP's claim that the input is sorted. | |
| Nov 9, 2015 at 20:54 | comment | added | David King | @JeffSchaller I think you want to switch the uniq and sort in your pipeline. uniq only combines (and counts) matching adjacent lines. cut -c12-19 $log_file | sort -rn | uniq -c | head -1 | |
| Nov 9, 2015 at 20:49 | comment | added | Jeff Schaller♦ | I may not understand your problem, but is cut -c12-19 $log_file | uniq -c | sort -rn | head -1 what you're looking for? | |
| Nov 9, 2015 at 20:48 | review | First posts | |||
| Nov 9, 2015 at 21:34 | |||||
| Nov 9, 2015 at 20:43 | history | asked | Pancho | CC BY-SA 3.0 |