[nailgun agent] untimely lshw command run observation
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
Fuel for OpenStack |
Fix Released
|
High
|
Alexey Elagin | ||
8.0.x |
Fix Released
|
High
|
Sergii Rizvan | ||
Mitaka |
Fix Released
|
High
|
Sergii Rizvan | ||
Newton |
Fix Committed
|
High
|
Sergii Rizvan |
Bug Description
Description
============
I'm observing frequently the process lshw consuming up one CPU from time to time. Actually, on my env every minute during ~20 seconds (real hardware, mos 8, controller node, I presume it's true for all nodes)
I'm wondering why the command is ran so frequently?
root 30762 0.0 0.0 4440 652 ? Ss 15:03 0:00 \_ /bin/sh -c flock -w 0 -o /var/lock/
.log | /usr/bin/logger -t nailgun-agent"
root 30763 0.0 0.0 5896 608 ? S 15:03 0:00 \_ flock -w 0 -o /var/lock/
/usr/bin/logger -t nailgun-agent
root 30764 0.0 0.0 4440 628 ? S 15:03 0:00 \_ /bin/sh -c /usr/bin/
root 30765 0.7 0.0 175408 23296 ? Sl 15:03 0:00 \_ ruby /usr/bin/
root 33668 96.1 0.1 501008 475844 ? R 15:03 0:17 | \_ /usr/bin/lshw -json
root 30766 0.0 0.0 5916 696 ? S 15:03 0:00 \_ tee -a /var/log/
root 30767 0.0 0.0 5908 704 ? S 15:03 0:00 \_ /usr/bin/logger -t nailgun-agent
Expected result/impact
=======
IMHO, this is definitely overkill, a waste of cpu time and useless because this kind of information is nearly static (hardware doesn't change so frequently right), the nailgun agent should run it once (or at least much less frequently if a use case requires 'fresh' hardware informations?), and I'm not talking about possible interference on the useful workload which could be impacted as well.
Please, could somebody confirm and fix it if accurate.
Step to reproduce
===============
deploy an env with fuel mos8
Changed in fuel: | |
milestone: | none → 9.0 |
assignee: | nobody → Fuel Python Team (fuel-python) |
tags: | added: area-python |
tags: |
added: module-nailgun-agent removed: area-python |
tags: | added: area-python |
Changed in fuel: | |
status: | New → Confirmed |
Changed in fuel: | |
importance: | Undecided → Medium |
Changed in fuel: | |
assignee: | Fuel Python Team (fuel-python) → Andrey Danin (gcon-monolake) |
Changed in fuel: | |
assignee: | Andrey Danin (gcon-monolake) → Alexey Elagin (aelagin) |
Changed in fuel: | |
milestone: | 9.0 → 10.0 |
Changed in fuel: | |
status: | Confirmed → Incomplete |
tags: | added: 9.1-proposed |
tags: | added: on-verification |
tags: | added: on-verification |
tags: | added: on-verification |
On scale-lab hardware the average CPU utilization for lshw is 62%. Here're the details collected by atop (10 minutes interval):
$ atop -r atop_20160316 -P PRC | grep lshw | head -n 22
PRC node-293 1458086421 2016/03/16 00:00:21 20 28722 (lshw) R 100 1592 46 0 120 0 0 5 0
PRC node-293 1458086441 2016/03/16 00:00:41 20 28722 (lshw) E 100 2492 88 0 0 0 0 0 0
PRC node-293 1458086501 2016/03/16 00:01:41 20 37072 (lshw) R 100 1283 35 0 120 0 0 16 0
PRC node-293 1458086521 2016/03/16 00:02:01 20 37072 (lshw) E 100 2240 58 0 0 0 0 0 0
PRC node-293 1458086541 2016/03/16 00:02:21 20 40720 (lshw) R 100 758 19 0 120 0 0 31 0
PRC node-293 1458086561 2016/03/16 00:02:41 20 40720 (lshw) E 100 2300 62 0 0 0 0 0 0
PRC node-293 1458086601 2016/03/16 00:03:21 20 7711 (lshw) R 100 538 23 0 120 0 0 7 0
PRC node-293 1458086621 2016/03/16 00:03:41 20 7711 (lshw) R 100 1934 64 0 120 0 0 2 0
PRC node-293 1458086641 2016/03/16 00:04:01 20 7711 (lshw) E 100 2529 91 0 0 0 0 0 0
PRC node-293 1458086681 2016/03/16 00:04:41 20 15749 (lshw) R 100 1846 55 0 120 0 0 23 0
PRC node-293 1458086701 2016/03/16 00:05:01 20 15749 (lshw) E 100 676 41 0 0 0 0 0 0
PRC node-293 1458086741 2016/03/16 00:05:41 20 22965 (lshw) R 100 1399 45 0 120 0 0 8 0
PRC node-293 1458086761 2016/03/16 00:06:01 20 22965 (lshw) E 100 2596 103 0 0 0 0 0 0
PRC node-293 1458086781 2016/03/16 00:06:21 20 26855 (lshw) R 100 1176 25 0 120 0 0 5 0
PRC node-293 1458086801 2016/03/16 00:06:41 20 26855 (lshw) E 100 2236 43 0 0 0 0 0 0
PRC node-293 1458086841 2016/03/16 00:07:21 20 34339 (lshw) R 100 373 9 0 120 0 0 7 0
PRC node-293 1458086861 2016/03/16 00:07:41 20 34339 (lshw) R 100 1936 62 0 120 0 0 28 0
PRC node-293 1458086881 2016/03/16 00:08:01 20 34339 (lshw) E 100 2478 86 0 0 0 0 0 0
PRC node-293 1458086921 2016/03/16 00:08:41 20 1665 (lshw) R 100 1034 35 0 120 0 0 30 0
PRC node-293 1458086941 2016/03/16 00:09:01 20 1665 (lshw) E 100 2547 96 0 0 0 0 0 0
PRC node-293 1458086961 2016/03/16 00:09:21 20 5307 (lshw) R 100 1254 29 0 120 0 0 30 0
PRC node-293 1458086981 2016/03/16 00:09:41 20 5307 (lshw) E 100 2215 56 0 0 0 0 0 0