HPE ILO6 InvalidOperationForSystemState during power state set

Bug #2016307 reported by Samuel Kunkel
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
Sushy
New
Undecided
Unassigned

Bug Description

During power state handling sushy fails to set power state for HPE ILO6.

So far this is only observable for setting the Power State to down. And error does only occur sporadic.

Component Version:
Ironic (Condcutor, Api): Zed
sushy==4.4.1

Traceback:
https://paste.debian.net/1277349/

Revision history for this message
Samuel Kunkel (xshyve) wrote :
Download full text (9.7 KiB)

INFO ironic.conductor.task_manager [None req-01c83d87-b8e2-469c-bdc0-3c5d51edb050 a1336539b1204341a2cc47d59cda0837 8299bb6b4a4b4af5a7ca24d31a9cc42d - - default default] Node cdef5b7c-2650-4175-9ebd-1a10686bcac2 moved to provision state "deleting" from state "active"; target provision state is "available"
INFO eventlet.wsgi.server [None req-01c83d87-b8e2-469c-bdc0-3c5d51edb050 a1336539b1204341a2cc47d59cda0837 8299bb6b4a4b4af5a7ca24d31a9cc42d - - default default] ::ffff:10.255.106.2 "POST / HTTP/1.1" status: 200 len: 211 time: 0.4214077
INFO ironic.conductor.utils [None req-01c83d87-b8e2-469c-bdc0-3c5d51edb050 a1336539b1204341a2cc47d59cda0837 8299bb6b4a4b4af5a7ca24d31a9cc42d - - default default] Node cdef5b7c-2650-4175-9ebd-1a10686bcac2 current power state is 'power on', requested state is 'power off'.
INFO ironic.conductor.utils [None req-01c83d87-b8e2-469c-bdc0-3c5d51edb050 a1336539b1204341a2cc47d59cda0837 8299bb6b4a4b4af5a7ca24d31a9cc42d - - default default] Successfully set node cdef5b7c-2650-4175-9ebd-1a10686bcac2 power state to power off by power off.
INFO ironic.conductor.manager [None req-01c83d87-b8e2-469c-bdc0-3c5d51edb050 a1336539b1204341a2cc47d59cda0837 8299bb6b4a4b4af5a7ca24d31a9cc42d - - default default] Successfully unprovisioned node cdef5b7c-2650-4175-9ebd-1a10686bcac2 with instance None.
INFO ironic.conductor.task_manager [None req-01c83d87-b8e2-469c-bdc0-3c5d51edb050 a1336539b1204341a2cc47d59cda0837 8299bb6b4a4b4af5a7ca24d31a9cc42d - - default default] Node cdef5b7c-2650-4175-9ebd-1a10686bcac2 moved to provision state "cleaning" from state "deleting"; target provision state is "available"
ERROR oslo.service.loopingcall [-] Dynamic backoff interval looping call 'ironic.conductor.utils.node_wait_for_power_state.<locals>._wait' failed: oslo_service.loopingcall.LoopingCallTimeOut: Looping call timed out after 69.25 seconds
ERROR oslo.service.loopingcall Traceback (most recent call last):
ERROR oslo.service.loopingcall File "/usr/venv-ironic/lib64/python3.8/site-packages/oslo_service/loopingcall.py", line 154, in _run_loop
ERROR oslo.service.loopingcall idle = idle_for_func(result, self._elapsed(watch))
ERROR oslo.service.loopingcall File "/usr/venv-ironic/lib64/python3.8/site-packages/oslo_service/loopingcall.py", line 349, in _idle_for
ERROR oslo.service.loopingcall raise LoopingCallTimeOut(
ERROR oslo.service.loopingcall oslo_service.loopingcall.LoopingCallTimeOut: Looping call timed out after 69.25 seconds
ERROR oslo.service.loopingcall
ERROR ironic.conductor.utils [None req-01c83d87-b8e2-469c-bdc0-3c5d51edb050 a1336539b1204341a2cc47d59cda0837 8299bb6b4a4b4af5a7ca24d31a9cc42d - - default default] Timed out after 60 secs waiting for power on on node cdef5b7c-2650-4175-9ebd-1a10686bcac2.: oslo_service.loopingcall.LoopingCallTimeOut: Looping call timed out after 69.25 seconds
ERROR ironic.conductor.utils [None req-01c83d87-b8e2-469c-bdc0-3c5d51edb050 a1336539b1204341a2cc47d59cda0837 8299bb6b4a4b4af5a7ca24d31a9cc42d - - default default] Failed to prepare node cdef5b7c-2650-4175-9ebd-1a10686bcac2 for cleaning: Failed to set node power state to power on.: ironic.common.exception.PowerStateFailure: Failed to set node pow...

Read more...

Revision history for this message
Julia Kreger (juliaashleykreger) wrote :

So, we've run into a similar issue with Dell hardware in the past. I'll propose ap atch, but I'm unsure without more detailed logs (i.e. ones indicating the time of the events being logged), to know if it will fix it or not.

Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix included in openstack/sushy 4.5.0

This issue was fixed in the openstack/sushy 4.5.0 release.

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.