The init_host stacktraces on startup, killing the thread:
http://logs.openstack.org/49/484949/14/check/gate-tempest-dsvm-ironic-ipa-wholedisk-bios-agent_ipmitool-tinyipa-ubuntu-xenial-nv/69b18d7/logs/screen-n-cpu.txt.gz?level=TRACE
Jul 24 22:26:26.423871 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service [None req-414af81a-44e7-4a67-9cbd-74e549deb41b None None] Error starting thread.: InternalServerError: Internal Server Error (HTTP 500)
Jul 24 22:26:26.423993 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service Traceback (most recent call last):
Jul 24 22:26:26.424084 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service File "/usr/local/lib/python2.7/dist-packages/oslo_service/service.py", line 721, in run_service
Jul 24 22:26:26.424167 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service service.start()
Jul 24 22:26:26.424250 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service File "/opt/stack/new/nova/nova/service.py", line 143, in start
Jul 24 22:26:26.424330 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service self.manager.init_host()
Jul 24 22:26:26.424417 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service File "/opt/stack/new/nova/nova/compute/manager.py", line 1100, in init_host
Jul 24 22:26:26.424498 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service self.driver.init_host(host=self.host)
Jul 24 22:26:26.424638 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service File "/opt/stack/new/nova/nova/virt/ironic/driver.py", line 449, in init_host
Jul 24 22:26:26.424717 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service self._refresh_cache()
Jul 24 22:26:26.424798 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service File "/opt/stack/new/nova/nova/virt/ironic/driver.py", line 608, in _refresh_cache
Jul 24 22:26:26.425096 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service for node in self._get_node_list(detail=True, limit=0):
Jul 24 22:26:26.425187 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service File "/opt/stack/new/nova/nova/virt/ironic/driver.py", line 527, in _get_node_list
Jul 24 22:26:26.425278 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service node_list = self.ironicclient.call("node.list", **kwargs)
Jul 24 22:26:26.425359 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service File "/opt/stack/new/nova/nova/virt/ironic/client_wrapper.py", line 146, in call
Jul 24 22:26:26.425440 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service return self._multi_getattr(client, method)(*args, **kwargs)
Jul 24 22:26:26.425536 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service File "/usr/local/lib/python2.7/dist-packages/ironicclient/v1/node.py", line 143, in list
Jul 24 22:26:26.425614 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service limit=limit)
Jul 24 22:26:26.425708 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service File "/usr/local/lib/python2.7/dist-packages/ironicclient/common/base.py", line 149, in _list_pagination
Jul 24 22:26:26.425788 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service resp, body = self.api.json_request('GET', url)
Jul 24 22:26:26.425866 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service File "/usr/local/lib/python2.7/dist-packages/ironicclient/common/http.py", line 558, in json_request
Jul 24 22:26:26.425945 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service resp = self._http_request(url, method, **kwargs)
Jul 24 22:26:26.426027 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service File "/usr/local/lib/python2.7/dist-packages/ironicclient/common/http.py", line 188, in wrapper
Jul 24 22:26:26.426114 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service return func(self, url, method, **kwargs)
Jul 24 22:26:26.426198 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service File "/usr/local/lib/python2.7/dist-packages/ironicclient/common/http.py", line 540, in _http_request
Jul 24 22:26:26.426277 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service error_json.get('debuginfo'), method, url)
Jul 24 22:26:26.426355 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service InternalServerError: Internal Server Error (HTTP 500)
Jul 24 22:26:26.426437 ubuntu-xenial-osic-cloud1-disk-10052658 nova-compute[13755]: ERROR oslo_service.service
I guess the Ironic API service isn't yet up at that point?
Actually this wasn't caused by the migrate instance flavors change mentioned above, it was happening before that as well:
http:// logs.openstack. org/80/ 461480/ 5/check/ gate-tempest- dsvm-ironic- ipa-wholedisk- bios-agent_ ipmitool- tinyipa- ubuntu- xenial- nv/834477a/ logs/screen- n-cpu.txt. gz?level= TRACE#_ Jul_19_ 03_34_01_ 687299
So something has thrown off the timing of when nova-compute starts up and when the ironic-api is available.
This wasn't an issue in Ocata. From Ocata CI logs, the init_host in the driver happens around here:
http:// logs.openstack. org/63/ 485263/ 2/check/ gate-tempest- dsvm-ironic- ipa-wholedisk- bios-agent_ ipmitool- tinyipa- ubuntu- xenial- nv/de2c924/ logs/screen- n-cpu.txt. gz#_2017- 07-21_12_ 07_20_885
And then get_available_nodes is here:
http:// logs.openstack. org/63/ 485263/ 2/check/ gate-tempest- dsvm-ironic- ipa-wholedisk- bios-agent_ ipmitool- tinyipa- ubuntu- xenial- nv/de2c924/ logs/screen- n-cpu.txt. gz#_2017- 07-21_12_ 07_21_167
So that's not blowing up like init_host is blowing up in Pike.