Some of the neutron tempest tests are failing with urllib3.exceptions.ReadTimeoutError: HTTPSConnectionPool(host='10.0.0.5', port=13696): Read timed out. (read timeout=60) in periodic-tripleo-ci-centos-8-ovb-3ctlr_1comp-featureset001-wallaby
The tests are not the same everytime. For example:
https://logserver.rdoproject.org/62/40162/6/check/periodic-tripleo-ci-centos-8-ovb-3ctlr_1comp-featureset001-wallaby/68f312e/logs/undercloud/var/log/tempest/stestr_results.html.gz
Failed test: tempest.api.network.test_ports.PortsTestJSON.test_update_port_with_two_security_groups_and_extra_attributes
Traceback:
Response - Headers: {'content-length': '0', 'x-openstack-request-id': 'req-1ef72d63-6c7c-473a-b3dd-c1f35e0120d5', 'date': 'Tue, 08 Mar 2022 13:40:10 GMT', 'connection': 'close', 'status': '204', 'content-location': 'https://10.0.0.5:13696/v2.0/subnets/b692b121-3564-4eaa-a2c8-b7b07b311252'}
Body: b''
}}}
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/urllib3/connectionpool.py", line 426, in _make_request
six.raise_from(e, None)
File "<string>", line 3, in raise_from
File "/usr/lib/python3.6/site-packages/urllib3/connectionpool.py", line 421, in _make_request
httplib_response = conn.getresponse()
File "/usr/lib64/python3.6/http/client.py", line 1365, in getresponse
response.begin()
File "/usr/lib64/python3.6/http/client.py", line 320, in begin
version, status, reason = self._read_status()
File "/usr/lib64/python3.6/http/client.py", line 281, in _read_status
line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
File "/usr/lib64/python3.6/socket.py", line 586, in readinto
return self._sock.recv_into(b)
File "/usr/lib64/python3.6/ssl.py", line 971, in recv_into
return self.read(nbytes, buffer)
File "/usr/lib64/python3.6/ssl.py", line 833, in read
return self._sslobj.read(len, buffer)
File "/usr/lib64/python3.6/ssl.py", line 590, in read
v = self._sslobj.read(len, buffer)
socket.timeout: The read operation timed out
I tried looking for the uuid b692b121-3564-4eaa-a2c8-b7b07b311252 and the openstack request id in neutron logs - nothing looked suspicious.
From compute log: https://logserver.rdoproject.org/62/40162/6/check/periodic-tripleo-ci-centos-8-ovb-3ctlr_1comp-featureset001-wallaby/68f312e/logs/overcloud-novacompute-0/var/log/extra/errors.txt.gz
```
ERROR oslo_messaging.rpc.server [req-b0ff61dd-6342-40c5-a410-5feb89747a9a 12a5339b4ecf485a9b308c5419834e0d fa311d7daf3c4e47a25d4f3d606ac5cf - default default] Exception during message handling: libvirt.libvirtError: internal error: End of file from qemu monitor
```
Second example:
https://logserver.rdoproject.org/62/40162/5/check/periodic-tripleo-ci-centos-8-ovb-3ctlr_1comp-featureset001-wallaby/2b08731/logs/undercloud/var/log/tempest/stestr_results.html.gz
Tests failed:
tempest.api.object_storage.test_object_expiry.ObjectExpiryTest.test_get_object_after_expiry_time
tempest.api.volume.admin.test_volumes_actions.VolumesActionsTest.test_volume_force_delete_when_volume_is_creating
tempest.api.network.test_security_groups.SecGroupIPv6Test.test_create_security_group_rule_with_remote_ip_prefix
Traceback:
Response - Headers: {'content-type': 'application/json', 'content-length': '2770', 'x-openstack-request-id': 'req-dc5edc9e-0cf3-41cb-b008-77723b4d88c3', 'date': 'Mon, 07 Mar 2022 20:08:50 GMT', 'connection': 'close', 'status': '200', 'content-location': 'https://10.0.0.5:13696/v2.0/security-groups'}
Body: b'{"security_groups": [{"id": "0294eccf-6beb-4449-aa11-f64822c7ef93", "name": "default", "tenant_id": "ede5fd4c44894509af392e55aa36279c", "description": "Default security group", "security_group_rules": [{"id": "01d1e722-4d35-4d0f-a3dd-2b8466dec6cd", "tenant_id": "ede5fd4c44894509af392e55aa36279c", "security_group_id": "0294eccf-6beb-4449-aa11-f64822c7ef93", "ethertype": "IPv4", "direction": "egress", "protocol": null, "port_range_min": null, "port_range_max": null, "remote_ip_prefix": null, "remote_address_group_id": null, "normalized_cidr": null, "remote_group_id": null, "standard_attr_id": 9134, "description": null, "tags": [], "created_at": "2022-03-07T20:06:42Z", "updated_at": "2022-03-07T20:06:42Z", "revision_number": 0, "project_id": "ede5fd4c44894509af392e55aa36279c"}, {"id": "04b0f525-4511-43d1-ad76-47f44a3d4129", "tenant_id": "ede5fd4c44894509af392e55aa36279c", "security_group_id": "0294eccf-6beb-4449-aa11-f64822c7ef93", "ethertype": "IPv4", "direction": "ingress", "protocol": null, "port_range_min": null, "port_range_max": null, "remote_ip_prefix": null, "remote_address_group_id": null, "normalized_cidr": null, "remote_group_id": "0294eccf-6beb-4449-aa11-f64822c7ef93", "standard_attr_id": 9131, "description": null, "tags": [], "created_at": "2022-03-07T20:06:42Z", "updated_at": "2022-03-07T20:06:42Z", "revision_number": 0, "project_id": "ede5fd4c44894509af392e55aa36279c"}, {"id": "362dcdba-6d7b-466e-ba87-02def2c5a749", "tenant_id": "ede5fd4c44894509af392e55aa36279c", "security_group_id": "0294eccf-6beb-4449-aa11-f64822c7ef93", "ethertype": "IPv6", "direction": "egress", "protocol": null, "port_range_min": null, "port_range_max": null, "remote_ip_prefix": null, "remote_address_group_id": null, "normalized_cidr": null, "remote_group_id": null, "standard_attr_id": 9140, "description": null, "tags": [], "created_at": "2022-03-07T20:06:42Z", "updated_at": "2022-03-07T20:06:42Z", "revision_number": 0, "project_id": "ede5fd4c44894509af392e55aa36279c"}, {"id": "561c5dd2-9084-43f5-8d0b-cf3b5acfb99b", "tenant_id": "ede5fd4c44894509af392e55aa36279c", "security_group_id": "0294eccf-6beb-4449-aa11-f64822c7ef93", "ethertype": "IPv6", "direction": "ingress", "protocol": null, "port_range_min": null, "port_range_max": null, "remote_ip_prefix": null, "remote_address_group_id": null, "normalized_cidr": null, "remote_group_id": "0294eccf-6beb-4449-aa11-f64822c7ef93", "standard_attr_id": 9137, "description": null, "tags": [], "created_at": "2022-03-07T20:06:42Z", "updated_at": "2022-03-07T20:06:42Z", "revision_number": 0, "project_id": "ede5fd4c44894509af392e55aa36279c"}], "tags": [], "created_at": "2022-03-07T20:06:42Z", "updated_at": "2022-03-07T20:06:42Z", "revision_number": 1, "project_id": "ede5fd4c44894509af392e55aa36279c"}]}'
}}}
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/urllib3/connectionpool.py", line 426, in _make_request
six.raise_from(e, None)
File "<string>", line 3, in raise_from
File "/usr/lib/python3.6/site-packages/urllib3/connectionpool.py", line 421, in _make_request
httplib_response = conn.getresponse()
File "/usr/lib64/python3.6/http/client.py", line 1365, in getresponse
response.begin()
File "/usr/lib64/python3.6/http/client.py", line 320, in begin
version, status, reason = self._read_status()
File "/usr/lib64/python3.6/http/client.py", line 281, in _read_status
line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
File "/usr/lib64/python3.6/socket.py", line 586, in readinto
return self._sock.recv_into(b)
File "/usr/lib64/python3.6/ssl.py", line 971, in recv_into
return self.read(nbytes, buffer)
File "/usr/lib64/python3.6/ssl.py", line 833, in read
return self._sslobj.read(len, buffer)
File "/usr/lib64/python3.6/ssl.py", line 590, in read
v = self._sslobj.read(len, buffer)
socket.timeout: The read operation timed out
From compute logs:
https://logserver.rdoproject.org/62/40162/5/check/periodic-tripleo-ci-centos-8-ovb-3ctlr_1comp-featureset001-wallaby/2b08731/logs/overcloud-novacompute-0/var/log/extra/errors.txt.gz
```
ERROR os_brick.initiator.linuxscsi [req-c8572753-3f25-4a8a-91b7-86c954607287 ed32aa672562462f82676066650fb8a5 912435b25b7a4e09b70ef1eb94d605bc - default default] multipathd is not running: exit code 1: oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command.
: 7 ERROR oslo_messaging.rpc.server [req-0692deb8-4d8b-4d52-826d-31dc8f5dda15 c5ee3b85dcfa4d50b9814b1633b18671 84b6df098c554c06b518a6867c84b351 - default default] Exception during message handling: libvirt.libvirtError: internal error: End of file from qemu monitor
```
We triggered the above job via testproject
Again tempest tests are different but errors seems to be same.
https:/ /logserver. rdoproject. org/01/ 40101/2/ check/periodic- tripleo- ci-centos- 8-ovb-3ctlr_ 1comp-featurese t001-wallaby/ 8641138/ logs/undercloud /var/log/ tempest/ tempest_ run.log. txt.gz
``` api.compute. volumes. test_attach_ volume_ negative. AttachVolumeNeg ativeTest. test_attach_ attached_ volume_ to_same_ server [619.735919s] ... FAILED
{2} tempest.
Captured traceback: python3. 6/site- packages/ tempest/ common/ waiters. py", line 380, in wait_for_ volume_ attachment_ remove_ from_server TimeoutExceptio n(message) lib.exceptions. TimeoutExceptio n: Request timed out 194a-424d- 9e71-c6f598b66a 69 failed to detach from server a382b1fd- 4015-45a3- b94c-b7c5f29837 f8 within the required time (300 s) from the compute API perspective
~~~~~~~~~~~~~~~~~~~
Traceback (most recent call last):
File "/usr/lib/
raise lib_exc.
tempest.
Details: Volume 7ba1e256-
Captured traceback-1: ~~~~~~~ ~~~~~~~ python3. 6/site- packages/ tempest/ common/ waiters. py", line 312, in wait_for_ volume_ resource_ status TimeoutExceptio n(message) lib.exceptions. TimeoutExceptio n: Request timed out 194a-424d- 9e71-c6f598b66a 69 failed to reach available status (current in-use) within the required time (300 s).
~~~~~~~
Traceback (most recent call last):
File "/usr/lib/
raise lib_exc.
tempest.
Details: volume 7ba1e256-
```
and While taking a look nova compute log
https:/ /logserver. rdoproject. org/01/ 40101/2/ check/periodic- tripleo- ci-centos- 8-ovb-3ctlr_ 1comp-featurese t001-wallaby/ 8641138/ logs/overcloud- novacompute- 0/var/log/ containers/ nova/nova- compute. log.1.gz
``` DeviceDetachFai led: Device detach failed for vdb: Run out of retry while detaching device vdb with device alias virtio-disk1 from instance a382b1fd- 4015-45a3- b94c-b7c5f29837 f8 from the live domain config. Device is still attached to the guest.
nova.exception.
```