test_instances_have_networking fails to ping floating IP in migrate-ovn-* bundles

Bug #1951856 reported by Corey Bryant
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
OpenStack Neutron Gateway Charm
Triaged
High
Unassigned
OpenStack Neutron Open vSwitch Charm
Triaged
High
Unassigned
Revision history for this message
Corey Bryant (corey.bryant) wrote :
description: updated
description: updated
description: updated
Changed in charm-neutron-gateway:
status: New → Triaged
Changed in charm-neutron-openvswitch:
status: New → Triaged
Changed in charm-neutron-gateway:
importance: Undecided → High
Changed in charm-neutron-openvswitch:
importance: Undecided → High
Revision history for this message
Felipe Reyes (freyes) wrote :
Download full text (4.0 KiB)

https://review.opendev.org/c/openstack/charm-neutron-gateway/+/841975
https://openstack-ci-reports.ubuntu.com/artifacts/bbb/841975/5/check/focal-wallaby/bbba014/ (func-test log available at https://pastebin.ubuntu.com/p/98fvTrfdkb/ )

2022-07-18 13:11:24.147573 | focal-medium | 2022-07-18 13:11:24 [INFO] Assigned floating IP 172.16.82.209 to zaza-neutrontests-ins-1
2022-07-18 13:21:43.004996 | focal-medium | 2022-07-18 13:21:43 [ERROR] Pinging 172.16.82.209 failed with 1
2022-07-18 13:21:43.005908 | focal-medium | 2022-07-18 13:21:43 [ERROR] stdout: b'PING 172.16.82.209 (172.16.82.209) 56(84) bytes of data.\n\n--- 172.16.82.209 ping statistics ---\n1 packets transmitted, 0 received, 100% packet loss, time 0ms\n\n'
2022-07-18 13:21:43.006151 | focal-medium | 2022-07-18 13:21:43 [ERROR] stderr: b''
2022-07-18 13:32:02.823534 | focal-medium | 2022-07-18 13:32:02 [ERROR] Pinging 172.16.82.209 failed with 1
2022-07-18 13:32:02.824132 | focal-medium | 2022-07-18 13:32:02 [ERROR] stdout: b'PING 172.16.82.209 (172.16.82.209) 56(84) bytes of data.\n\n--- 172.16.82.209 ping statistics ---\n1 packets transmitted, 0 received, 100% packet loss, time 0ms\n\n'
2022-07-18 13:32:02.824207 | focal-medium | 2022-07-18 13:32:02 [ERROR] stderr: b''
2022-07-18 13:42:22.746705 | focal-medium | 2022-07-18 13:42:22 [ERROR] Pinging 172.16.82.209 failed with 1
2022-07-18 13:42:22.747353 | focal-medium | 2022-07-18 13:42:22 [ERROR] stdout: b'PING 172.16.82.209 (172.16.82.209) 56(84) bytes of data.\n\n--- 172.16.82.209 ping statistics ---\n1 packets transmitted, 0 received, 100% packet loss, time 0ms\n\n'
2022-07-18 13:42:22.747396 | focal-medium | 2022-07-18 13:42:22 [ERROR] stderr: b''
2022-07-18 13:52:46.471416 | focal-medium | 2022-07-18 13:52:46 [ERROR] Pinging 172.16.82.209 failed with 1
2022-07-18 13:52:46.472275 | focal-medium | 2022-07-18 13:52:46 [ERROR] stdout: b'PING 172.16.82.209 (172.16.82.209) 56(84) bytes of data.\n\n--- 172.16.82.209 ping statistics ---\n1 packets transmitted, 0 received, 100% packet loss, time 0ms\n\n'
2022-07-18 13:52:46.472539 | focal-medium | 2022-07-18 13:52:46 [ERROR] stderr: b''
2022-07-18 14:03:14.181516 | focal-medium | 2022-07-18 14:03:14 [ERROR] Pinging 172.16.82.209 failed with 1
2022-07-18 14:03:14.181939 | focal-medium | 2022-07-18 14:03:14 [ERROR] stdout: b'PING 172.16.82.209 (172.16.82.209) 56(84) bytes of data.\n\n--- 172.16.82.209 ping statistics ---\n1 packets transmitted, 0 received, 100% packet loss, time 0ms\n\n'
2022-07-18 14:03:14.181959 | focal-medium | 2022-07-18 14:03:14 [ERROR] stderr: b''
2022-07-18 14:13:50.015781 | focal-medium | 2022-07-18 14:13:50 [ERROR] Pinging 172.16.82.209 failed with 1
2022-07-18 14:13:50.016169 | focal-medium | 2022-07-18 14:13:50 [ERROR] stdout: b'PING 172.16.82.209 (172.16.82.209) 56(84) bytes of data.\n\n--- 172.16.82.209 ping statistics ---\n1 packets transmitted, 0 received, 100% packet loss, time 0ms\n\n'
2022-07-18 14:13:50.016207 | focal-medium | 2022-07-18 14:13:50 [ERROR] stderr: b''
2022-07-18 14:24:41.807360 | focal-medium | 2022-07-18 14:24:41 [ERROR] Pinging 172.16.82.209 failed with 1
2022-07-18 14:24:41.807959 | focal-medium | 2022-07-18 14:24:41 [ERROR] stdout: b'PI...

Read more...

Revision history for this message
Liam Young (gnuoy) wrote :
Download full text (5.6 KiB)

I'm not sure if this is the only issue but looking at the neutron-gateway wallaby functional tests there is a mismatch between the neutron versions being deployed which is causing a RPC version mismatch.

$ juju config neutron-gateway openstack-origin
cloud:focal-wallaby
$ juju config neutron-api openstack-origin
cloud:focal-wallaby
$ juju config ovn-dedicated-chassis source
cloud:focal-xena

$ juju run --unit neutron-gateway/0 "dpkg -l | grep neutron-common"
ii neutron-common 2:19.3.0-0ubuntu2~cloud0 all Neutron is a virtual network service for Openstack - common

y$ juju run --unit neutron-api/0 "dpkg -l | grep neutron-common"
ii neutron-common 2:18.4.0-0ubuntu1~cloud0 all Neutron is a virtual network service for Openstack - common

Looking at the bundle the `source` charm config option is not set for ovn-dedicated-chassis so it is falling back to the default of xena which is adding the xena uca to the container and hence the mismatch.

Which seems to be causing:

From neutron-gateway/0:/var/log/neutron/neutron-openvswitch-agent.log
/lib/python3/dist-packages/oslo_messaging/rpc/dispatcher.py", line 320, in dispatch\n raise UnsupportedVersion(version, method=method)\n', 'oslo_messaging.rpc.dispatcher.UnsupportedVersion: Endpoint does not support RPC version 1.9. Attempted method: update_device_list\n'].
2022-07-21 08:54:51.786 59505 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent Traceback (most recent call last):
2022-07-21 08:54:51.786 59505 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/neutron/plugins/ml2/drivers/openvswitch/agent/ovs_neutron_agent.py", line 2709, in rpc_loop
2022-07-21 08:54:51.786 59505 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent failed_devices = self.process_network_ports(
2022-07-21 08:54:51.786 59505 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/neutron/plugins/ml2/drivers/openvswitch/agent/ovs_neutron_agent.py", line 2149, in process_network_ports
2022-07-21 08:54:51.786 59505 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent failed_devices['added'] |= self._bind_devices(need_binding_devices)
2022-07-21 08:54:51.786 59505 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/neutron/plugins/ml2/drivers/openvswitch/agent/ovs_neutron_agent.py", line 1237, in _bind_devices
2022-07-21 08:54:51.786 59505 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent devices_set = self.plugin_rpc.update_device_list(
2022-07-21 08:54:51.786 59505 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python3/dist-packages/neutron/agent/rpc.py", line 189, in update_device_list
2022-07-21 08:54:51.786 59505 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent ret = cctxt.call(context, 'update_device_list',
2022-07-21 08:54:51.786 59505 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File ...

Read more...

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.