Octavia GET request for members in pool takes a lot of time on huge Loadbalancer

Bug #2019311 reported by Sergey Kraynev
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
octavia
In Progress
Undecided
Unassigned

Bug Description

Create LB with big topology, f.e.:
39 listeners with 1 pool and 13 members in this pool.

so totally it will be about 600 objects in the objects graph:
39 Listeners, 39 HMs, 39 pools, 13*39 members, LB, VIP, etc...

Such setup allows to display performance issue with GET requests.
Specifically when I try to get information about members in pools, i.e. call:
GET /v2/lbaas/pools/<pool_id>/members/

This call takes around 2.5 seconds.

GET /v2/lbaas/pools/78b285be-48fa-4470-a838-8445ddf83774/members => generated 6893 bytes in 2390 msecs (HTTP/1.0 200) 4 headers in 158 bytes (1 switches on core 0)

After profiling code for handling GET request, I found, that around 1 sec on call:
model.to_data_model()

https://github.com/openstack/octavia/blob/master/octavia/db/repositories.py#L174

Looks, like it happens due to recursive iteration through all nodes in graph (https://github.com/openstack/octavia/blob/master/octavia/db/base_models.py#L59)

Probably this could be optimised by only one step of recursive call, i.e. do not go through all nodes in the graph (for member it's only 1 pool, instead of all 600+ nodes in graph).

During local testing on the same topology with only one recursive call, it took around:
1.2 sec:

GET /v2/lbaas/pools/78b285be-48fa-4470-a838-8445ddf83774/members => generated 6893 bytes in 1432 msecs (HTTP/1.0 200) 4 headers in 158 bytes (1 switches on core 0)

Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix proposed to octavia (master)

Fix proposed to branch: master
Review: https://review.opendev.org/c/openstack/octavia/+/883063

Changed in octavia:
status: New → In Progress
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.