httpd leaks open files
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
OpenStack Dashboard (Horizon) |
Confirmed
|
Undecided
|
Unassigned | ||
keystoneauth |
Won't Fix
|
Undecided
|
Unassigned | ||
python-keystoneclient |
Confirmed
|
Medium
|
Unassigned |
Bug Description
horizon version 13.0.0-1.el7 (Queens) on centos 7.4.1708
After some time working on the dashboard, it stops working and throw this error in the error log
[Wed Jul 04 22:49:33.744241 2018] [:error] [pid 23924] [remote 10.144.187.237:52] IOError: [Errno 24] Too many open files: '/usr/share/
if we check the open files of this process 23924
ls -l /proc/23924/fd | wc -l
1023
even if we increase the nofile limit of this process, it doesn't help as the open files is always increasing
the problem is cleared if we restart the httpd process, but then the open files will increase again
how to reproduce
1- login to the dashboard
2- get the pid from the error_log file
[Wed Jul 04 22:50:58.620832 2018] [:error] [pid 23924] INFO openstack_
3- browse the dashboard different menus, specially the network topology tab
4- monitor the open files no. with ls -l /proc/<pid>/fd | wc -l
observation
the no. is always increasing
description: | updated |
Changed in horizon: | |
status: | New → Confirmed |
Changed in keystoneauth: | |
status: | New → Confirmed |
We have also noticed this problem.
We did an strace on horizon and found that the connections made by horizon to keystone were not getting closed.
Digging deeper, we found 2 very similar commits that fix connection closure issues related to keystone.
https:/ /github. com/openstack/ python- keystoneclient/ commit/ 8fcacdc7c74f5ac 68e8e55ea8c1591 8c452411fe
and
https:/ /github. com/openstack/ keystoneauth/ commit/ dbcbf414ac8423e 97d77d0bda8157b e5350530f0
I think that the commit in keystoneauth is incomplete, it's missing the destination of the move of the _FakeRequestSession class and so it's just a removal.
In Newton, horizon is using keystoneclient to get keystone sessions and we don't see the connection leaks.
Somewhere between Newton and Queens, horizon switched to keystoneauth and lost the _FakeRequestSession hack and so we are now seeing a regression where horizon is leaking connections again.
I am attaching a patch that we are currently testing, it re-introduces the _FakeRequestSession class into keystoneauth. sion, I just set it to an empty list because it is being referenced by code elsewhere. It seems to be working for now.
The patch is a little naive, I don't know what to do with the instance attribute "adapters" of _FakeRequestSes