ovs-vswitchd: possible memory leak in 2.10.1 (Alpine 3.9)
I’m running OVS on a Qemu/KVM virtualization host for a while now. The virtualization host is running around 12 VMs without a huge network activity. The OVS configuration is very simple, nothing fancy. Four VLANs, two bond ports with two slaves each, three internal ports and then one port for each VM (I can provide the output of ovs-vsctl show if needed).
Recently I noticed that when the ovs-vswitchd process starts it’s consuming around 10MB of RAM but after two weeks (the server is running 24x7) ovs-vswitchd can be consuming around 2.5GB of RAM and after let’s say a month or so it will be around 5GB or even a bit more.
What I’m doing now to keep ovs-vswitchd memory usage under control is to restart the service every couple of weeks but this is far from ideal.
Please find below some additional information about my setup and feel free to ask for any other details I may have missed.
OVS command line: /usr/sbin/ovs-vswitchd
—pidfile=/var/run/openvswitch/ovs-vswitchd.pid —detach —monitor
—mlockall unix:/var/run/openvswitch/db.sock
OVS version: 2.10.1
Distro: Alpine Linux 3.9.2
Linux kernel version: 4.19.26
Qemu version: 3.1
Libvirt version: 4.10.0
OVS memory usage from log file:
2018-09-07T22:02:54+02:00 vmsvr01 ovs-vswitchd: ovs|00030|memory|INFO|10316 kB peak resident set size after 10.0 seconds
2018-09-07T22:02:54+02:00 vmsvr01 ovs-vswitchd: ovs|00031|memory|INFO|handlers:5 ports:2 revalidators:3 rules:5 udpif keys:23
2018-09-07T22:12:05+02:00 vmsvr01 ovs-vswitchd: ovs|00030|memory|INFO|10316 kB peak resident set size after 10.0 seconds
2018-09-07T22:12:05+02:00 vmsvr01 ovs-vswitchd: ovs|00031|memory|INFO|handlers:5 ports:2 revalidators:3 rules:5 udpif keys:21
2018-09-08T10:48:18+02:00 vmsvr01 ovs-vswitchd: ovs|00702|memory|INFO|peak resident set size grew 51% in last 3472.0 seconds, from 10300 kB to 15572 kB
2018-09-08T10:48:18+02:00 vmsvr01 ovs-vswitchd: ovs|00703|memory|INFO|handlers:5 ports:13 revalidators:3 rules:5 udpif keys:18
2018-09-08T12:21:11+02:00 vmsvr01 ovs-vswitchd: ovs|00704|memory|INFO|peak resident set size grew 51% in last 5573.2 seconds, from 15572 kB to 23492 kB
2018-09-08T12:21:11+02:00 vmsvr01 ovs-vswitchd: ovs|00705|memory|INFO|handlers:5 ports:13 revalidators:3 rules:5 udpif keys:32
2018-09-08T14:46:16+02:00 vmsvr01 ovs-vswitchd: ovs|00706|memory|INFO|peak resident set size grew 51% in last 8705.3 seconds, from 23492 kB to 35372 kB
2018-09-08T14:46:16+02:00 vmsvr01 ovs-vswitchd: ovs|00707|memory|INFO|handlers:5 ports:13 revalidators:3 rules:5 udpif keys:16
2018-09-08T17:46:03+02:00 vmsvr01 ovs-vswitchd: ovs|00708|memory|INFO|peak resident set size grew 50% in last 10786.3 seconds, from 35372 kB to 53060 kB
2018-09-08T17:46:03+02:00 vmsvr01 ovs-vswitchd: ovs|00709|memory|INFO|handlers:5 ports:13 revalidators:3 rules:5 udpif keys:18
2018-09-08T23:08:45+02:00 vmsvr01 ovs-vswitchd: ovs|00720|memory|INFO|peak resident set size grew 50% in last 19362.0 seconds, from 53060 kB to 79724 kB
2018-09-08T23:08:45+02:00 vmsvr01 ovs-vswitchd: ovs|00721|memory|INFO|handlers:5 ports:14 revalidators:3 rules:5 udpif keys:27
2018-09-09T06:41:39+02:00 vmsvr01 ovs-vswitchd: ovs|00722|memory|INFO|peak resident set size grew 50% in last 27174.2 seconds, from 79724 kB to 119588 kB
2018-09-09T06:41:39+02:00 vmsvr01 ovs-vswitchd: ovs|00723|memory|INFO|handlers:5 ports:14 revalidators:3 rules:5 udpif keys:26
2018-09-09T18:02:35+02:00 vmsvr01 ovs-vswitchd: ovs|00724|memory|INFO|peak resident set size grew 50% in last 40856.1 seconds, from 119588 kB to 179516 kB
2018-09-09T18:02:35+02:00 vmsvr01 ovs-vswitchd: ovs|00725|memory|INFO|handlers:5 ports:14 revalidators:3 rules:5 udpif keys:37
2018-09-10T10:58:41+02:00 vmsvr01 ovs-vswitchd: ovs|00727|memory|INFO|peak resident set size grew 50% in last 60965.9 seconds, from 179516 kB to 269276 kB
2018-09-10T10:58:41+02:00 vmsvr01 ovs-vswitchd: ovs|00728|memory|INFO|handlers:5 ports:15 revalidators:3 rules:5 udpif keys:39
2018-09-11T14:54:02+02:00 vmsvr01 ovs-vswitchd: ovs|00734|memory|INFO|peak resident set size grew 50% in last 100521.3 seconds, from 269276 kB to 403916 kB
2018-09-11T14:54:02+02:00 vmsvr01 ovs-vswitchd: ovs|00735|memory|INFO|handlers:5 ports:14 revalidators:3 rules:5 udpif keys:16
2018-09-13T08:06:22+02:00 vmsvr01 ovs-vswitchd: ovs|00740|memory|INFO|peak resident set size grew 50% in last 148339.4 seconds, from 403916 kB to 605876 kB
2018-09-13T08:06:22+02:00 vmsvr01 ovs-vswitchd: ovs|00741|memory|INFO|handlers:5 ports:15 revalidators:3 rules:5 udpif keys:15
2018-09-15T21:54:39+02:00 vmsvr01 ovs-vswitchd: ovs|00750|memory|INFO|peak resident set size grew 50% in last 222497.4 seconds, from 605876 kB to 908948 kB
2018-09-15T21:54:39+02:00 vmsvr01 ovs-vswitchd: ovs|00751|memory|INFO|handlers:5 ports:14 revalidators:3 rules:5 udpif keys:20
2018-09-19T18:15:25+02:00 vmsvr01 ovs-vswitchd: ovs|00763|memory|INFO|peak resident set size grew 50% in last 332445.8 seconds, from 908948 kB to 1363556 kB
2018-09-19T18:15:25+02:00 vmsvr01 ovs-vswitchd: ovs|00764|memory|INFO|handlers:5 ports:13 revalidators:3 rules:5 udpif keys:46
2018-09-25T11:54:40+02:00 vmsvr01 ovs-vswitchd: ovs|00855|memory|INFO|peak resident set size grew 50% in last 495554.7 seconds, from 1363556 kB to 2045468 kB
2018-09-25T11:54:40+02:00 vmsvr01 ovs-vswitchd: ovs|00856|memory|INFO|handlers:5 ports:16 revalidators:3 rules:5 udpif keys:53
2018-10-04T08:31:40+02:00 vmsvr01 ovs-vswitchd: ovs|00888|memory|INFO|peak resident set size grew 50% in last 765420.9 seconds, from 2045468 kB to 3068204 kB
2018-10-04T08:31:40+02:00 vmsvr01 ovs-vswitchd: ovs|00889|memory|INFO|handlers:5 ports:14 revalidators:3 rules:5 udpif keys:42
2018-10-16T14:45:35+02:00 vmsvr01 ovs-vswitchd: ovs|00911|memory|INFO|peak resident set size grew 50% in last 1059234.5 seconds, from 3068204 kB to 4602308 kB
2018-10-16T14:45:35+02:00 vmsvr01 ovs-vswitchd: ovs|00912|memory|INFO|handlers:5 ports:14 revalidators:3 rules:5 udpif keys:27
2018-11-04T06:27:02+01:00 vmsvr01 ovs-vswitchd: ovs|01015|memory|INFO|peak resident set size grew 50% in last 1615287.6 seconds, from 4602308 kB to 6903596 kB
2018-11-04T06:27:02+01:00 vmsvr01 ovs-vswitchd: ovs|01016|memory|INFO|handlers:5 ports:13 revalidators:3 rules:5 udpif keys:1
2018-12-01T19:28:14+01:00 vmsvr01 ovs-vswitchd: ovs|01092|memory|INFO|peak resident set size grew 50% in last 2379671.3 seconds, from 6903596 kB to 10355396 kB
2018-12-01T19:28:14+01:00 vmsvr01 ovs-vswitchd: ovs|01093|memory|INFO|handlers:5 ports:13 revalidators:3 rules:5 udpif keys:18
2019-01-12T13:12:25+01:00 vmsvr01 ovs-vswitchd: ovs|01234|memory|INFO|peak resident set size grew 50% in last 3606251.1 seconds, from 10355396 kB to 15533228 kB
2019-01-12T13:12:25+01:00 vmsvr01 ovs-vswitchd: ovs|01235|memory|INFO|handlers:5 ports:13 revalidators:3 rules:5 udpif keys:46
Also I’m attaching two screenshots from Munin monitoring ovs-vswitchd rss size and meminfo for around five days and how memory usage growth is almost linear to time.
I can provide a memory dump of the process while consuming 500MB of RAM (collected using gcore) if that would help.
Please let me know if you need any more details.
(from redmine: issue id 10138, created on 2019-03-19)