Summary: | Texture Memory Leak | ||
---|---|---|---|
Product: | Mesa | Reporter: | Sander Jansen <s.jansen> |
Component: | Drivers/DRI/i965 | Assignee: | Eric Anholt <eric> |
Status: | RESOLVED FIXED | QA Contact: | |
Severity: | normal | ||
Priority: | medium | Keywords: | NEEDINFO |
Version: | 7.6 | ||
Hardware: | Other | ||
OS: | All | ||
Whiteboard: | |||
i915 platform: | i915 features: | ||
Attachments: |
Trace: Creating & Using & Destroying 5 textures
Trace: Quiting program from debug_part_1 Code leaking texture memory. streaming-texture-leak fails with texture size of 2048 (not OOM related) |
Description
Sander Jansen
2009-11-11 19:20:20 UTC
Can you provide a minimal testcase that we can integrate into glean to show off the problem? (In reply to comment #1) > Can you provide a minimal testcase that we can integrate into glean to show off > the problem? > I think so. It should be pretty trivial (creating/using/destroying textures in a loop). http://sourceforge.net/projects/glean/files doesn't seem to provide any file. Where can I find the latest glean? Created attachment 31272 [details]
Trace: Creating & Using & Destroying 5 textures
Here you can see my program:
1) creating texture
2) using it
3) destroying it.
I'm doing this 5 times. I've added "[GMM] glGenTextures" and "[GMM] glDestroyTextures" to indicate where I call those functions. Observed behaviour My programs memory keeps on increasing its memory size.
See the next attachment to see what happens when I quit my program and destroy the context.
Created attachment 31273 [details]
Trace: Quiting program from debug_part_1
Here I quite the program from debug_part_1. The opengl context gets destroyed. Valgrind doesn't report any memory lost. I don't know the intel driver code very well, but some of the "bo_unreference final" calls seems rather late. For example, a bo gets created when I've created a texture:
[GMM] glGenTextures
intelNewTextureObject
intelNewTextureImage
intelTexImage target GL_TEXTURE_2D level 0 500x500x1 border 0
guess_and_alloc_mipmap_tree
intel_miptree_create_internal target GL_TEXTURE_2D format GL_RGB level 0..0 <-- 0x11c2a00
intel_miptree_set_level_info level 0 size: 500,500,1 offset 0,0 (0x0)
brw_miptree_layout: 512x500x4 - sz 0xfa000
bo_create: buf 33 (region) 1024000b
But it only gets unreferenced when the context gets destroyed:
bo_unreference final: 38 (SS_SURF_BIND)
bo_unreference final: 34 (SS_SURF_BIND)
bo_unreference final: 33 (region)
Very sorry for saying glean -- that was a mistake. piglit is the testsuite that we use, and I've made a testcase (streaming_texture_leak) there before to blow up with particular texturing leaks before, by looping on a create/destroy cycle that leaked until the system oomed. http://people.freedesktop.org/~nh/piglit/ (still NEEDINFO -- the traces aren't helpful without the code) Created attachment 31955 [details] Code leaking texture memory. The GMImageView::updateTexture(FXImage * image) is called whenever the textures gets refreshed. onPaint does the repainting. You can also see the current code in the source repository: http://code.google.com/p/gogglesmm/source/browse/trunk/src/GMImageView.cpp I tried the test case. I see memory consumption going up and down between 200mb and 2.0GB. (so obviously at some point it does clear some memory). However since my machine contains 4GB of ram, the test always passes since the oom-killer is never needed. Setting the TEXTURE_SIZE to 2048 will make the test case "succesfully" fail. Just like to note that I see the same problem on my desktop pc with an intel G45 graphics chip. How many textures are involved? What sizes? (In reply to comment #8) > How many textures are involved? What sizes? > I'm not sure if I understand what you're asking. My viewer displays cover art and only uses 1 texture to display it. So 1 texture is involved. When changing the display the texture gets deleted [ideally] and a new texture is created and used to display the new image. As I said before, even when re-using the texture, the memory increases over time (albeit at a much slower pace). Typically texture is a about 500x500 pixels. Good News! On Arch Linux with following software versions installed: kernel26 2.6.34-1 libdrm 2.4.20-3 mesa 7.8.1-3 intel-dri 7.8.1-3 xf86-video-intel 2.11.0-2 Running the piglet streaming-texture-leak with texture sizes of 1024 and 2048, the memory consumption is very stable (and barely noticable at 10mb / 22mb) and the OOM is never needed. So it looks like the leaking texture memory is fixed. The _only_ problem I encountered, is that with a texture size of 2048 the test itself still fails when it tries to read back the pixel value. With 1024 texture size the test passes. I'll attach the summary for the failed test. Created attachment 36062 [details]
streaming-texture-leak fails with texture size of 2048 (not OOM related)
OK, sounds like the original problem was fixed now (I think it was the deleting of cached objects referencing dead regions, but if not then at least the state batching of the binding table killed it off). |
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.