Created attachment 22433 [details]
Xorg.0.log
Created attachment 22434 [details]
xorg.conf
The glxinfo says DRI is in effect and GEM used (since you are using 2.6.28 kernel). Why do you think it's using classic mode (not GEM)? Eric, the log says: (EE) intel(0): Failed to set tiling on front buffer: rejected by kernel (EE) intel(0): Failed to set tiling on back buffer: rejected by kernel (EE) intel(0): Failed to set tiling on depth buffer: rejected by kernel Well, I thought that by default driver should use EXA, DRI1 and classic mode. And according to Xorg.0.log it's using dri1 and EXA (OK, not sure that it's classic mode) (In reply to comment #4) > Well, I thought that by default driver should use EXA, DRI1 and classic mode. Yes, by default driver should use EXA and DRI1. But it's GEM not classic mode since you are using all new bits including 2.6.28 kernel. "DRI memory manager" in X log also proves it's in GEM mode. btw, how many fps do you get if using 2.6.27 kernel? Up to 80fps Any news on it? Bug is reproducible on 2008q4 package, which is declared <quote>recommend to ordinary users/OSVs</quote>. Tested on 3 laptops with gma950: lenovo thinkpad t60 (archlinux), lenovo 3000 n100 (gentoo ~x86 with x11 overlay). asus "don't know model, but can ask its owner if you really need this info" (gentoo ~x86 with x11 overlay) I am also interested in a fix for this. OpenGL has always been a lot slower compared to the Windows Intel drivers, but with the recent GEM refactoring its almost unuseable :-/ please attach dmesg Created attachment 23311 [details]
Dmesg output on 2.6.27 kernel with EXA (1.5.99.903 xserver, 2.4.5 libdrm, 2.6.2 intel, 7.3 mesa)
Created attachment 23312 [details]
Xorg.0.log on 2.6.27 kernel with EXA (1.5.99.903 xserver, 2.4.5 libdrm, 2.6.2 intel, 7.3 mesa)
Created attachment 23313 [details]
Dmesg output on 2.6.29-rc6 kernel with EXA (1.5.99.903 xserver, 2.4.5 libdrm, 2.6.2 intel, 7.3 mesa)
Created attachment 23314 [details]
Xorg.0.log on 2.6.29-rc6 kernel with EXA (1.5.99.903 xserver, 2.4.5 libdrm, 2.6.2 intel, 7.3 mesa)
Created attachment 23315 [details]
Dmesg output on 2.6.29-rc6 kernel with UXA (1.5.99.903 xserver, 2.4.5 libdrm, 2.6.2 intel, 7.3 mesa)
Created attachment 23316 [details]
Xorg.0.log on 2.6.29-rc6 kernel with UXA (1.5.99.903 xserver, 2.4.5 libdrm, 2.6.2 intel, 7.3 mesa)
OK, here are a few attachments for your perusal.
Please note that the dmesg output is reused in the UXA test (e.g. the results from the EXA are shown before). I probably should have edited them in hindsight...
(I reused the dmesg from the 2.6.27 kernel test too btw - sorry!) I should clarify a bit here. I can currently use: 2.6.27 kernel, libdrm 2.4.4, intel 2.6.1 and get reasonable 3D performance. It's not the best it's ever been (e.g. previous combos have worked better) but it works pretty well. If I keep the same kernel and upgrade to libdrm 2.4.5+intel 2.6.2, the performance drops considerably. This performance drop is in some way due to the following patches: libdrm: intel: libdrm support for fence management in execbuf intel: 1001-Remove-logical-context-setup.patch 1002-Fix-compile-failure-after-45f45c73469f1bd46a1b6fb206.patch 1003-Make-i830_allocate_memory-take-tiling-parameters.patch 1004-Support-tiled-back-depth-on-915-class-hardware-with.patch 1005-Resize-framebuffer-on-screen-size-change-requires-U.patch (I can look up the exact commits, I'm just going on an svndiff from our package repository where these patches were backported to our libdrm/intel packages. Merging just these patches was enough to cause the slowdown on my system when I tested). Even with the older libdrm and intel drivers, I had issues using any kernel > 2.6.27. With the new release, things are generally worse! I'm currently running outdated packages that I have diligently kept so I can actually use my machine while I provide this feedback. Hopefully a fix is forth coming. :) It's probably worth mentioning also what my Mandriva colleage Ander said to me recently: The kernel is not able to enable tiled rendering for 945 chipsets when the system uses dual channel memory in interleaved mode. It seems this is the most common case. :( (In reply to comment #17) > It's probably worth mentioning also what my Mandriva colleage Ander said to me > recently: > The kernel is not able to enable tiled rendering for 945 chipsets > when the system uses dual channel memory in interleaved mode. It seems this is > the most common case. :( > Uh-oh, it's my case :( Eric: do you still need my dmesg? Is this bug is getting fixed soon? It took me three weekends to find out why every new distribution is unusably slow on my laptop with 945GM (I tried some: Ubuntu intrepid & jaunty, Opensuse 11, Fedora 10 and Arch) until I saw that this error is present in all Xorg.logs. I have 2x1gb memory in dual channel, so the cause is known, the question is whether I should post dmesgs, Xorg.logs or do any tests to help this bug fixed? I also think that bugs http://bugs.freedesktop.org/show_bug.cgi?id=18586 and http://bugs.freedesktop.org/show_bug.cgi?id=16835 might be connected to this one, since the error messages are similar and all concerns memory problems. Please attach intel_reg_dumper output from master. Colin, if you're using 2.6.27 (non-GEM), you have a completely different problem. (In reply to comment #20) > Please attach intel_reg_dumper output from master. > > Colin, if you're using 2.6.27 (non-GEM), you have a completely different > problem. > Where I can get intel_reg_dumper? Btw, problem disappeared with 2.6.29_rc kernel (I'm using 2.6.29_rc7) Created attachment 23867 [details]
Xorg.0.log with EXA on 2.6.28 with the driver from git-20090315
I think he refers to the recent changes in the git tree, see the register dumping changes. I hope I assumed well, I compiled the driver from git today on Arch linux (kernel 2.6.28 and libdrm-2.4.5) and attached the Xorg.0.log with the ModeDebug and FallbackDebug options on.
(In reply to comment #20) > Colin, if you're using 2.6.27 (non-GEM), you have a completely different > problem. I've tried all kernel's up to the latest 2.6.29-rc as you can see from the numerous attachments I added as per your original request. I only mention .27 as it is the only kernel that gives me half way decent performance (albeit not with recent libdrm+intel versions which have killed performance on these older kernels too). (In reply to comment #23) > (In reply to comment #20) > > Colin, if you're using 2.6.27 (non-GEM), you have a completely different > > problem. > > I've tried all kernel's up to the latest 2.6.29-rc as you can see from the > numerous attachments I added as per your original request. I only mention .27 > as it is the only kernel that gives me half way decent performance (albeit not > with recent libdrm+intel versions which have killed performance on these older > kernels too). > Tiling doesn't work for you for some reason, here's part of your Xorg.0.log: (EE) intel(0): Failed to set tiling on front buffer: rejected by kernel (EE) intel(0): Failed to set tiling on back buffer: rejected by kernel (EE) intel(0): Failed to set tiling on depth buffer: rejected by kernel That's cause of low performance. Btw, tiling works fine for me with EXA, but doesn't work with UXA. Try EXA on latest kernels ;) (In reply to comment #24) > That's cause of low performance. Btw, tiling works fine for me with EXA, but > doesn't work with UXA. Try EXA on latest kernels ;) :) I was literally just trying UXA and then EXA on the latest 2.6.99.902 driver & kernel 2.6.29-rc8 right now! The same problems persist with both UXA and EXA - the tiling is not available on either same errors (tho' I've not rebooted, just restarted xserver - I will try rebooting just in case the hardware has been put into some invalid state when UXA was attempted). There are differences between UXA and EXA tho': [root@jimmy intel]# grep tiling Xorg.log-2.6.29-rc8-UXA.txt (EE) intel(0): Failed to set tiling on front buffer: rejected by kernel (EE) intel(0): Failed to set tiling on front buffer: rejected by kernel [root@jimmy intel]# grep tiling Xorg.log-2.6.29-rc8-EXA.txt (EE) intel(0): Failed to set tiling on front buffer: rejected by kernel (EE) intel(0): Failed to set tiling on back buffer: rejected by kernel (EE) intel(0): Failed to set tiling on depth buffer: rejected by kernel I can post the full log if you want and am happy to test anything and everything. We've not got long before the Mandriva release and solving this issue is quite critical :s Actually a slight update. It appears that using the latest kernel (2.6.29-rc8) and the older intel (2.6.1) and libdrm drivers (2.4.4) I get pretty good 3D performance. I do however still get: (EE) intel(0): Failed to set tiling on front buffer: Invalid argument Which is slightly different to the previous errors. I'm not sure if this is helpful and I think I remember reading that this setup has some memory leak issues, but time will tell on that front. If you aren't the submitter of this bug report, get your own bug report. These tiling problems are a pain to figure out, as it's a combination of a bunch of different potential chipset problems with a bunch of slightly buggy software and often issues in the reports themselves with wrong logs getting attached. By having everyone pile on one bug report, any hope of discerning what's going on for any one person is lost. And to everyone opening a new bug report, you need either ModeDebug enabled in your config file or the output of intel_reg_dumper. Vasily, could I get you to attach your dmesg and Xorg.0.log with ModeDebug enabled? (In reply to comment #27) > If you aren't the submitter of this bug report, get your own bug report. Sorry if you consider my input on this bug to be counter-productive. Your request is a deviation from standard bug-reporting practices which attempts to cut down on unnecessary duplicates and I've got about 10 or 15 people experiencing problems that look to be covered by this bug that I've so far been managing in the Mandriva bugzilla and mailing lists to prevent spamming upstream developers with multiple bug reports. Do you really want me to get them all to report a new bug here? Feel free to reply off bug if it's more appropriate. Created attachment 23936 [details]
dmesg output + Xorg.0.log for working and not working cases
Sorry, false alarm.
It still not working with vanilla 2.6.29-rc7 - driver refuses to enable tiling.
But for some reason it's working with EXA and small patch on 2.6.29-rc7 kernel (look for it in attached archive) - maybe I'm just lucky and it doesn't crash for me :). With UXA + this patch it just stops working after a while (driver stops to update screen)
Dmesg and Xorg.0.log with ModeDebug enabled are in attached archive.
Created attachment 24000 [details]
Xorg.0.log with 2.6.29-rc8 and xf86-video-intel from git
xf86-video-intel git commit is 6deb26ae7bd796e88a5dd90df5f6c35fbc44e798
Created attachment 24001 [details]
dmesg with 2.6.29-rc8 and xf86-video-intel from git
I think this bug can be closed as duplicate of bug 16835 OK, I've looked at the log and confirmed that it's the same. Thanks. I've also finally got a machine with the problem, so I should have some progress here soon. *** This bug has been marked as a duplicate of bug 16835 *** (In reply to comment #33) > I've also finally got a machine with the problem, so I should have some > progress here soon. \o/ (/me is hoping his problem is the same but will be patient!). Here's hoping it can make it into the 2.7 release but that's perhaps a little bit optimistic :) |
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.
Created attachment 22432 [details] glxinfo output xf86-video-intel-2.6.1 (actually all versions from 2.5.0 to git have this problem) is slow when using EXA+DRI1 with 2.6.28 kernel (with 2.6.27 everything is OK) glxinfo says that I'm using direct rendering, but 3D performance is terrible (~9fps in quake3)