software: glxgears (all linux version) At computers with intel x3000 series, the folowing error message appears: engine386.lnx: brw_draw_upload.c:412: brw_prepare_vertices: Assertion `input->offset < input->bo->size' failed. At computers with other intel graphics card: segmentation fault OR white screen other issues: may glcopyteximage2d is broken? ---------- https://sourceforge.net/projects/lgfxgears/
This is an assertion to catch bugs in GL users before they crash the GPU. Have you checked the VBO setup in the application?
(In reply to comment #1) > This is an assertion to catch bugs in GL users before they crash the GPU. Have > you checked the VBO setup in the application? > Eric! Thank you for your reply. This minibenchmark is based on a hughe passive/monolitic game engine, that has been developed by me since 4 years. With this game engine, i alreday developed a few applications and games since 2006, but mostly for windows. Since 2006, of course, a hughe number of serious bugs where fond in the code, and i fixed them. Of course, it is possible, that the bug is in my code, but i think not: under windows, 3dfx, matrox, powervr, ati, sis, s3, 3dlabs, intergraph, nvidia, and more (intel too, on windows) graphics cards are working without problems with this graphics engine. But under linux, nvidia and ati cards was only tested until now. In this minibenchmark, the geometry being rendered with VBO (if available) and in immediate mode (glbegin) for the particles. VBO geometry is floating point triangles, they are gl_triangles. The vbo handling not changed since 2 years. There is nothing special in the vbo setup: i generate with: glGenBuffersARB(1, ...); then i bind with: glBindBufferARB(GL_ARRAY_BUFFER_ARB, ...); then i upload the data with : glBufferDataARB(GL_ARRAY_BUFFER_ARB, ..... , ....., GL_STREAM_DRAW_ARB); At the rendering cycle, i uses multitexturing (one of pipelines: gltexcoord1f, second: gltexcoord2f), strides are always 0. finally a glDrawArrays call. If no vbo available, rendering happens without it. If no vertex array support, engine falls back to classic intermediate pipeline. I cant test the engine on intel gpu with linux, i has none, so i cant check what is cause the problem. Thanks for your help! -Geri
dl url changed: http://legend.uw.hu/lgfxgears.zip version changed to 1.13 bug still exist.
The archive doesn't even extract on Linux. Try printing out the offset (which should be the pointer you passed in for your vertex array, though of course it won't be an actual pointer) and the size of the buffer object (possibly page-aligned size of a buffer object you created) that are being compared in that assertion.
Just tried latest lgfxgears 1.13 on a matrox g550 under ubuntu 10.04 and it segfaults. I will send the full log, hope this helps.
Created attachment 36627 [details] text output when launching lgfxgears
Thanks for you bug report. It seems, in the matrox and intel drivers there is same code wich causing the bug.(In reply to comment #6) > Created an attachment (id=36627) [details] > text output when launching lgfxgears
An out-of-date binary test case, no wonder it crashes. Works under valgrind strangely enough and I can't reproduce the assertion failure.
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.