currentMode->imageHeight = (info->FbMapSize / currentMode->bytesPerScanline);
On my Radeon IGP chipset with 64MB of VideoRam, this value is calculated as
65536 (0x010000). Although here the variable is a 32 bit integer, later in the
dga library this value is truncated to a 16 bit integer, causing the driver to
report the imageHeight as zero. I added a single line beneath the current Line
144 to clamp the value to the max a 16 bit iteger can hold;
if (currentMode->imageHeight > 0xFFFF) currentMode->imageHeight = 0xFFFF;
This fixes the problem for my radeon, but I think the other ati drivers may
suffer from the same problem, but I do not have the hardware to test.
Can I ask you the issues that you get without this line added?
(In reply to comment #1)
> Can I ask you the issues that you get without this line added?
The problem is that the imageWidth is set to zero, so when an application tries
to detect a useable DGA mode, these modes are skipped (assuming the application
checks the available widths and heights). I also submitted the same bug report
to the XFree86 devs, and I believe they fixed it in a more general manner by
clamping the co-ords to 16 bit values in "Xext/xf86dga2.c"
Has this bug been fixed?
Created attachment 4819 [details]
Fixed 6.9.0 Source
I have modified "xc/programs/Xserver/Xext/xf86dga2.c" (which i have
taken from xorg 6.9.0) and fixed this problem.
Created attachment 4820 [details]
diff of the fixed source and the original 6.9.0 source
Sorry about the phenomenal bug spam, guys. Adding xorg-team@ to the QA contact so bugs don't get lost in future.
dga == dead dead dead.