Operationally, at least, I seem to resolved my problem with
Xorg 6.9.0 and using the Mach32 ATI video card.
I included lines in xorg.conf.new in Section "Device" ...
Option "tv_out" "false"
Option "tv_standard" "None"
I suspect only the latter is necessary but haven't experimented to see.
The reason for doing this seems to me to be some bugs in the
ATI driver arrangement and Xorg general code. It seems that
there are many implementations of strncasecmp in different
packages, despite what seems to be a "standard" available in
libc (???). Xorg's is xf86strncasecmp and is written so that it
goes ahead and dereferences a null pointer if one is passed to
it - hence the segment error crash.
The null pointer comes from the string TvStd not being
initialized when ATIProcessOptions runs with the "Xorg
-configure" generated xorg.conf.new file. While I haven't
traced how the variable gets defined, by changing the option
above, that would seem to be the effect as it quits crashing
I've been trying to bring up X Windows an old PC with an ATI
Mach32 video card running on FreeBSD 6.1.
First, I was able to run the card with an ATI driver under
FreeBSD 5.3 which was using Xorg 6.7.0, I believe.
Second, I can bring up test screens by mucking with the
configuration file and choosing other drivers - notably "vga".
Third, I posted a query to the FreeBSD-Questions list but
haven't gotten really helpful responses ... so far.
Fourth, the problem appears to be consistent. I first run
to get the configuration file and then do as it says at the end
of its output and run
X -config xorg.conf.new
which fails with a segment fault at the same place every time;
from the Xorg.0.log file
Fatal server error:
Caught signal 11. Server aborting
By running X via gdb I was able to get this backtrace data
Program received signal SIGSEGV, Segmentation fault.
0x0809854d in xf86strncasecmp ()
#0 0x0809854d in xf86strncasecmp ()
#1 0x286a7cb9 in ATIProcessOptions () from
#2 0x286b8748 in ATIPreInit () from /usr/X11R6/lib/modules/drivers/atimisc_drv.so
#3 0x0806e2f1 in InitOutput ()
#4 0x080c3d84 in main ()
As near as I've been able to tell, this is the result of TvStd
in ATIProcessOptions being passed to xf86strncasecmp as a null
pointer. Of course, it probably doesn't make all that much
sense for xf86strncasecmp to be de-referencing a null pointer.
Sorry about the phenomenal bug spam, guys. Adding xorg-team@ to the QA contact so bugs don't get lost in future.
Can you attach the xorg.conf that was generated for you? Also, does this issue still show up in a newer xserver?
This is definitely a server bug, since the tv standard and tv out code hasn't changed since it was originally checked in. In trying to chase it down, I can't find anywhere that this would have gone wrong.
I don't see myself fixing this without the xorg.conf available. I get the feeling that this has been fixed already and I just don't know where it might have been. Well, I don't have the xorg.conf to test, so I can't be sure.
If this is still an issue with you with a newer xserver, feel free to open this bug back up. It'd help a lot if you attached the generated xorg.conf, since I can't see how option collection would get a null string. For now, I'm going to say this was fixed since I can't see how it would happen again.