I have tested the new binutils package on the test program at the top of this ticket. I no longer see the "Inconsistency detected..." message, but I do receive a segfault.
Program received signal SIGSEGV, Segmentation fault.
0x0000000000000000 in ?? ()
(gdb) bt
#0 0x0000000000000000 in ?? ()
#1 0x00007ffff406c291 in init () at dlerror.c:177
#2 0x00007ffff406c6d7 in _dlerror_run (operate=operate@entry=0x7ffff406c130 <dlsym_doit>,
args=args@entry=0x7fffffffe550) at dlerror.c:129
#3 0x00007ffff406c198 in __dlsym (handle=<optimized out>, name=<optimized out>) at dlsym.c:70
#4 0x00007ffff7b4eb3e in ?? () from /usr/lib/nvidia-331/libGL.so.1
#5 0x00007ffff7b32db4 in ?? () from /usr/lib/nvidia-331/libGL.so.1
#6 0x00007ffff7dea0fd in call_init (l=0x7ffff7ff94c0, argc=argc@entry=1,
argv=argv@entry=0x7fffffffe698, env=env@entry=0x7fffffffe6a8) at dl-init.c:64
#7 0x00007ffff7dea223 in call_init (env=<optimized out>, argv=<optimized out>,
argc=<optimized out>, l=<optimized out>) at dl-init.c:36
#8 _dl_init (main_map=0x7ffff7ffe1c8, argc=1, argv=0x7fffffffe698, env=0x7fffffffe6a8)
at dl-init.c:126
#9 0x00007ffff7ddb30a in _dl_start_user () from /lib64/ld-linux-x86-64.so.2
#10 0x0000000000000001 in ?? ()
#11 0x00007fffffffe968 in ?? ()
#12 0x0000000000000000 in ?? ()
I am using nvidia-331=331.113-0ubuntu0.0.4, but nvidia-331-updates=331.113-0ubuntu0.0.4 seems to have the same behavior too. If I can provide more information let me know. I'll update the tag too (if I can find where to do it).
I have tested the new binutils package on the test program at the top of this ticket. I no longer see the "Inconsistency detected..." message, but I do receive a segfault.
Here is what I did to upgrade:
#> cat /etc/apt/ sources. list.d/ trusty- proposed. list archive. ubuntu. com/ubuntu/ trusty-proposed restricted main multiverse universe 2.24-5ubuntu13 dpkg/status
deb http://
#> apt-get update
#> apt-get install binutils=
#> apt-cache policy binutils
binutils:
Installed: 2.24-5ubuntu13
Candidate: 2.24-5ubuntu13
Version table:
*** 2.24-5ubuntu13 0
100 /var/lib/
Here is what I did to compile the test program:
$> g++ test.cpp -lGL
$> ./a.out
Segmentation fault (core dumped)
Forcing linking against NVIDIA's libGL.so seems to get rid of the segfault:
$> g++ test.cpp -L /usr/lib/nvidia-331 -lGL
$> ./a.out
Here is a backtrace from gdb.
Program received signal SIGSEGV, Segmentation fault. operate@ entry=0x7ffff40 6c130 <dlsym_doit>, args@entry= 0x7fffffffe550) at dlerror.c:129 nvidia- 331/libGL. so.1 nvidia- 331/libGL. so.1 argv@entry= 0x7fffffffe698, env=env@ entry=0x7ffffff fe6a8) at dl-init.c:64 0x7ffff7ffe1c8, argc=1, argv=0x7fffffff e698, env=0x7fffffffe6a8) ld-linux- x86-64. so.2
0x0000000000000000 in ?? ()
(gdb) bt
#0 0x0000000000000000 in ?? ()
#1 0x00007ffff406c291 in init () at dlerror.c:177
#2 0x00007ffff406c6d7 in _dlerror_run (operate=
args=
#3 0x00007ffff406c198 in __dlsym (handle=<optimized out>, name=<optimized out>) at dlsym.c:70
#4 0x00007ffff7b4eb3e in ?? () from /usr/lib/
#5 0x00007ffff7b32db4 in ?? () from /usr/lib/
#6 0x00007ffff7dea0fd in call_init (l=0x7ffff7ff94c0, argc=argc@entry=1,
argv=
#7 0x00007ffff7dea223 in call_init (env=<optimized out>, argv=<optimized out>,
argc=<optimized out>, l=<optimized out>) at dl-init.c:36
#8 _dl_init (main_map=
at dl-init.c:126
#9 0x00007ffff7ddb30a in _dl_start_user () from /lib64/
#10 0x0000000000000001 in ?? ()
#11 0x00007fffffffe968 in ?? ()
#12 0x0000000000000000 in ?? ()
I am using nvidia- 331=331. 113-0ubuntu0. 0.4, but nvidia- 331-updates= 331.113- 0ubuntu0. 0.4 seems to have the same behavior too. If I can provide more information let me know. I'll update the tag too (if I can find where to do it).