nVidia and current Kernels
My home workplace is slowly and steadily mutating into a never ending story. I do not remember blogging every aspect of it, but after three graphics cards, an even older mainboard and two DVB-S-Cards, my home workplace PC currently does what I expect it to do: Run Debian unstable, drive two 20 inch DVI TFT monitors with 1600x1200 pixels each and receiving DVB-S transmissions. I do not think that these are exaggerated expectations, but it took over three months to find a combination of hardware which will actually do what I want.
The hardest part was finding a AGP graphics card which can drive two DVI monitors with 1600x1200 pixels each. After failing with two different Matrox cards (the G550 not being able to do 1600x1200 pixels if the monitors are connected via DVI), I finally settled on a used GeForce FX 5200. In the beginning, the binary nVidia module didn't hurt as much as I expected. Unfortunately, this rapidly changed with the 2.6.27 Linux kernel.
Unfortunately, the Linux kernel changed its interface with 2.6.27, which stopped the nvidia-kernel-source 173.14.09 from sid from compiling. Contrary to what is told in #500285, the driver doesn't compile on my system even when I clean out /usr/src/modules/nvidia-kernel prior to unpacking the driver .tar.gz.
I already thought that my problem was solved when I noticed nvidia-kernel-source 177.80 in Experimental. The module compiles just fine against kernel 2.6.27, but...
Don't closed source binary-only drivers just suck? I mean, granted, the FX 5200 is a five year old design, but it does its basic office job just fine. So I currently can choose to stick with a legacy kernel of the 2.6.26 series to be able to run the legacy nVidia driver, or can shell out money for a new graphics card and a new mainboard, a new CPU and new memory (since current graphics cards aren't available for AGP any more). Just fine. Sucks.Nov 21 17:30:53 weave kernel: NVRM: The NVIDIA GeForce FX 5200 GPU installed in this system is Nov 21 17:30:53 weave kernel: NVRM: supported through the NVIDIA 173.14.xx Legacy drivers. Please Nov 21 17:30:53 weave kernel: NVRM: visit http://www.nvidia.com/object/unix.html for more Nov 21 17:30:53 weave kernel: NVRM: information. The 177.80 NVIDIA driver will ignore Nov 21 17:30:53 weave kernel: NVRM: this GPU. Continuing probe... Nov 21 17:30:53 weave kernel: NVRM: No NVIDIA graphics adapter found!
If I get around to buying a new box, which graphics card manufacturer is to be trusted these days? nVidia is out of the question after my current experiences, Intel doesn't make graphics cards as far as I know, and ATI/AMD? I hear that they have recently seen the light, but are their open source drivers mature enough so that ATI/AMD products can already drive high resolution dual DVI setup? I currently do not care about 3D performance, I just want to push around windows on a flat KDE desktop and to some serious work. All I want is screen real estate and a picture of decent quality. I'd appreciate any hints and comments, and surely hope that there will be some kind of upstream support for the 2.6.26 kernels until I have refitted my home workplace hardware.
Display comments as Linear | Threaded
Michael Croes on :
"Intel doesn’t make graphics cards as far as I know" Well you don't need a graphics card, all you need is a graphics chip to drive some dvi connectors... If I were in your position I would go for something with an onboard 4500 series intel graphics card I think. Just add an ADD2 card (PCIe card with DVI connectors for supported hardware, like intel's chips) and you're fine. I myself have a geforce 8800 gts/512, that's still a very decent card and will be supported for the next couple of years, by which I will have moved on to my next computer I think... Another option for you would be to use nouveau. I don't know how well nouveau has support for your 5200 series card is, but I'm impressed by the functionality of their driver on my geforce 6600 card.
If you want more information on anything I said here and your comment system doesn't notify commenters on updates, send me an E-mail and I'll answer whatever more you want to know...
Marc 'Zugschlus' Haber on :
That would mean a new mainboard, a new CPU and new memory, since the current board doesn't have a graphics adapter. And I'm really reluctant to buy something with on-board graphics - I am an old PC geek who appreciates the possibility to swap out components.
Additionally, the "notify on new comments" function had to be switched of since blogs have been in the line of legal fire for not being opt-in.
fatal on :
"That would mean a new mainboard, a new CPU and new memory, ..."
Thats pretty much the situation anyway.... Noone makes AGP cards anymore or atleast noone supports those legacy cards.
"I’m really reluctant to buy something with on-board graphics.." Get over it! There's one company that has graphics cards that "just works" in Linux. They are called Intel. Is a non-working card better then a working onboard? You choose! Also, onboard doesn't mean there's no slot to put in a replacement graphics card if you one day would need it.
(Friendly advice from someone which refused nvidia many years ago and now have both Intel and ATI graphics.)
Ken Bloom on :
If you don't care about 3D performance, perhaps the open source nv driver will do the trick. If not, then maybe the experimental open source Nouveau driver will also do the trick. AFAIK, neither requires special kernel support.
Ken Bloom on :
Whoops. It says plain as day in nouveau's description:
Users wishing to install these drivers should first use module-assistant to build the drm kernel modules (via ``module-assistant auto-install drm'').
But nv doesn't require that, and neither nouveau nor drm-modules are proprietary.
elkin on :
you could try to compile the beta drivers for the 5 series:
Anonymous on :
As another commenter mentioned, Intel does do graphics, they just don't do standalone graphics cards (yet). Just buy an Intel chipset with integrated graphics. Intel, unlike both ATI and nVidia, has excellent Linux and X support. Every single Intel graphics chipset will work with Linux and X on or before the day it goes on sale. Not only that, but most of the innovation in the X Window System, such as BIOS-free modesetting, kernel memory management, and kernel modesetting, all tend to work with Intel graphics hardware first simply because Intel hired many of the top X Window System developers.
Anonymous on :
I like how you blame nVidia rather than the API's that are constantly broken as if nVidia has a say in how quickly things change in FOSS. Can you name another hardware manufacturer that provides support for 10* year old hardware that is for all intents and purposes obsolete?
It should be pointed out that the GeForce 7xxxx And Quadro FX 4xxxx series can still be found in AGP configurations so you aren't forced to upgrade, yet.
Alex on :
Based on a friend's experience, if you can find an ATI card that works with the free driver, you can get good performance with KDE 4 (and presumably KDE 3 as well).
Florian Lohoff on :
Solved the very same problem with an Radeon 9600 AGP - There are some with Dual DVI and 256MByte memory - Passive cooled. I think the manufacturer was HIS. This setup is supported by the native X.org drivers even in etch ...
Chris on :
"Don’t closed source binary-only drivers just suck?"
No, just the implementation in Linux that changes all the time.
Seriously, why do Linux API's change with every minor version?
It smells like bad software engineering to me.
Anonymous on :
Linux's only "API", the system call layer it exposes to user space, never changes incompatibly. Ever. Linux still supports every single system call it ever had, with no incompatible changes to interfaces. Thus, any program written for a given version of Linux will continue to work with all future versions of Linux.
However, Linux specifically does not claim to provide a stable in-kernel API. As a result, Linux can change to provide new interfaces (functions, data structures, etc), migrate other bits of the kernel to the new interfaces, and then remove the old interfaces. Every part of the Linux kernel remains subject to redesign and improvement without regard to the limitations imposed by unchangeable legacy interfaces. As a result, Linux can achieve an unrivaled development rate, adapt quickly, remain efficient, and avoid unnecessary complexity.
Keeping old kernel functions and kernel data structure around for the sake of binary-only software, and allowing them to hold back the rest of the system, strikes me as "bad software engineering". Furthermore, keeping bits and pieces of kernel code out of the kernel source tree, and thus preventing the kernel developers from updating it together with the kernel when they make changes, also strikes me as "bad software engineering".
Robert on :
Smells like great software engineering to me.
Design choices aren't restricted to having to comply with ancient crufty interfaces.
You're forgetting that this is why linux works so well for so much hardware. In-tree drivers are what gives you the stability and wide ranging support. You're just complaining about graphics cards whose specifications are kept secret.
chithanh on :
The radeon up to X1900 are fully supported by open source drivers (2D, 3D, and video acceleration). I got a second-hand X1600Pro AGP dual-DVI for cheap, and it works great with two 1280x1024 monitors.
Xake on :
So you blame Linux for something Windows does the same? Why do you think it takes about five months before graphics in new version of windows just a bit less "un"-suck?
Internal APIs in the kernel are made to best support current existing hardware as good as possible, and sometimes new hardware comes around or new ways of handling the hardware which works just so much better. And when a old internal API are not used internal any longer, why "support" the old API any longer only becouse an outside company which coud have fixed their drivers long ago just do not want to spend the money doing it (why does not inline-kernel-modules suck when it comes to API? why do nvidia when they have the same resources to know what is going on i.e. lkml, RCs and so on)?
eremit on :
I'm really sorry to say that, but I wouldn't decide for an ATI/AMD card. From my experience it's really a pain in the ass to get them properly set up for dual screen. So if you don't want to have some enormous headache you should stick on the NVIDIA cards, especially as NVIDIA has released a new driver a few days ago. Maybe you should switch form the nvidia-kernel stuff from the repository to original driver. You can simply install this via envy-ng for example (I think it's in Debian unstable too), and the best is: you won't bypass the package system with it, as envy-ng build .deb packages in the end.
nucco on :
I do enjoy my intel 965 chipset, save for the fact that i lose DRI whenever i plug in a second monitor (which pushes the texture (i can't remember what they call it ) size above 2048 pixels on the x-axis). That is moderately painful, since I cannot use compiz then. No, i don't use compiz for the eye-candy, I use it so I can switch workspaces by dragging windows across, and I can use the more intuitive window picker ("scale"), as well as the exposé plugin. Those are real productivity boosters. If only metacity would gain that kind of functionality.
kju on :
As reported in IRC, the NVIDIA card in question works with 22.214.171.124 using the beta driver source 173.14.15 which is linked here: ftp://download.nvidia.com/XFree86/Linux-x86/173.14.15
Get the binary ftp://download.nvidia.com/XFree86/Linux-x86/173.14.15/NVIDIA-Linux-x86-173.14.15-pkg0.run and run it with option -x to extract the source. Then replace the content of /usr/src/modules/nvidia/nv with the /usr/src/nv directory from the unpacked source, modify debian/changelog to the reflect the version 173.14.15 and run make-kpkg kernel_image. Works.
kju on :
Should be make-kpkg modules_image of course
Marc 'Zugschlus' Haber on :
This is what I actually tried, but I did it the Debian way. A dedicated article will follow soon.