Nvidia Linux Drivers

Nvidia recently released an update (1.0-8174) to their high performance Linux video card drivers. Usually driver releases are done to support new hardware. In this case SLI series video cards and others such as the GeForce 6100, GeForce 6150 and GeForce 7800 GTX 512. Other than the driver itself, there were some significant changes made.

  • There is finally an HTML Readme. Previously the text based file was very hard to navigate. (BUT these HTML files do not work in Mozilla or Firefox due to improper mimetypes!!! )
  • This a nvidia-xconfig application included. The installer will properly configure your xorg.conf file. No need to do it manually.

From my installation it appears as though the installer has become more sophisticated, which is welcomed. Many Linux newbies have had frustrations with installing the Nvidia driver.

My FC4 Installation Guide has been updated to reflect this new release.

EDIT: Dec 23, 2005 There was an incremental update 1.0-8178. If everything worked from 8174, no upgrade is required.

Pocket Linux Server

About 2 years ago I purchased a Linux based PDA: the Sharp Zaurus SL-5500. The PDA was intended to be used on Windows and (later) Linux. The initial driver for Windows setup the device as a USB network device, however the latest driver set it up as a normal USB PDA. I found that using the older driver, I can assign an IP address to the device and configure it as a mini server. Although I no longer use it as a PDA, I have set up an Apache Web Server, MySQL database and PHP interpretter on it – all managed by setting up the SSH server. I sometimes plug it on various machines and do some web development or toy around with it.

I only recently heard of the Black Dog pocket Linux server. When plugged into ANY computer it will create a basic X-server and run a few basic (Linux) applications from the device within the host OS (Windows, Linux, etc.). It works similar to the Zaurus’ network capability over USB. Hence it can access anything the host computer can access. Even better it has an intelligent resume feature which can pretty much preserve your working desktop and resume it in place later on a different machine.

It ships with a 400Mhz process, 64MB RAM and base 256MB flash-based storage. My Zaurus, to compare, has 206Mhz, 32MB RAM and 64MB storage. Better yet it comes with a biometric scanner, is only 3.5 by about 2 inches in size, and already comes with Apache, SSH and several other programs ready to run. For about $250 you can get the 512MB model and if you plug in another SD card you can increase your storage easily.

Looks like I just found my Christmas present for myself.

Using Alternate Compilers

Users of Fedora Core 4, SuSE 10.0 and other distributions with GCC v4 may have found some open source softwares may not compile properly. Using alternate compilers may resolve the problem. FC4 also ships with GCC v3.2. There are many ways to specify alternate compilers during the build process, below are some ways.

Environment Variables
Most softwares support the CC and CXX environment variables. First assign them, then run configure or make. Example:

# export CC=gcc32
# export CXX=g++32
# ./configure

A good example can be found on the Redhat Mailing List.

Configure Support
If the software is using the standard GNU automake and configure, then there is a chance it supports other compilers by passing in a setting to the configure script. First run configure --help to see if it mentions anything. The following example is from MPlayer:

# ./configure --help
# ./configure --cc=gcc32

Makefile Support
Sometimes the software may just come with a Makefile. Open the Makefile and look inside to see if there are variables that specify the compiler. You can either edit those variables or set them at compile time. For example:

(in Makefile)


Then using the makefile, you can run:


Although the above examples are not exhaustive they should provide some ideas for using alternate compilers in Fedora Core and other Linux distributions.

I am also providing some notes from my FC4 Guide on GCC and GCC Compatability Libraries.

Evaluating New Linux Distributions

For Linux and Open Source in general, choice has always been abundant. However in both the Linux Server market and to a degree in the Linux “Desktop” market only a few major distributions have taken most attention. In my (future) spare time, I plan on evaluating new Linux distributions to see how well they compare for either a Linux Server (preliminary examination) and more critically: the Linux Desktop.

I want to develop a common test/evaluation plan for different distributions so I make fair assessments on their comparable value. Additionally, I do not have the luxury to test any system thoroughly over an extended period of time. The following are some ideas I have thought out.

  • Installation
    • Partitioning, Dual Boot, Networking
  • Software Selection
    • Environments, Office, Multimedia, Development, Server
  • Software Support
    • Security and General Updates, Software Repositories, Package Management
  • Basic Hardware Support
    • Motherboard, Networking, Video, Sound, Power Management (ACPI), DVD/DVD-R
  • Peripheral Hardware Support
    • Digital Camera, Printers, Scanners, Media Card Readers, PDA’s, USB Devices
  • Community Support
    • Mailing Lists, Websites, Forums, Newsgroups
  • General Usuability and Stability
  • Default Behavior and Configuration

I think I have covered the most important issues. Hopefully I will be able to perform all the above for every distribution. I appreciate suggestions and comments to these points.

CentOS 4.1 Quick Examination

Since Redhat stopped supporting their commercially available Linux distribution, they moved to an Enterprise Linux Server (RHEL) and left everyone else to use a community effort (Fedora Core). Considering the cost of RHEL, the source packages were recompiled and redistributed. The resulting CentOS is a free binary compatible distribution of RHEL without the proprietary Redhat only software. I have seen some virtual private servers using CentOS for the virtualized operating system.

I installed the DVD based copy of CentOS 4.1 on my main desktop a few weeks ago just to experiment. For the most part it, it installs and is setup very similarly to Fedora Core 3(?). Upon initial inspection it also seems to come with the similar list of software. However the major issue here is that it is geared for server performance. Hence, it will not include the latest versions of most software and instead use widely supported and stable applications (i.e. PHP4 as opposed to PHP5, etc.). I could tell that the desktop applications were a bit out of date, but that makes sense considering its intended purpose. … After having used Fedora Core 4 for the past four months, I was shocked at the performance of CentOS. CentOS seemed significantly faster han FC4. In terms of user interface, web browsing, launching applications and even running servers and daemons – it was very notable. I do not know if it has to do with the GCC4 being used in FC4, but I’m still surprised to realize this now. Anyways, I decided I could not use it for desktop purposes since much of the software was older and there was not as much support from the community in general.

I played around configuring different installs of Apache, MySQL and PHP. I tried recompiling source RPMs (SRC.RPM) from FC4, but it became tricky to have multiple versions running. I will try again from source files instead.

My opinion so far is that it is a VERY stable and usable server distribution. If the software it included was newer I would probably be using it as my main Linux desktop. Anyways, more later as I learn more.

Pitfalls to Installing Everything

The purpose of this article is to explain the potential problems in installing every package that comes included in any given Linux distribution. For the most part, this is a bad practice and is not conducive to becoming proficient in Linux for either a seasoned professional or a newcomer (ie. “newbie”). It is my hope that this will help educate people on this subject matter.

There are some abundancy arguments that are commonly used and overstated. Specifically: Disk space, memory and bandwidth are all “cheap”. Technically none of these are always true. In fact these are almost always entirely false in third world countries.

There are some minimal advantages to installing everything. There will not be any dependency issues among software packages included in the software distribution. All software will be immediately available for use to try and test. Other advantages are possible, but these are the most relevant.

The problems I see are as follows:

  • Most software will never be used and is redundant. Many of these applications are designed for experienced users who know how to install them even though they are not included in the default install. Examples: Most newbies do not use ‘vi’ or ’emacs’. Most devel packages are only used for compilation.
  • Every software whether used or not must be maintained if they may be accessed by multiple users whether remotely or locally. A typical problem would be for security updates or bugs that you would not normally encounter with default settings.
  • Updates take longer and consume more resources. Everytime a system wide update is done (ex: yum update) it needs to download updates for every single package on the system. Even though you may not pay for your bandwidth, there is some cost to the provider and could serve someone else who could use it more appropriately.
  • (For new comers) You really do not learn anything. It is beneficial at times to understand how software dependancies work and to learn how to install software when needed. Needs change and are not the same for everyone.
  • There is more immediate drain on local resources. Most distributions package enough software to run as either a server and/or a desktop. It does not make sense to run multiple server applications on a desktop machine. Furthermore, most distributions package some packages with the knowledge that some should not run at the same time, i.e. the installer should know what they are doing. Additionally many services and daemons perform redundant tasks, i.e. multiple FTP servers are not typically required or recommended.
  • Although rare, some distributions may include conflicting versions of packages with the intention of the user selecting only one. This is typical of a distribution which may provide a new less popular version in addition to a widely used version. An example in past I’ve seen is (SuSE?) shipping both Apache 1.x as well as Apache 2.x.
  • There are hardware specific options that should not be on every machine and require extra steps to update. In the case of Fedora Core, some kernel packages (which a small population require) are not updated on the same frequency as more common packages. This has lead to some confusion and difficulty.
  • An additional note to Fedora Core users: Fedora Core has always been “bleeding edge” distribution, which basically means it will typically ship with the absolute latest (sometimes not adequately tested) software versions. Also there will always be some software included that may not make it into the next version or update.

Given these points, it is still entirely up to the end user as to what software they should install and use. However, it is very unlikely that anyone could potentially use every single included application. It is better to choose less than more and install as needed. Furthermore it is best to understand why something is needed as opposed to foolish assumptions that more unknown software is beneficial.

Quake 4 for Linux

Linux gaming for the most part is non existent. There are a handful of games, but for all the bells and whistles that the gaming industry puts forth, not much of it makes it to Linux. Activision is a good exception to this, I’ve followed their gaming engines since Quake II in the late 90’s. It was with the release of Quake III, a fully native Linux version was available. A full featured FPS (first person shooter) that was a commercial release really showed off the potential of Linux gaming. Activision has designed their gaming engines such that ports to other operating systems (Linux, MacOS, Nintendo, etc.) should be much easier. Which is in their best interest money-wise — Licensing!!! Many games, Caste Wolfenstein, Elite Force and Doom 3 to name a few have all been released as such.

Quake 4 was released only 2 days ago (Oct 18) and the Linux installer is already available. The best thing about this deployment method is that if you buy the Windows version, the the Linux installer is available for download and uses the data files from your Quake 4 windows CD’s. Basically 2 for the price of one. Id Software and Activision really get a thumbs up for open minded design.

Now if only the rest of the gaming industry could follow suit. But for a incredibly small Linux gaming market and for the relatively higher development costs and minimal returns, I seriously doubt that Linux games will improve any where in near future.

Canon S500 in Fedora Core

I had written a really simple camera mini-guide for how I use my Canon S500 digital camera in Fedora Core 3. The other night I decided to update and make sure everything still works in Fedora Core 4 – and it did.

Guides like these, to me, are almost not necessary. I would tell someone, just make sure you have Gnome and gPhoto and your USB setup and your camera will “automagically” work. But that advice, to me, does not seem very tangible. In the Windows world, people are reassured by the fact that their hardware or peripheral comes with an installation CD. That seldom happens in the Linux world. People who are unsure whether or not hardware works correctly with Linux need some sort of valid proof that there is Linux support – this always seems to come from the community as opposed to the manufacturer. Personally, I research Linux compatibility for all of my hardware prior to purchasing. Guides like the one listed above, I hope, make someone feel more certain about their purchase.