Thursday, August 5, 2010

Kerberos + Open Directory + OSX Snow Leopard

I just finished tracking down one of those terrible bugs that are hard to find because the cause is just so seemingly disconnected.

I recently installed an Xserve with Snow Leopard (OSX 10.6) and Open Directory. Everything worked great, except that periodically kadmind would go into an infinite loop and never come back to this world.

So, I downloaded the latest MIT kerberos source package, compiled it, and installed it to /opt/kerberos on my system. After verifying that it worked, I renamed the osx provided kadmind, and copied the new one, and started it up. Everything worked great, until....

Uses on my linux hosts complained that they could not change their password, and received the following errors:

root@clipper:/etc# kpasswd tbriggs
Password for tbriggs@CS.SHIP.EDU:
Enter new password:
Enter it again:
Authentication error: Failed reading application request

After reissuing keys, rewriting configuration files, triple checking ntp configurations ( to make sure clock skew wasn't the problem ), I finally broke down (my resolve and my emotions), and broke out GDB. Having compiled kadmind from MIT source, I was able to use GDB to trace through the code. I found that it was failing in lib/krb5/krb/rd_req_dec.c, specifically when it calls krb5int_authdata_verify. I traced into that, and found that it was failing whilst running authorization plugins.

So, here is the tricksy part. I used OSX's dtruss to capture file activity, and found, to my dismay, that there were two sets of authorization plugins, one from /System/Library/KerberosPlugins/KerberosAuthDataPlugins and one from /opt/kerberos/lib. The plugin that was there, was, if you can guess, the plugin for the OSX PasswordServer. I disabled it (ok, I swore a little bit and removed it). Now, I can change passwords without issue.

So, three days of serious kerberos debugging, and it was because of a residual plugin that we don't even need!

Sunday, August 3, 2008

Eclipe Ganymede - Executable Jars

Eclipse Ganymede (3.3) has some cool new features. One that I just discovered today is really cool and saved me a bunch of time: Export to Executable Jars.

Here is how it works: make a Java program with a runnable class. Build up your build path and get a clean compile. Execute your program locally with the normal "Run as..." tools. This isn't anything new. The new thing is the ability to export to a single, executable jar file.

Here is how:

  • Right click on project
  • Choose "Export"
  • Choose "Runnable JAR file"
  • Follow the prompts to pick the runner and the destination

When the export is finished, you can execute this single jar as: java -jar [jarfile].jar

Use Case:
Here is how I use this: I am developing a database app that needs to suck in data from a MySQL database stored on a remote system. I can VPN in over my DSL line, and it works great for testing. I usually add a "limit 100" or some reasonable number to large select statements so I can debug quickly. Then, when I'm ready to run the program "for real" on the large dataset (this one contains over 100,000 entries of large XML fields, I don't really want to run this over my DSL line. Many reasons for this:
  • Tie up my Powerbook from other cool things
  • Saturate my DSL line
  • Tie up records in my database waiting for data buffers to flush in between locks
In the past, I manually concocted a UNIX tar file with all of the dependent jars, classes, etc. and used SSH's "scp" feature to move it onto the UNIX host. It wasn't too bad - but it was a pain when there were a lot of jars. One project I have uses over 50 different JAR files! Debugging was painful.

Now, I can use this feature to dump out one JAR file and execute it directly on the remote system!

FEATURE REQUEST: Use this to target a remote host, and use RSYNC to put the JAR file on the remote host and SSH to execute it.... that would be killer!

Sunday, February 17, 2008

Mac OSX Tricks

Mac OSX Tricks
I've learned some cool Mac OSX Tricks along the way, that I'm sure to forget before too long. I'll try to add them here and maybe build up a nice list of cool tricks and necessary fixes.
  • I like old green-screen terminal, so set Terminal's preference to default to Homebrew
  • JAVA_HOME wasn't set: added

    export JAVA_HOME=/System/Library/Frameworks/JavaVM.framework/Home

    To /etc/bashrc. After new login (or . /etc/bashrc) echo $JAVA_HOME works. I found this necessary for some ant scripts.

Sunday, September 30, 2007

CSE Research Idea

I'd like to see a research project that collects various on-line student programming answer services and evaluates them for 1) accuracy (what grade do they get), and 2) ways of identifying them, and 3) what it would cost to preemptively gather results from them to add to a plagiarism database.

Wednesday, May 9, 2007

Grrrrr.... It is the little things

Another example of how Linux is not ready for the desktop prime time. I use Gnome. I have used Gnome for a long time. I use Gnome under Solaris too. Gnome isn't the best, its not the worst, and I don't expect much from Gnome -- except that stupid things like seeing my desktop icons actually be recognized as desktop icons - not ".desktop" files!

The Fix: update-mime-database /usr/share/mime

And, someone please, please, please tell me why I cannot edit menus under Gnome for Linux? I used to be able to do this with Gnome under Solaris!?! If I want to add a new launcher or reorganize the menus, there is no obvious way to do this. Come on people, Windows 3.1 could do this! Hell, I think I'd rather find a copy of GeoWorks and start using that instead of this nonsense.

Video card troubles (c. 1994)

I started using Linux when the kernel in 0.88 PL1 days (it was amazing on my 386DX 40!) In those days I had an ATI VGA Wonder XL video board in my system. ATI VGA Wonder XL's worked either with the XFree ATI driver or with the XFree VESA driver; except mine. My VGA Wonder had a different clock chip that made it incompatible with XFree. I remember reading through endless guides posted on usenet news groups describing how to patch XFree and how to manually compute the monitor mode lines to make XFree work with this card. The hours I wasted! Of course, because it was all open source, I could patch the source to XFree, and because there was documentation I could compute the mode lines. Go open source.

In the late nineties, I was working as a systems administrator. I installed Linux on some Zenith workstations. They had (almost) generic video cards. XFree failed to work on them without some more configuration file tweaking. To make it worse, the application we used required OSF/Motif libraries, which were not free (even as in beer) in those days. The commercial libraries required specific versions of XFree, which were unstable with the video cards. The best we could reliable do was to put them in VESA mode and live with 800x600x8 displays.

In that same era I picked up a Dell Latitude laptop with a NeoMagic video card. That was the one and only laptop which I had no driver issues with. There was a NeoMagic video driver already there, and it happened to work with Linux.

Ever since then, I've struggled with video card issues. My newer Dell laptops have all had great video support under Windows, but terrible support under Linux. Even when I've been lucky to get some video support, I stopped expecting it to properly handle external connections to projectors.

Fast forward to 2005 and Excalibur, my Dell Precision 670 workstation. This system is the synthesis of 11 years of hateful experiences with video cards. The nvidia Quadro NVS 280 card which shipped with the machine lived in the PCI Express slot but was advertised a "2D accelerated card!" For a long time the nvidia drivers didn't recognize it, and when they did, the OpenGL drivers hated it. When they finally did start to work, performance was terrible.

I replaced the NVS card with an nvidia 6600 GT PCIE card. It was reasonably fast, but had the annoying habit of causing the system to freeze intermittantly when doing OpenGL work. The freeze in question was a hard freeze - one had to power off to regain control of the system. The card worked 100% flawlessly under Windows. Performance between Linux and Windows was noticebly different. Whereas the card produced expected performance under Windows, it was about 50% slower under Linux. Yes, I made sure I wasn't using Mesa drivers. I also made sure I removed the X screen saver, because it would randomly make the system lock up.

So, eventually, I replaced the 6600GT PCIE card with an ATI X300 card. This card was labelled specifically as supported by Linux. In fact, it has been the most stable of all of the cards I've used. It worked (generally) out of the box. Of course, this was with software rendering using mesa. I just spent the past 1 1/2 hours getting the fglrx modules installed and working. There is still a little bit of a bug with the AIGLX extensions from the livna package. I'm running a plain vanilla FC6 system (I haven't even tried to compile a custom kernel).

Shouldn't this stuff just work? Why is it that Linux, after 12 years, has yet to find a stable video driver framework? If Linux is going to be successful as a desktop replacement, this needs to be fixed and fixed now. Why? Because Microsoft & Apple have it right. My dad can buy a PC, plug it in, and it just works. I'm probably going to switch to Apple, and I expect that it will just work.

Open Source Just Doesn't Work
Now, you may argue that Linux does work, its the video card driver vendors. Those evil companies that don't want their cards to work with Linux. They keep their drivers closed source so no one can see their code. Lets get those evildoers and smoke 'em out. Closed source drivers are irrelevant. I've spent the last 12 years trying to make Linux work. For a long time I was an evangelist for the cause. But one cannot make a convincing argument for a technology that is stuck chasing the tail lights of its competition. Windows is totally closed source, and yet things work. MacOS is closed source, and yet things work. Solaris is closed source, and yet things work.

Linux will probably continue to gain ground in the server room (although frankly I prefer Solaris). But I do not see any significant market penetration until I can hand my Dad a Linux DVD and tell him go put this in your drive, and all will be well.

Monday, May 7, 2007

Making Excalibur Work

I swear I spend 90% of my time trying to solve the same problems over and over again. I think I should be awarded the world record for the greatest number of Linux installs on a single system. The system in question is Excalibur, a Dell Precision 670N (The N says I didn't buy it with Windows). The most recent round of troubles happened as a result of downloading a huge chunk of file system data which caused the I/O performance to degrade. Who says Linux EXT3 is immune from fragmentation issues?

Anyway, I'm trying out XFS with LVM2 using a 3-Ware 8006 Escalade. I had one aborted install where XFS forgot that it was supposed to be a file system. I saw some interesting kernel "oops" messages that I've never seen before too. A good time was had by all.

The current configuration (mostly for my own posterity's sake) is the basic Dell 670 w/ 2GB of RAM, 2 300GB SATA HDD's and 1 74G SATA HDD. The two 300's are on the Escalade controller. Mirroring on that device was problematic (slow mirror performance and no dirty region logging), so I'm using striping and rolling the dice that I'll get to replace the machine before one of the drives drops dead.

I've reconfigured from the basic "one big file system" approach back to a more reasonable :
  • / = 16G
  • /var = 13G
  • /opt = 20G
  • /work = 200G
  • swap = 20G
I should add that when I restored some data, I flooded "/var" - why does MySQL default there anyway? One good thing with this setup:

# lvm
> lvextend -L+8G /dev/VolGroup00/var

Followed up by :
# xfs_growfs /var

Did the trick quite nicely. So, OK - I'm liking lvm and xfs better and better.

I found some good tips at :