Monday, November 12, 2007

A cynic's view on the Sun-Network Appliance lawsuit

I've been thinking about the Sun - NetApp lawsuit, which is an interesting case that highlights what is wrong with the Patent Office in the US. I don't necessarily think that all software patents are bad, and that innovation should be rewarded with temporary monopoly on a piece of technology.

However, this lawsuit shouldn't have happened, because at least one of the patents in question should probably not have been granted in the first place. David Hitz, founder of Network Appliance, has a blog in which he contends that Sun's ZFS violates patents held by NetApp for their WAFL. If that's truly the case, Sun should certainly be held liable, and should stop publishing ZFS as open source code. You can't give away what isn't yours.

The larger issue, in my opinion, is that it seems significant claims of the WAFL patent should never have been granted. There exists a significant amount of prior art in the use of a "tree of block pointers" to maintain logical consistency in data storage. Relational database management systems (Oracle, Microsoft SQL Server, IBM DB2, Sybase, PostgreSQL, etc.) have been using the same techniques for decades to maintain transaction-consistent indexes and relational data inside databases. WAFL may indeed be the first implementation of the idea where the "tree of blocks" points to files in a general-purpose file system. But in an RDBMS, the tree of blocks (tree of index pages) typically points to arbitrary row data in the database. What's the difference? Not much, bits are just bits. The application to a file system seems obvious to me (reasonably skilled in the art), meaning the patent should likely be challenged. Heck, not-so-innovative Microsoft was kicking the same ideas around back in the early 1990s as part of the WinFS file system for the "Chicago" project.

In fact, I personally designed the database for a document management system that used a "tree of blocks" to store and organize arbitrary file data in Microsoft SQL Server back in the late 1990s. This system had point-in-time recovery and look-up capability, based on valid time-stamps in the block pointers (folder and file tables with their indexes). I suppose you could call this a "snapshot" capability. The database took care of transaction logging, check-pointing, and referential integrity, all of which seem to be additional claims in the WAFL patent.

I have passable programming skills, I am not an algorithm design guru, and I had certainly never read the WAFL patents before implementing this "file system on a database". The basic ideas were widely known and used frequently in the database arena. I am not an intellectual property lawyer, but experience leads me to believe there isn't much innovation in the "always-consistent tree of blocks" described in WAFL patent. They devil may be in the details, I suppose - I have only reviewed the patent abstract at this point.

Still, in my opinion, the U.S. Patent Office, as currently constituted, is incapable of identifying true innovation. It grants far too many patents on obvious or derivative technology, especially in the software arena. If they can't get it right, even with the enormous resources at their disposal, they should probably not grant any software patents at all.

As a side note, this in-house document management system was never widely used, and the project was considered a failure. I believe this was largely the result of a cumbersome legacy ASP-based web front-end, though, not because of deficiencies in the storage engine.

Monday, April 23, 2007

time.windows.com fixed

Well, it appears that time.windows.com is now fixed, after a few weeks of serving up invalid time. Presumably, the clocks on millions of Windows machines worldwide are now slowly drifting back into synchronization with the rest of humanity.

I find it rediculous that such a problem could go unnoticed and unfixed by Microsoft for so long, and that it took a Microsoft participant on a programmer's blog reading about it to track down and correct the issue.
U:\>w32tm /monitor /computers:time.windows.com,us.pool.ntp.org
time.windows.com [207.46.130.100]:
NTP: +0.0541156s offset from local clock
RefID: time-nw.nist.gov [131.107.1.10]
us.pool.ntp.org [66.91.129.70]:
NTP: +0.0293621s offset from local clock
RefID: bigben.ucsd.edu [132.239.1.6]

Friday, April 13, 2007

time.windows.com is broken... is your clock off too?

It seems that time.windows.com is broken, reporting unsynchronized time off by two minutes or more. Why is this a big deal? Time.windows.com is the default Network Time Protocol server used by the Windows Time Service in Windows XP, 2003, and Vista systems. So there are literally millions of systems out there without an accurate source of internet time.

I have personally reported the issue to Microsoft, and Akamai as well (they seem to host the actual servers). But there has been no response from either for several days. Reports on the internet indicate that time.windows.com has been broken for at least a week!

Fortunately, it is easy to switch to a different time server. If your computer is part of a Windows domain at your workplace, it will get time from your domain controller by default, so you don't need to do anything. If your system is a domain controller, or is stand-alone, you should run these commands: C:\>w32tm /config /manualpeerlist:"us.pool.ntp.org,0x8" /syncfromflags:MANUAL /update
C:\>w32tm /resync /rediscover


Note these commands do not work on Windows 2000. For that the command would be:
C:\>net time /SETSNTP:us.pool.ntp.org

What is us.pool.ntp.org? It is the United States address for the global NTP Pool Project. You can substitute your own two-letter country code for the "us" portion if you are not in the continental United States. If you're in London, for example, use uk.pool.ntp.org.

The ",0x8" after the time server name tells Windows to use a client-mode association with the time server. This isn't strictly necessary, but the proper way to configure an NTP client talking to an NTP server. If you don't use it, Windows checks the time exactly once per hour, rather than adjusting its time-checking interval automatically based on clock performance and network conditions.

One final note, do not use popular "stratum-1" time servers to synchronize your client. These systems are typically run by national standards laboratories (an example would be time-a.nist.gov). These time servers are quite overloaded, and Windows systems cannot use the increased accuracy they provide anyway (the Windows Time Service is only accurate to about 16ms). The NTP Pool Project was started for the specific purpose of reducing the load on the Internet's stratum-1 time-keepers.

Sunday, April 01, 2007

Linux plunge not working out so well

So I don't think Linux is ready for laptops. Well, my laptop anyway.

The first problem I encountered was with screen resolution. I didn't have an option for the "native" 1280x800 widescreen resolution of my Dell 700m. After digging through the Ubuntu support forums, I discovered that I had to install a small utility called 915resolution. It was a minor pain to track this down, as there were several contradictory sets of instructions found with Google, but running this command:
sudo apt-get install 915resolution
and restarting seemed to fix things.

My next problem was with WiFi. Ubuntu's network management applet didn't show any available wireless networks, despite the fact that I know there are dozens nearby my home. Reboot into Windows, do some more browsing, and discover some diagnostic tests to run. Boot back into Ubunutu. It appears that command-line tools can see wireless networks nearby, but Ubuntu's GUI is broken and doesn't list them. I could deal with having to run a few commands to connect, but...

It also seems that Ubuntu does not support Wifi Protected Access 2 (WPA2) wireless access points out of the box. This is a much bigger deal, as my home network is all WPA2 WiFi, and I will not "downgrade" to any version of WEP, as it is woefully insecure.

So I boot back into Windows, and do a few more hours of research. There are WPA tools for Debian-derived Linux systems, and some folks have gotten them to work. But I was really unwilling to go down this route, as the documented procedures were pages long, and involved running a command line utility to generate a password hash for each new network I wanted to use. Not exactly useful for someone who needs to do work on the go.

Finally, I looked into getting support for my Sprint Mobile Broadband Card, which provides about 1 Mbps download speed just about anywhere. This little device is my lifeline for work. From what I read on the net (again scattered over dozens of contradictory sites), there is almost no driver support at all for these mobile Wireless cards in Linux. To get something working, I would have to modify some available driver for another device and compile it into my kernel.

So I gave up. I have a family, and a job, and I just wanted to get some work done. Right now, at least, Ubuntu doesn't have enough mobile device support for my needs.

And I know all the Linux fanboys out there will call me an 1d10t n00b, and blame the hardware manufacturers for not releasing good open-source drivers. But you know what? I don't care. Ubuntu failed me. Going mobile with Windows XP is light-years easier by comparison, and I'm not going to switch to something that requires so much manual configuration each time I want to work on the road.

Maybe I'll try a MacBook instead.

Friday, March 23, 2007

Taking the Linux plunge

So I've got a decade or so of Windows-based network administration experience. However, my formal computer science education was rooted in Unix back in the early 1990s. We used Sun workstations exclusively back then, and almost all programming was in C or Scheme. I even wrote a simple C compiler.

But my first job was at a Netware 3.X shop, which we transitioned to Windows NT 3.51, and I've been managing large, multi-site, but mostly-Windows networks ever since. (I've also done a lot of DBA and security work too). So I'm not a neophyte when it comes to IT, but I don't have much Linux experience, as none of the jobs or projects I've worked on used Linux.

I finally decided to install Linux and actually try to use the thing regularly. I've done "toy installs" of various Linux and BSD flavors over the years, mostly in in Virtual machines. But this was my first real go at using Linux regularly.

I picked Ubuntu 6.10 as my distribution, based mostly on reputation as the simplest Linux to get working.

But first, I had to partition my hard disk on my Dell 700m so I could dual-boot. No problem; I've used PartitionMagic before. But QTpartEd from the bootable Knoppix CD can also shrink and existing NTFS partition, so I used that, and it worked like a champ: 40 GB for my existing Windows XP Pro installation, and 20 GB of empty space for Linux.

Not exactly "my mom can do it" easy, but not the sort of thing that's necessary unless you want to dual-boot.

After downloading Ubuntu 6.10 installer and burning it to CD, I fired it up in my road-weary Dell. A nice GUI installer comes up, and asks a few sensible questions. Then it asks where I want to install. But the default option is to erase the entire disk and devote it to Ubuntu!

Now, not even Microsoft has the gall to default to "erase everything else" the Windows installer. But choosing the "use largest free space" option was easy enough. So I'll forgive this, and I'm sure there would have been lots of warnings to prevent me from killing my Windows partition if I had chosen the default option.

Now click through a few more sensible screens, wait 20 minutes or so for files to copy, and then a reboot. Ubuntu comes up in all its earthy (that is, very very brown) glory.

All in all, a simple install. Simpler than Windows XP, in fact, and about the same as Vista.

But now the real fun begins, as I try to use this thing regularly.