BSD stands for "Berkeley Standard Distribution". When Bell Labs invented UNIX, AT&T was prevented by consent decree from getting commercially involved with computers. That was the main reason why they agreed to the 1984 break-up - they thought they could make more money from competing with IBM than they would lose from the loss of their regulated monopoly. Suckers... Anyhow, they just gave away UNIX for free, and UC Berkeley ported it to DEC's VAX and released it. As an OS, it has over 20 years of history, and is very mature and stable. Yahoo runs on FreeBSD, for instance.
As BSD is not as mainstream as Linux, it has not benefitted from all the work companies like Mandrake, Red Hat or SuSE have put into making (relatively) it easy to use and install for neophytes. It is best for people who already have a fair level of UNIX experience, at least basic programming skills, and who are not afraid of compiling software packages from source code when pre-compiled binaries are not available. Usually, this just means unpacking the source, then typing "configure;make;make install", but sometimes it gets more involved, and you may need to know the C programming language to find out what is going wrong. There are also many commercial software packages like VMware that are available for Linux but not on BSD, and that will not run under FreeBSD's Linux compatibility mode. On the other hand, Apple's Mac OS X (soon available on Intel) includes BSD and has access to commercial software unavailable on Linux like Adobe Photoshop or Microsoft Office.
In terms of basic usage, BSD is a UNIX variant, just like Linux. Some of the administrative commands are different, but the systems share 95% of the same tools. Most programs that compile on Linux will also work on BSD, unless they rely on Linux extensions. It's the engine under the hood that is different, arguably the BSDs are superior (FreeBSD for performance, OpenBSD for security), and managed in a way that is less free-wheeling and anarchic than the Linux community. The more conservative approach also means they are less likely to make questionable choices in the realm of security vs. performance.
I burned out of Linux in 1993 after having to reinstall the entire operating system and dependencies twice in a week because they were changing the fundamental system APIs without forethought and planning, or heeding (or even acknowledging) the need for backwards compatibility or stability (as in not making changes for changes' sake, or at least thinking them through, not as in "does not crash every 5 minutes).
Actually, my experience with recent versions of Linux is that there is so much feature bloat and too-many-cooks-spoil-the-broth syndrome that it is less reliable today with 2.6 than it was in the early days of 0.9. At my company, I let my employees have their choice of OS on their desktops, but I often see the Linux users struggling because they need a special version of the kernel to support their video card, but a different version for their sound card. Configuration management has always been problematic on Linux due to the lack of tight control. The good thing is this means a lot of oddball hardware is supported, but the bad thing is, there are no two Linux machines on this planet that have the same version of Linux installed.
You can install up to 4 operating systems per hard drive if you partition it, so trying out multiple OSes is not hard if you reserve disk space for them ahead of time. All you have to do is use a boot manager to choose which OS you want to boot into when you start your PC.
My OS of choice on Intel is Sun's Solaris. It is derived from AT&T's original UNIX System V Release 4, with some BSD heritage as well. It is remarkably stable (one of my company's Solaris web servers has been running continuously for over 1050 days), and as an OS designed for mission-critical databases and the like, it has very high-performance under load, not just high performance for light workstation use that falls off under stress like Linux (the BSDs are closer to Solaris in terms of robustness). With recent features like DTrace, the amount of visibility you have into the system to identify performance problems or the root cause of problems is simply phenomenal.
My ideal setup, and indeed the one I have at home, is a Solaris/x86 server and a Macintosh desktop.
Finally, spending a couple thousand dollars on a PC seems like a lot. Building yourself does not make as much sense as it did ten years ago, because prices on complete systems have fallen and you do not get volume discounts on components when you buy retail the way computer manufacturers do. I wouldn't spend more than $1500 on a PC, and I would actually get one ready-made then customize it by changing out a couple parts like the video card or upgrading memory rather than building something from scratch (I know how to do it, it just doesn't make economic sense to do so any more). Sun's new Ultra 20 workstation is very nice, as are Dell's Precision Workstation lines and Shuttle's XPC line of compact shoebox-sized machines.