*gloats* I got a new computer!
Oct 7, 2001 at 1:35 AM Post #31 of 69
I wouldn't bother with raid unless redundancy and speed are extremely important.

What sorts of things do you primarly do with your computer?? Programming, games, ..?? It seems like you use 3D Studio Max...

Anyway, the HD is usually the bottleneck (or slowest component) of the computer.. a faster HD usually makes a notable difference. SCSI is expensive, but reliable (5yr warranty). Seek rates are also fast. Sustained rates are a little bit faster than the newer IDE HDs.

A good place to buy HDs from is HyperMicro... I bought my IBM GXP 60 from them ... very good business and their packaging of the hard drive is 1st class.
 
Oct 7, 2001 at 1:40 AM Post #32 of 69
Yeah, good link dhwilkin..SR has a nice FAQ with everything you need to know about HD, RAID, ATA vs. SCSI, etc..

Quote:

As for ata 66 vs ata 100, while ata 66 i noticed was a big improvement over ata 33 i have heard that ata 100 is a much smaller improvement over ata 66. Mainly because IDE has been maxed out.


Yea, AFAIK, that is correct.. since most newer HDs have sustained rates that are in the ~40MB/s range .. obviously, the difference btw 33 and 66 is there.
 
Oct 7, 2001 at 1:42 AM Post #33 of 69
Quote:

Originally posted by MacDEF
...RAID Level 0...does indeed significantly increase I/O by spreading the data and I/O load across multiple drives.


Umm..not really...not compared to comparable hardware...it may increase the throughput rates, but it's done by increasing the hardware, hence the expense...RAID was designed mainly for redundancy...RAID level 0 was done for people who wanted to be able to say they had RAID, but didn't want to pay the overhead for it.
biggrin.gif
 
Oct 7, 2001 at 4:08 AM Post #35 of 69
Quote:

Quote:

...RAID Level 0...does indeed significantly increase I/O by spreading the data and I/O load across multiple drives.


------------------------------------------------------------------------
Umm..not really...not compared to comparable hardware...it may increase the throughput rates, but it's done by increasing the hardware, hence the expense...RAID was designed mainly for redundancy...RAID level 0 was done for people who wanted to be able to say they had RAID, but didn't want to pay the overhead for it.


RAID Level 0's *only* reason for existence is increased I/O. In fact, with with multiple drives and channels, and multiple controllers (one drive per controller), it offers HUGE speed increases. It's not a "true" RAID, in that it's not fault-tolerant, but to say that it doesn't offer increased I/O isn't really accurate.
 
Oct 7, 2001 at 5:00 AM Post #36 of 69
Oops, I just noticed this matter already had lots of comments... ahh.. I'll let it stay anyway since I wasted my keyboard typing it. =)

Quote:

Originally posted by Xander
RAID can be utilized with either IDE or SCSI interfaces, and would only allow him to use multiple drives as mirrors of one drive, or multiple drives as one physical drive. It wouldent give him any more speed, or bandwidth so to speak.



Actually, RAID level 0 (usually called "data striping") does offer some performance advantages over other RAID levels and normal hard drive setups.
Technically it isn't actually true RAID since it gives you no redundancy, but it is still called RAID 0, and gives you a very nice performance boost on large blocks of data I/O, but no parity check for fault-tolerance.

IO cards supporting RAID 0, like some Promise cards, are fairly cheap.
 
Oct 7, 2001 at 11:53 AM Post #37 of 69
MacDEF: Problem is, most onboard raid controllers (Promise...) found on Athlon boards are actually rather software than hardware solutions - most of the work is done by the driver. That means that i/o speed is load dependant. We once tried it on a test system with some video grabbing and on-the-fly encoding: This took so much resources and processing power that the throughput of the RAID system (striping used - to double performance) actually dropped below below the throughput of the system using the integrated ide controller of the chipset southbridge. So ai0tron should at least make sure to use Win 2000 or Linux (or any other multiprocessor capable os - I don't want to start os wars here...
smily_headphones1.gif
) to make sure that both processors are supported right from os level in order to ensure there's enough processor time for the i/o handling, if he wanted to go for IDE raid.

And to all scsi supporters: Well, yes, scsi is still the better solution. At least as long there is no disconnect/reconnect-support for ide.
wink.gif
But, hey, look at the hd prices: The price premium for scsi drives over ide drives is too much nowadays. Price per gigabyte with scsi drives just sucks compared to ide drives. So the only applications I'd still use scsi drives for are servers and especially database servers as well as video workstations where you need minimum access times or minimum impact on system resources. But for typical systems scsi is just too expensive (performance gain to price premium = loooow! => bad idea!
biggrin.gif
).

Greetings from Munich!

Manfred / lini
 
Oct 8, 2001 at 12:01 AM Post #38 of 69
Quote:

MacDEF: Problem is, most onboard raid controllers (Promise...) found on Athlon boards are actually rather software than hardware solutions


Very true -- you need hardware controllers.

And lini is also correct that while SCSI is (usually) superior to IDE, it's cost/performance ratio is usually too high for home use.
 
Oct 17, 2001 at 4:03 AM Post #40 of 69
(OT) The (original) GeForce3 is being phased out in favor of two new "titanium" flavors - the GeForce3 Ti200 and the GeForce3 Ti500, the latter obviously faster than even the original GeForce3. But at $350, about the same price as what the original GeForce3 last sold for at retail, I'd recommend the Ti500 version ONLY if you are configuring a new computer system and you absolutely, positively must have the latest and greatest consumer-level graphics subsystem - and if you already have the original GeForce3, it isn't compelling enough to offset whatever you might have paid for the original GeForce3.

On the other hand, the GeForce3 Ti200 is slightly slower than the $350 original GeForce3 - but the Ti200 cost just $200! The slightly slower speed is more than offset by the significantly lower price! That alone makes it a VERY worthwhile upgrade on a fairly recent system that's equipped with a TNT2 or slower graphics card (yep, those sub-$1000 1.5GHz Pentium 4 systems from Compaq/HP with 256MB or 512MB of PC133 SDRAM memory cut corners even further by including a 32MB TNT2 M64 - or worse, a 16MB TNT2 Vanta - graphics card in those systems). If you want the most use out of that sub-$1000 1.5GHz P4-based system, then spend the extra $200 (retail price, but may be lower online) for that GeForce3 Ti200 card with 64MB of DDR memory (the only difference between the original GeForce3 and the GeForce3 Ti200 is the slightly slower chip-speed of the latter).

Speaking of the third "titanium" member of the GeForce family, the GeForce2 Ti, it sucks for the price. If you buy that particular card, you're stuck in no-man's land; the GeForce2 Ti barely outperforms the GeForce2 Pro - and is slower than the GeForce2 Ultra. And the pricing - $150 at retail - falls in between the two older chips, too.
 
Oct 17, 2001 at 4:17 AM Post #41 of 69
Nvidia lost this round. The Radeon 8500 is every bit as fast as the Ti500, has vastly better image quality, a more complete feature set, and costs $50 less.
 
Oct 17, 2001 at 5:03 AM Post #42 of 69
Quote:

Originally posted by SumB
Nvidia lost this round. The Radeon 8500 is every bit as fast as the Ti500, has vastly better image quality, a more complete feature set, and costs $50 less.


That's true, if it weren't for the horrible drivers that are currently the latest released version available for the Radeon 8500...
frown.gif


And another thing, though the original Radeon 64MB DDR (which I am currently running on my 700MHz PIII/BX chipset WinMe PC) is discontinued (its successor is the slightly faster Radeon 7500), the 64MB SDR version of that board has been renamed the Radeon 7200!
 
Oct 17, 2001 at 5:24 AM Post #43 of 69
The 8500 may technically be able to outperform the Geforce 3 but it has yet to do so because of the driver problem eagle_driver mentioned. I am glad to see ATI doing something interesting for a change though. I have heard however that NVIDIA already has technology much faster than the GeForce 3 running in the XBOX. I would assume then that it's only a small step to making that technology available on a PC. From what I know it looks like NVIDIA is playing down to ATI's level for the time being. And if there is a serious challenge made to NVIDIA's crown they will most likely drop another bomb on ATI with technology they have had waiting around for months... I would also like to see Matrox get it's act together if thats even remotely possible... I have a feeling Matrox is going to end up making nothing but video capture cards and TV tuners.

I have read some bad reviews and benchmarks concerning the 7500.
 
Oct 17, 2001 at 5:42 AM Post #44 of 69
You're right, ai0tron. Why saddle that potentially superior Radeon 8500 graphics subsystem with craptacular drivers that don't support some of its advanced features?

And those somewhat negative reviews of the 7500? That's because the drivers available for the 7500 are almost as crappy as those available for the 8500 - and much less mature than the latest drivers available for the original Radeon (aka Radeon 7200)/Radeon VE (aka Radeon 7000).
 
Oct 17, 2001 at 6:27 AM Post #45 of 69
Quote:

ai0tron said...

I have heard however that NVIDIA already has technology much faster than the GeForce 3 running in the XBOX. I would assume then that it's only a small step to making that technology available on a PC.


The X-Box technology is somewhat faster, yes, but it's not quite as easy transferring all that performance to the PC. Remember, the X-Box is a dedicated gaming machine. Which means it does not have to be expandable and knows exactly what hardware it has to work w/. For instance, one limiting factor on the PC is bus speed. You just won't get the guaranteed bandwidth from a PC that you would from a X-Box. Also, the PC has other applications taking up system resources, and drivers will have to be refined. A video card in a console will practically always perform better than the same video card in a PC.

And I haven't read any reviews of the new ATI cards yet, have to do that tomorrow... err, later today, ugh. Though I agree that ATI is pretty notorius for taking their sweet time releasing fast, stable drivers, through first-hand viewing.
 

Users who are viewing this thread

Back
Top