Sign in to follow this  
Followers 0
Brian

Drive testing methodology

20 posts in this topic

As we're about to scale up the reviews engine again, I want to throw out there a starting point for a new review methodology. Thoughts on what should be included in reviews?

Battery tests (notebook drives only):

-Running movie from hard drive until battery reaches certain %

-Idle until battery reaches certain %

-General usage until battery reaches certain %

Benchmarks:

-PCMark05

-HDTune Pro (full diagnostics/analysis)

-Crystal Disk Mark

-Boot time

-Time it takes to copy one large folder on the drive

Others::

-Temperature measurement w/ HDTune

-Subjective noise measurement @ idle/load

-Warranty comparison to other brands

-Brief review of included software (if applicable)

Share this post


Link to post
Share on other sites
As we're about to scale up the reviews engine again, I want to throw out there a starting point for a new review methodology. Thoughts on what should be included in reviews?

Battery tests (notebook drives only):

-Running movie from hard drive until battery reaches certain %

-Idle until battery reaches certain %

-General usage until battery reaches certain %

This would be good but I think it's too tricky and untenable. It relies on the notebook brand and components, the wear level of the battery...what may work better is taking the notebook out of the equation entirely (since SATA's standard between platforms) and popping that bad boy in a desktop unit. In fact, what you could theoretically do is pop it into a BlacX or similar unit and run that unit's power supply through a Kill-a-Watt wattage meter. Since the BlacX uses eSATA the difference in throughput should be negligible at worst.

Benchmarks:

-PCMark05

-HDTune Pro (full diagnostics/analysis)

-Crystal Disk Mark

-Boot time

-Time it takes to copy one large folder on the drive

I waffle on Crystal Disk Mark, but would like to suggest adding a game to the mix to measure loading times. GTA4 has godawful loading times (even on my RAID), for example, so it might be a good measure.

Others::

-Temperature measurement w/ HDTune

-Subjective noise measurement @ idle/load

-Warranty comparison to other brands

-Brief review of included software (if applicable)

I like all of this.

Some of these really stress the need for a standard test bed. My desktop, for example, takes a year to load despite being on Win 7 simply because it has to go through Intel's RAID BIOS and then a second SATA BIOS for my eSATA card. I know I'm pushing this really hard, but I also think it can be done very cheaply. A corporate Intel board that has an IGP and a RAID on the southbridge, a cheap Intel CPU, two or four GB of RAM, small cheap HD (just in case), a DVD-ROM drive, and a case. I'd include something like the BlacX or a similar product to make testing easier and more efficient.

EDIT: Check that, the external docks apparently still incur some overhead. Perhaps a hot swap bay instead?

Share this post


Link to post
Share on other sites

Another suggestion.... don't overlook the server jockeys. Not everyone is a gamer or home PC enthusiast. Reviews on 15k RPM SAS disks, and other disks more often found in servers, are verrrry welcome. Perhaps more tests on RAID controllers? And, any info related to long-term reliability on any disk, server or home PC, is golden.

Share this post


Link to post
Share on other sites
Others::

-Temperature measurement w/ HDTune

-Subjective noise measurement @ idle/load

-Warranty comparison to other brands

-Brief review of included software (if applicable)

All good stuff here - but I would categoricly AVOID "subjective noise measurements" Get a microphone or sound meter to use for your measuring. then if several people do reviews, the numbers will be consistant. The ideal case here would be to construct a holder for the sound meter or microphone and attach it to whatever you use to hold the drive under test.

you can also checkout www.silentpcreview.com to see what that guy did with a homemade anechoic chamber and a seriously sensitive microphone. Probably overkill for here - but im sure you can pickup some ideas.

Other possibilities to check:

- How long to format the drive?

- Some way to measure vibration?

- Ditto on the load times for some games and/or software packages.

- Mabye measure how long it takes to install windows or some other big footprint software package.

Share this post


Link to post
Share on other sites

Doing proper sound measurements is very tricky, and requires an anechoic chamber to do reliable testing. Either stick with subjective opinion, outsource the noise level testing to someone who can do it properly, or be prepared to invest in the proper tools and chamber to do it in. SPCR does fairly good drive reviews these days, with proper sound measurements, you could consider maybe teaming up with them on drive reviews? :)

I'm also interested in another aspect of drive testing: RAID COMPATABILITY. this one is very important for us server nuts, because not all harddrives play nice with RAID controllers or linux swraid.

Share this post


Link to post
Share on other sites
Doing proper sound measurements is very tricky, and requires an anechoic chamber to do reliable testing
Well, if you want a sufficiently low ambient noise floor then yes... if not, so long as your measurements and your testing environment, and your methology and equipment are all consistent, you can get adequate results-- so long as your results are only compared with your other results.

(which is exactly how most other sites operate)

So that works for me.

Battery tests (notebook drives only):
I wouldn't bother, just do power measurements like Anandtech, Techreport, SilentPCReview, Xbitlabs-- or *gasp* like SR used to do. IIRC Xbitlabs or SPCR are a little more detailed than the others.

Benchmarks... server loads that Xbitlabs, Anandtech, Techreport, etc. do with IOMeter are all very useful. IOMeter does have limits, IIRC Anandtech has done quite a bit of work here on this for realistic benchmarks. SR was a pioneer with useful IOMeter results and server-oriented stuff, I would try to keep that--- because as other reviews note for general desktop use, the differences between drives are usually tiny!

Game level load times are also useful, as are utility benchmarks (7-zip, LAME MT, CD burning, etc. etc.).

Some of these really stress the need for a standard test bed.
ABSOLUTELY AGREED. SPCR, Techreport, Xbitlabs, Anandtech, etc. all have standard testbeds. Techreport's finally updating theirs, but it's coming along slowly. Making benchmarks on the same testbed is absolutely essential for useful reviews and a useful comparison database.

And the database of reviews needs to proceed at a decent pace-- it must be kept updated to remain useful, and updated often enough that it stays current. That may be a momentous task, but it is an absolutely essential one.

Reviews on 15k RPM SAS disks, and other disks more often found in servers, are verrrry welcome. Perhaps more tests on RAID controllers?
Given the shrinking market with SAS, I would love to see reviews but I think to make them useful the benchmark suite must include IOMeter and other suitable items.

RAID controller testing should be separate. If you mean testing disks for compatibility with RAID controllers... I wouldn't bother. That is a huge task. It's not a simple "plug 4 disks in, build the RAID6, and use it for a week in the main file server with 20 users"... it's much, much more difficult than that to truly establish useful reliability. I would advise skipping it.

And, any info related to long-term reliability on any disk, server or home PC, is golden.
Not sure how realistic this is. If the manufacturer publishes annualized failure rate (AFR) in the drive info sheet, please share it... but aside from that there's no good way to test it for a review site like this.
- Some way to measure vibration?
Definitely useful-- SPCR does this, but only subjectively. Then again subjective measurements may be all that's realistically doable... I am familiar with vibration testing and even my big customers outsource that, which tells me building such setups is probably extremely expensive or of limited value, or both. This would be intriguing, at least. As SPCR found with the latest Velociraptor as well, some drives may be more noisy depending on how they are mounted, which could further complicate things.

For things like external disks, you could probably use a separate testbed (again keep it standardized), but honestly since external disks only have a few families of controllers (USB, firewire, combo ones, etc.) I am not sure how useful this is, especially given the already-formidable size of doing just internal HDDs (SSD, mechanical HDD, etc. on both SATA, SAS, and PCI-e interfaces..)... since the goal of the site is to test harddisks and not USB-to-SATA controller performance? ;) (I mean, it'd be nice to see it added, but man, this is a big job...).

Share this post


Link to post
Share on other sites
RAID controller testing should be separate. If you mean testing disks for compatibility with RAID controllers... I wouldn't bother. That is a huge task. It's not a simple "plug 4 disks in, build the RAID6, and use it for a week in the main file server with 20 users"... it's much, much more difficult than that to truly establish useful reliability. I would advise skipping it.

It sounds like you have some good experience with hardware RAID controllers; personally, I've only done SOME research but not a whole lot. Are the main ones still going to be on PCI-X, or would a standard testbed with a free PCI Express 2.0 x16 slot do the job?

I've been mulling over the kinds of hardware we'd want in the testbeds - certainly I'd like to see dedicated controllers compared to Intel's mobo implementation if just for my own elucidation - so knowing if they're gonna need a PCI-X slot is important.

Also, as an aside, Cooler Master's CM-690 II is a little pricey but MAN, the drive caddy on the top of the case would be a MAJOR lifesaver. They've added a lot of features (including a converter to change drive bays to 2.5") that make it an absolutely perfect case for our needs:

http://www.newegg.com/Product/Product.aspx...N82E16811119216

Share this post


Link to post
Share on other sites
And, any info related to long-term reliability on any disk, server or home PC, is golden.
Not sure how realistic this is. If the manufacturer publishes annualized failure rate (AFR) in the drive info sheet, please share it... but aside from that there's no good way to test it for a review site like this.
I'd have to agree. Due to the volumes of drives shipped, it's (unfortunately) easy to get a cluster of bad drives which have nothing to do with the overall reliability of the line. What if that one box got dropped between wholesaler and retailer? As a consumer, I don't actually care what the estimated failure rate from the mfr is; I just care that *I* got the one drive in a jillion that failed.
- Some way to measure vibration?
Definitely useful-- SPCR does this, but only subjectively. Then again subjective measurements may be all that's realistically doable... I am familiar with vibration testing and even my big customers outsource that, which tells me building such setups is probably extremely expensive or of limited value, or both. This would be intriguing, at least. As SPCR found with the latest Velociraptor as well, some drives may be more noisy depending on how they are mounted, which could further complicate things.

Yes, how a drive is mounted plays a very big role in how much vibration/noise gets measured. I've seen the same drive sound quiet in one chassis and sound horrible in a different one. Oh, and it also depends on where it was mounted (specific bay), were there other drives around, were grommets used, etc., etc.

The expense comes from the setup and expertise to do the setup/interpret the results. The actual accelerometers aren't all that pricey (few hundred bucks or so). You could suspend the drive in free state and measure absolute drive vibration, but that wouldn't necessarily correlate to what will happen once it's mounted, hence the tricky part of vibration testing.

And, any info related to long-term reliability on any disk, server or home PC, is golden.
Not sure how realistic this is. If the manufacturer publishes annualized failure rate (AFR) in the drive info sheet, please share it... but aside from that there's no good way to test it for a review site like this.
I'd have to agree. Due to the volumes of drives shipped, it's (unfortunately) easy to get a cluster of bad drives which have nothing to do with the overall reliability of the line. What if that one box got dropped between wholesaler and retailer? As a consumer, I don't actually care what the estimated failure rate from the mfr is; I just care that *I* got the one drive in a jillion that failed.
- Some way to measure vibration?
Definitely useful-- SPCR does this, but only subjectively. Then again subjective measurements may be all that's realistically doable... I am familiar with vibration testing and even my big customers outsource that, which tells me building such setups is probably extremely expensive or of limited value, or both. This would be intriguing, at least. As SPCR found with the latest Velociraptor as well, some drives may be more noisy depending on how they are mounted, which could further complicate things.

Yes, how a drive is mounted plays a very big role in how much vibration/noise gets measured. I've seen the same drive sound quiet in one chassis and sound horrible in a different one. Oh, and it also depends on where it was mounted (specific bay), were there other drives around, were grommets used, etc., etc.

The expense comes from the setup and expertise to do the setup/interpret the results. The actual accelerometers aren't all that pricey (few hundred bucks or so). You could suspend the drive in free state and measure absolute drive vibration, but that wouldn't necessarily correlate to what will happen once it's mounted, hence the tricky part of vibration testing.

Share this post


Link to post
Share on other sites

Pretty much all RAID controllers today are going to be PCI-e. As Eugene did previously, though, you'll need to make sure your motherboard's secondary PCI-e x16 slots support RAID cards-- not all do. Since you won't be changing the RAID card very often (if ever-- as mentioned I would consider a different testbed or at least a second set of hardware for RAID card testing should SR get into that) it really shouldn't be that big of an issue.

Also as you are doubtless aware, some drives perform very differently on different controllers.. this is doubly true for RAID controllers.

The CM 690 II is nice, but for practical purposes for testing we usually just leave a pile of parts on an ESD-safe workstation. Due to repeated use of the 3.5" and 2.5" ports if there is a backplane in that chassis (I can't tell from the pics) you would need to make sure you can get replacements from CM as you will wear those connectors out and see a decrease in reliability. Also since I doubt those backplanes fit SAS disks, you may just want to omit it entirely... plus cables are a lot cheaper to replace than backplanes.

Share this post


Link to post
Share on other sites

"Battery tests (notebook drives only):

-Running movie from hard drive until battery reaches certain %

-Idle until battery reaches certain %

-General usage until battery reaches certain %"

Like someone said, batteries degrade. Measure power consumption with Kill-A-Watt from both 12V and 5V lines. I'm pretty sure there's detailed information on what kind of equipment Eugene used. As far as I remember, he got some home-engineered solution with a regular multimeter, or something. Anyway, very cheap to build.

"Benchmarks:

-PCMark05

-HDTune Pro (full diagnostics/analysis)

-Crystal Disk Mark

-Boot time

-Time it takes to copy one large folder on the drive"

About boot time (and other real-life tests like gaming):

remember to do it consistently - always do a image-to-HDD restore instead of re-install. It is necessary that the HDD's contents remain exactly on all drives tested. Includes same amount of fragmentation. (If the imaging software defragments (doesn't do sector-by-sector image) then the fragmentation will be reset to zero, but at least we know it's zero on all drives in the comparison.)

You could try to copy a large folder from X:\FOLDER1 to X:\FOLDER2. That way (depending on OS) it will write very small chunks and be severely limited. Also, two simulateneous read from same source X:\FOLDER1 to Y:\WHATEVER and X:\FOLDER2 to Z:\DUNNOWHERE.

Obviously these tests are highly OS dependent and before running real-life tests, you'd need to extract OS from image and not do any updates prior testing.

"Others::

-Temperature measurement w/ HDTune

-Subjective noise measurement @ idle/load

-Warranty comparison to other brands

-Brief review of included software (if applicable)"

HDTune temperature is worthless as is any other software based "measurement" since it's just relayed SMART data as it's reported by HDD itself. It relies on accuracy of HDD's integrated temperature sensor and there's much variance between models and manufacturers.

Subjective noise measurement would be good. Objective measurement is actually almost impossible without anechoic chamber. SilentPCReview does subjective, objective measurement and sound recordings (downloadable MP3). They also take noise spectrums and stuff. That's the place for hardcore silencers anyway so you probably wouldn't be able to challenge their expertize. (Note: objective measurement has to be at distance over 100mm (not at 3mm) because of close-field effect that will distort the measurement. Over 10cm measurement distance will however require absolute ambient silence in the room.)

Warranty worthy of mentioning. No need to make it a big number or rant about it for paragraphs, like some Seagate-fanboys/reviewers have done in the past (until Seagate dropped it's 5 year to 3 years, LOL!)

Brief review of buddle is appropriate. Most of customers are there for the hardware so as long as the focus isn't too much directed at the SW.

Kittle:

"Other possibilities to check:

- How long to format the drive?

- Some way to measure vibration?

- Ditto on the load times for some games and/or software packages.

- Mabye measure how long it takes to install windows or some other big footprint software package."

Format time check is worthless. "Full format" is basically a verify pass, and if they run HDTune full read or verify pass, the time it takes to finish should be the same as "full format". It might be worth mentioning that this time HDTune takes to run corresponds to "full format". "Full format" should always be used with irony, as quick format is the same, minus the verify scan. And you can run verify scan later by "CHKDSK X: /R". Therefore only idiots run "full format" routine nowadays...

Vibration measurement is best done subjectively (hold it in hand -method). There's a high variance between drive samples. Variance in vibration is great where as acoustics of drives belonging to same model tend to sound more or less the same. You'd need several samples to have something meaningful.

Install software idea is good... as long as it's non-attended so that measurements are accurate and don't depend on user reaction. Even though system boot-up tests should be conducted on OS freshly restored from an image, OS installation could be a separate test (the resulting OS installation should be instantly discarded and not used in any tests - that's my point). Anyway, might be a good benchmark. A good suggestion.

Kittle:

"All good stuff here - but I would categoricly AVOID "subjective noise measurements" Get a microphone or sound meter to use for your measuring. then if several people do reviews, the numbers will be consistant. The ideal case here would be to construct a holder for the sound meter or microphone and attach it to whatever you use to hold the drive under test.

you can also checkout www.silentpcreview.com to see what that guy did with a homemade anechoic chamber and a seriously sensitive microphone. Probably overkill for here - but im sure you can pickup some ideas."

You refer to SPCR but ironically SPCR reviewers and users give much more weight on the subjective than objective measurements - even though their objective noise measurement method is by far the best used by reviewers and ... I'd say the best in industry. Exaggeration? No really. Best doesn't mean most expensive. Best is the one giving most meaningful measurements. Even HDD manufacturers measure from a few millimeter distance, making the measurement worthless (because the measurement is done only on the center of top cover, not in every point around the drive). When you move 10...20...30 cm away, the effect of where the microphone / sound-pressure-level (SPL) meter is placed reduced dramatically.

If new SR reviewers want to noise test properly, and don't mind spending money on very sensitive microphone, and much time in creating a home-made anechoic chamber (basically sacrificing a whole room into testing computer hardware), use time to analyze noise spectrum, and doing ad-hoc additional tests when needed to locate and eliminate these noises (like when they placed a metal plate in proximity of whining Seagate to make it quiet), etc. then you could compete. I don't think we need SR and SPCR competing but if you do, you better copy their methods to the letter because they've done it for a long time and received a lot of input from their forums.

Share this post


Link to post
Share on other sites
You refer to SPCR but ironically SPCR reviewers and users give much more weight on the subjective than objective measurements - even though their objective noise measurement method is by far the best used by reviewers and ... I'd say the best in industry. Exaggeration? No really. Best doesn't mean most expensive. Best is the one giving most meaningful measurements. Even HDD manufacturers measure from a few millimeter distance, making the measurement worthless (because the measurement is done only on the center of top cover, not in every point around the drive). When you move 10...20...30 cm away, the effect of where the microphone / sound-pressure-level (SPL) meter is placed reduced dramatically.

Um, I don't believe that's how acoustics at HDD makers are measured. At least, that's not how I've seen it measured. If it's a single point, which I *believe* is for sound pressure, not sound power measurement, that's done in one spot. But usually, HDD specs are for sound power, which uses 10 mics that are placed a specific distance away (and I'm pretty sure it's more than just a few mm). There are also other types of measurements done, but that's my understanding of it.

Still, the specs will only tell you a drive measured on its own. It won't tell you what it would sound like mounted in a specific chassis in a specific manner. While the big OEMs have testing done with drives in their chassis(es?), I don't believe that'd be practical for SR to try and replicate.

Share this post


Link to post
Share on other sites

Yeah, I meant sound pressure, not power. And it may be that they publicize only sound power in and not pressure. Most review sites use sound pressure measurements. Old SR measurement was ridiculously close to HDD surface, so are measurements taken by [pick any E-reviewer (except SPCR)]. Sound power measurement might be better (i.e measure all acoustic energy from the source within a time interval) but they're still just a single number that doesn't tell much. From the raw data (noise spectrum) you can make several dB readings: unweighted, dB(A), etc. but no human ear is exactly "A-weighted" even though A-weighting is designed to somewhat simulate typical ear. Also, computer cases may amplify or attenuate certain frequencies more than others, and A-weighting needs to be done _after_ it's installed in a system... which by definition is system specific. Every computer case is different.

So, too many variables and too difficult for the meaningfulness of results obtained. Just subjective evaluation of noise and vibration. Subjective evaluation can be good enough if it's based on side-by-side comparison to known reference drive... several references, actually. A known noisy one (flagship model like 5-platter 7200rpm Hitachi 7K2000 or 4-platter flagship from WD or Seagate), a relatively quiet one (1-platter 7200rpm drive, or 4-platter 5400rpm drive), a very quiet one (1-platter 5400rpm drive). The "a very quiet one" reference might also be 2.5" 5400rpm HDD but it should be remembered that there's big difference of noise output between 2.5" HDDs. Some 2.5" HDDs are noisier than 3.5" 5400rpm HDDs.

Making good audio comparison requires that you keep the reference drives always usable. That also means, you should handle them with care and store them in anti-static bags (offline) to minimize the likelyhood of your reference dying. (Unless one has several samples of reference drive AND has verified that all samples sound the same in side-by-side comparison.)

And when measuring HDD noise, it's better to use external power brick, or a fanless PSU for powering it up (though for ATX power supplies not connected to motherboard, you need to short-circuit some connectors on the motherboard plug to make PSU believe power switch has been pressed), while keeping the HDD outside of the Testbed computer case (and the Testbed powered down to drop ambient noise level). For performance testing, use the PSU inside the case. It doesn't matter how much noise that PSU produces since the audio recording would already be done...

Share this post


Link to post
Share on other sites

Someone already mentioned collaborating with SPCR, and I suggested this to Eugene here & Mike over at SPCR a few years ago, but the most cost effective way to get good sound measurements would be to send the drives to Vancouver after SR testing, so Mike can do acoustic testing - or vice versa (SR gets to do performance testing on drives SPCR receives for review).

SPCR used to mention Storage Review for readers to find performance numbers when they did reviews of hard drive acoustics, heat and power. It seems like the best of both worlds - more samples for both sites, both sites get more complete information for their databases, and if you allow SPCR to publish SR's performance data and SPCR allows SR to publish SPCR's acoustic data (and add it to the performance database), then both sites get to stick to their strengths.

If that sounds like a good idea, get in touch with Mike Chin at Silentpcreview.com - his email address is published on that site, so I won't repeat it here - and see if you can come to an agreement.

Potential downsides? The drives get shipped twice, doubling the chance of damage that may affect acoustic characteristics. Might be best to do acoustic testing first, then ship to SR for performance testing. Free review samples are often kept by the reviewing site, so that could be a bone of contention - maybe the site that keeps the drive pays the postage? Or get into the habit of returning the drive to the site that obtained it in the first place.

This wouldn't be needed for SSDs or other silent devices. So SR would be on its own obtaining samples of those.

Share this post


Link to post
Share on other sites

As for the rest, measure power consumption on 3.3V, 5V & 12V - peak at startup, average at idle and seek/load. Check out previous methodology articles at SR for details of how to do this with a multimeter etc. Battery life is not necessary.

Benchmarks - I'll echo the above comments for repeatability. Best would be to capture disk accesses from real world usage and play them back to each drive - SR used to do this, and Anandtech has recently started doing something similar. I think that kind of real world testing is critical to set SR apart from the "anyone who can run free benchmarks" crowd. Boot time, game level load time, time to complete intensive multitasking office/productivity work, and server benchmarks (random I/O) for various queue depths for every drive.

Retail packs and software - mention what the retail pack includes, comment on the value (both financial and in usefulness) of the package, detail any differences in warranty between retail and OEM drives, but I wouldn't go so far as to make or break a review on warranty - unless it's only 1 year or something, short enough to kick up a fuss. Most will care about performance and acoustics, any info about reliability they can get, and price before considering warranty.

Price is a tricky aspect to incorporate - you need a level playing field, such as comparing against other products that are around the same price in the same price comparison engine. You could use MSRP, but that can vary widely from street price, and you put older drives at a disadvantage because they may have dropped in price since their MSRP was set. And of course it's only useful to compare value with products that are still available.

If the SPCR collaboration comes to naught, both subjective and reasonably objective noise and vibration measurements would be essential - you could score subjectively on noise and vibration and then include that in the performance database for a quick comparison. Maybe allow people to compare only the drives that scored 8 or better for noise, for example.

Share this post


Link to post
Share on other sites

SPCR collaboration for sharing samples and cross-advertising each others corresponding reviews seconded. SPCR has reviews have ofter contained a suggestion to find performance review for their reviewed product from SR. Only lately (when SR became totally inactive) they started adding some performance benchmarks (though they're still very limited). They also do wattage measurements but SR's method is far more comprehensive with peak measurements and separated 5V and 12V readings.

Wattage measurements are of paramount importance: all electric power gets turned into heat, and thus power consumption measurement is the most accurate method of measuring typical HDD temperatures. Temperature probe measurements (or even worse: SMART temperature monitoring) are prone to hot and cold spots so measurements cannot be compared to competing products. Power consumption has a linear relation to temperature delta (HDD temperature above ambient)... at least roughly. Cooling is mostly done by convection (=airflow) and conduction (=from HDD to computer case) and not by radiation, as radiation increases exponentially and not linearly ...and when radiation becomes dominant heat loss method, the HDD would be dead due to extreme overheat by quite a fair margin. As a "side product" you can get peak amperages (or peak wattages, depending on how you want to view the same thing) on each voltage supply line which is quite useful for people building big HDD arrays that have to boot off a regular PSU (especially if staggered spin-up is not supported).

So, I'm definitely against any temperature measurement as wattage measurement is far more useful tool for approximating HDD temperatures in computer cases that differ from the one used in the review. And battery life test is, as pointed out, not good with external variables not related to object being reviewed. You can't really notice battery life change on a laptop with a 1.0 watt HDD and a 1.5 watt HDD since 0.5 watt difference can be masked by margin of error. The problem is similar to what many review sites use: they measure whole system power consumption after swapping the component to be reviewed. 100+ or even 200+ watt gaming system with WD Greenpower, an SSD or 5-platter Hitachi 7200rpm, I don't think the measurement from PSU mains intake is accurate enough to give proper results. StorageReview's method of measuring the reviewed products power consumption from the PSU output wires is far more accurate as there's no need to deal with subtracting base system consumption and no added margin of error related to base systems consumption. This problem, plus battery-life deterioration over time, makes battery-life measurement for HDD reviews a nonsensical one. (Note: battery-life measurement is extremely relevant for reviewing a (complete) laptop system, as a whole. But that's not what StorageReview is about...)

Share this post


Link to post
Share on other sites

Regarding the sound measurements of the drives..

I wasnt suggestion that SR use the same methods as SPCR does. Mike has quite a sophisticated setup, and his focus is not the same as this site. What I meant was to take a look at what he has (and it sounds like most here already have) - and see what ideas (if any) you can get from it.

My suggestion in a nutshell is this

- get a decent/good quality sound level meter

- include 3 measurements with each of your reviews: Ambient dB, drive idle dB, drive seek dB.

Ambient being what the meter says with the drive not powered up.

Will this match exactly what users have in their system? NO. And its not supposed to - its just a 2min test to give us an idea of how noisy the drive is.

IMO, most of the time Ambient and idle will be identical and seek will be a tiny bit more if at all. But for any drive that really IS noisy, the numbers wont lie.

Share this post


Link to post
Share on other sites

Kittle - that won't be a problem. We've already planned to do that test from a fixed distance each time. We're also working on another method that may mitigate some of the ambient noise issue.

Share this post


Link to post
Share on other sites

Closing this topic, please direct new comments here, where we've updated the methodology approach -

Share this post


Link to post
Share on other sites
Guest
This topic is now closed to further replies.
Sign in to follow this  
Followers 0