Sign in to follow this  
Sivar

Seeking suggestions from readers for name of new SR awards

Recommended Posts

The problem with performance Awards is that they become obsolete almost instantly.

I should have said

obsolete within the products lifetime.

Share this post


Link to post
Share on other sites
A couple notes:

1) These elevated awards are being considered because of complaints that the only real one benig used, that ubiquitous Safe Buy, doesn't do enough to distinguish between drives.

Hmm, one would think the text of the review would be necessary to do that. Perhaps a one-size-fits all standardization of drive ratings doesn't do justice to the complexity of disk drive properties. People often pine for a rule of thumb simplicity in situations that are not well characterized by such.

I agree, reparations. What I think is lacking is better Conclusion/Exec Summary for current drive reviews that actually gives an expert assessment of the drives instead of simply presenting some graphs and data, which is what the reviews do now. I don't think it would do the drives justice if there were a Top Performer Award or a Quietest Drive award. It should be a general award linked to a vastly improved Conclusion/Exec Summary page that is both concise and easy to scan (bulleted lists) that includes:

Pros:

1.

2.

3.

Cons:

1.

2.

3.

Who Should Buy It? (recommended application)

Ratings

Desktop Performance: A-

Server Performance: C+

Ease of Integration: B+

User Rating: 72%

Read User Opinions (links to a poll about this drive with user comments accessible)

-----------------

Drive of the Year Award is somewhat ambiguous and subjective, but I agree with Eugene -- it should be a drive that has done something significant for hard drive technology. The first 8 MB buffer drive would be a good example. The 2002 award might go to the Cheetah 15k.3 because it completely removed the integration barriers of heat and noise that previous gen 15k units were plagued with. The 2003 award is a no brainer -- the WD Raptor.

Share this post


Link to post
Share on other sites
I would like to see: 

1. Editor's Choice Award 

2. Reader's Choice Award

I've never found readers' choice award to be a good idea. Often what you end up with is a mob mentality or flavor of the month popularity contest (in the bad sense) rather than the most deserving candidate winning. For example, IBM (Hitachi) is still trying to dig out of the hole that the 75GXP created three years ago, despite having released multiple top notch drives after it. Currently, if I was going to pick the best performing ATA drive, I would go with the 180GXP by a nose over the WD2000JB, but in a poll I would guarantee you that no IBM drive would get even a sniff of the board due to the popular and unjust view that IBM drives are still terribly unreliable. The other problem with reader awards is that it is a very very rare occasion when a reader has actually used all the drives in question. I can't tell you how often I've seen posts that take the following form:

"I just upgraded from a (fill in 3 year old 5400RPM drive) to a (fill in average performing previous generation 7200RPM drive) and the difference is night and day. This drive is the greatest!!"

Is that information of any real use to anyone, and would you want to base your buying decision on this person's vote?

Those who actually own the drive and enter it into the drive reliability database should be given twice the weight of voters who don't actually own the drive.

Again, this stipulation wouldn't mean anything. My owning drive X gives me no knowledge of how drive y performs. A drive should not be penalized just because someone doesn't own it.

1) These elevated awards are being considered because of complaints that the only real one benig used, that ubiquitous Safe Buy, doesn't do enough to distinguish between drives.

I thought that was the purpose of the leaderboard. Each leaderboard category should be an award, so instead of sticking the "safe buy" award on every drive, if the drive is worthy of making it onto the leaderboard, it gets the corresponding award for the category it is topping. This also eliminates any time based problems, as it will always be up to date. As soon as a new drive ascends to the throne and is available for purchase it moves into the leaderboard.

2) The DotY would not necessarily be the fastest unit that we've reviewed as of Nov 30, Dec 31, or whatever. Rather, it would go to the drive that we felt made the biggest stride in improvements within the calendar year. For example, we were first considering the award late 2001. Though the WD1200JB was the king of the hill by Dec 31st, the award would have gone to the WD1000JB since it was the drive that advanced IDE buffer sizes to 8 megs.

Good idea in theory, terrible in practice. This would only work with a well informed audience that reads everything including the description of the award which unfortunately, the average reader stopping by at this site just to view the awards page would not be nor would they. If something is given an ambiguous award title like DOTY it is automatically assumed it is being given to the best performing drive which opens up SR to a ton of criticism if that's not what their criteria for DOTY was.

The 2002 award might go to the Cheetah 15k.3 because it completely removed the integration barriers of heat and noise that previous gen 15k units were plagued with. The 2003 award is a no brainer -- the WD Raptor.

8MB buffers were not invented by the WD JB, SCSI drives had buffers that large for years, even 16MB drives existed. So to say it was something really innovative, well.. it wasn't. I really don't view slapping 6MB of cache on a drive as terribly innovative or overcoming some engineering obstacle. The 15k.3 introduces another issue with technology based DOTY awards. Yes, it was the best performing drive to date, but it was simply a continuation of technology used by the X15-36LP, there really wasn't anything innovative about it. There really was no drive in 2002 deserving of any sort of innovation award as there wasn't anything beyond generational improvements the whole year nor do I think any drive in 2001 deserved the award either. I would go back to 2000 with the release of the X15 for the last drive that really deserved an innovation award for truly breaking down barriers. I do agree that the Raptor is deserving of this year's innovation award so far, (it's only March) being the first consumer based 10k drive and being an SATA drive, creating the first real reason besides thin cables for people to begin migrating to SATA. An annual award that is given to a deserving product once every 3 years doesn't seem very appropriate.

Who Should Buy It? (recommended application) 

Ratings 

Desktop Performance: A- 

Server Performance: C+ 

Ease of Integration: B+ 

User Rating: 72% 

Read User Opinions (links to a poll about this drive with user comments accessible)

I agree that the conclusions should be more detailed and more opinionated to give readers a better indication of what SR thinks about each drive but I wouldn't go as far as grading because the baseline for the scale would change over time. The Barracuda V might be graded an A+ for useability today, what happens when the next drive comes along and halves the noise levels and decreases the heat, is it an A++ now? Then the next generation improves on that and you see where this is going. Grading the products eliminates the ability to compare drives over time.

Share this post


Link to post
Share on other sites

Good thoughts as usual, KG. I know I can count on you to provide good counterpoints in our debates.

Share this post


Link to post
Share on other sites
Who Should Buy It? (recommended application) 

Ratings 

Desktop Performance: A- 

Server Performance: C+ 

Ease of Integration: B+ 

User Rating: 72% 

Read User Opinions (links to a poll about this drive with user comments accessible)

I agree that the conclusions should be more detailed and more opinionated to give readers a better indication of what SR thinks about each drive but I wouldn't go as far as grading because the baseline for the scale would change over time. The Barracuda V might be graded an A+ for useability today, what happens when the next drive comes along and halves the noise levels and decreases the heat, is it an A++ now? Then the next generation improves on that and you see where this is going. Grading the products eliminates the ability to compare drives over time.

It may be possible to include a performance rating that is tied in to the database similar to the reliability rating. A drive could have marked with a percentage in desktop and server performance. This would be it's total scores in all of the desktop/server as a percentage compared to the totals of the highest of all scores.

In case that's not clear, here's what I mean. A number would be attained by adding all the highest test scores for the desktop related benches. Then the drive in question has all it's desktop related benches added. This is then expressed as a percentage of the first figure.

The same is done with the server benches.

In the drive review, if this is all linked to the database like the reliability section of the review, the score would be automatically adjusted with time and a current (relevant) rating would be viewed. For historical interest, the rating achieved by the drive when released should be listed in the review.

These ratings could also be used by the leaderboard and thus be updated every review rather than once every, erm... bit longer :wink:

Share this post


Link to post
Share on other sites

How about a "Deathstar-award" to honor IBM's hideous 75GXP and unreliable drives that come thereafter? :twisted:

Editor's choice is good, however the current Safe buy is not bad either. "Performance award" would be good to distinct performance oriented-drives from "safe buys", maybe in addition to the current "Safe buy".

Eugene, do you know something we don't when there is apparently a "15.000RPM IDE" category?

Share this post


Link to post
Share on other sites

I like e_dawg's suggestion to improve the classification and ease of comparability of the test results.

I think most people here have specific preferences on the criteria and need good tools for their comparison rather than veiling awards.

On the other hand, I would understand SR to introduce such as for publicity reasons.

:roll:

Share this post


Link to post
Share on other sites

What about a grading scale based on what else is currently out there in it's class? Such as a -5 to +5 scale where 0 is par. A -5 would mean that in that area, the drive was excessively bad (ie: in the noise catagory there was a 70db whine and 80db clicks), and a +5 would mean that it just moved the curve.

15k drives and 5400 drives would odviously be seperate classes.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this