Kevin OBrien

Corsair Performance Pro SSD (256GB) Review Discussion

Recommended Posts

I agree. Another problem with too many Sandforce drives is that they take up two slots (compressible and random data).

I suggest the following SSDs be included for comparison with the SSD under test in the future:

OCZ Vertex 3

Corsair Performance Pro

Crucial m4

Samsung 830

Those are all close competitors at the very top of the consumer performance spectrum, so they make good benchmarks for comparing new SSDs with. And it is a good selection of manufacturers, controllers (1 Sandforce, 2 Marvell, 1 Samsung) and flash (IMFT, Toshiba toggle, Samsung).

That would be a wonderful line-up to see thoroughly tested (though I would like to see an Intel 320 included too), particularly with a steady-state comparison. My interest is high-performance for low-end servers (i.e. pricing dictates MLC drives) so it that would certainly be valuable for me and others I'm sure.

Share this post


Link to post
Share on other sites

That would be a wonderful line-up to see thoroughly tested (though I would like to see an Intel 320 included too), particularly with a steady-state comparison. My interest is high-performance for low-end servers (i.e. pricing dictates MLC drives) so it that would certainly be valuable for me and others I'm sure.

Yes, the Intel 320 would be a good benchmark for the lower-end of the performance spectrum. Once the Intel 520 comes out, it could replace the Vertex 3 in the lineup (assuming it has a Sandforce controller).

Share this post


Link to post
Share on other sites

I agree. Another problem with too many Sandforce drives is that they take up two slots (compressible and random data).

I suggest the following SSDs be included for comparison with the SSD under test in the future:

OCZ Vertex 3

Corsair Performance Pro

Crucial m4

Samsung 830

Those are all close competitors at the very top of the consumer performance spectrum, so they make good benchmarks for comparing new SSDs with. And it is a good selection of manufacturers, controllers (1 Sandforce, 2 Marvell, 1 Samsung) and flash (IMFT, Toshiba toggle, Samsung).

Some of the manufactures as you listed don't pay enough to have there drives included. You are 110% correct, they should all be included to be a complete non bias review.

As for SSD Speed I think for the most part its hog wash at this point in time! We want reliable SSD at these prices, discussions and testing performed to determine long term reliability!!!

We all know who the guilty parties are.

Edited by Ricky_005

Share this post


Link to post
Share on other sites

Some of the manufactures as you listed don't pay enough to have there drives included. You are 110% correct, they should all be included to be a complete non bias review.

I understand your frustration but you are making rather large accusations about this site and it's editors. In my opinion this has no place here unless you have some kind of "proof", otherwise it's simply unnecessary defamation. I think they are doing a great job, and responding to both criticism and suggestions from here.

As for SSD Speed I think for the most part its hog wash at this point in time! We want reliable SSD at these prices, discussions and testing performed to determine long term reliability!!!

Well I think are clear speed differences with different work loads, but I do think any sort of long-term testing would be interesting.. but I think you'll need to be more specific about what you want to see here. If it's simply endurance or durability you can write 24/7 to the drives for days -- though I'm not sure exactly how useful it would be with a sample size of a few drives only, besides the point that most manufacturers provide TBW ratings now (I think?). If you are talking about long-term performance testing well I guess the SR Steady State IOMeter tests address this to some extent though I think SR can improve this by disclosing their methodology in more detail and I would love to see before and after HD Tach plots like this for Samsung 830 on AnandTech.

Share this post


Link to post
Share on other sites

I would love to see before and after HD Tach plots like this for Samsung 830 on AnandTech.

The problem with Anand's HDTach tests is that HD Tach writes highly compressible data. That led Anand to make (actually, he continues to make) erroneous claims that Sandforce SSDs are much better than some others at sustained write performance. In reality, the Sandforce SSDs are not especially good at sustained writes (as seen from SR's SS test with IOMeter and random data).

Share this post


Link to post
Share on other sites

Some of the manufactures as you listed don't pay enough to have there drives included. You are 110% correct, they should all be included to be a complete non bias review.

As for SSD Speed I think for the most part its hog wash at this point in time! We want reliable SSD at these prices, discussions and testing performed to determine long term reliability!!!

We all know who the guilty parties are.

We make charts based on a number of variables, none of which is ad dollars. Crucial and Corsair are not even advertisers on our site. We'll try to get the 830 in more charts for you.

Share this post


Link to post
Share on other sites

Some of the manufactures as you listed don't pay enough to have there drives included. You are 110% correct, they should all be included to be a complete non bias review.

As for SSD Speed I think for the most part its hog wash at this point in time! We want reliable SSD at these prices, discussions and testing performed to determine long term reliability!!!

We all know who the guilty parties are.

Until we convert over to a dynamic chart setup, we always need to make a decision on which drives get included on benchmarks and which ones don't. Its not a massive SSD conspiracy, its a physical limit at how large our charts can be before they are too difficult to read. For most charts this means a cutoff point of eight unique drives (4 if random/repeating). In cases where we are reviewing a drive from a given manufacturer that we have reviewed other models from, we include those in for comparison as well as similar drives. All of our current-generation numbers align, so nothing is preventing you from looking at one drive you like and then looking at another to see how it performed with our results. No company has control over how we compare drives or what models are included.

I understand your frustration but you are making rather large accusations about this site and it's editors. In my opinion this has no place here unless you have some kind of "proof", otherwise it's simply unnecessary defamation. I think they are doing a great job, and responding to both criticism and suggestions from here.

Well I think are clear speed differences with different work loads, but I do think any sort of long-term testing would be interesting.. but I think you'll need to be more specific about what you want to see here. If it's simply endurance or durability you can write 24/7 to the drives for days -- though I'm not sure exactly how useful it would be with a sample size of a few drives only, besides the point that most manufacturers provide TBW ratings now (I think?). If you are talking about long-term performance testing well I guess the SR Steady State IOMeter tests address this to some extent though I think SR can improve this by disclosing their methodology in more detail and I would love to see before and after HD Tach plots like this for Samsung 830 on AnandTech.

It is hard to please every reader, which is why we have a discussion forum to bring up talking points after a review is published. Adding more data, giving more comparisons, etc are all possible through this channel.

A guide to our testing methods, equipment, workflow is in the works although right now getting content up takes a higher priority. We are seeing more of these items completed and we have a new test bed launching soon which will leave some time to hopefully get this all out. Also as to the general performance plots before/after tests, it will vary greatly depending on the workload you are throwing at the drive. After running a 4K steady state benchmark most drives throw terrible numbers until they normalize again. Either through GC or a secure erase, they try to get back to a resting state that can take time. In a normal installation the type of data written to the drive will make a huge difference in what that would be.

The problem with Anand's HDTach tests is that HD Tach writes highly compressible data. That led Anand to make (actually, he continues to make) erroneous claims that Sandforce SSDs are much better than some others at sustained write performance. In reality, the Sandforce SSDs are not especially good at sustained writes (as seen from SR's SS test with IOMeter and random data).

Getting more to this point on before/after charts, in one of our next enterprise reviews we look at SS in multiple workloads. Each of those areas actually needs a conditioning period of a few hours in that new workload before results are consistent. It really does come down to the exact data you throw at the drive. If you happen to just buy a drive for benchmarks, I guess it would be a decent metric to look at, but for readers it would never really apply to them.

Share this post


Link to post
Share on other sites

If SR is looking for guidelines on how to do SS tests and pre-conditioning on SSDs, SNIA has already released a couple excellent documents for testing either consumer or enterprise SSDs:

http://www.snia.org/tech_activities/standards/curr_standards/pts

I would like to see a prominent review site like SR to be the first SSD review site to follow these industry-standard, best-practice guidelines, instead of arbitrary, hit-or-miss tests that most review sites tend towards now.

Share this post


Link to post
Share on other sites

If SR is looking for guidelines on how to do SS tests and pre-conditioning on SSDs, SNIA has already released a couple excellent documents for testing either consumer or enterprise SSDs:

http://www.snia.org/...r_standards/pts

I would like to see a prominent review site like SR to be the first SSD review site to follow these industry-standard, best-practice guidelines, instead of arbitrary, hit-or-miss tests that most review sites tend towards now.

Its funny you mention that, we have been following SNIA and attending SNIA workshops this year and working towards doing exactly this in our enterprise reviews. ;)

It will be a filter down effect where we hope to add more enterprise-quality tests down to the consumer hardware.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now