johnw42

Member
  • Content Count

    66
  • Joined

  • Last visited

  • Days Won

    1

Everything posted by johnw42

  1. johnw42

    Plextor PX-M3P SSD Announced Discussion

    When you test it, please be sure to run at least the 4KiB random write steady-state test. Any additional steady-state tests that you can run would also be interesting, since Plextor touts that aspect of performance. Also, I'd like to see the following SSDs in the comparison charts: Samsung 830 Intel 520 Sandforce with toggle NAND flash (e.g., OCZ Vertex 3 Max IOPS) Crucial m4 Plextor M2P or Corsair Performance Pro
  2. johnw42

    Plextor PX-M3P SSD Announced Discussion

    The M3P is now available for purchase in the US at newegg When will we see a storagereview review of the M3P?
  3. That should be SATA 6Gb/s , which is 6Gbit/s, not 6GB/s, which is 6GigaByte/s.
  4. Yes, thank you for the clarification.
  5. That is simply not true. I personally know of two sites that use Intel consumer SSDs and/or Crucial consumer SSDs in heavy workload environments, and by word of mouth, I have heard of many others. In both cases that I am familiar with, they considered enterprise SSDs, but rejected them because it did not make sense to pay 8 times as much, when similar performance can be obtained by using multiple consumer SSDs in RAID.
  6. You may want to look at 'fio' which is basically a scripting language for performing IO tests. I have used the linux version, but a web search can turn up windows binaries if you need them. http://freshmeat.net/projects/fio/ Here is someone who wrote a shell script to do some rudimentary automation with 'fio' for SNIA SSS tests: http://storagetuning.wordpress.com/2011/11/07/sssi-performance-test-specification/ Of course, that is just a beginning. Much more sophisticated tests could be done with 'fio'.
  7. That is rather vague. Do you mean to say that you will not include less expensive SSDs for comparison in the reviews because you believe it will offend the manufacturers of the SSDs? I suppose that is the problem with relying on manufacturers to provide free review samples rather than purchasing the products to be tested.
  8. Which still does not tell us what preconditioning was done. It is not difficult to describe in a few sentences. For example: WIPC (workload independent pre-conditioning) was done with 4KiB random writes QD=32 0/100 R/W mix until reaching SS. Then WDPC (workload dependent pre-conditioning) was done with 2MiB sequential writes QD=4 0/100 R/W mix for 5 rounds until reaching steady state. For a good example of the type of tests and charts that should be included in SSD reviews, both consumer and enterprise, check the "MLC-A Full report" and "SLC-A Full report" links at the bottom of the table on this SNIA page: http://www.snia.org/forums/sssi/pts
  9. I do like the focus on steady-state performance. It is always good to know what the performance will be under near worst-case conditions. I do have a few suggestions: 1) Please document the pre-conditioning procedures used to insure the tests were conducted under steady state. I assume you did something like continuous 4KiB random writes at QD=32 while monitoring the write speed and waiting for it to stabilize. That procedure (or whatever SR used) should be mentioned in the reviews. Ideally, a pre-conditioning graph of write speed vs. time should be included, so that readers can see at a glance that the SSDs did indeed reach steady state before the testing. Such a graph is also useful to see how initial speeds compare to steady state speeds. 2) Please include one or two of the best consumer drives in your tests as a comparison, say the Corsair Performance Pro or one of the new Plextor models (M2P or M3S). This would be helpful for people who have enterprise-like heavy workloads but who opt to use consumer SSDs instead of enterprise SSDs. I think a lot of people will use consumer SSDs with enterprise-like heavy workloads, because enterprise SSDs often cost four to eight times what a consumer SSD costs. For such people, seeing how much they are giving up with the consumer SSDs vs. the enterprise SSDs will be helpful in deciding whether to pay the extra money for enterprise SSDs. 3) I hope at least some of the new test procedures SR used in this review will be applied to every future consumer SSD review. Perhaps the consumer SSD reviews could report the usual data (not steady state), but then also include SS test results for the 2MiB sequential, 2MiB random, and 4KiB random tests. That way the SR readers could see how the consumer SSDs perform on those three sets of tests, both out of the box, and at steady state. I have not seen any other SSD review site show that sort of data, and I think many consumer SSD readers would be suprised by the differences between OOB and SS performance, with some consumer SSDs having a much larger performance degradation than others. It would be good to see this test applied to the SSDs I mentioned in a previous comment: OCZ Vertex 3, Crucial m4, Samsung 830, Corsair Performance Pro. Also the Intel 320, and Intel 520 when it is released. I think those are the most common consumer SSDs that people might choose to use for an enterprise-like heavy workload. By the way, I notice that many of your graphs still have the axis label as "MB" instead of "MB / s".
  10. If SR is looking for guidelines on how to do SS tests and pre-conditioning on SSDs, SNIA has already released a couple excellent documents for testing either consumer or enterprise SSDs: http://www.snia.org/tech_activities/standards/curr_standards/pts I would like to see a prominent review site like SR to be the first SSD review site to follow these industry-standard, best-practice guidelines, instead of arbitrary, hit-or-miss tests that most review sites tend towards now.
  11. The problem with Anand's HDTach tests is that HD Tach writes highly compressible data. That led Anand to make (actually, he continues to make) erroneous claims that Sandforce SSDs are much better than some others at sustained write performance. In reality, the Sandforce SSDs are not especially good at sustained writes (as seen from SR's SS test with IOMeter and random data).
  12. Yes, the Intel 320 would be a good benchmark for the lower-end of the performance spectrum. Once the Intel 520 comes out, it could replace the Vertex 3 in the lineup (assuming it has a Sandforce controller).
  13. I agree. Another problem with too many Sandforce drives is that they take up two slots (compressible and random data). I suggest the following SSDs be included for comparison with the SSD under test in the future: OCZ Vertex 3 Corsair Performance Pro Crucial m4 Samsung 830 Those are all close competitors at the very top of the consumer performance spectrum, so they make good benchmarks for comparing new SSDs with. And it is a good selection of manufacturers, controllers (1 Sandforce, 2 Marvell, 1 Samsung) and flash (IMFT, Toshiba toggle, Samsung).
  14. I'm coming to this discussion a little late, but aren't those two different SSDs? Certainly the label has one as being the Vertex 3, and the other the Vertex 3 Max IOPS, and those ARE two different SSDs. So either your label is wrong, or the poster is comparing two different models and expecting to get the same results...which is obviously a poor expectation. The Max IOPS should be faster (and it is, in SR's tests). Isn't that "the main difference"? Another poster (mike2h) already mentioned this, but no one else seems to have noticed so I thought I'd mention it again.
  15. johnw42

    Patriot Pyro SE Review (240GB) Discussion

    Interesting. So the Pyro SE at 17.8 MB/s is a little better than the Force 3 at 14 MB/s. But the Performance Pro is still the fastest of the consumer-branded SSDs at the SS write test, with 29 MB/s. By the way, I noticed a typo on all of your SS test graphs, the label for the write speed is "MB" instead of "MB / s".
  16. johnw42

    Corsair Force Series 3 Review Discussion

    Why does this review not show up in the list when I click on "STORAGE REVIEWS" from the main SR website?
  17. johnw42

    Patriot Pyro SE Review (240GB) Discussion

    It would be nice to see the steady state performance on the Pyro SE. I don't think SR has posted a review that shows the SS performance for a Sandforce-controlled SSD with synchronous flash. It would be interesting to see if it performs much better than the Corsair Force 3 (with asynchronous flash) on the SS test.
  18. johnw42

    Corsair Force Series 3 Review Discussion

    So QD=32 on the SS tests for all SSDs. Good to know, thanks.
  19. johnw42

    Corsair Force Series 3 Review Discussion

    But what is the actual queue depth used in IOMeter? Are you saying it is different for each SSD? If so, that is not good methodology. I suggest using the same value for all SSDs. If you want to saturate the throughput vs. QD relation for all SSDs, QD=16 would be sure to do it. Probably QD=8 would be sufficient, but QD=16 definitely would be.
  20. johnw42

    Corsair Force Series 3 Review Discussion

    The SS data is interesting. The F3 held up fairly well. Thanks for adding the Performance Pro. It looks very similar to the Plextor M2P on the SS test. Can you give the parameters of the SS test here (and in future articles)? I am guessing the queue depth (QD) must be at least 4, judging by the burst write speeds you give, since almost all SSDs seem to test at around 60 - 80 MB/s on your standard 4KiB random write test (which I assume has QD=1). Also, very relevant to the Corsair Force 3 is what type of data IOMeter is writing. Did you set it up for full random or pseudo-random? Full random would be best.
  21. johnw42

    OCZ Octane SSD Review (128GB) Discussion

    Thank you for including the steady-state 4KiB write results in this review. I hope SR will include this test in the future for ALL SSDs, since it is useful to know for both consumer and enterprise models. Also, it would be nice if you would add the results for the SS test to the recent Corsair Performance Pro and Plextor M3S reviews.
  22. I second h4lf. I thought SR said that you were going to be performing the steady-state write test on all SSDs in the future. I certainly want to see it for every SSD. It is a lot more interesting than your fake "real-world" tests that do not write real world data (but rather easily compressible repeating patterns).
  23. johnw42

    Plextor PX-M3S SSD Review Discussion

    No doubt it is a good thing for reviewers to work with manufacturers to fix problems with their products. However, reviewers first responsibility should be to those who read their reviews. When a product does not function as it should, the reviewer should mention it in the review. It is not necessary to understand exactly why a product malfunctioned in order to explain the behavior that was observed during testing. I think most people would agree that the best way to handle such a situation is to explain (in the review) exactly what issue was observed, and (if applicable) mention that the reviewer(s) are working with the manufacturer to explain and find a solution to the issue. Without such information, people reading the review could buy the product and have a bad experience that could have been avoided if the reviewer had been forthcoming in the review.
  24. johnw42

    Plextor PX-M3S SSD Review Discussion

    Thanks for explaining. From what you said, I do not see how you can say that it does not really count as a product failure. Every SSD should have the ability to be secure erased and continue to function with full capacity afterwards. If the HPA is erased in the process, the SSD should have the ability to restore it, either through firmware or external software. A secure erase should not change the SSD firmware, and it should not be required to flash the firmware after a secure erase. An SSD that cannot usefully be secure erased is a faulty product. Alternatively, an SSD with a firmware issue that prevents it from being usefully secure erased is a buggy product.