Sparky

Member
  • Content Count

    71
  • Joined

  • Last visited

Everything posted by Sparky

  1. Good morning everyone. I've just taken delivery of some new Dell-EMC Unity 450F arrays (in part due to the comprehensive review here: https://www.storagereview.com/dell_emc_unity_450f_allflash_storage_review), and am looking to configure networking to a pair of Cisco Nexus 9000 series switches, configured with a vPC peer link. So far, this is what I've got configured: Unity port A0 - Cisco Nexux 1 port 1, VLAN 100 Unity port A1 - Cisco Nexus 2 port 1, VLAN 110 Unity port B0 - CIsco Nexus 1 port 2, VLAN 100 Unity port B1 - Cisco Nexus 2 port 2, VLAN 110 This is as per the attached diagram. All ports are via 10GbE optical SPF+ (we have SPF+ modules in both the onboard CNA ports and a 4 port I/O module), and each port on the Nexus is configured as a iSCSI interface (4 in total). At least for the time being, all connectivity will be via block/iSCSI. A few follow-up questions on the above: Is this actually a valid configuration? Should I be looking to have the separate VLANs across each switch (i.e., A0 to Nexux 1 via VLAN100, and B0 to Nexus 2 via VLAN100)? Is it accurate that LACP and FSN cannot be configured for block iSCSI access? Would it be wise to utilize both the onboard CNA ports and I/O module ports for each controller (say, one on each) to provide extra redundancy? Thanks in advance for any assistance with this.
  2. Hello everyone! My organization is currently in the process of investigating options for replacing our currently primary storage platform. Details of the current setup: 2 * Dell-EMC VNX5200 appliances in our production DC A 3rd VNX5200 in our DR DC (located in a separate country than the prod DC) RecoverPoint appliances for a modest amount block-level replication between appliances 90-95% storage presented to VMware vSphere environment, with the rest presented to Oracle physical DB servers All storage currently presented via FC/block Some of the requirements Increase in performance over existing platform Able to handle a wide range of workloads and IO requirements Move to iSCSI Replication and the appliance level (i.e., migrate away from RecoverPoint appliances) Better file support – it would be nice to present more of our files shares natively at the appliance level (not a must, but nice-to-have) Better ability to leverage storage snapshots (i.e., Veeam Storage Snapshot backups) We are currently looking at two vendors/models: the Dell-EMC Unity range (most likely hybrid Unity 400), and the NetApp FAS range (most likely FAS2650 range). Does anyone have any thoughts or suggestions on the above? Are you going through a similar process yourself? Are you using either vendor in your current environment? Would you have any other alternative suggestions? Thanks in advance for any help or advice on the above.
  3. Thoes drives both look perfect, thanks for the recommendation, and I'll keep an eye out for the We review! Edit: the review is up! http://www.storagereview.com/wd_my_book_pro_review
  4. Hello everyone! I'm looking for recommendations for a good quality external drive. USB 3 is a must, as is RAID1. Thunderbold, Firewire or any other interfaces are a bonus. I don't seem to be able to find a good selection of suitable devices out there, and would really appretiate some recommendations. Thanks in advance for any help with this.
  5. Thanks for the replies, and apologies for the delay in getting back to you to clarify my requirements: Hands-on isn't necessarily a terrible thing, but on balance a pre-packaged option would probably be best. Cost isn't a primary consideration. These will be to replace drives that have recently caused problems and even failed, losing data - so quality and reliability are important. Capacity wise, something around the 2-4TB mark should suffice. I was thinking 2 * 5.25 inch bays in a simple RAID1 setup would cover this adequately. Good performance would be a plus, there will be some video editing involved. Moving into the realms of caching drives might be a little overkill, but is worth considering.
  6. Good morning everyone! I have currently got a few TB of data sitting on my home desktop with little in the way of backups, redundancy or availability. I am looking at purchasing a NAS to fix all of the above. The data is a mixture of Office documents, pictures, music, videos, and other random file data. Some questions on NAS and drive selection: From a brief look around the various reviews and manufacturer sites, it seems the DS215+ would be a good choice of NAS. Would anyone have any other recommendations (maybe the DS713+ or something QNAP)? Does SHR (Synology Hybrid Raid) allow the creation of volumes on a NAS with different RAID protection levels? Some of the data on the NAS will need drive redundancy, however other data is not so critical and I am more than happy to store them with no protection. With regards to drives, I am looking at WD Reds (4-5TB), however would anyone consider WD Red Pros worthwhile for the above setup? My wireless router only has a single GB port, with multiple 100mb ports. Would connecting the NAS to two 100mb ports have much effect on performance? Thanks in advance for any assistance with these questions!
  7. Good afternoon everyone! We run a number of mission critical databases on a DL460 G6 Blade with P4400 EVA storage. This setup is mirrored across two sites using an application called CA ARCserve Replication. This whole setup is starting to show its age and we are looking at replacement options. We have our eyes on some HP DL560 G8s (in a similar mirrored setup as above) and have got specs for two E5-4610 processors and 128 GB of memory per server. However, we are a little unsure on storage options. The server itself comes with 5 bays and option for a 25 drive bay external enclose. There are a raft of drive options available, from SSDs, FATA, 7.2/10/15K SAS and so on. For the on-board storage we need to cater for the following: OS (Windows 2008 R2) + SQL server (SQL Server 2005 SP4): maximum of 100 GB TempDB: around 50 GB Logs: 50 GB Pagefile: 128-192 GB ARCserve spool drive: .5 – 1 TB We are currently looking at the following: 2*200 GB 6G SSD in RAID 1 for OS plus either TempDB or logs 2*200 GB 6G SSD in RAID 1 for pagefile and, if there is room, either TempDB or logs 3 * 10k 600 GB SAS in RAID 5 (using the external bay) for spool drive and either TempDB or logs (if still required from SSDs) Some regarding following this setup: With 128 GB, these servers shouldn’t need to do much paging. However, I believe at the very least we need to have a 128 GB pagefile for crash dumps and know that some applications (such as Exchange) still need to page, even on servers with high memory servers. Do we still need to follow the 1.5 ratio for pagefile to physical memory (ie, is 128 GB enough, or will we need 192 GB)? Does the pagefile need to go on SSD? Does having the pagefile on RAID 6 cause issues? If for whatever reason we are only able to fit one of TempDB or logs onto the SSDs, which would benefit more from the extra performance, and which would be just as happy on 10K RAID5 SAS? Does anyone know enough about ARCserve Replication to know what sort of performance requirements we would need from spool drives? Would they benefit from SSD, or would 10K RAID 6 SAS be enough? Is there anything else we have missed or could do differently? Thanks in advance for any help with this!
  8. Thanks for the replies! To clarify a few points: Our current setup entails purchasing the extra drive bay. If we do this, short term we won't be using this for any other drives. Longer term, if we notice good gains with the performance of this server, we could look at moving some other databases to this server. Budget wise, we have a healthy amount to play with - although obviously we won't want to throw money at the problem and if we can get the same results with cheaper hardware, that would be preferable. Datacentre wise, we have ample room in our cabinets. Although again, we obviously woudln't want to be putting things in for the sake of it.
  9. Hey there everyone! I recently purchased a Dell XPS 14 Ultrabook, which I am absolutly delighted with. It comes fitted with a 32gb Samsung PM830 mSATA drive, and a 500gb Hitachi 5K500 SATA drive. Everything also seems to be installed on the 5K500, which seemed a little bit odd. I thought it might be a good idea to upgrade the mSATA drive, and looking at the reviews the Pextor M5M seemed like a good option, so I ordered one. However, a couple of mistaken orders later and I now a 128gb Plextor M5M, and a 256gb Plextor M5S and I am unsure what to do with them all! They are also both unopened, so I have the option to send either or both back if needed. Storage space on the laptop (and in turn the 5K500) would be nice, but ultimately speed and power consumption are far greater. So, a few questions: - The M5M seems like one of the best mSATA drives on the market now, am I right in thinking I am not going to notice much in the way of a performance hit from using it over a "regular" 2.5" SSD SATA? - How much of a performance gain and power consumption hit will I get from using two drives (mSATA SSD and either 2.5" SSD or HDD)? - The Samsung PM830 seems like a half decent drive, albeit small in capacity. Is there anything creative that I could use it for - say using it as an OS partition and putting Program Files and user data whatever drive I put in the SATA bay - or should I just sell it and stick with either the M5M, or even a single SATA SSD solution? - The M5S reviews don't seem to be great. If I am going to use a 2.5" SSD (if I don't find a use for it with the laptop, I can always slap it in my desktop), would I be better off with a better model - or will it be hard to notice any difference? Thanks in advance for any help with this!
  10. This is on Windows, yes (XP SP3 to be specific). It also assumes then when a user clicks on an attachment, they select Open as opposed to "Save As" (which would corectly specific their default save location).
  11. I have a bit of a problem regarding editing and saving of files opened from emails. When a file, such as an office document, is opened from an email attachment, the file is stored as a temporary internet files directory. If the user makes changes to the document and saves those changes (as opposed to save as), the document remains in the temporary internet files directory. Now, I know in part this can be solved by educating users – but its still a pretty silly “bug”. Is there any way of resolving it? Thanks in advance for help help or advice!
  12. I’ve recently purchased and built a new PC that I am having some serious and strange problems with. Its an Asus P5E3, with aq6600, 4gb of DDR3 memory and numerous hard drives. I used a fresh installation of Windows for the machine. A day or so after the fresh installation, I started experiencing problems with one of the drives on the machine (a 1tb SATA Samsung). I was getting the following errors:- - An error was detected on device \Device\Harddisk8\D during a paging operation. (warning, source:disk, code 51) - The system failed to flush data to the transaction log. Corruption may occur. (warning, course:ftdisk, code 57) - {Delayed Write Failed} Windows was unable to save all the data for the file . The data has been lost. This error may be caused by a failure of your computer hardware or network connection. Please try to save this file elsewhere. (warning, source:ntfs, code 50) This happened many, many times – and was presumed to be the fault of the hard drive itself. I struggled to get off the drive and backed up on other locations on my computer. Just when I had started to give up hope, I tried the drive in an external USB enclosure. Low and behold, it worked fine – and I was able to salvage all the data from it. Everything was fine and dandy for a while. Then, yesterday evening a 400gb ATA Maxtor drive started to experience the same issues. Numerous code 51s, 57s and 50s. As with before, the drive seemed to function normally using an external enclosure. Last nigh was the last straw. One of my new 1.5tb SATA drives displayed the same issues, with the event viewer flooded by another deluge of warnings and errors. I also noticed these errors (they might very well have happened during the previous faults too):- - The driver detected a controller error on \Device\Harddisk2\D (error, sorce:disk, code 11) - A parity error was detected on \Device\Ide\IdePort1 (error, source:atapi, code 5). Thanks in advance to anyone that can shed any light on these issues.
  13. Sparky

    Some very strange goings on…

    I really don’t think its hard drive related. The problem started with drives that had been used extensively in previous machines without fault. Now its giving me BLOD when I try during the Windows installation routine without any hard drives attached at all. I get no SMART error on any of the drives during start-up. Not sure if there is anyway I can give you more detailed reports from the BIOS, I certainly cannot do it from Windows right now (having problems even getting into safe mode).
  14. Sparky

    Some very strange goings on…

    I've tried a new power supply, as well as a new GFX card. Neither helped the problem - which has now escalated. Was working on the machine earlier, and started getting the same errors and problems. It BSOD'ed and restarted the machine. About 3 seconds after seeing the Windows startup logo, I get the same BSOD. This happens with any number or combination of hardrives installed. I've tried resinstalling Windows, but that throws up another BLOD at the point where is tries to start the instalation process. It even does that with no hard drives installed (just the optical drive). Is my motherboard broken? Does the world just hate me? Any advice much appretiated here - I'm about ready to throw this PC out of the window.
  15. Sparky

    Some very strange goings on…

    The PSU is unchanged from my previous setup. Its a good quality unit, and one I have not had problems with before. None the less, I have tried different power connectors - along with different SATA leads, and it appears that the problem persists and is independant of cables.
  16. Hi there everyone. I could use some advice on backup solutions for our work server. The server currently runs 3*73gb SCSI drives in RAID 5, one 20gb or so partition for Windows, and a second partition for use as a general filestore. We use a DAT24 tape to backup the files from the server (SQL database, pictures, presentations, and general office documents) on a nightly basis, keeping backups off site and rotating through the tapes on a fortnightly basis. This solution has worked very well for us, with the tapes stored off site, and having multiple backups with the ability to restore older files - but we have reached a point where the 24gb tapes cannot hold all the information from the server. Now, we have considered splitting the backups over multiple tapes, or to backup certain pieces of data on different evenings, but I think it would be a much better idea to just purchase a new backup solution. Size wise, I guess we would be looking for something at least 50gb in capacity. I guess it would be nice to get something large enough that we would require an upgrade to the server before we needed to upgrade the backup device (i.e., something around the 100gb mark). Are tape drives sill a good solution? If so, what type/make/format should we be looking for? If not, what are the alternatives? Thanks in advance for any help with this matter.
  17. Hi there everyone. A friend of mine is having a problem logging into his Windows XP PC. I am under the impression that the problem started after a power cut stopped the PC for an hour or so. Around 5 seconds after trying to login, Windows automatically logs you out of the machine. I have tried loggin in with various accounts (including the administrative account), starting the machine in safe mode, and last known good configuration. Now, other than just a format and reinstal (I really CBA), any ideas as to how I can get this machine working? Thanks in advance for any help with this.
  18. Sparky

    Printing problems

    Hi there everyone. I have a slight problem here at work with one of our printers. The unit in question is an HP 3550N, that is connected to our network via a Jetdirect box. All of the workstations in the department are setup to use the printer via its IP address. Over the last few weeks, we have found that any document sent to print via one of our laptops comes out in multiple copies - and usually does not stop until someone resets the printer. This does not seem to happen with any of our desktops. I have tried re-installing the printer on the laptops, but to no avail. Any suggestions on solutions to the problem would be greatly appreciated.
  19. Hi there. This is basically a repost of a problem I had in this thread here. Basically, here at work everyone has shared area on our server for storing their files and work. The folders are all in the format "$username$ public". Our users have full rights over all the files and documents in these folders, while the folders themselves are setup so that the users cannot delete or rename them. This is all well and good. When someone trys to delete one of public folders, an error message is generated telling the user that they dont have permission to do so. BUT! all of the files inside the folder itself are deleted in the process. I hope that makes sense? Now, I know this is logical, as the users do actually have rights over these files. Is there anyway to setup the folders so that as before, so that users cannot rename, create or delete public folders, and that when when they actually try to delete any public folders, the files inside the the public folders remain intact. Thanks in advance for any help.
  20. Sparky

    Permissions of shared folders

    Hi there Trinary, thanks very much for the reply. I don't mind users deleting their own files, if someone goes into a shared folder and deletes everything, then thats OK. I just want to prevent people from doing so at the level of the shared folder itself. We are quite a small company, only 30 users have accounts in the shared area. Most people know most other people quite well, and sharing of work is quite common place. All of the users also have a privtae folder - "$username$ privtae" that only themselves and the admin have access over. That said, I have been thinking about the kind of idea you mentioned above, maybe alowing users to create new files in someones area (so if you work on one of their files, you could just save it as a new, renamed file) or something along thoes lines. Either way, I really want to be able to stop users from deleting all their work from the shared folder level, while still giving them full access to it inside the folder itself. Is this possible in any way? Or am I asking for too much?
  21. Created txt files for each of the connection settings. Created two batch files to load the settings in a minimized window. Placed shortcuts to both batch files in a little toolbar on the desktop. Seems to work great, thanks for the advice!
  22. Hi there everyone. Our work network is spread across two sites, both on different subnets. We have a laptop user who not only requires network access on both sites, but at home too. Three different network configurations on the one laptop. I was under the impression that I could just create a different hardware profile for each site. When the user logged on, he would select a hardware configuration that would correspond to what site he was working at, and in that hardware configuration, the particular IP and gateway addresses could be specified. This doesn’t seem to work. The hardware configuration can differentiate between weather a connection is disabled or not, but any changes you make to a specific connection, from changing addresses to renaming the connection itself, is universal across all hardware profiles. Is there a way around this? Or are hardware profiles not the answer to my problem? Thanks in advance for any help!
  23. Thanks for the replies everyone. Unfortunatly, for many reasons, DCHP is not an option. I have tried adding both IP and gatway addresses to the network settings, this was unsuprisingly ineffective. Pedrito - this sounds like an interesting idea. Once I have the different network settings saved, how would I go about implimenting them? Could this be done on startup, or would I have to say create a toolbar that had shortcuts allowing the user to load the different settings? Thanks again for the help sofar.
  24. Sparky

    Setup of shared folders

    I would be the first to say that I might be being just a little tad fussy here. It’s certainly not a problem if I cannot do this. I also understand that at some point, no amount of foresight can protect a user from their own stupidity. If fred blogs decides he actually wants to go into his own private folder, highlight and delete everything - that’s fine, I don’t mind that. All I wanted to do is prevent these files from being deleted when fred attempts to delete the private folder itself. This as discusses is something fred will not be able to do, but in attempting and failing to do so, the actual contents of the folder will itself be all deleted. I obviously realise that this is what should happen, as again, although fred doesn’t have delete rights to the folder itself, he can do anything he likes to the contents of the folder. I was under the impression, with the plethora of options available in the permissions section, this might be possible. There are quite a few reasons why I would like to achieve this. First of all, if someone (for whatever reason) deleted someone else’s public folder, they would be greeted by the usual message saying that they were not able to do so. The user would probably be made to think by this message that his actions had no effect on the folder, and would probably think no more about it. When in actual fact, the entire contents of the folder would have been deleted, and would have to be restored from the backup tapes. Even more so, if a user selected all 30 or so public folders, and again for whatever reason, attempted to delete them, we would loose the contents of all 30 folders. The argument of only being able to protect users so much from their own actions comes into play again here, but this seems like far too easy a way to delete tens of gigabytes of users data. Thanks again for all the help with this.
  25. Hello there everyone, I could use some help with the setting up some space on the server at my work for members of staff to save their work on. I though that instead of posting the question “how do I go about doing it?†I would have a go myself, and look for some pointers and guidance. Currently, members of staff store all their work in a single folder that is shared across the network, with no structure as to how and where people save their files, let alone security and permission settings. The work stored on the server is divided up into work that is reasonably private and should only be viewable by its owner, some more work that although is not particularly sensitive, is not needed to be shared with other people. We also have some work that needs to be shared with particular groups of staff, as well as work done by individuals that needs to be shared with everyone. We have a single server, running Windows 2000 Server, being used as our active directory store, an SQL server and a network storage. We have about 40 staff accounts in the active directory, about 25 of which require space on the server to save their files. I started by creating a global security group “Server Storageâ€, to which I add the members of staff who require space on the server to store files. I created a folder in a non boot partition on the server named “Server Storage†for all the shared folders and work to be stored under. I set this folder to be shared, with no user limit, and no comment. I removed the “Everyone†group from the permissions, added the “Server Storage†group and gave them full access. In this “Server Storage†folder, I created three folders; Public, Private and Shared. With all of these folders, I first disabled the option "allow inherited permissions from parent to propagate to this objectâ€. I then took away the permissions for “Everyoneâ€, added the “Server Storage†group and gave them read only permissions. I granted full permissions for the Administrator (or else I wouldn’t have been able to do much more!) In the “Private†folder, I created a folder for each member of staff in the “Server Storage†group. I selected all of the folders, again disabled inherited permissions, removed the “Everyone†group and granted full permissions to the Administrator. I then proceeded with each of these folders in turn to grant the individual user full access to their own folder. One folder per member of staff, each member of staff only able to access their own folder and the Administrator access to everything In the “Public†folder, I did pretty much the same thing, creating folders for each member of staff, selecting them all to disable inherited permissions and remove the “Everyone†group. I then just added the “Server Storage†group, and granted them full permissions to all the folders. One folder per member of staff, each member of staff having access to all the folders. Finally, I have created a handful of folders in the “Shared directory, to which I have granted access to various groups of various selections (and sometimes all) members of staff. Some full access, some read only access. The general idea is that each member of staff has a folder to save work which is private, a folder to save work which is public and viewable by any member of staff, and an area which various work can be viewed and shared with other colleagues. Users can only create manipulate files and folders in the areas they have access to – so members of staff cannot create new files or folders in any of the Server Storage, Pubic, Private and Shared Folder – space for a new member of staff or shared folder has to be created by an administrator. The files in both the Public and Personal are both names after their owner, using a full name, eg “Fred Blogs†or “John Doeâ€. I plan to get people to access these files by mapping the Personal, Private and Shared directories as network drives. I hope that all makes some sense! I could really use some ideas, comments or criticisms on anything I might have done wrong, or ways I could make this better. For example, is it a problem that each member of staff has two folders in different areas that have the same name? Should I change the format of one of these folders? – They currently both look like this:- John Doe on \\Server.\Server Storage\Private John Doe on \\Server.\Server Storage\Public Would it be an idea to change these names, should I be using the member of staff’s actual network account name? Are the actual names of the root directories and security groups suitable? Any advice, feedback or hints with this would be really greatly appreciated!