Sign in to follow this  
jehh

Switching to Windows from Linux - A Linux user goes back

Recommended Posts

This is a very good article writen by someone who obviously has spent their fair share of time with Linux.

Having run Linux in various flavors myself, and every version of Windows since 2.0/286, I can say it is fair and balanced.

He points out (correctly) that Windows is a great desktop OS and that Linux is a great server OS.

To quote one example:

The good: I recently borrowed a digital camera from a mate at work, to take photos of my case mod. Imagine how happy I was when I plugged it into my nearest USB port, and it was automatically configured (as a SCSI device) and mounted! SuSE even added it to my /etc/fstab file so that it always automounted when plugged in. I was very impressed.

The bad: Along came my new IDE CDRW drive. At AU$99, I couldn't pass up the purchase. Plugging it in gave me no joy. I was very disappointed that a device so common couldn't be detected and automatically configured under a modern operating system. The instructions on the SuSE support site said to add lines to lilo.conf and reboot. While this is a perfectly acceptable way to get hardware working for a geek familiar with *NIX, I believe that a home user shouldn't have to do more than plug it in. It's an IDE device, it's not that complicated!

This is a perfect example of one basic design flaw in Linux. Nothing they do, short of a massive redesign of the kernal, will ever really solve the hardware driver issue.

Things like IDE CD-RW drives should simply plug in and work, no fuss, no muss...

Before Windows 2000/XP came out, one of the big points of Linux was stability. As quoted here:

I can't comment on the Windows using community yet. I've not yet had a problem that a simple point and click couldn't fix. However, I will say that my original concern with Windows '95 has been addressed in Windows XP. The stability is finally there.

Stability was the really big thorn in Window's side, and for all intents and purposes, it is a non-issue in 2000/XP. I haven't had a system crash in so long, I'm starting to forget what they look like. :P

I'm interested to hear what you guys think of this, both good and bad.

Jason

Share this post


Link to post
Share on other sites
I'm interested to hear what you guys think of this, both good and bad.

It sounds to me Jason, like you are just trying to get Sivar and I in to a fight. :D

All I will add at this time, is the email I sent the poster a short while ago

"Flames will be automatically sent to the Windows equivalent of /dev/null, once I find where that actually is."

DeviceNull in the NT Kernel namespace.

There is a symbolic link named NUL in the win32 namespace, that points to DeviceNull.

For example dir > nul sends the output of dir to DeviceNull.

The reason for the two tiered null is that NT/XP was designed from the begining to support multiple types of programs.  Posix, OS/2, DOS, and win32 programs were all designed to run in their own subsystems, on top of a single kernel.  This is simply a manifestation of that structure.

Share this post


Link to post
Share on other sites
It sounds to me Jason, like you are just trying to get Sivar and I in to a fight.  :D

:)

Not at all, I wanted to hear some honest and non-flaming replies to the topic.

Basicly, my take is this:

Linux has more options for control, the user is given the ability to change and alter just about anything. It is very stable overall, and provides a very quick, very secure OS.

Windows has far fewer options for control, this is a bad thing to tech junkies, a good thing for everyone else. Windows is now stable, so that isn't an issue anymore. Computers all come with Windows installed, which is both good and bad. Just about any program someone buys from a retail store these days will install without any problems, just put in the CD, click install, and it works.

Linux is a much better server OS, much more secure and much faster on cheaper hardware.

Windows is a better desktop OS for the 95% of the computer users who neither know nor care what a "kernel", a "driver", or a "command line" actually is.

For those users, Windows makes far more sense, and is why Linux will never really challenge Windows in the desktop space. Linux is writen for and by tech junkies, Windows is writen for the lowest common denominator, the average Joe.

Linux is technically superior, but as Microsoft has proven over and over, technical superiority is not the name of this game. Just compare DOS and CP/M back in the day and ask yourself why DOS won out...

Jason

Share this post


Link to post
Share on other sites

I agree with most of the article, but i do disagree slightly with neededing a complete redesign of the kernel for hardware support. Where the author is incorrect is that yes, many power linux users do compile their own monolithic custom kernels....however there is no reason why modular kernels.....either default or custom compiled coulded be used.

For instance in redhat there is no need to compile a new kernel when adding hardware unless you want to. Redhat will load the proper kernel modules needed and that's all there is to it. Need something not in the stock linux or redhat kernels? compile just the module, depmod -a and your all set. True it's not quite as simple as running an executable and rebooting, but it's hardly the hell that the author claims.

Where the author does hit the mark though is on device support. While mainstream hardware is pretty simple to get going in modern distros, some older hardware and non-mainstream/generic devices can be a real pain.

Windows does do a better job with old hardware. Case in point: an old non-pnp SB16 card I set up a few weeks ago. The config jumpers were not silkscreened well so i had no idea what the current settings for the card were. Popping it in a windows machine and running the add hardware wizard got it running after a long detection process. I was then able to use device manager to find the info i needed to use the card on a slackware machine. a few parameters had to be passed to the sound modules when they were loaded, so i whipped up a simple little script to start/stop and restart sound. Obviously not terribly difficult, but still no where near as easy as windows.

The other part of linux hardware..very few hardware vendors write linux drivers, though more are appearing every day. Most device drivers are written by end users, so If you're card isn't terribly popular.....you'll probably have a harder time finding drivers. I can't see this changing unless linux gains enough market share to require vendors to support it. Thus this aspect of linux hardware support will probably continue for while.

In the end I agree....linux for servers......windows for desktops. However I might consider a linux desktop for older hardware, or as a lowcost replacement for windows. I was using a slackware system for a week or so while in vacation, and the experience was actually rather enjoyable.

-Chris

Share this post


Link to post
Share on other sites
I agree with most of the article, but i do disagree slightly with neededing a complete redesign of the kernel for hardware support. Where the author is incorrect is that yes, many power linux users do compile their own monolithic custom kernels....however there is no reason why modular kernels.....either default or custom compiled coulded be used.

The problem with drivers is not really the need to recompile the entire kernal, it is that drivers are not compatible across minor kernal versions all of the time.

What works here and there doesn't work everywhere. That unlimited user power causes these kinds of problems. Modules do work just fine in most cases, so long as the driver was compiled in that version of the kernal.

If it wasn't, you need the source to do it right, and not all drivers have the source.

It comes down to this... The 2000/XP version of nVidia's drivers work on all 2000/XP boxes. The 2.4 kernal version of those same drivers do not work on all versions of the 2.4 kernal, and don't work on all distros...

Which ones DO they work on? Trial and error, or posting and asking are the only ways to know for sure. That is simply not acceptable for a desktop OS.

compile just the module, depmod -a and your all set.

Compile the whatsit? Huh?

Sure, I know what you mean, and tech junkies do as well, but no one outside of the tech world is ever going to bother with any of that.

And that is the primary reason why Linux is not a serious threat to Windows, it is unlikely to ever get that much easier. To make it that easy would remove most of the power.

Case in point. I recently bought a new scanner. Plugged it into the USB port, Windows XP detected it and asked for the CD. I inserted the CD, it found the driver, installed it, and boom, it worked... I opened up PhotoPaint and scanned in an image, took all of 3 or 4 minutes and I didn't have to know anything at all. My mom could have done it... and that is of course the point...

Windows does do a better job with old hardware. Case in point: an old non-pnp SB16 card I set up a few weeks ago. The config jumpers were not silkscreened well so i had no idea what the current settings for the card were. Popping it in a windows machine and running the add hardware wizard got it running after a long detection process. I was then able to use device manager to find the info i needed to use the card on a slackware machine. a few parameters had to be passed to the sound modules when they were loaded, so i whipped up a simple little script to start/stop and restart sound. Obviously not terribly difficult, but still no where near as easy as windows.

Microsoft puts a ton of effort into making older hardware work well.

You can stick Windows XP on almost any machine built from 1995 to 2001 and have a 98%+ chance of having it work out of the box without needing anything else.

In the end I agree....linux for servers......windows for desktops. However I might consider a linux desktop for older hardware, or as a lowcost replacement for windows. I was using a slackware system for a week or so while in vacation, and the experience was actually rather enjoyable.

I've used Linux off and on over the years, starting way back in 1994. :P

Linux does have its good points, I enjoy having that power when I want it, sometimes I find Windows a bit restrictive.

That being the case, when I simply want to get work done, Windows works just fine.

If I were setting up a net server, Linux would go on it, but the desktops would have Windows XP on them. :P

Jason

Share this post


Link to post
Share on other sites

I will let other posters discuss the usability features of each operating system, but I must disagree with Jason’s assertion that Linux is technically superior. Quite the opposite is true.

While NT and Linux are roughly the same age, the NT kernel is far more sophisticated than the Linux kernel. Linus chose to implement a simple, traditional, monolithic kernel. After all, the project was ambitious enough as it was. I suspect he made the right choice at the time.

That said, Linux has paid the price as drivers, and vast amounts of kernel code have required rewrites to support multiple processors and threads(in an elegant way). Further, early versions of the Linux code base were tightly tied to the x86 architecture. Even in its initial release, NT was portable, had fine grained kernel locks, and supported SMP operation.

As someone who has known what a “kernel”, “driver”, and “command line” is, since long before many of the Linux kernel hackers, I choose Windows. That said, all forms of *nix hold a certain charm, because they were written not only by, but for people like me.

Jason does make one valuable point, DOS defeated CP/M-86 because it was on time, and because it was cheap. It did enough, well enough, that CP/M-86 just wasn’t worth the incremental investment. Microsoft has long known, that if they sat still for too long, they would be “cloned” just like the BIOS manufacturers. The treadmill seems to be slowing. If it ever stops, Linux (et all) will do enough, well enough, and Windows won’t be worth the incremental investment.

Share this post


Link to post
Share on other sites
I will let other posters discuss the usability features of each operating system, but I must disagree with Jason’s assertion that Linux is technically superior.  Quite the opposite is true.

Allow me to clear up my comment in that department...

Linux is more stable when run in a clean configuration, it is more secure than Windows is, and it has some Unix functionaly in it that Windows wishes it had...

The differences today are far smaller than they used to be. Linux was by far superior to Windows 9x, but as you've pointed out, NT/2000/XP have many of the same features.

The lack of security in Windows is the biggest concern right now, but they now have their eye on that ball so it should be fixed soon enough.

While NT and Linux are roughly the same age, the NT kernel is far more sophisticated than the Linux kernel.  Linus chose to implement a simple, traditional, monolithic kernel.  After all, the project was ambitious enough as it was.  I suspect he made the right choice at the time.

Good point, didn't think of it that way...

That said, Linux has paid the price as drivers, and vast amounts of kernel code have required rewrites to support multiple processors and threads(in an elegant way).  Further, early versions of the Linux code base were tightly tied to the x86 architecture.  Even in its initial release, NT was portable, had fine grained kernel locks, and supported SMP operation.

That is probably why Microsoft can make Windows run on almost anything, with very little code changes.

Microsoft gets lots of flak for the bloat of Windows, but I have a feeling part of that bloat is the modular nature of the OS.

Jason does make one valuable point, DOS defeated CP/M-86 because it was on time, and because it was cheap.  It did enough, well enough, that CP/M-86 just wasn’t worth the incremental investment.

Yea, that is what I meant... :P

Bill Gates has been quoted back in the 80's as saying, "we don't have to get it perfect, just good enough to ship, we can solve the minor problems in the next release".

He said this in an internal memo that was leaked years ago, it was meant for his own programmers, he was trying to tell them that getting their programs perfect was not the point, getting them good enough to sell was. He knew (correctly) that people will take something that is 95% finished today over something that is 100% finished a year from now.

Look at Windows and OS/2. OS/2 was superior to Win 9x (NT/2000/XP are simply updated versions of OS/2) and it was very powerful for its day...

But IBM didn't know how to manage it, they messed the whole thing up by trying to get it perfect... Windows 3.1 sucked, Windows 95 was not much better, but they were cheap, out on the market, ran existing software without much of a hicup, etc.

Windows 3.0 was not ready for prime time, but if you think about what was out back then, all the other DOS shells (which is all Windows is pre NT/2000/XP) out at the time were a big concern. Microsoft wanted something, anything out there...

Windows 3.1 was really a patch for Windows 3.0 that they managed to somehow sell... Ditto with Windows 98, nothing but a patch for 95.

Jason

Share this post


Link to post
Share on other sites
Compile the whatsit? Huh?

Perhaps I wasn't clear on my point ....which was assuming you stick with the stock redhat kernel, you shouldn't really have to do this. kudzu will (hopefully) detect the hardware and do the rest. That's it. Does it always work? no. But it does a good part of the time and when it does it' essentialy requires little to no user intervention.

Remember......let's compare apple to apples here. You're not going ot upgrade the windows XP kernel, so there's not need to assume you would automaticly be upgrading the linux kernel either. That's what the distros are for.....integegrate the new stuff and make sure it all works easily out of the box.

With driver support from hardware manufacturers, device support and simplicity of setup ::::could::: improve substantually.

-Chris

Share this post


Link to post
Share on other sites
Compile the whatsit? Huh?

Perhaps I wasn't clear on my point

No, you were... I know what you meant, but most people will not...

The point is, having to even know that exists, much less do it, is a massive flaw in Linux. It is unacceptable for a desktop OS to even have that as an option for the end user, much less a requirement.

The thing should install without any input whatsoever. Anything more is pointless complexity. A driver is a driver, the hardware is either there or it isn't. While options are nice, that is just absurd...

This is one of my pet peves with Linux, and why I don't use it anymore. It is just way too much hassle to deal with for a desktop OS. Options are nice, but there is a limit to that... :P

Jason <--- thinks Linux is too complex for its own good

Share this post


Link to post
Share on other sites
The point is, having to even know that exists, much less do it, is a massive flaw in Linux. It is unacceptable for a desktop OS to even have that as an option for the end user, much less a requirement.

Jason, pardon my bluntness, but did you even read the post?

What I was saying was that most of the time you don't need to do any of this. If you stick to the redhat kernal and have hardware well supported by linux, the whole process is automatic. Hell...lean your bear on the enter key after kudzo finds the device and It's done.

The problem is the availability of drivers for linux in the first place, which vendor acceptance would certainly help.

-Chris[/b]

Share this post


Link to post
Share on other sites
Jason, pardon my bluntness, but did you even read the post?

:P no biggie, to answer your question, yes I did...

What I was saying was that most of the time you don't need to do any of this.

What I'm saying is you should NEVER need it do any of it... That it is ever required at all, for ANY reason, is a serious flaw in Linux.

If you stick to the redhat kernal and have hardware well supported by linux, the whole process is automatic.

Sure, but why would I want to limit myself this way?

I guess it just comes down to this basic point:

For the desktop, Linux doesn't really offer anything Windows doesn't, but adds in a lot of complexity that almost no one needs...

The problem is the availability of drivers for linux in the first place, which vendor acceptance would certainly help.

Yea, but there is no company to help this process along, so it will be poor for a long time to come.

One of the reasons Microsoft got support early on is they simply paid for it. They paid to have systems and hardware developed for DOS/Windows. In the 80's, they provided cash payments to companies to write DOS versions of their applications, this built up a base of software that helped sell additional copies of DOS, and then later on, Windows.

Jason

Share this post


Link to post
Share on other sites

Dude, get off the FUD bus. Who are you helping with these unfounded claims? I have heard the same arguments rehashed for so long that it is rather boring. [Yuyo takes a long yawn and wonders whether he really wants to waste his time responding to this crude non-sense].

To help you place what I am about to write in context, let me tell you that I have been using Linux for the better part of three years. I don't consider myself a techie and have nothing but good things to say about it. My last Linux installation took 21 minutes. My last Windows XP installation on a coworker's laptop took close to an hour.

I am going to divide my response in two sections: hardware and software.

Hardware

Anybody who wants a stable hassle-free Linux box can now get a Microtel PC from Walmart. You can also get a box from Dell or HP, if those are more to your liking. If you build your own, all you have to do is make sure that your hardware is supported before you buy it. This is as simple as going to the hardware database of whichever distribution you are using and running a search:

http://www.mandrakelinux.com/en/hardware.php3

http://hardware.redhat.com/hcl/genpage2.cg...gi?pagename=hcl

Real freeking difficult that is, jehh. And you claim to have used Linux for years. Ok, sure. Most hardware, by the way, works find across distributions. If you buy the right hardware, you plug it in and it works. I can attest to this because I periodically like to see what's out there and try new distros. Have you REALLY used any flavor of Linux? If you did not know what you were doing and it doesn't sound like you did, maybe you should have asked somebody to set it up for you or tried one of the more user-friendly distros (Mandrake 8.2, Suse 8.0 or Lycoris). You might have been a Windows power user for years, but that does not mean that you understand in the least the file hierarchy, philosophy and device management of Linux. Asking for help or having a new OS installed is nothing to be ashamed of. After all, most non-technical users have the OS preloaded on their computers.

Hardware Part II: The Question of driver support

The only reason that Microsoft gets manufacturers to write drivers is because it is the most widely used OS in the world. Had Microsoft gotten there legally, most of us would have no qualms about Microsoft's monopoly in the OS market space. You obviously are not concerned with questions of fairness and will have us forget that Microsoft has spread FUD such as you are doing now, crashed its competitors with illegal tactics, pressured OEMs and hardware makers in general to keep them from supporting other operating systems, and left us with nothing but a security nightmare that calls itself Windows XP in its latest incarnation. If you want to argue the merits of the legal case against Microsoft, I will be more than happy to do so. But before doing so, please make sure you read Judge Jackson's findings of fact and the overwhelming amount of evidence available (And before you start the Microsoft spin machine about how they are all about "INNOVATION", keep in mind that most of the people who read SR are educated and informed people).

You, yourself, mention that up to know Microsoft has failed to deliver stable, reliable or secure products, all at the expense of the consumer.

Jehh says:

“Linux is more stable when run in a clean configuration, it is more secure than Windows is, and it has some Unix functionaly in it that Windows wishes it had... The differences today are far smaller than they used to be. Linux was by far superior to Windows 9x, but as you've pointed out, NT/2000/XP have many of the same features. The lack of security in Windows is the biggest concern right now, but they now have their eye on that ball so it should be fixed soon enough”.

I respond: If there wasn't a marginal improvement over OS versions, not even Microsoft's heavy-funded marketing department would get people to upgrade. I suppose that the plethora of patches that have plagued Windows XP, IIS and IE in the last two years alone, along with the countless hours of downtime and lost productivity are nothing to make you reflect a bit.

Security is an architectural problem that cannot be dealt with patches. It has to be done from the ground up. Microsoft is too late in the game to start anew, although I would applaud them if they did.

One last point that you have been making all along and that shoemakc already clarified for you:

“Remember......let's compare apple to apples here. You're not going to upgrade the windows XP kernel, so there's not need to assume you would automatically be upgrading the linux kernel either. That's what the distros are for.....integrate the new stuff and make sure it all works easily out of the box.”

Software

Tell me which applications you use daily whose functionality you found missing in Linux. I challenge you to name them and I will be happy to provide you with an equally or more powerful counterpart that is ready for prime time now. Here’s a minute summary that I will be happy to expand on:

http://umsis.miami.edu/~gporcelq/Applicati.../Linux_Apps.htm

If you had taken a bit of time to look for application availability, you would have found that all the applications are there. Most of them are an apt-get or urpmi away. If we were talking about the state of Linux on the desktop 5 years ago, then I would have had to agree with most of your claims.

To finish, I'll point out to you the name of a few of the most recent and ongoing threads at SR:

· Corrupt Installation File In WinXP

· WINXP CRASHED

· Terrible SCSI performance in Windows XP: SCSI works perfectly and has worked out of the box for years in Linux.

You may also want to read the following if you really want to understand why Microsoft’s software is not trusted in matters of national security, you may want to read the following:

http://www.pimientolinux.com/peru2ms/villa...ueva_to_ms.html

Slabodkin, G. (1998). Software Glitches Leaves Navy Smart Ship Dead in the Water. Government Computer News. GCN.com. http://www.gcn.com/archives/gcn/1998/july13/cov2.htm

Ps: Much of the nonsense in Jehh’s posts is here summarized:

J says: This is a perfect example of one basic design flaw in Linux. Nothing they do, short of a massive redesign of the kernal, will ever really solve the hardware driver issue.

Yuyo says: It’s kernel, my friend. You get it wrong two-thirds of the time. More to the point, the modularity of the Linux kernel is one of its greatest strengths. Do you understand that the same basic kernel is able to run a digital watch and an enterprise server?

J says: Things like IDE CD-RW drives should simply plug in and work, no fuss, no muss..

Yuyo says: They do.

Share this post


Link to post
Share on other sites

By the way, I will not bother posting on this thread again unless you provide a reasoned and compelling argument.

Cheers,

Yuyo

Share this post


Link to post
Share on other sites
Security is an architectural problem that cannot be dealt with patches. It has to be done from the ground up. Microsoft is too late in the game to start anew, although I would applaud them if they did.

I will leave the advocacy alone, but the weakness of your technical background is showing through here.

Security is an architectural problem. It is also a problem that can be dealt with, by issuing patches.

The base NT security model is far more sophisticated than the base *nix security model. I have discussed this issue in considerable depth before. There has never been a version of NT without ACLs. There is no NT object (resource) which can not be protected with an ACL. If you are unfamiliar with ACLs, consider how difficult it is in a base *nix install for a normal user to permit some arbitrary group of users access, while denying another arbitrary group of users.

There are plenty of other things which NT has always done, like zeroing pages before they are used to satisfy allocations. For similar reasons, disk blocks are zeroed when allocated to a new file. NT has also always had a comprehensive auditing system, which allows for any type of activity on specified objects to be logged.

While these features were baked in to NT from the beginning, none of them are present in Bach style Unix. Very sophisticated add-on security packages have appeared more recently in commercial Unix implementations, but they have been slow to appear in the various free-nixes.

The suggestion that Unix offers architecturally superior security is particularly laughable in light of its history. Programs like ftp and telnet, until recently the backbone of Unix communication use plaintext passwords. Microsoft has favored, and at times taken flak for its use of (non-standard) challenge-response mechanisms, far more secure than plain text.

The curious thing though, is that no amount of design will save you from a poor implementation. For example, by far the most destructive internet security breach, was Robert Morris’ sendmail exploit. This was not the result of a poor architecture, it was just poor coding.

Fortunately, guys like Theo de Raadt recognized this. A careful audit of the (often trusted) BSD code base in early 1997, yielded thousands of security issues that needed to be addressed. This process continues today, and security advisories are regularly issued for OpenBSD, and all versions of Unix.

Like Unix, NT has suffered from a fair number of security weaknesses caused by poor coding. Frankly, this was probably the result of a culture (not exclusive to Microsoft) that favored features over security.

Microsoft deserves the beating they have taken, but let’s not have the pot calling the kettle black.

Share this post


Link to post
Share on other sites
Security is an architectural problem.  It is also a problem that can be dealt with, by issuing patches. 
“Security is an architectural problem. It is also a problem that can be dealt with, by issuing patches.

Let me start by saying that I do not claim any technical background other than what I have learned through judicious reading and by having been a resourceful computer user for the past six years of my life. Most SR users would do well to be more modest about their knowledge and skills (here I purposely EXCLUDE some of the old timers whose expertise shines through their responses such as Frank Russo, B Nathan, yourself, and many others). I came from the Windows World into Linux three years ago. Prior to that I had some limited Basic programming, dos/word processing and some experience with DbaseIII when such a thing was used for building small databases. Why do I say this? Because the thrust of this discussion was about Linux not being ready for the desktop. Well, I am living proof that it is as are all the members of my family.

If I seem enthusiastic about Linux is because I “have” to be. While my colleagues have run around patching systems and keeping virus definitions updated and crying over nights of lost work, my research has never been impeded by a software issue, even when I have tinkered extensively in Linux. This means a lot to me. So, yes I am no programmer and have, for the time being, no desire to be one. I know enough to have been able to get a summer job “administering” (please note the quotation marks) both NT and Nix networks as a part-time job while finishing graduate school.

I take your point about ACLs. XFS and ext2/3 now implement ACLs that function very much like they do in NT and this has been the case for quite a while. Whether ACLs have actually prevented major security problems would also be worthy of a separate debate. Logging system processes and access to the very file level has been possible for as long as I have used Linux, so I have to disagree with you on this one. Finally, I don’t know of any admin worth its salt, or even average users, that use telnet. SSH has been available for a long time and provides secure access between linux clients and servers. It can also be used to tunnel other unprotected protocols through it. Both of these things are fairly easy to implement. So, here I too have to disagree.

Rather than debate the merits of the Windows and Linux architectures, which is likely to be an endless and not very productive debate, I will focus on the actual measurable outcomes of the NT security model. Those outcomes as far back as my memory goes are nothing but an endless assortment of security patches that often open more holes than they manage to plug. Every NT admin that I have known always relays tales of frustration with Microsoft. I vividly remember when I was in my first year of graduate school how two Microsoft Exchange patches not only failed to address a security hole, but actually brought down the server, stopping all communication for almost 500 hundred people. It took thousands of dollars and almost two weeks for Microsoft to come up with a solution. I also remember that my work's Windows NT file server needs to be rebooted periodically, while we had no downtime on the RedHat-based RAQ server that was used in the University of Newcastle while I was there.

Cas, how many viruses, besides those created in academic circles as a proof of concept, can you name that were spread by taking advantage of security flaws in the Linux kernel or even Linux apps? Since we are looking at measurable outcomes, the financial losses in productivity caused by Microsoft’s products are nothing short of staggering. This is a matter of public record.

As has been often said, security is a process. It just so happens to be a process that is, in my experience, far more open and manageable in the Linux world. I remember being able to monitor from the comfort of my home 10 different systems by using webmin and running an ssh session to specific servers. Of course, whoever manages security in a network must establish policies that are effective, understandable, auditable, enforceable and nonintrusive. And my LIMITED experience tells me that net-admins are often too busy solving the every day crises to be able to attend properly to such policies. With such policies in place, even NT can be reasonably secure. Having said this, I am not heartened by Jim Allchin’s statements in the remedy hearing depositions where he openly acknowledged the existence of huge undisclosed holes in the current XP code that could bring entire networks of Windows computers down. When obscurity ABOUT EXISTING FLAWS is the only way to maintain an OS secure, something is seriously wrong. The user/company is kept from being able to address a problem or paying someone to do so because a software vendor is too concerned about the damage to its reputation and the attendant devaluation of its stock. The security flaws of Linux or any of its services, (bind, apache, sendmail) are usually fixed in a few horus as soon as they are disclosed. This is a matter of record.

Gartner, not exactly a Linux enthusiast, published recently a report where it advised most companies to move away from ISS and look for alternatives. This is also a matter of record.

Nobody in the Linux world has to be worried about huge security flaws that will be found if the workings of the OS are disclosed, because the workings of the OS ARE disclosed and documented. I don’t expect you, Cas, to come to my side of the fence, but I would hope that you can recognize that, right now, the security challenge is in Microsoft’s court much more than it is on any distributor of Linux.

Take care,

Yuyo

Ps: In case I have not made sufficiently clear, I do not expect code to be bug-free, but I do expect security flaws to be quickly fixed once discovered. Here's where I think Linux shines.

Share this post


Link to post
Share on other sites
While NT and Linux are roughly the same age, the NT kernel is far more sophisticated than the Linux kernel.  Linus chose to implement a simple, traditional, monolithic kernel.  After all, the project was ambitious enough as it was.  I suspect he made the right choice at the time.

No, I think it was a deeply wrong choice. No matter how much better a monolithic kernel might be at the start, a modular kernel will eventually overtake it, because it is much easier to make changes and introduce new concepts.

Leo

Share this post


Link to post
Share on other sites

If you think that “my side of the fence” is anti-Unix, you are wrong. I switched from a mixed Unix/Windows environment, to a pure NT environment almost a decade ago. I still respect Unix’s contributions to the field, and continue to make regular use of a number of NetBSD systems.

That said, I would be interested to know how you would apply an ACL to a semaphore in a base Linux install? How would you audit access to that semaphore? What about shared memory?

As I have posted elsewhere in this forum, and indeed intimated earlier in this thread, just about any system will reach parity with its competitor in a steady state. Virtual memory, file systems, SMP support, security and so forth have all improved, for many *nix systems since NT was released.

Reading your post it’s clear that you have missed my point. That *nix went from bad to better proves it. Despite Unix’s weaker architectural base, Unix improved through the conscientious application of thousands of patches. Similar improvements will be seen in NT over time, just as they will continue for *nix.

Share this post


Link to post
Share on other sites
No matter how much better a monolithic kernel might be at the start, a modular kernel will eventually overtake it...

We could fill a dozen threads with this one (I assume you have read the Torvalds v. Tanenbaum debate?).

Linus wanted a kernel that he (and others) could hack on, and the ability to run GNU programs. He created what he set out to build. No doubt, if Linus new that he was the chief architect for tomorrow’s general purpose operating system, he would have done things a bit differently. He might even have started with an NT-like four inch thick specification. Given Mach’s influence at the time, I suspect we would have ended up with an operating system looking more like OSX, and less like AT&T Unix.

We should probably table this discussion for another time.

Share this post


Link to post
Share on other sites
If you think that “my side of the fence” is anti-Unix, you are wrong.  I switched from a mixed Unix/Windows environment, to a pure NT environment almost a decade ago.  I still respect Unix’s contributions to the field, and continue to make regular use of a number of NetBSD systems.

That said, I would be interested to know how you would apply an ACL to a semaphore in a base Linux install?  How would you audit access to that semaphore?  What about shared memory?

As I have posted elsewhere in this forum, and indeed intimated earlier in this thread, just about any system will reach parity with its competitor in a steady state.  Virtual memory, file systems, SMP support, security and so forth have all improved, for many *nix systems since NT was released. 

Reading your post it’s clear that you have missed my point.  That *nix went from bad to better proves it.  Despite Unix’s weaker architectural base, Unix improved through the conscientious application of thousands of patches.  Similar improvements will be seen in NT over time, just as they will continue for *nix.

All I have to say is: "fair enough". I actually hear you loud and clear and respect your view, even if we may not agree on many of the nuances. The nuances, both in terms of the technical and the social implications of Linux vs NT, are important and we will discuss them at length when I have a bit more time. The fact that I have not discussed Unix, but rather Linux should give you a hint as to where this discussion would go. Let's declace a truce so that I can get some work done. End of debate as far as I am concerned.

One last thing: remember that this was about whether Linux was ready for the desktop. This was raised by Jehh sharing the lessons learned from his SUPPOSED "return from the Linux world. I have gone to great lengths here and elsewhere to show that the applications are there. Anybody can look at the list of applications I posted or take up the suggested challenge. Bottom line is that the average user that the media always refers to would see all his needs fulfilled by a Linux desktop.

If this wasn't so, why would the government of Extremadura, Spain risk alienating public opinion by releasing a Linux distribution to be used at every public school, government administration and university? The distribution has also been a hit with local businesses and users.

See here: http://www.linex.org

Share this post


Link to post
Share on other sites

Like I said, I'll leave the general advocacy to somebody else.

Share this post


Link to post
Share on other sites

It would appear that once again cas has wowed me :-)

just to comment on something jason said, linux driver support is becoming more widespread among vendors. Heavyweights like intel, 3com, nvidia, adaptec all release drivers (albiet binary only) for most of their products. Obviously the whole process would go faster with a single entitiy "greasing the wheels" a bit, but things are progressing. If enough users demand linux support, the vendors will reply in turn.

-Chris

Share this post


Link to post
Share on other sites

This is a great thread! I have heard soime realy good discussion, BUT......

Jason.

You seam to be trying to make the point that Win2K is flawless. All I can do is laugh. just 3 days ago I was installing a printer. Just a simple printer, nothing big......right??? It took 4 HOURS. I installed the driver package and rebooted. The machine came up and I turned on the printer. Look...... its in the printer list.....yeah!!!. Try to print........ Device not attached.?????? Ok thats weird. reboot............detecting new hardware. Hmmmmmm......what... its already installed. Ok it detected that it didnt know what it was.

Loing story short I ended up having to uninstall the driver. Let windows detect it. install there driver (which didnt work) and THEN install the supplied driver on top of that. Wow it works now. after 4 HOURS.

it was an HP printer!!!!!!!!! like the most common printer brand in the world.

Anyone out there that can say they havent had ANY problem with there windows NT/2K/XP is either not reconfiguring there system much or has been VERY lucky.

As for stability...... My laptop goes down ever 3 days. or less. never made it more than 3 days without a reboot. I had an NT box last 2 months once. never has a 2K box last that long. I have 3 linux boxes with over 300 days uptime though and 2 BSD boxed over 250.

As to "security", its what I do for a living. Cas, you seam to say that ACLs solve windows security problems if used correctly. They dont. Not even close. I also have yet to find somethign that ACLs do that you cant do in most base versions of *nix.

One of the major problems with the NT command processor is the size and complexity of it. It has to many holes in it and there is no way to do a code audit. its just to BIG. In my opinion linux is headed that way as a default. you can always build a custom linux kernel though. You cant, at least not legaly, with NT. There are so many security holes found in NT every year it makes me laugh. recently there was a huge anouncement, im sure a good number of you heard about it, that OpenSSH was found to have a hole. so now OpenBSD has had ONE hole in there default install in 4 YEARS. How many has windows had?? at least 1000 and that is a coinservative number. and that only counts remote root/administrator exploits.

basicaly, due to its design NT is inherantly less secure. Yuyo has a valid point. Security wise almost any *nix is better than NT. This is of course a general statement as I have locked down NT boxes to the point where I would feel secure with them. People in the tech industry dont realy talk about the security of there web servers though. you can easily put a box behind a firewall and alow only port 80. Then you only have to worry about your web server and not other things on the box. and anyone can lock down IIS if they work hard enough at it. Its just generaly not worth the effort. The real problems are the default settings and unpatched boxes and untrained admins and bad security policy I could go on. Most servers on the net are not heavily optimized for security! When you dont spend that much time on security MS boxes show through as shotty full of holes. Same with workstations. A 2K box attached to a cable modem is an enticing target. A redhat box is not.

Not to say that *nix is fault free but there is a HUGE differance in the scale of the security problem. This is definatly where MS does NOT shine.

James Ashton

Share this post


Link to post
Share on other sites
Cas, you [seem] to say that ACLs solve windows security problems if used correctly.

No, this is not at all what I am saying. I am saying that the presence of comprehensive ACL support in all versions of NT is evidence of a sophisticated design. While you may not have been able to find anything that can be done with ACLs that can not be done without them, an entire industry, and many national governments have. This is why ACL packages are available for most *nix variants today.

…How many has windows had?? at least 1000 and that is a [conservative] number.

Again, if you were using the BSD code base in early ’96(and many were), your system had thousands of exploits, yet to be discovered. This, despite widespread availability of the source code, for many, many years.

Your argument that “NT is [inherently] less secure” is obviously flawed. When it is found to be such, NT is insecure for the same reasons as *nix: sloppy coding.

Share this post


Link to post
Share on other sites
Guest russofris

Looking over this thread, I kinda have to laugh.

Trying to have an intelligent conversation on Win32 vs. Nix is like having sex with your best friend, and trying to stay friends... It simply won't happen.

In reality, Linux does havie it's uses. When I configure a box that will undergo minimal changes (in both its hardware and its role), I use linux. Web servers are single function machines, and linux is a great foundation for this.

When a computer will be changing its HW and role weekly (daily?), Windows is a better candidate because it is easier to change on the fly.

*Nix is the backbone of the internet, of businesses, and of most major networks. It allows to use our Win32 boxes with minimal effort. The two platforms are a symbiant circle, and thrive off of the existance of each other.

I hate posting to LvW threads,

Thank you for your time,

Frank Russo

here I purposely EXCLUDE some of the old timers whose expertise shines through their responses such as Frank Russo, B Nathan, yourself, and many others

Shucks...

It's not the expertise, it's just that I find that using my real name makes me less apt to be an ass on boards that I post to.

Share this post


Link to post
Share on other sites
Jason. 

You seam to be trying to make the point that Win2K is flawless. All I can do is laugh. just 3 days ago I was installing a printer. Just a simple printer, nothing big......right??? It took 4 HOURS. I installed the driver package and rebooted. The machine came up and I turned on the printer. Look...... its in the printer list.....yeah!!!. Try to print........ Device not attached.?????? Ok thats weird. reboot............detecting new hardware. Hmmmmmm......what... its already installed. Ok it detected that it didnt know what it was. 

Loing story short I ended up having to uninstall the driver. Let windows detect it. install there driver (which didnt work) and THEN install the supplied driver on top of that. Wow it works now. after 4 HOURS. 

it was an HP printer!!!!!!!!! like the most common printer brand in the world. 

Anyone out there that can say they havent had ANY problem with there windows NT/2K/XP is either not reconfiguring there system much or has been VERY lucky. 

As for stability...... My laptop goes down ever 3 days. or less. never made it more than 3 days without a reboot. I had an NT box last 2 months once. never has a 2K box last that long. I have 3 linux boxes with over 300 days uptime though and 2 BSD boxed over 250.

I had my main Win2K box up for 5 days downloading movies. After that, my WinXP Inspiron 3800 laptop was up for a week downloading stuff. Neither of these systems crashed, I shut them down because I was done with them.

I am going for the big one now. I will see how many days my secondary Win2K box runs 24/7 before crashing. This box plays MP3's:

Pentium II 300 @ 233(passive cooling)

128mb pc-133 SDRAM

Deskstar 75GXP 45gb

Soundblaster 16 ISA

SiS 5598/6326 AGP

Unknown Mobo w/i440tx chipset and Award Modular Bios

32x CD-ROM

As for Linux, I have tried Redhat 7 once and it wouldn't install :( .

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this