• Welcome to Valhalla Legends Archive.
 

Why Windows crashes, and Linux doesn't

Started by Mephisto, December 14, 2003, 09:02 PM

Previous topic - Next topic

Mephisto

My friend and I had come up with an idea that perhaps the reason why Windows crashes more than Linux is that it runs off one file, Explorer.exe.  So when Explorer gets mad and has some complications, your whole computer does and results in a crash.  Because Linux runs off of several files, when one gets complications, your whole program doesn't end up crashing as Windows would.  Also, this could be why Linux gets better uptimes because it doesn't rely on just one file like Windows does.  Windows may just go off of one file since they want to keep their source closed, and managing one file is just easier than multiple.  Maybe Linux runs faster because it has multiple files handling data and processing it.   My friend and I don't really have a lot of evidence backing up this theory, and we're not 100% sure if Linux even does run off of multiple files, but if it's true, perhaps this why?  Dunno... what do you guys think?

Skywing

I think this is a good candidate for the Fun Forum.

Grok

Quote from: Mephisto on December 14, 2003, 09:02 PM
My friend and I had come up with an idea that perhaps the reason why Windows crashes more than Linux is that it runs off one file, Explorer.exe.  So when Explorer gets mad and has some complications, your whole computer does and results in a crash.  Because Linux runs off of several files, when one gets complications, your whole program doesn't end up crashing as Windows would.  Also, this could be why Linux gets better uptimes because it doesn't rely on just one file like Windows does.  Windows may just go off of one file since they want to keep their source closed, and managing one file is just easier than multiple.  Maybe Linux runs faster because it has multiple files handling data and processing it.   My friend and I don't really have a lot of evidence backing up this theory, and we're not 100% sure if Linux even does run off of multiple files, but if it's true, perhaps this why?  Dunno... what do you guys think?

Am quoting it so he can't edit it away.  My reply:

O.M.G.

Arta


Stealth

Have a look at your C:\Windows\System\ or \System32\ folder, then come back here and tell us that Windows runs from a single file.
- Stealth
Author of StealthBot

Kp

... that alone would have no effect on the stability.  In both cases, the OS has to operate in privileged mode, which among other things means that it can pretty seriously ruin its own data structures and there's nobody there to stop it.  Stability has nothing to do with number of unique files running in privileged mode and a great deal to do with whether and to what extent you use a "good" code base.  That is, one which has been thoroughly debugged, does not fail under stress, etc.  As another point, whether something is open source or not seems to have an effect on whether the code is good, but is by no means a deciding factor.
[19:20:23] (BotNet) <[vL]Kp> Any idiot can make a bot with CSB, and many do!

Denial

Wow, We need a kiddy section ! THis Post is no place for the fun forum
Actus non facit reum nisi mens sit rea

Adron

Well, I wouldn't say the basic idea is so wrong. His statement is incorrect because linux has a much more monolitic kernel than Windows - if anything, linux can run off fewer files when you compile everything into your kernel.

Other than that, if an OS is made up of independent parts that don't crash the kernel (MS seems to be making more things critical than linux does) then less failures will make it crash.

Then on the other hand, having fewer and smaller files makes an OS less likely to have bugs, with the same number of people working on it...

UserLoser.


Grok

#9
Quote from: Adron on December 16, 2003, 02:30 AM
Well, I wouldn't say the basic idea is so wrong. His statement is incorrect because linux has a much more monolitic kernel than Windows - if anything, linux can run off fewer files when you compile everything into your kernel.

Other than that, if an OS is made up of independent parts that don't crash the kernel (MS seems to be making more things critical than linux does) then less failures will make it crash.

Then on the other hand, having fewer and smaller files makes an OS less likely to have bugs, with the same number of people working on it...


Not exactly a complete statement.  Studies by IBM have shown many different reasons for bugs being introduced into software.  Having fewer files does reduce the number of bugs, but smaller files does not.  The evidence suggests that related to procedure size, large procedures with hundreds of lines have less errors than many small procedures with dozens of lines.  This is counterintuitive to how we think and program, but is supported by their studies.  I prefer to write small functions for discrete manipulations and build larger blocks from those.  For me this works well, and I feel I'm doing a better job.  Perhaps for programmers in general, writing large procedures lets them keep more focus on the algorithm, and more details in front of them.  If so, this is just support that they're sloppy function writers, not good, focused, programmers.

Edit:  source material from "Code Complete"

Banana fanna fo fanna


effect

LIES LIES ALL LIES

you dont have a friend  ;D
Quote from: Mangix on March 22, 2005, 03:03 AM
i am an expert Stealthbot VBScript. Recognize Bitch.

Adron

#12
Since this post was reawakened...

Quote from: Grok on December 17, 2003, 11:31 AM
Having fewer files does reduce the number of bugs, but smaller files does not.  The evidence suggests that related to procedure size, large procedures with hundreds of lines have less errors than many small procedures with dozens of lines.  

Is this the sizes they refer to? Small is a few dozen lines and large is many hundreds? I was thinking more about file sizes than of procedure sizes - having 10-50k or 400-1000k source files. For small source files  you can either have a large amount of small procedures or a few big ones. For large source files, you can either have a huge amount of small procedures or a large amount of big ones.

If the project is divided up into reusable pieces by source files, having the pieces be more limited and easily tested should be likely to reduce bugs. At least that's my experience from the huge nbbot.cpp.... :P

MrRaza


iago

Adron's absolutely right.. I was looking through piles and piles of .jsp and .java files yesterday, and the fact that there were tons of different files, each for doing different things, saved a lot of trouble.
This'll make an interesting test for broken AV:
QuoteX5O!P%@AP[4\PZX54(P^)7CC)7}$EICAR-STANDARD-ANTIVIRUS-TEST-FILE!$H+H*