Windows 7 on its way :|

It's not just as simple as "I don't support that".

If that was the case, almost every bug you can find for any OS could be rolled under the "I don't support that" clause. I mean, if they didn't test it, it's obviously because they don't support using it that way, right?

Or are you finally willing to admit that there's no way on God(s) Green Earth(TM) that any programming project with anything other than a bottom-run-basic level of complexity is going to inherently have bugs?
 
D: Is hated by the all the self-proclaimed "computer geeks" ofthe world because you *gasp* charge money for it and

Just wondering, what does that have to do with the stability of windows?
 
Just wondering, what does that have to do with the stability of windows?

If you have ten million people who make it their hobby / enjoyment / paid job to do continually research and develop malicious attacks on an application or OS, then you're going to have far more problems to deal with. That's what it has to do with Windows' stability.

Do you see anywhere near the same population of people doing the same for MacOS? When "everyone hates you", your life will become harder -- and not always for any good reason.

MacOS is in a great place right now, because there really aren't that many people focused on finding attacks for the Mac platform. But as the OS matures and gets a bigger foothold, people will begin targetting the Apple platform too, and problems will start to emerge just as they have on Windows.
 
MacOS is in a great place right now, because there really aren't that many people focused on finding attacks for the Mac platform. But as the OS matures and gets a bigger foothold, people will begin targetting the Apple platform too, and problems will start to emerge just as they have on Windows.

Sorry but I think that this is as tired an old mantra as "Linux/OSX/*nix is better by design because it's not Windows". There are some very large and high-profile targets running Linux now, how much damage could one do with a couple of thousand CPUs and a 10Gig connection to the Internet back-bone?

To turn the discussion round though, how much of this apparent security issue is down to a conflict of interest between the concept of user control over a desktop PC, and the fundamental tennets of security (one of which being -- don't let the user do anything!).
 
Sorry but I think that this is as tired an old mantra as "Linux/OSX/*nix is better by design because it's not Windows". There are some very large and high-profile targets running Linux now, how much damage could one do with a couple of thousand CPUs and a 10Gig connection to the Internet back-bone?
Yes, but the BIG Linux targets are also the ones running fully-customized kernels and hardened processes. And actually, a truly hardened Windows server can be just as secure with the same attention to detail as goes into a similar Linux distro.

Those systems aren't the ones we should be thinking about; think more about John and Jane Doe users out there who buy a PC and take it home and never do anything else (maintenance-wise) with it. Those are the PC's at risk, and right now, the overwhelming majority of PC's that fall into that category are Windows-based. That is a valid reason why Windows is a bigger target than any flavor of Linux.

The flipside is still there: those same users are moving to Mac (and other, alternative OSes) because they're supposedly "more secure" or "more stable" than Windows. The problem is, as the installed base of alternative OSes grow, so does the big red target painted on the side of their collective PC's. When they hit a certain critical mass, they too will see the onslaught. And maybe that isn't necessarily a bad thing...

To turn the discussion round though, how much of this apparent security issue is down to a conflict of interest between the concept of user control over a desktop PC, and the fundamental tennets of security (one of which being -- don't let the user do anything!).
Absolutely agreed, and Microsoft hasn't done themselves any favors both in how their previous OSes worked and the "corrections" they've made in Vista. Linux has always had the concept of elevated user privelges, root authority, et al. Technically Windows NT-based OSes had this, but anything that was on the desktop (before Vista) defaulted ALL users to an admin role.

So everyone became accustomed to their PC allowing them to do whatever they want -- along with whatever spyware, adware, malware that was running under their credentials (with or without their knowledge.) They've taken an extreme amount of negative press for those decisions, as well they should, and User Access Control was their "fix". Not a great fix IMO, at least in it's default state -- I prefer the enterprise method which actually asks for a user ID and password to continue versus just an "Allow or Deny" message.

So, exactly to your point, where does the balance truly lie? And what consequences (OS stability, user friendliness, et al) does that balance have?

I think MS has done a lot of good things with how Vista is structured compared to XP, but a lot of people are only going to see the negative. That's likely just who those people are, and they're welcome to their opinion. And back to the original topic, I think Microsoft at least has their head pointed in a better direction now, so I expect future OS releases from them will only continue to get better.
 
If you have ten million people who make it their hobby / enjoyment / paid job to do continually research and develop malicious attacks on an application or OS, then you're going to have far more problems to deal with. That's what it has to do with Windows' stability.

Do you see anywhere near the same population of people doing the same for MacOS? When "everyone hates you", your life will become harder -- and not always for any good reason.

MacOS is in a great place right now, because there really aren't that many people focused on finding attacks for the Mac platform. But as the OS matures and gets a bigger foothold, people will begin targetting the Apple platform too, and problems will start to emerge just as they have on Windows.

Minor issues maybe, but most major issues are found on all platforms within a few weeks of release it seems. And then patched. It seems there's practically a checklist of common exploits to use against newly developed software.
If an OS has severe bugs/exploits that are just unknown, then it's not safe and secure at all.
Linux is generally more secure and stable out of the box than Windows as well (no patches), though that's often attributed to the legacy support Windows needs to maintain, even by MS developers. From what I've read about Vista, big things have been done in its underlying code to fix these problems, yet that seems contradicted by the exploits and such, many carrying over from XP, that still exist.
 
Can someone tell me what implications would a Windows subscription model have on everyone?

I can't speak for everyone, but I wouldn't be using Windows on subscription (nor do I think most home users would). It might be attractive to businesses since many of them lease equipment anyway.
 
Office 2007 does a wide range of things that OpenOffice cannot do -- howabout configuration by group policy? Howabout integrated updates via our already-in-place Windows Update Services system?

OpenOffice is good stuff for the home user who doesn't want to spend the money, no argument. But for a true office environment, there are things in MS Office that OpenOffice cannot replace.

No arguments there - Office 2007 is the way for corporations to go; however a home user that would actually like to have a legal box can use Open Office and need little else. There are some document formats that I do actually use that Office 2007 does not natively support - Aportis (palm doc) format to start with.
 
Minor issues maybe, but most major issues are found on all platforms within a few weeks of release it seems. And then patched. It seems there's practically a checklist of common exploits to use against newly developed software.
If an OS has severe bugs/exploits that are just unknown, then it's not safe and secure at all.
Linux is generally more secure and stable out of the box than Windows as well (no patches), though that's often attributed to the legacy support Windows needs to maintain, even by MS developers. From what I've read about Vista, big things have been done in its underlying code to fix these problems, yet that seems contradicted by the exploits and such, many carrying over from XP, that still exist.
Generally agreed on all fronts, but a "checklist of common exploits" only goes so far. I still think there's discoveries to be made, and I can only assume the best way to find them is to leverage even half of the enthusiasm aimed at Microsoft in another Os'es direction...

I can't speak for everyone, but I wouldn't be using Windows on subscription (nor do I think most home users would). It might be attractive to businesses since many of them lease equipment anyway.
Yeah, definitely agree with this. Subscription doesn't make any sense for a home owner; businesses would be a different story.

No arguments there - Office 2007 is the way for corporations to go; however a home user that would actually like to have a legal box can use Open Office and need little else. There are some document formats that I do actually use that Office 2007 does not natively support - Aportis (palm doc) format to start with.
:)
 
Going back to Davros' definition of unfinished code, howabout the unfinished processors you're buying too? Got quite a chuckle out of Linus Torvald's take on Core2Duo Errata from his quote here:
Linus said:
And yes, we've occasionally hit real hardware bugs. Not
just in CPU's either. You'd think something like a "simple"
ethernet chip wouldn't be buggy. Think again.

Bugs happen.
Humorous, in the context that even the hardware you're running on (reliably, more than likely) contains bugs of it's own that are A: known, B: documented and C: never to be fixed -- and these are things that OS manufacturers are expected to work around.

So tell me again why any OS creator should be expected to create "perfect code"?

Link: http://www.realworldtech.com/forums/index.cfm?action=detail&id=80579&threadid=80534&roomid=2
 
It's not just as simple as "I don't support that".

If that was the case, almost every bug you can find for any OS could be rolled under the "I don't support that" clause. I mean, if they didn't test it, it's obviously because they don't support using it that way, right?

Or are you finally willing to admit that there's no way on God(s) Green Earth(TM) that any programming project with anything other than a bottom-run-basic level of complexity is going to inherently have bugs?

You you think its wrong to have a target system or minimum system requirements ?
the example that springs to mind is deus ex 2. dx8 games up untill that point did work on dx7 cards (all be it without the shaders )but dues ex 2 was the first game to absolutely require a dx8 card or it wouldnt run(i remember people getting caught out by this) . you think that was wrong of them ?
I dont

oh and its nice to know opinion has changed from "its not possible to create a program without bugs" to "its not possible to create a program with anything other than a bottom-run-basic level of complexity without bugs" its a start in the right direction :D

you think that a program becomes more difficult to debug the more complex it is up untill a point and if you ad 1 more line of code it goes from difficult to impossible i dont
and i dont think you beleive it either. heres what i beleive and i think you do too:
the more complex a program the more difficult it is to debug and thats it, there is no point were it suddenly becomes impossible, the difficulty just increases.
 
The more complex the program, the more modular you'll make it, with small, simple modules which are easy to test and debug. But that would require correct programming philosophy, which is not really there at MS (not because of lack of good programmers, but because of past sins and all the compatibility junk they have to drag along).
 
The more complex the program, the more modular you'll make it, with small, simple modules which are easy to test and debug. But that would require correct programming philosophy, which is not really there at MS (not because of lack of good programmers, but because of past sins and all the compatibility junk they have to drag along).

I don't disagree with what you said about their past. However, if you watch any of the (many) video interviews with Microsoft development staff regarding Vista, you'll hear that they indeed spent a considerable amount of time going back to "modular" code.

Which is (reportedly) why they will be able to start churning out OSes in reasonable timeframes again.
 
oh and its nice to know opinion has changed from "its not possible to create a program without bugs" to "its not possible to create a program with anything other than a bottom-run-basic level of complexity without bugs" its a start in the right direction :D
No, I can write program with no bugs right now:

Code:
Echo Hello world

There, a "program" that has no bugs -- if you want to call it that. Here's another "program" that has no bugs either:

Code:
wscript.echo "Hello, the date is " & Date()

Wow, lookie there. Another program that has no bugs -- again, if you want to call it that.

you think that a program becomes more difficult to debug the more complex it is up untill a point and if you ad 1 more line of code it goes from difficult to impossible[
Can you point me to where I said that little part about "one more line it goes from difficult to impossible". I'd like to know where I let that slip, thanks. The prior half of that sentence is correct...

i dont and i dont think you beleive it either. heres what i beleive and i think you do too: the more complex a program the more difficult it is to debug and thats it, there is no point were it suddenly becomes impossible, the difficulty just increases.

So what is an "impossible" level of difficulty? If you need to debug 10 lines of code, is that impossible? If you need to debug 1000 lines of code, is that impossible? If you need to debug 10,000,000 lines of code, is that impossible? If you need to debug 1,000,000,000,000 lines of code, is that impossible?

No. Nothing I just wrote is impossible within the realm of technicalities. But what if you have 50,000,000 lines of code, 4,830 different video cards, 97 different processors, 740 different motherboards, 21,872 different memory sticks, 9,470 different network cards and 372 different audio cards to test with your fifty million lines of code? And every permutation therein?

Let's just do a bit of quick math on just processors and video cards with my little goofy hypothetical numbers... Even if we only had to test every permutation of 1000 video cards with 20 processors and 50 motherboards, that's 1,000,000 different case scenarios to test your fifty million lines of code with.

Is that impossible? What if we add 100 network cards? What if we add sound cards? What if we're fighting CPU errata? At what point does it truly become impossible?

Technically, within infinite time and infinite resources, it isn't. But we don't live in that world.
 
Going back to the original topic: according to a rather high, technical Microsoft manager (nephew of a friend), Vista development didn't work out (that's why they restarted it two times) and it seems impossible to continue building on it, so the next version will be what Vista was supposed to be, really rewritten from scratch, but this time with much better and stricter development rules and goals.
 
Going back to the original topic: according to a rather high, technical Microsoft manager (nephew of a friend), Vista development didn't work out (that's why they restarted it two times) and it seems impossible to continue building on it, so the next version will be what Vista was supposed to be, really rewritten from scratch, but this time with much better and stricter development rules and goals.

I already like the sound of this :D
 
Btw, almost all programs I write are bug-free as far as I can determine. Then again, almost all of them are less than 10,000 lines of code. And written in Delphi.

The bigger a software project, the less chances of success you have. Exponentially. Ever changing specifications, interfacing with third parties and too many arbitrary decisions (made by people who aren't technical) are the major culpits.

While I can use any Microsoft software legally from my employer, I use OpenOffice myself. And I would run Linux if I could (games and developing MS Windows apps).
 
"I just wrote is impossible within the realm of technicalities"

now were getting somewhere!

weve gone from :
"impossible to have a program without bugs "
To :
"impossible within the realm of technicalities to have a program over a certain level of complxity without bugs "

you see i can nearly agree with that ;)

i just looked at ms's earnings and in 4th quater 2006 the made $11 billion

Heres what they should do with windows 7:
when they think its finished they should spend that $11 billion on removing bugs
i think you'll agree with me on this if that doesnt get rid of all of them it will get rid of nearly all of them
 
"I just wrote is impossible within the realm of technicalities"

now were getting somewhere!

weve gone from :
"impossible to have a program without bugs "
To :
"impossible within the realm of technicalities to have a program over a certain level of complxity without bugs "
I want you to quote me EXACTLY where I said it is impossible to have a program without bugs. You're putting words into my mouth and creating your own little straw-man arguments out of them.

Until you can get your head together and use any sort of actual logic to back your claims, I'm done with you.
 
ok alburquerque i appologise turns out you didnt say that (must of got carried away :D)

but it does bring up another question " if you beleive its possible to have bug free programs why do you defend companies for not creating them?"
 
Back
Top