Apple, Google, Microsoft, and the future of computing

Frank

Certified not a majority
Veteran
Do I like Google? Not really. It's big, bloated, mostly unsuccessful (but the things in which they did succeed are massive) and less and less consumer friendly.

So, how about Microsoft? Hell, no! They are very arrogant and have lost touch with everyone else (their customers). They only have two successful products (Windows and Office), and see those becoming irrelevant. But that's all they know. And they copy exactly the wrong "improvements".

And how about Apple? Definitely not! They want total control. Which mostly worked while Steve was at the wheel. Because he also delivered products everyone wanted. That strict control made it happen. But now, only that demand for total control is left.


Essentially, Microsoft (or rather, Windows) is becoming irrelevant, and Linux and the likes (BSD/OSX/iOS) have become the default OS. Finally, some kind of a standard has been reached.

So, the "winner", in a technical sense, would be Linus Torvalds. And all the other groups who helped make it a success, like the OpenGL and Apache people. But mostly: OpenSource. And thus, Richard Stallman by extension (although he would be the first one to tell us all that it's not "free" software!).

That gives us an interesting list of companies: Samsung, IBM, Sony, Oracle (reluctantly) and, yes, Google. Who all use that software that is developed by many thousand engineers, mostly in their free time, just because it needed fixing, or should be improved.


Essentially, anarchy does work. Or perhaps, it's engineers not bothered by management that make things work. And the smart companies pick it up, incorporate it into their products and even advertise it.

If you still have Microsoft or Apple shares, I would strongly suggest you sell them ASAP. And with Google as the new king of the hill, the same will happen in about five years from now. But out of the three giants, they still have their stuff together and made it happen.

Not that they would have succeeded without first Asus (eeePC) and then Steve (iPhone and iPad) showing consumers they don't really need Windows or Office. And we gamers (especially here at B3D) are still waiting for Steam to come with a Linux box that either runs DirectX applications, or for them to get all those developers to produce a Linux version of our precious games.

But in the mean time, things are more standardized, consolidated and open as they've ever been in the computer space. We do have a winner!

And expect this standard to last. Parts of the code you write, might be around in more than a century from now, if incorporated.

Perhaps that quantum computing or something equally bizarre (like nano-rod computing) might require a change, but seeing that 90+ % of all of the fastest and most experimental computers run Linux, as well as most of the smallest, I wouldn't hold my breath. It's pretty adaptable.


The only thing missing seems to be a good distributed model. (Amdahl is still right.) We see this everywhere, even on the desktop. But then again, Unix and the old Linuxes did have a pretty good one. It mostly fell into disarray (except for the huge clusters).

But if something shakes things up, It'll probably be that: run whatever program distributed over many nodes that probably use different processors and with an unknown latency.

:D
 
Last edited by a moderator:
Essentially, Microsoft (or rather, Windows) is becoming irrelevant, and Linux and the likes (BSD/OSX/iOS) have become the default OS. Finally, some kind of a standard has been reached.

So, the "winner", in a technical sense, would be Linus Torvalds.

Wouldn't that make Apple the "winner" since they have the total share of the default OS from a statistical standpoint?
 
Wouldn't that make Apple the "winner" since they have the total share of the default OS from a statistical standpoint?

That depends on how you count.

It's hard to get good statistics, as in some 60% of all smart phones sold are iPhones, while in others 75% are Android. But they all agree that Android grows faster.

But if you count all devices that have a processor, we see that the vast majority has no operating system whatsoever (tiny microcontrollers). Of all the devices that do have one, by far the vast majority uses an Unix derivate, most likely a custom Linux. Most of those are invisible. They're inside random electronics.

Of the more visible ones, which is only a tiny part, we have Windows, iOS and Android in various distributions, depending on where you look or how you count.
 
To explain it a bit more:

Electronic devices without a processor are vanishingly rare nowadays. Simply because it's much easier and cheaper to take a microcontroller, add some I/O like resistors and FETs, and program the functions you require.

But, as soon as you want to communicate with another piece of electronics or a user, it rapidly becomes very complex and time-consuming to program all that, especially if you want to incorporate complex standards like USB, Bluetooth or Ethernet. At that point, a far more powerful processor, running a complex OS, like (mostly) Linux, becomes the better option.

We see that in the pricing of the components: such a big, complex and powerful microcontroller that can run a Linux kernel is only marginally more expensive than a far simpler one that cannot. Simply because the demand is much higher. Only a really tiny one is much cheaper.

You can buy a tiny microcontroller for ~$0.20, a decent one for ~$1, and a "huge" one that can run (and is delivered with) Linux for ~$2.


EDIT: after counting, I have at least 57 processors in devices in my home, from the coffee machine, drill, alarm, dimmer and such, up to the router, stereo, television and laptops/computers. Of which at least 7 run Linux (including my server ;-) ).
 
Last edited by a moderator:
Anyway, to get back to the original point: I have two laptops and a computer that run Windows, and at least 7 devices (among which one server) that run Linux. But, when I want something computed, only the processor in the computer/laptop is used.

In some cases, like when running a game, also the processor in the GPU is used. And while that amounts to 8 cores for the CPU and 800 cores for the GPU, in general processing only a single CPU core is used.

Now, I know the other 7 CPU cores can be used. But. For starters, most applications use threading and shared memory to spread the load. Which means, that most of the time they're waiting for one another. Be it to access memory, or to synchronize the results.

A much better way is to spawn jobs, like you would do on a Cell. But that turned out to be quite hard for most programmers, not in the least because you have to write your own schedulers and managers.


Then again, there are millions of processors I could potentially communicate with, which are idle for most of the time. Some around the house, others further away.

Using that immense spare capacity is what "cloud computing" should be all about. But instead, it is about renting virtual machines, that run on dedicated processors that are idle most of the time as well.


When people say "The Cloud", or "The Internet", they think about a huge amount of computers that stand ready to go and process their requests.

Which is true, in a sense. Many routers will process their messages, for example. But that's not what they think about.

Like, how do you offload computing to the cloud? As it is: don't bother, unless you already have a server park. Because the only examples I can come up with is MMOGs, and Diablo 3.


A better analog would be Seti, or Folding. But those are really inefficient, and only run on compatible hardware. So, the minimum requirements would be a low-level and (somewhat) fine-grained VM. As in: you can upload many, simple jobs.

The latency isn't all that interesting, as long as it's less than the total time to completion.
 
I wouldn't say Windows is becoming irrelevant, just less dominant in its market and eventually maybe not dominant at all. The path to irrelevancy is very long and unclear.
 
I'm not sure frank. I have 4 xbox 360s , 3 desktops running windows 8 , a htpc running windows 8 , two surface pros running windows 8 and two phones running android currently.

If the xbox one works as a media center I will transition 4 xbox 360s to windows 8 type machines and my two android phones will most likely be windows 8 phones at the time of my next upgrade.

If anything I think the other OS's are becoming irrelvent again after having a few years in the sun
 
Do I like Google? Not really. It's big, bloated, mostly unsuccessful (but the things in which they did succeed are massive) and less and less consumer friendly.

So, how about Microsoft? Hell, no! They are very arrogant and have lost touch with everyone else (their customers). They only have two successful products (Windows and Office), and see those becoming irrelevant. But that's all they know. And they copy exactly the wrong "improvements".

And how about Apple? Definitely not! They want total control. Which mostly worked while Steve was at the wheel. Because he also delivered products everyone wanted. That strict control made it happen. But now, only that demand for total control is left.


Essentially, Microsoft (or rather, Windows) is becoming irrelevant, and Linux and the likes (BSD/OSX/iOS) have become the default OS. Finally, some kind of a standard has been reached.

So, the "winner", in a technical sense, would be Linus Torvalds. And all the other groups who helped make it a success, like the OpenGL and Apache people. But mostly: OpenSource. And thus, Richard Stallman by extension (although he would be the first one to tell us all that it's not "free" software!).

That gives us an interesting list of companies: Samsung, IBM, Sony, Oracle (reluctantly) and, yes, Google. Who all use that software that is developed by many thousand engineers, mostly in their free time, just because it needed fixing, or should be improved.


Essentially, anarchy does work. Or perhaps, it's engineers not bothered by management that make things work. And the smart companies pick it up, incorporate it into their products and even advertise it.

If you still have Microsoft or Apple shares, I would strongly suggest you sell them ASAP. And with Google as the new king of the hill, the same will happen in about five years from now. But out of the three giants, they still have their stuff together and made it happen.

Not that they would have succeeded without first Asus (eeePC) and then Steve (iPhone and iPad) showing consumers they don't really need Windows or Office. And we gamers (especially here at B3D) are still waiting for Steam to come with a Linux box that either runs DirectX applications, or for them to get all those developers to produce a Linux version of our precious games.

But in the mean time, things are more standardized, consolidated and open as they've ever been in the computer space. We do have a winner!

And expect this standard to last. Parts of the code you write, might be around in more than a century from now, if incorporated.

Perhaps that quantum computing or something equally bizarre (like nano-rod computing) might require a change, but seeing that 90+ % of all of the fastest and most experimental computers run Linux, as well as most of the smallest, I wouldn't hold my breath. It's pretty adaptable.


The only thing missing seems to be a good distributed model. (Amdahl is still right.) We see this everywhere, even on the desktop. But then again, Unix and the old Linuxes did have a pretty good one. It mostly fell into disarray (except for the huge clusters).

But if something shakes things up, It'll probably be that: run whatever program distributed over many nodes that probably use different processors and with an unknown latency.

:D

Is this thread about

a) the future of system software?

OR

b) personal likes/dislikes of ecosystem vendors?

OR

c) the pros and cons of distributed computing?

OR

d) something else?
 
Adobe just drop all their products and went with cloud which is subscription base. I think Apple, Google and Microsoft will likely follow. You have to pay subscription for your Windows, Mac OS. Or be bombarded with ads.

I really do not like where the future of computing is going. But the alternative is Linux. I adore the idea, but in over a decade Linux has failed to gain traction in desktop world. Where are the support from hardware and software vendors ? Community can only get you so far.

But I'm moving to Linux wether I like it or not. Windows 8.1 is not an option. The new Mac Pro is really strange and I picture my floor will be cluttered with extra external devices to trip over. As for Adobe, I don't like paying monthly fee for my software. I like to upgrade when they offer me new features that I need. I'm not going to give you my money just because you went cloud.

At this point Adobe was my sole reason for remaining with Windows for a few of my workstation but they went cloud. WTF. My next workstation build will be Linux with maybe Windows XP VM for legacy stuff. I'm really not looking forward to my next build that's for sure. It's a pain to get things working under Linux, but I'll suffer through that than put up with this BS from Adobe or Microsoft.

From what I need for my workstations, future of 'desktop' computing is really crappy IMO.
 
Is this thread about

a) the future of system software?

OR

b) personal likes/dislikes of ecosystem vendors?

OR

c) the pros and cons of distributed computing?

OR

d) something else?
Mostly A, but in an abstract way. That's where C comes along. B was mostly for an introduction. And it isn't all that personal, that was just to keep it short. If you want the long version, I can do that. ;-)
 
That depends on how you count.

It's hard to get good statistics, as in some 60% of all smart phones sold are iPhones, while in others 75% are Android. But they all agree that Android grows faster.

But if you count all devices that have a processor, we see that the vast majority has no operating system whatsoever (tiny microcontrollers). Of all the devices that do have one, by far the vast majority uses an Unix derivate, most likely a custom Linux. Most of those are invisible. They're inside random electronics.

Of the more visible ones, which is only a tiny part, we have Windows, iOS and Android in various distributions, depending on where you look or how you count.

This is kinda silly since you just pointed out that the invisible devices are the ones running the OS that no one wants. That makes sense, but is hardly something worthy of accolades.
 
Adobe just drop all their products and went with cloud which is subscription base. I think Apple, Google and Microsoft will likely follow. You have to pay subscription for your Windows, Mac OS. Or be bombarded with ads.

I really do not like where the future of computing is going. But the alternative is Linux. I adore the idea, but in over a decade Linux has failed to gain traction in desktop world. Where are the support from hardware and software vendors ? Community can only get you so far.

But I'm moving to Linux wether I like it or not. Windows 8.1 is not an option. The new Mac Pro is really strange and I picture my floor will be cluttered with extra external devices to trip over. As for Adobe, I don't like paying monthly fee for my software. I like to upgrade when they offer me new features that I need. I'm not going to give you my money just because you went cloud.

At this point Adobe was my sole reason for remaining with Windows for a few of my workstation but they went cloud. WTF. My next workstation build will be Linux with maybe Windows XP VM for legacy stuff. I'm really not looking forward to my next build that's for sure. It's a pain to get things working under Linux, but I'll suffer through that than put up with this BS from Adobe or Microsoft.

From what I need for my workstations, future of 'desktop' computing is really crappy IMO.
Yes, I agree. I really think Windows 8 is the wrong direction, and the subscription model is what they all want.

But, if you're not running it on your desktop, it probably runs on a Linux machine (except Microsoft Azure, I expect). Your phone and tablet probably use it as well (iOS/Android). And to get that data to and from, it passes many routers, that also run Linux. So, you're using it anyway.

That's why it has become the default OS. The desktop market might be the most visible one, but it's only a tiny part of all those boxes. And it's probably the only one where you might have to pay for the OS as well.
 
This is kinda silly since you just pointed out that the invisible devices are the ones running the OS that no one wants. That makes sense, but is hardly something worthy of accolades.
Well, as it's by far the most used one, many people do seem to want it.

And unless you only look at desktops, it's the most commonly used one on user devices (smartphones, tablets and desktops) as well. iOS uses a BSD kernel, and Android simply uses Linux.

PS4 also uses BSD. Which is very much like Linux, but with a less restrictive license.
 
Sorry, but Linux is still a mess if you want anything low latency - I wouldnt trust it even with streaming audio without glitches.
No matter how much devices run Linux (and many embedded devices like your coffee machine, or time and safety critical things like controllers for cars/planes simply dont), for desktop its desktop that counts.
And there you have MS and Apple, both having control over every part of the OS to make sure the user interaction aint blocked by unrelated services. Getting a OS running is easy, keeping a flock of independed parts behaving nicely all time isnt!

I am working nearly 100% under Linux in my job and the tools are great, server stuff is great, the desktop isnt!
If it works somehow its good enough, and theres no incentive of improving. For example, if you take a window, enlarge it, shift it to the left, enlarge it further, go on doing this - after a few times the window gets garbled and might even crash your X session. This bug is there since ages and illustartes the issue nicely - theres no thought about what should happen at all, and which component is responsible for keeping everything in the operational limits.

In a sense GPL totally devalues the product itself and just focus on selling services. Noone will invest big amounts of research and time in the desktop if the rest of the world can just rip it off (and I dont talk about code but the harder part about finding smart solutions in the first place)
 
My 2c

Love linux for scheduling work across lots of cores and writing simple C programs (valgrind always brings me back).

While terminal access is great - I hate linux as a desktop OS. Installing simple updates can easily become a neverending web of dependencies, drivers are painful, and when things go wrong they go very, very wrong. For a couple of years the desktop experience in ubuntu was approaching acceptable but unity more or less broke that - they managed to change it in such a way to leave it still to hard to use/fix for general users while making it clumsy for those familiar with linux.

Linux is useful tool and the fact it's free makes it ideal to put everywhere. It's fair to say that the code running on embedded cores and the code governing HPC clusters aren't the same. More importantly the code doing the heavy lifting is usually isn't open source.

Windows is more or less still the default operating system if you want to do something not related to coding. iOS and android are more or less the default systems everyone spends their none work time using.

Anyway, to get back to the original point: I have two laptops and a computer that run Windows, and at least 7 devices (among which one server) that run Linux.
BTW - if that server is the one running your webpage it's currently broken...
 
Does your iPhone/iPad/Android phone have a desktop? Does it works like expected?

Yes, the Ubuntu desktop experiments of the last years were about as bad as Windows 8. And yes, Windows 8 does work as expected, as long as you know what to expect and where to find it. Just like iOS and Android. And the initial eeePC Linux desktop was a revelation for many people.

Yes, you might want to select your preferred Linux desktop, customize and tweak it until it works nicely, if you're a producer. That's the difference between free and paid for.

Really, I totally don't understand all the singular focusing on some random, free, Linux desktop, which was free, as in you didn't have to pay for it, and that same desktop, customized, tweaked and tuned, where you do have to pay for it.

If you took the time and effort, you could have that free desktop perform just as well. Or, you could simply buy a product that has all that preinstalled.


I mean, Google and Apple did do something, they didn't just pick a random, free copy off the shelf without looking, and sell it for a big markup, just like that.
 
Generation after generation of phones and tablets practically pronounce their "smoothness" over their predecessors - I got a 4-core tablet and it still stutters now and then.
If you want to get Linux a OS thats as painless and performant as ios/Windows (low latency, usefull process and thread priorities) you have a big task ahead. If you manage to do it, it will have little in common with the upstream linux kernel and you wont get your patches in mainline anytime this decade (I doubt ios has any similarities to BSD by now). The money involved cares for high throughput only, which is pretty much sitting on the some slider as low latency.. but on the opposite end.

And then you can start thinking about fixing up a nice desktopenvironment from all the parts.
 
They only have two successful products (Windows and Office)
Arguably the 360 is a huge success. It's not the runaway leader by far that windows is in the desktop OS market, but it doesn't have to be to count as successful... :)

Essentially, Microsoft (or rather, Windows) is becoming irrelevant, and Linux and the likes (BSD/OSX/iOS) have become the default OS.
Don't really see that - especially in the case of linux itself - certainly not right now and not for a bunch of years either. While linux-like derivatives have met some success in some niches (iOS; OSX, not so much, because macs are expensive and OSX is incompatible with windows, which is the actual default OS). iOS, and android too for that matter, can't really take over the desktop market yet because they lack comfortable input methods (IE, hardware keyboard, mouse too some people would say), and large, comfortable screens.

In some years' time though, who can say. Few if anone could realistically have predicted the explosive success of touch devices these past couple years, so the future is indeed rather murky I would say...
 
Google is using the linux kernel in android, so it wins.
Isn't Microsoft using some network stack from linux? so it wins too.

Anyone win and all are happy.
 
Back
Top