Why limit console functionality?

Dreamcast didn't have a harddrive and sega didn't have an OS team, and I think they did fairly well.

They had a rather functional web browser(especially version 3.0), could send email, and I believe in Japan there was even a printer attachment and a word processor, or at least one was planned.

I say make either simple stand alone programs, as on the dreamcast(which may have been running Windows CE for the web browser, which might be a good idea of what a console OS should be like), or take say maybe adopt Linux and modify it to suit your console, and then port Linux software.
I don't think there would have to be any problems with having an OS that does everything a PC can, just as long as it doesn't have to do them all at once. A fully featured multi tasking OS probably just isn't necessary for most people, at the most maybe a word processor opened concurrently with a web browser, but I don't think there would be any need for a desktop. For the most part though, I'd say applications could be done without needing a fully fledge OS.(or if an OS does degrade performance, be like dreamcast and have an OS for applications, and then a simpler OS for games)

BTW, as long as the OS can download and execute files(so files can only be added from a disk), I'd say it's pretty safe.

Do you really think that if os was easy to do microsoft would own 90% of the market ? Every tom , dick and harry would be making os product.

Back in the day, they did.
For that matter, there's about a zillion different versions of some type of 'nix out there.

BTW, the neat and tidy worry free closed system will disintegrate when people try to connect a printer to their console... and a digital camera... and a camcorder...and a scanner... and a webcam, etc. Who will write drivers for all of that? Make sure that everything works fine together? Who is going to provide updates for OS, as well as anti-virus and anti-spyware tools, because once the user base increases, they will undoubtedly come along. Finally, who is going to pay for this? Manufacturer, who is already losing money on hardware, or consumer, who can already get a PC for the same amount?

Well, just like not anybody can create a game or an accessory for a console, there can be licensed applications and hardware that will be made to spec and tested to be compatible. If it doesn't have the proper license and format, it won't even run on the system.

The OS thing seems workable to me, you don't make an OS to run anything that could possibly exist, you define what you want it to do from the start, and then make sure it does those things simply and not worry about expanding upon it.
A console OS would only need a web browser, word processor, chat client, and maybe a simple picture editing program to do well, and dreamcast was able to do a web browser, email client, and a chat client(and was at least planned for a word processor).
Dreamcast could also do DIVX, but that wasn't an official program, and it wasn't integrated into one program while the web browser, email client, and chat client were.
An OS can exist and be very useful without being free and open.

There is a reason why ms is on version 6 of its browser and why there are thousands of patches for it .

Yet I know people who still have Internet Explorer 4.0 or Netscape and do just fine navigating the web and going to all their favorite sites.
 
They had a rather functional web browser(especially version 3.0), could send email, and I believe in Japan there was even a printer attachment and a word processor, or at least one was planned.

and the dc failed . The web browser was nice and I did use it on the dc here.

Back in the day, they did.
For that matter, there's about a zillion different versions of some type of 'nix out there.
and how many of these back in the day had guis and operated on anything as complex as current pcs or a cell set up ?

Yet I know people who still have Internet Explorer 4.0 or Netscape and do just fine navigating the web and going to all their favorite sites.
and hopefully those sites never update . Not to mention all the holes and security risks they are leaving themselves open too .
 
so the topic creator wants to turn consoles into PCs. i dont see a problem with hooking up consoles to a monitor. poor man's hdtv right there. but if i wanted PC functions on a console i would be a PC. Consoles= Gaming machine first, all the other stuff 2nd.
 
Kill_Jade said:
so the topic creator wants to turn consoles into PCs. i dont see a problem with hooking up consoles to a monitor. poor man's hdtv right there. but if i wanted PC functions on a console i would be a PC. Consoles= Gaming machine first, all the other stuff 2nd.

Eh, I don't think so.

Eventually I think you'll see consoles start replacing PCs, people who really need to do important things with their PCs will have workstations, people who just want to simply and easily check their email will use consoles. And if anything, I think PCs could use quite a bit of dumbing down and simplifying for the average user. Many things could be automated or eliminated, as one default choice would be chosen most of the time anyhow, or the user wouldn't know what it's for. I think a vastly streamlined OS and software could go over quite well, maybe use TIVO as an example. I've never used a TIVO, but I've heard it automatically makes makes selections for people based on what they watch, that could be quite a good thing to add to a web browser.
 
Fox5 said:
Kill_Jade said:
so the topic creator wants to turn consoles into PCs. i dont see a problem with hooking up consoles to a monitor. poor man's hdtv right there. but if i wanted PC functions on a console i would be a PC. Consoles= Gaming machine first, all the other stuff 2nd.

Eh, I don't think so.

Eventually I think you'll see consoles start replacing PCs, people who really need to do important things with their PCs will have workstations, people who just want to simply and easily check their email will use consoles. And if anything, I think PCs could use quite a bit of dumbing down and simplifying for the average user. Many things could be automated or eliminated, as one default choice would be chosen most of the time anyhow, or the user wouldn't know what it's for. I think a vastly streamlined OS and software could go over quite well, maybe use TIVO as an example. I've never used a TIVO, but I've heard it automatically makes makes selections for people based on what they watch, that could be quite a good thing to add to a web browser.

im not saying consoles shouldnt have basics pc functions. but gaming should always be the main priority. why would it be any easier to check email on a console then a pc? why divide fucntions? why do important fucntions on the pc then switch over to a console to check email if it can be done on the pc in one go.

my whole point adding all the pc functions basically turns consoles into pc. make it streamlined and simple and i can see it working like you said. i wouldnt mind checking my email thru a online service like Live tho :)
 
jvd said:
Back in the day, they did.
For that matter, there's about a zillion different versions of some type of 'nix out there.
and how many of these back in the day had guis and operated on anything as complex as current pcs or a cell set up ?
MS has an advantage in time, but it took them years to catch up with the competition in many ways. I'd love to know how AmigaDOS and Workbench would have developed if the company was properly managed. In the late 80's I was multitasking 30 application on 1 meg including extremely processor intensive tasks, without any crashes or glitches.
A LOT of development goes into Windows, but it's a badly designed piece of software. Start again from scratch and things are a lot easier, plus you can design the system around a new structure that's very hack resistant. That's my proposition. You seem to think every OS will have the same weaknesses and it's only the widespread use of Windows that attracts hackers. I say a different OS structure would eliminate 99% of potential threats and problems by simply not having them.

Anyway, I can't see why people are seemingly getting irate over this. It's just an idea. It probably won't happen, but that doesn't stop people discussing whether or not it is a good idea. Some here think it's a good idea. Some say it'll never work. Each to their own.
 
Shifty Geezer said:
In the late 80's I was multitasking 30 application on 1 meg including extremely processor intensive tasks, without any crashes or glitches.

So was I, but I don't view the Amiga through rose-coloured glasses.

No memory protection meant that whenever I made a mistake in my code, poof went the machine... "Guru Meditation". I saw those a lot, far more than I ever saw trap screens after I switched to OS/2, or bluescreens after I dumped OS/2 and switched to NT4.

These days, my XP/2003 systems run for months, until I ask them to reboot (for whatever reason). I can't recall the last time I encountered a bluescreen that wasn't a hardware or driver problem or my own damn fault.

All I do is exercise a little caution and some common sense, and so far I've never had any spyware (other than cookies), no viruses, and I don't reinstall Windows every six months either. (Unlike some people.)

I just have my router, and the built-in firewall. I run a virus scanner, but it's never had to clean any files.

A LOT of development goes into Windows, but it's a badly designed piece of software.

You'd be suprised. Over the years, I've learned a lot about the insides of the Windows NT derived OSes, and the kernel is, quite frankly, really well designed.
 
Well that is surprising to hear. A lot of what I've read from other programmers etc. including Gate's claims that Windows is a monolithic design and he doesn't care for Software Engineering principles, and given the crazy way installing one piece of software can randomly screw up another piece of software, I find it hard to believe anyone sat down and designed the OS to work this way!

Regards AmigaDOS, it wasn't perfect by any stretch and I view modern OS's as much improved. BUT it was 20 years ago. It took MS some 10 years to produce something with similar functionality. If Amiga had been properly developed, it's OS would be a much advanced system from what it started with '85. And that's only one example. There was also OS/2 and RiscOS. I don't know what others, but AFAIK all of these (certainly Warp4) were better OS's than Windows, but MS won through by business strategies, not by the best product, just as the PC system did. We haven't got the advancement of the best system. We have the advancement of the worst-but-most-popular system :p
 
Shifty Geezer said:
Well that is surprising to hear. A lot of what I've read from other programmers etc. including Gate's claims that Windows is a monolithic design and he doesn't care for Software Engineering principles, and given the crazy way installing one piece of software can randomly screw up another piece of software, I find it hard to believe anyone sat down and designed the OS to work this way!

re: Installing one piece of software screwing up another piece of software.

This is the fault of the software you're installing. Windows is broken up into things called DLLs.

DLLs are a very practical and useful idea: You take a piece of code that is commonly useful, and you put it in a shared module so everyone can use it. All major OSes have similar ideas - AmigaOS had .library files.

The problem is on Windows, lazy or incompetent developers will replace shared DLLs willy-nilly, without checking for the version first. (Even Microsoft itself is occasionally guilty of this.)

Since the PC market is open, and anyone can write software for Windows and put it up for download or sell it, there's a lot of badly written software out there that will break things.

re: Windows Monolithic design.

Everyone's misquoted Gates on this.

IE itself is composed of DLL modules, just like everything else in the system. That's how people use it inside their applications. That means any developer can load and use pieces of IE to do things like download files using HTTP or FTP, render HTML, execute scripts, open SSL connections, and so on.

Lots of developers do this.

Gates is saying that if you remove IE, you will break all sorts of things. Because all sorts of people depend on IE being there, because it's a system service. The help system uses IE. Winamp uses IE. Realplayer uses IE. AOL uses IE. Etc etc etc.

Think of IE as basically like DirectX for the Internet. It's a software layer that developers can use in their own apps for interfacing with the Internet, without worrying about all the details of how HTTP works, or how FTP works, or how HTML works or whatever.

The IE that you think of when you use a browser is just a default shell around these features. Other shells exist: Maxthon and AOL for example.

Almost every major feature in Windows is designed like this: a layer of APIs available to developers so they can incorporate it into their own applications.

WMP is designed like this too. What you see as WMP is really just a shell that exposes features implemented by DirectShow and DirectX, which any other program can use (and frequently do).

That's called Software Engineering, and yes, MS has of course screwed things up here and there, but by and large, I think they do decent work.

If Amiga had been properly developed, it's OS would be a much advanced system from what it started with '85. And that's only one example.

Actually there were a lot of fundemental problems with the way Amiga OS worked that prevented it from ever getting real memory protection. A lot of these problems were due to basic tradeoffs made when it was designed to make it faster (like the way the message passing worked in Intuition.)

I'm not going to argue that the world wouldn't have been a better place if Amigas had taken over, but I will have to say it is far from certain that it would be.

There was also OS/2 and RiscOS. I don't know what others, but AFAIK all of these (certainly Warp4) were better OS's than Windows, but MS won through by business strategies, not by the best product, just as the PC system did.

I ran and developed on OS/2 for many years (all the way up until OS/2 4.0 Beta), and there were a lot of flaws in that system that IBM simply never got around to fixing properly, which really frustrated me, especially as I saw NT getting better and better.

One core problem with OS/2 was called the "synchronous system input queue". Basically, in every modern GUI operating system, when the user sends input, the system creates something called a "message" and sends it to the applications to process. So clicking a mouse, for example, tells the OS to send a "CLICK" event to the applications.

The fatal flaw in the OS/2 system was that the OS would WAIT for each application to respond to the "CLICK" before going on to the next message.

This meant that if a program hung and stopped responding to messages (for whatever reason), the entire desktop would freeze and stop responding.

Win3.1 had the same issue (and many far worse than that), but while IBM diddled around and never really looked at it, MS went ahead and fixed it in NT.

There were many other irritating problems in the OS/2 kernel that IBM never got around to fixing either. The upper layers of OS/2 were 32-bit, but core pieces like the file system (HPFS) never made it to 32-bit until much after NT4 came out and just decimated OS/2. (Well there was a 32-bit version of HPFS, but I think you could only get it if you bought an OS/2 server edition.)

We haven't got the advancement of the best system. We have the advancement of the worst-but-most-popular system :p

I completely disagree.

With NT/2000/XP/2003, we have the advancement of probably one of the better designed systems (at least at the kernel level), which also happens to be the most popular today.
 
Sorry guys, don't wanna ruin the party, but... Console forum, console thread...

I do agree that XP has never given me a problem that wasn't cause by something else, either hardware or programs (funny enough, Firefox has given me lots of problems, all fixed now).
 
Kill_Jade said:
so the topic creator wants to turn consoles into PCs. i dont see a problem with hooking up consoles to a monitor. poor man's hdtv right there. but if i wanted PC functions on a console i would be a PC. Consoles= Gaming machine first, all the other stuff 2nd.

No, I don't want to turn consoles into PCs. I don't want consoles to be upgradable or let anyone just put any program on there. I just want the machine to be used to it's full potential without sacrificing the core functions of the console.
 
May I ask why people want to segregate their electronics in two different platforms? If you have one box with CPU, RAM, storage that does games, and another box of CPU, RAM, Storage that surfs the net, writes letters, prints and stuff...and one of those boxes is far cheaper and more powerful than the other...

If you had a choice between a 3 core, 3.5 GHz monster, and an expensive and less powerful machine, why would you not want to explot the hardware in the cheaper, better machine?

Personally I have a PC upstairs, console downstairs, like many I imagine. I wouldn't have any qualms about replacing my PC with another console if it offered the same functionality and more power at less price than a PC upgrade.

regardless of whether it is or is not possible, regardless of OS and software issues, is there actually a reason why the same basic design, CPU, RAM, etc. should be confined to doing one task only? Would you also want one computer for pictures, one for the internet, one for wordprocessing? Limiting the hardware to just one application seems like a terrible waste to me. I don't understand why some would be averse to using cheaper, more powerful hardware :?
 
SG I see consoles and pc's just fundementally different, though they technically use the same parts to do different function. Some members in the thread mentioned using the dc to browsing the web, chat and send email. While the DC did some applications related to the interenet fine, I never saw it replacing the pc simply because it didn't do multi tasking well. On the pc I can run IE (yes I said it), winamp, trillian, and psp all that the same time and accomplish several things. If a console were able to do that would it really be a console at that point? My other main reason for not wanting a console to do a pc job is because you can't upgrade a console persay. Say adobe is running sluggish on a console or I want to beef up overall performance I'm stick with exactly that whereas on a pc a new cpu, ram or vidcard can fix that issues. Like I said before once a console does that with upgrading it's not a console. Not sure my point was made clear, but the reason why consoles have limited functionality in comparison to a pc is due to their fundemental princples in usage and design. A console is not designed for change where as a lot of things in the pc world demand it.

Bringing this little debate back to gaming perspective I can offer my own little views on why consoles have limited functionality with controllers. A few months before Halo2 came out smartjoy announced the smartjoyfrag adapter. Within a matter of days most of the xbox forums were littlered with lots of whiny kids whining about how a player using KB&M would unlevel the playing field. FYI I don't believe using a KB&M as it like comparing a steering wheel to a controller for a racing game. In the case of console games that would benefit from using this method of control like an fps or rts does this pandora box really need to be opened? Sony could encouraged more developers doing more shooter games doing this but outside of UT I can't think of many games on the ps2 that allow people to use their KB&M. My experiences of KB&M vs Controller have lead me to believe the big 3 console manufactuers would rather keep the masses happy, than a smaller pc minority who already has their own place. Xbox360 is another indication that they don't wish to use KB&M in games. They talk about how the usb ports can be used, but make no mention of ever using it for something as obvious as the KB&M

BTW whomever said the pc monitor is poor man's hdtv needs to learn a thing or two. PC monitors have been ahead of tv's for years when it comes to visual quality. Sure monitors aren't big, but I'd take 20" monitor with a tv tuner over a tv of the same size anyday.
 
The reason why is becasue right now consoles have no problems running what they need to run , games .

IF you open up the flood gates the console will be bogged down , opened to virus's , spyware , worms and hacking attempts .

Which will take away its main strength against pcs . The fact that its a stable device that runs games with no hassles .
 
Exactly. People eager to escape the "evils" of PC by going to consoles will soom be be beset by many of the problems that plage the PC world.
 
You don't need to boot into the OS to play games though. You have the console boot into games mode with a game inserted, or boot off the HD without a game inserted to load your wordprocessor.

Enabling PC like functionalty doesn't need to touch the game capabilities of a console. All the APIs etc are hard-coded into the machine so you couldn't mess it up anyway.
 
you can mess up alot of things . But here is an example


Say I'm doing a spread sheet then decide i wnat to play a game . Do i have to save my work then shut down the thing and load up in game mode ?

That wont go over well .

Also if the os is hardcoded its going to be hard to update the os and make changes to it to run newer software .
 
But I want to listen to my MP3 files while I am copying the Excel spreadsheet, along with a picture I just found on the web into a presentation, all the while talking on my IM.
 
Geeforcer said:
But I want to listen to my MP3 files while I am copying the Excel spreadsheet, along with a picture I just found on the web into a presentation, all the while talking on my IM.

And why wouldn't you be able to do that?
 
Back
Top