News & Rumors: Xbox One (codename Durango)

Status
Not open for further replies.
I'm talking here about the local speech recognition. Cloud recognition, like the Bing searches, is a lot more robust and surprisingly accurate, even with accents that are quite thick.
I presume also that online recognition includes a lot of learning, so should improve for all accents with use (?). That's one area where always on makes sense and would be very beneficial to the platform.
 
Right, then in the context of that logic, they should divert some of the BOM for K2, even if it means not allocating as much of the BOM for CPU, GPU, RAM.

If Kinect v2 is included in every box as we expect it to be, then yes, it probably is. But there's really no point in building it if its not in every box as no developer will take it seriously.

It's really important to remember that the ultimate goal of everything Xbox is not gaming, but effectively having a "PC" in your living room. Gaming was a trojan horse because gamers will buy more expensive things than your average consumer (it's the only way to explain millions of people buying a PS3 at $499 and $599).

If Kinect v2 edges Microsoft just a bit closer to dominance in the living room, they will certainly sacrifice some BOM that could go towards CPU, GPU, RAM to get there because it's where people spend a huge chunk of their time getting their entertainment.
 
If Kinect v2 is included in every box as we expect it to be, then yes, it probably is. But there's really no point in building it if its not in every box as no developer will take it seriously.

It's really important to remember that the ultimate goal of everything Xbox is not gaming, but effectively having a "PC" in your living room. Gaming was a trojan horse because gamers will buy more expensive things than your average consumer (it's the only way to explain millions of people buying a PS3 at $499 and $599).

If Kinect v2 edges Microsoft just a bit closer to dominance in the living room, they will certainly sacrifice some BOM that could go towards CPU, GPU, RAM to get there because it's where people spend a huge chunk of their time getting their entertainment.

Sticking point being smart devices such as gs4 already offer kinect like features and near xbox 360 lookalike graphics..which can be hooked up to a tv.

Hell samsung tvs them selfs will offer much of the same.
Im not suggesting samsung gimmicks will offer complete experience parity with Kinect2, but it sure does take the uniqueness out of it.
 
Sticking point being smart devices such as gs4 already offer kinect like features and near xbox 360 lookalike graphics..which can be hooked up to a tv.

Hell samsung tvs them selfs will offer much of the same.
Im not suggesting samsung gimmicks will offer complete experience parity with Kinect2, but it sure does take the uniqueness out of it.

Sticking a camera on a TV isn't hard. Interpreting that data under a variety of scenarios is. I don't want to be racist, but I as an African American can instantly tell whether someone bothered to do proper testing when the camera can identify me in a "darker than ideal" scenario or if they just tested the product on themselves. It sucks sometimes. I often see it in bad camera software that doesn't expose for my face correctly so all you see are my eyes. :cry:

Anyway, off my soapbox, it's all about the implementation. As of today, no one really questions the voice part of Kinect. It works pretty well. The skeletal tracking just needs to get to the same level.
 
The voice part might just work for you as an American but as a Brit I find it pretty poor. I guess it just shows that each manufacturer tend to focus all testing on their local market. Almost incredible in today's global economy but still seems true.
 
The voice part might just work for you as an American but as a Brit I find it pretty poor. I guess it just shows that each manufacturer tend to focus all testing on their local market. Almost incredible in today's global economy but still seems true.
We had tens of thousands of example sets from UK users. We also had pretty poor participation in voice collection from the UK, so yes, while the sample set was large, it was not nearly as large as the US set. You can blame your lazy-ass co-gamers who didn't complete voice collection during the betas :).

Seriously though, you're absolutely correct, we do a bad job on regional accents with the on-machine reco, and the UK has a much greater range of regional accents than the US. It's bad enough that BBC America sometimes subtitles shows that are theoretically in english.
 
Do you have feedback on Kinect's success with thick UK regional accents? Is it okay with Glaswegian English (that most English folk can't comprehend!) or thick southwest accents? Or immigrant English (Indian, Polish accents on English)?
speaking as a Geordie (north east england), Kinect has trouble recognizing my accent perhaps 40% of the time or more. i gave up using voice commands with it..
 
They should test Kinect 2 voice recognition with the movie Lock Stock and Two Smoking Barrels. If it can understand that, then it can understand anything.
 
Does any voice recognition system besides Dragon do learning individualized to specific users?

And I hate how Kinect picks up things said when watching Netflix or YouTube as commands. Pretty much always use a PS3 for those things now.
 
Do you have feedback on Kinect's success with thick UK regional accents? Is it okay with Glaswegian English (that most English folk can't comprehend!) or thick southwest accents? Or immigrant English (Indian, Polish accents on English)? Or speech impediments? You'll have the same in the US of course and differences between other country's variations on the same language. Are there statistics on robustness? Voice recognition as an optional extra like Kinect won't adversely impact the experience, but if your platform depends on it and a significant percentage of potential users can't be understood, that's going to present a negative image to the public and deter adoption.

Accent recognition could make for some interesting situations, I hear the My Fair Lady rain on the plain level is a chore!!! I could see some useful applications with helping people learn languages in a more natural sort of way. It could add to the immersion of game for sure.
 
new rumors... (some are really silly)

http://translate.google.com/transla...hashtag-per-latteso-evento-di-maggio-2013.php

[1st Update]

The symbols that appear in the slides are actually code phrases. It was discovered that in fact has been used esoteric language programming to encode each message. The translation of each slide is:

1. Storm Clouds

2. More than now

3. Deep Computing

4. Petaflops> Teraflops

If the first three can have a sense (storm clouds, over time, calculating in depth), the fourth and last sentence looks amazing.

Hearing about Petaflops Teraflops relatively higher than the next-gen consoles looks very strange. A console, as it may be the new generation, can not reach the Petaflops, a measure that indicates one billion floating-point operations performed in a second by the CPU, normally present in supercomputers.

A Petaflops equals 1000 Teraflops. We may therefore assume that they are a joke well thought by some users. Or will it be reality? For you comments.


[2nd update]

According to some rumors, Microsoft would be ready for an innovative structure of the Xbox LIVE to take on the console of the next generation of cloud computing combined with the AMD Fusion gaming.

In practice this would take advantage of the console as a client while the server managed by an infrastructure to remotely send data to the console already calculated so as to bring the same to reach 1 PetaFlops!

The system is similar to what is already the case with some working tools such as Octane Render , able to provide a render similar to computer graphics (CGI) for those who use tools such as 3D Studio Max, Maya, etc. ..

This is not pure as it was for OnLive cloud gaming, where all the computational load had been covered by the servers, but something in between. That is to say that the Xbox NEXT will have its own computing power that will be given a greater and further via dedicated servers.
 
new rumors... (some are really silly)
No shit...

The translation of each slide is:
If genuine at all, I wouldn't affix any importance to those slides, really. If MS wanted to impart an important message, it would be written in plain text and not hidden in obscure ways. Thus, I'm thinking it's just fake. All fake.

4. Petaflops> Teraflops
Stating the obvious, really. Petaflops ARE greater than teraflops. It doesn't mean that durango would offer petaflop performance because IT CAN'T. Not by itself (which would be ludicrous, you need an entire datacenter's worth of computing hardware, power, networking, cooling etc to even get close), and not through some kind of ridiculous cloud computing setup either of course. That would be ludicrously expensive, and just totally impossible.

Building, running and maintaining even a single petaflop's worth of computing costs millions of dollars. To try and do so for a console which is destined to sell in tens of millions of units would bankrupt microsoft in roughly 15 minutes.

...So, NO. I'm sorry.

This is impossible.

So, again: FAKE. Or else MS is just trolling people like you, and at least I personally consider them above that sort of thing. So the possibility that remains is fake.
 
so thats why the voice recog works better on Bing with my US account rather than with my SG account.

Hmm, what voice recognition are used on SG account? Its just automagically disabled when im online, but works fine when im offline.

but every Kinect game have their own voice recog? Because i can do voice command on kinect games while using SG account offline or online. Although the recognition quality are better on dash than on games

btw the calibration procedure on Xbox 360 kinect voice command are VERY crucial. I wonder on next xbox how Microsoft will handle that. Adaptive learning?
 
new rumors... (some are really silly)

http://translate.google.com/translate?sl=it&tl=en&js=n&prev=_t&hl=it&ie=UTF-8&eotf=1&u=http%3A%2F%2Fwww.4news.it%2F14499-xbox-next-svelati-logo-simboli-e-hashtag-per-latteso-evento-di-maggio-2013.php

[1st Update]

The symbols that appear in the slides are actually code phrases. It was discovered that in fact has been used esoteric language programming to encode each message. The translation of each slide is:

1. Storm Clouds

2. More than now

3. Deep Computing

4. Petaflops> Teraflops

If the first three can have a sense (storm clouds, over time, calculating in depth), the fourth and last sentence looks amazing.

Hearing about Petaflops Teraflops relatively higher than the next-gen consoles looks very strange. A console, as it may be the new generation, can not reach the Petaflops, a measure that indicates one billion floating-point operations performed in a second by the CPU, normally present in supercomputers.

A Petaflops equals 1000 Teraflops. We may therefore assume that they are a joke well thought by some users. Or will it be reality? For you comments.


[2nd update]

According to some rumors, Microsoft would be ready for an innovative structure of the Xbox LIVE to take on the console of the next generation of cloud computing combined with the AMD Fusion gaming.

In practice this would take advantage of the console as a client while the server managed by an infrastructure to remotely send data to the console already calculated so as to bring the same to reach 1 PetaFlops!

The system is similar to what is already the case with some working tools such as Octane Render , able to provide a render similar to computer graphics (CGI) for those who use tools such as 3D Studio Max, Maya, etc. ..

This is not pure as it was for OnLive cloud gaming, where all the computational load had been covered by the servers, but something in between. That is to say that the Xbox NEXT will have its own computing power that will be given a greater and further via dedicated servers.
Love it, love it, love it, love it to death, it's effin' hilarious to be honest.

Storm clouds are approaching.

These are the codes where it says this, but I cant decipher them tbh.
untitled99y46.png

r0b9cn.png

2hxvyaa.png

acsn7t.png
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top