The xs360 wi-fi USB adapter is not that bad of a deal...

It's still $60+ more than you need to spend.

A $40 wireless bridge does the same thing, it just plugs into your ethernet port instead of the USB port.
 
Part of the reason for the inclusion of the "a" standard, is that it's the only "supported" wireless transport for MCE.
 
Powderkeg said:
It's still $60+ more than you need to spend.

A $40 wireless bridge does the same thing, it just plugs into your ethernet port instead of the USB port.

I know.

Just pointing out these comparisons people are making to $15 usb-adapters are not valid since those are only for wireless-G not A. When compared to the simlar hardware it's a fairly reasonable price, cerytainly not the 500% markup it seems at first.

Rockster - I didn't realize that, it's prett y lame we're forced to pay that much especially asince most people wont' use A, but I guess that's what the wireless bridge option is for.

Q: Does this mean if I intend to use MCE with my x360 wirelessly I will have to use a wireless-A router? That sux....
 
'A' I believe is the most 'secure' of the current Wi-Fi standards, which is of course why Microsoft is all over it. It's popular (relatively) in office environments and operates on the 5 GHz band. By the way, it's range totally sucks compared to b and g; it's just more or less a failed Wi-Fi standard. But if it's the 'official' Wi-Fi of Windows Media Center, that would explain a lot of the reasoning behind this otherwise bizarre move.
 
Ya, if hat's the case I'm just gonna screw the whole wireless thing. Too much money right now, and I really want it to work with MCE so I'll just stick to the trusty cable. it's faster anyways...
 
People have been saying this forever. Nobody wants to listen though. Anyway, I would rather use a ethernet bridge and not lock up my USB in the back. I'd rather have that free for the camara that's coming out. :p
 
xbdestroya said:
'A' I believe is the most 'secure' of the current Wi-Fi standards, which is of course why Microsoft is all over it. It's popular (relatively) in office environments and operates on the 5 GHz band. By the way, it's range totally sucks compared to b and g; it's just more or less a failed Wi-Fi standard. But if it's the 'official' Wi-Fi of Windows Media Center, that would explain a lot of the reasoning behind this otherwise bizarre move.
Really? I thought A was the only one capable of hidef streaming? But maybe I've mixed it up with G?

.Sis
 
Hardknock said:
People have been saying this forever. Nobody wants to listen though. Anyway, I would rather use a ethernet bridge and not lock up my USB in the back. I'd rather have that free for the camara that's coming out. :p

Well you would have 6 ports on the PS3 if you wait [/fanb0y]

No seriously that price is still too high to me.
 
mckmas8808 said:
Well you would have 6 ports on the PS3 if you wait [/fanb0y]

No seriously that price is still too high to me.
Ok then. What is a fair price? 80 bucks? 60? Should it be priced exactly the MSRP of its PC counterpart?

.Sis
 
xbdestroya said:
'A' I believe is the most 'secure' of the current Wi-Fi standards, which is of course why Microsoft is all over it. It's popular (relatively) in office environments and operates on the 5 GHz band. By the way, it's range totally sucks compared to b and g; it's just more or less a failed Wi-Fi standard. But if it's the 'official' Wi-Fi of Windows Media Center, that would explain a lot of the reasoning behind this otherwise bizarre move.

Since 802.11a operates in the 5 Ghz band, it is less susceptible to RFI/EMI from household wireless products like 2.4Ghz cordless phones, mobile phones, microwaves, and neighbouring wireless networks compared to 802.11b or 802.11g. All of these items can impede throughput in a b/g network which makes consistent streaming of high bitrate media almost impossible.

Typical throughput for both 802.11a and 802.11g is around 25Mbps but only 'a' can offer that reliably. For example, the mere presence of a 802.11b network can drop a 802.11g network to 'b' speeds. Microwaves are some of the worst offenders though. If you are using a high numbered channel, say 9 for example, they can drop througput of a 802.11b network by 60% from a distance of 20 feet!
 
Sis said:
Really? I thought A was the only one capable of hidef streaming? But maybe I've mixed it up with G?

.Sis

Well, both A and G have the same data throughput capabilitis... so both! ;)
 
Hardknock said:
People have been saying this forever. Nobody wants to listen though. Anyway, I would rather use a ethernet bridge and not lock up my USB in the back. I'd rather have that free for the camara that's coming out.
icon_razz.gif

Well you would have 6 ports on the PS3 if you wait [/fanb0y]

No seriously that price is still too high to me.
 
Mmmkay said:
Since 802.11a operates in the 5 Ghz band, it is less susceptible to RFI/EMI from household wireless products like 2.4Ghz cordless phones, mobile phones, microwaves, and neighbouring wireless networks compared to 802.11b or 802.11g. All of these items can impede throughput in a b/g network which makes consistent streaming of high bitrate media almost impossible.

Typical throughput for both 802.11a and 802.11g is around 25Mbps but only 'a' can offer that reliably. For example, the mere presence of a 802.11b network can drop a 802.11g network to 'b' speeds. Microwaves are some of the worst offenders though. If you are using a high numbered channel, say 9 for example, they can drop througput of a 802.11b network by 60% from a distance of 20 feet!

That's certainly the hype - but I have to say, those considerations aside I rather go with a much less expensive b or g capability. A's range is also total ass, so something else to consider.

I mean there's benefits to it - I'm not saying there are not. But there are drawbacks as well.
 
$100 for a wireless USB adapter sounds high. They'll sure make a profit from it, and if that's where MS thinks it will have to charge high prices then there is little we can do about that. I'd rather spend $100 on a wireless adapter than spend an extra $10 on all the new games I purchase.
 
Support for "a" protocol is good as an option but most people already have "b/g" routers for them it does not makes sense. For people who will buy new Wifi routers are going to buy "g" mostly because "b/g" are more widely accepted than "a"
 
Crystalcubes pointed out the main concern. How many people have 'a' networks/routers? As well as a $100 XB360 WiFi adapter you'll need a Wifi 'a' router/bridge device. The only a+g combo router at eBuyer is £72 ~ $130 = $230 dollars for WiFi.

Sounds like an extremely expensive solution making WiFi on XB360 unlikely for most.
 
Let's not go overboard here bashing MS, we don't know for sure it isn't possible to stream media without "a" compatibility. Let's wait and see before we condemn them, alrighty? :D

Even if b and g are less reliable, you can still make up for any intermittent transmission dropouts by buffering more on the receiver, and x360 has half a gigabyte to fill up. Shouldn't be a real problem.

And great first post from Mmmkay, thanks dude!
 
11Mb/s is more than enough for most peoples DSL/cable connections, and really enough for any one single person, 11Mb/s being roughly equivalent of the amount of analog data going in through our senses every second.

ms choice of standard seems like a typical woodheaded "big corporate decision" to stick to a standard, even when that standard is not well suited to a particular use.
 
I would expect our eyes to produce a considerably higher dataflow than that, considering the pretty high resolution and color precision. :) Besides, we have a couple million nerve sensors in our skin that measure touch, pressure, temperature, pain, etc...

Besides, talking about analog data in megabits/sec is somewhat of an oxymoron really... :LOL:
 
Back
Top