NVidia's Dispute over Nehalem Licensing - SLI not involved

so what is "typical netbook usage" when you're in the game to "redefine netbook usage"?
 
But I don't believe mobilemark stresses anything the ION is marketed for. Namely video acceleration and 3D acceleration. And as far as I can see the only video test in MobileMark is DVD playback which I "thought" current netbooks could handle.

If all you are going to use is applications from MobileMark why pay 33% more for the ION chip and marginally higher power usage?

If you plan to use the features Ion is marketted for, then I expect battery life to go down a bit further, but then again considering it's not something 945gse is capable of doing, that itself represents a possible niche market for it. At least until Poulsbo shows up.

Regards,
SB
 
I'm not sure you can consider that FUD as it wasn't meant for public disclosure or marketing. Likewise many of the claims are referenced back to reliable sites (Xbit labs, Apple, and NVIDIA for example).

The biggest questionable claim I see in there is a rather vague and nebulous "purpose built design & validation." Likewise there's no way to quantify whether there is a large market for "gaming" on a nettop/netbook. But I personally doubt it. HD video on the other hand might have more traction (which is why Intel is including it in Poulsbo).

And this considering I don't even particular "like" Intel although I do respect them.

Regards,
SB
 
NVidia's death by powerpoint round 2

http://www.fudzilla.com/forum/index.php?topic=499.0

Thing I don't get is who do both companies think they're kidding? Their pretending that these slides are aimed at the industry when really they're marketing at us, the influential geeks, via the technology press's advertising $. JHH crying like a spoilt child about how consumers of netbooks are being denied 1080p video and Crysis gameplay by Intel's unfair bundling/pricing of Atom is aimed solely at us.

OEMs/ODMs don't fall for this crap, do they?

Jawed
 
it's a PC, its usage is only defined by its user, no matter the market segmentation the vendor has planned.

If I'm to spend a few hundreds on a netbook I would expect at least some ability, and the choice is between two modern-era IGP anyway, either an old crumbling Intel one and a nvidia one with nvidia drivers that would more reliably run games and with better quality. I'd be someone that only owns a netbook, not a bigger laptop and - gasp - a smartphone.
so if it allows me some quick gaming or lan gaming along taking notes and doing geeky stuff, the better. either on its own, or connected to a CRT monitor w/ or w/o full keyboard, or using a gamepad, whatever.

on the hardware side it's better at gaming than an ipod or sony PSP..
and it can run thousand of games and apps without the need for hacks, 'cos it's just a PC.
 
Last edited by a moderator:
in one of those slides published by fudzilla
http://img7.imageshack.us/img7/4360/nvidiaslide10ol5.jpg

will we see products with a VIA CPU and nvidia chipset, or is that a reminiscence of the so-called "threat to go with VIA"?
a 45nm single core VIA nano would be a good netbook CPU, it would make better sense than Atom for gaming, while a dual core nano w/ nvidia IGP would serve well in a laptop or entry desktop.

that 2nd generation ION reference hints at a GT2xx IGP, which I'd expect to be a 40nm chip with one 24SP or 32SP multiprocessor, and DDR3 memory controller. Notice how all the intel brands quoted refer to FSB based processors.
 
Dual-core 45/40nm VIA Nano is expected in 4Q09, same time as Ion2. Hopefully both will be able to hit the 2010 Back-to-School cycle. Presumably the single-core version for netbooks will initially just be the same chips with one of the two cores cut-down; if both are functional, the one with the lowest power only could be used for ultra-low-power SKUs. I'd expect die size around 50mm², but who knows. Either way, the dual-core Nano would certainly get rid of the "Ion is CPU limited in most games anyway" argument.

FWIW, this has been my expectation (100% speculative) for Ion2 for a long time, I wrote this ages ago:
40nm/4Q09
Socket 775+VIA
4xROPs/8xTMUs/32xSPs
64-bit DDR3
20xPCIe Gen2
08xSATA 6G
14xUSB2
3xEHCI
1xGigE
1xPS/2
i.e. in a way it's very low-end (64-bit memory bus, no PATA or PCI, etc.) and in another it's more high-end than today's solutions (8xSATA 6G, 3xEHCI, 32 SPs, etc.) in a weird kind of 'sweetspot' way.

While 64-bit DDR3 might not seem like much, remember DDR3-1600 should be pretty cheap in that timeframe. That's the equivalent of 128-bit DDR2-800... And Ion1-based netbooks won't use more than 64-bit anyway, so it makes sense to design the chip around that. And it does allow for lower die size...
 
that 2nd generation ION reference hints at a GT2xx IGP, which I'd expect to be a 40nm chip with one 24SP or 32SP multiprocessor, and DDR3 memory controller. Notice how all the intel brands quoted refer to FSB based processors.

I don't think we'll see any GT2xx derivative from nvidia before .. oh say, summer 2015. even the GTX280M is a mobile G92band the design of GT200 itself just does not seem to lend itself for downcutting obviously.

I have no idea on the power usage of the "G105M" nV's newest low-end discrete mobile graphics is, but it's probably not good as there are no official power figures anywhere on their site, a lot different from GT200.
 
I don't think we'll see any GT2xx derivative from nvidia before .. oh say, summer 2015. even the GTX280M is a mobile G92band the design of GT200 itself just does not seem to lend itself for downcutting obviously..
If you knew the truth, you'd be laughing pretty hard right now. (and yes, NV's reuse of G92 in 500 billion different sauces is more than a little bit retarded!)
 
isn't nvidia about to launch a line of 40nm GT2xx? :p
That's only a name anyway, they're G80 derivatives (which is not necessarily a bad thing)
 
I really don't understand why NVidia didn't make more of a push with VIA. The only thing I can think of is that OEMs/ODMs said "Atom or nothing" so NVidia had to prioritise.

It's also kinda interesting that Ion 2 will be ready for the end of the year, but since mobile stuff seems to have ~9 month lead/design I guess this will result in Ion2 stuff appearing about a year later than Ion stuff.

As to the question of what use is a netbook:

http://www.xbitlabs.com/misc/poll_display/160.html

clearly the enthusiast audience is a bit divided on this one. I actually sympathise with the view that being able to play games on one (when connected to a full size monitor, say) or decode/output 1080p video could be useful. So I see nothing wrong with Ion as a platform - if you want to spend extra on such a netbook, you should have the choice.

For my own use I do think 1080p is overkill for video - I dislike sitting close enough to a screen in order to be able to perceive 1080p as more detailed than 720p. I like being that close for gaming, but not for watching films. I'm also seriously atypical because I have no interest in any kind of mobile computing.

Jawed
 
I really don't understand why NVidia didn't make more of a push with VIA. The only thing I can think of is that OEMs/ODMs said "Atom or nothing" so NVidia had to prioritise.
MCP7A does not support Nano's bus interface AFAIK. MCP7C nearly certainly did, but it got canned... ;)

It's also kinda interesting that Ion 2 will be ready for the end of the year, but since mobile stuff seems to have ~9 month lead/design I guess this will result in Ion2 stuff appearing about a year later than Ion stuff.
Yup. Intel's '4Q09' date for Pineview will have a similar lead time though.

For my own use I do think 1080p is overkill for video - I dislike sitting close enough to a screen in order to be able to perceive 1080p as more detailed than 720p.
You mean a notebook screen, or anything 1080p in general including HDTVs? :oops: :|

Either way, given what you've just said, I'll be curious to see what you think of ARM netbooks with 720p decode (ala NVIDIA Tegra, Freescale i.MX515, Qualcomm Snapdragon, etc.) - personally I'll certainly grab one if I can get a 7" netbook at $99 as they claim (which is definitely feasible, it's not just marketing BS) but I'd be more interested in the market once the Cortex-A9 is mainstream and Google Docs is supported in Opera Mobile...
 
You mean a notebook screen, or anything 1080p in general including HDTVs? :oops: :|
Yep, I dislike sitting within 3 feet/1m of a 24" monitor and within 6 feet of a 48" screen.

http://www.engadgethd.com/2006/12/09/1080p-charted-viewing-distance-to-screen-size/

Though I am continually re-evaluating. I dislike close viewing less than I once did... I had a friend who chose to sit in the front row at the cinema. I joined them once to see a film and was utterly horrified to discover all the holes in the screen - she wasn't short sighted, so I don't think that was it :LOL: Similar with plasma screens: if you sit too close all you see is lickle squares of light with black borders. It's nasty. LCDs are only marginally better. It seems to me a lot of people mistake this aliasing for "sharpness and detail". ARGH.

Either way, given what you've just said, I'll be curious to see what you think of ARM netbooks with 720p decode (ala NVIDIA Tegra, Freescale i.MX515, Qualcomm Snapdragon, etc.) - personally I'll certainly grab one if I can get a 7" netbook at $99 as they claim (which is definitely feasible, it's not just marketing BS) but I'd be more interested in the market once the Cortex-A9 is mainstream and Google Docs is supported in Opera Mobile...
That's a question of "do I want to fight/use something that isn't Windows" versus the wonderful portability.

I was an ardent Psion 3/3c/3mx user back in the day:

http://www.itreviews.co.uk/hardware/h41.htm

Now that's battery life (and it was x86) :p Back then it did everything I wanted and the bit of data interchange I did with my PC worked fine (i.e. Word, Excel and DBF database files). The increased performance of the 3mx was actually worth it in my view. And I had no trouble touch typing on it...

These newest gadgets are certainly amazing - if I wasn't such a born-again-luddite when it comes to gadgets I'd want one (my mobile phone doesn't even have a calculator function ;)).

Jawed
 
Yep, I dislike sitting within 3 feet/1m of a 24" monitor and within 6 feet of a 48" screen.

http://www.engadgethd.com/2006/12/09/1080p-charted-viewing-distance-to-screen-size/
So first, let me say I didn't really believe that chart at all initially. So I looked into it, and I figured under what principle it is based; and I must say, while it is not fundamentally flawed, there are several points that make it seem very imprecise to me.

First of all, it assumes 20/20 vision. But a quick Google tells me reality is quite different:
http://www.isd.mel.nist.gov/US&R_Robot_Standards/Visual_Acuity_Standards_1.pdf said:
It turns out that 20/20 is not perfect human vision. Indeed, it is near the average for adults in their 60’s as vision degrades. Good vision in young adults with no visual impairment is generally between 20/16 and 20/12, much better than 20/20. 20/20 vision has come to be interpreted as the limit Figure 1: Snellen Eye Chart of “normal” vision with which an individual can cope well enough in school or industry and hence does not require correction. Vision beyond 20/20 is generally improved with corrective lenses.

Secondly, what is being measured in no way tells you everything you could possibly want to know about human vision:
http://en.wikipedia.org/wiki/Visual_acuity said:
20/20 is the visual acuity needed to discriminate two points separated by 1 arc minute—about 1/16 of an inch at 20 feet. This is because a 20/20 letter, E for example, has three limbs and two spaces in between them, giving 5 different detailed areas. The ability to resolve this therefore requires 1/5 of the letter's total arc, which in this case would be 1 minute.

Being able to discern points is not all there is to it when it comes to perceived image quality. My guess is that this is most likely to be determined primarily by horizontal cells - and even in the fovea, things aren't quite as simple as a 1:1 mapping. For example:
http://webvision.med.utah.edu/OPL2.html#horizontal said:
The clusters of terminals contact cones in the same manner as the HI cell terminals and because of their bigger field size contact more cone pedicles (9-12 in foveal retina, 20-25 in peripheral retina).

Finally, consider that video compression will never be perfect, and if a decoder only supports 720p then chances are it doesn't support anywhere near as high a bitrate as most 1080p decoders (Tegra does supports 14Mbps in 720p and 20Mbps in 1080p though, FWIW). So in practice, considering only whether you can see every single individual pixel doesn't seem like the right approach to me, unless you played a 1440p 100Mbps clip on a 720p screen or something like that... :)

I'm not saying there are massive benefits to 1080p beyond the screen distances in that chart. But saying you can't even detect 'some of the benefits' when further away than that is completely insane in my mind...

It seems to me a lot of people mistake this aliasing for "sharpness and detail". ARGH.
Hah - people ftw? ;) The myths about image quality can be depressing at times...

I was an ardent Psion 3/3c/3mx user back in the day:
Wow, that's pretty cool!

These newest gadgets are certainly amazing - if I wasn't such a born-again-luddite when it comes to gadgets I'd want one (my mobile phone doesn't even have a calculator function ;)).
Hehe, I can't blame you. I don't think I ever used a 3G phone (i.e. actually used the controls etc.) before I went to MWC last week; I really love my iPod Touch though. In fact, given how subpar most of these phones are, I think I'll stick to that for some more time :p (LG's Arena was nice though, it's funny that the most impressive phone at the show wasn't even a smartphone...)

WRT Netbooks, for my uses I don't think it'd be a big deal that it's not Windows as long as web browsing speed is great, Flash is supported, and Google Docs works. I used a 12" ultraportable for a pretty long time, and honestly I barely ever used it for anything I couldn't do on ARM. Of course, I'm sure for some people it's the exact opposite but then again I truly wonder how many of those ever used ultraportables or netbooks rather than much bigger laptops.
 
So first, let me say I didn't really believe that chart at all initially.
The chart happens to match my experience. My eyesight is certainly nothing special and more than 40 years old - maybe everyone at Engadget has similarly dodgy eyes?

According to this interactive Snellen chart:

http://www.smbs.buffalo.edu/oph/ped/IVAC/IVAC.html

my vision is about 20/20 - it's a bit of a fudge here because I can only get about 9 feet away from the screen and 20/20 is as low as it'll go for my desktop resolution (1600x1200 - the calibration line measures 2.6cm). My eyes are definitely a bit tired today, but I doubt I'd do notably better as it's borderline for me anyway.

This:
SnellenE.png
is an E from a Snellen chart. On my monitor its 5 pixels are 1.25mm high (1200 pixels on my monitor are 300mm high). With 20/20 vision I should be able to read it at 87cm but it's more like 60cm - I think CRT fuzz is having an effect and I wonder if visual acuity changes at closer distances. I'll have to try it on the 24" 1920x1200 monitor at some point, where 20/20 vision means it's distinguishable at about 96cm...

I'll be interested to hear how other people get on with that E or the interactive chart page.

I'm not saying there are massive benefits to 1080p beyond the screen distances in that chart. But saying you can't even detect 'some of the benefits' when further away than that is completely insane in my mind...
This

http://broadcastengineering.com/hdtv-displays/

recommends at least 4x picture height, which is 4 feet/1.2m for 1080p on a 24" 1920x1200 monitor. I suppose if a netbook had a 1920-wide display of about 9" diagonal that would imply a 1.5foot/45cm viewing distance (about the distance from my eyes to my lap, if I tilt my head to look down, as it happens). That's about 245 pixels per inch, very much in photographic print territory :p

A Kindle 2 has 167 pixels per inch, for comparison's sake.

Hah - people ftw? ;) The myths about image quality can be depressing at times...
This crud is obviously less of a problem with 1920x1080 than 1280x720 panels.

Wow, that's pretty cool!
It's more than a little ironic how popular this kind of form factor has become. The Psion gear of course dates from "before the web" in any meaningful sense. It's also the predecessor of the Symbian OS.

Jawed
 
Yes and No.

The most basic version of Windows 7 aimed at extremely low performance/cost parts (most notably netbooks and OLPC, but also old computers and cheap laptops) only allows 3 applications at a time.

Makers of the more respectable performing netbooks, however, will likely opt for one of the more complete versions as their systems perform fast enough to not need these limitations. OEMs basically just have more options now - I don't believe there's any contractual requirement from Microsoft requiring all netbook OEMs to use the basic version of Window 7 or anything of the sort, it's just a version designed for low-end netbooks.
 
But it does give you a pretty good idea of what Pineview is aimed at... And this is how they plan to fight the ARM Netbook & VIA/Ion offensive? These guys are fun... :)
WRT DOS, nothing prevents DosBox from being ported to ARM, but there's no modern recent version of it sadly.
 
Back
Top