Next-Gen iPhone & iPhone Nano Speculation

How would such a feature know when not to zoom as you approach the screen to touch it for other functions? I guess a slight delay may work, but I'm not sure if the calibration could be set fine enough for a feature like that to not constantly annoy someone by zooming when not desired.

That's pretty trivial I think. It would function when web browsing which only works when in the browser. You don't need zoom when you're navigating the icons of the main GUI "desktop". Double tapping to zoom in when web browsing is a PITA given the zoom is preset and not variable. Pinching to variable zoom in/out is a pain too and requires two hands...one to hold the phone and one to pinch. A proximity sensor for fingers would allow zooming and holding of the phone with one hand.
 
You would only be able to implement that if your phone is able to discern your intent. By the time our phones are capable enough of doing that, they won't even resemble the kinds of smartphones we have now. Not to mention that being able to discern your intent would make a manual zoom gesture like that obsolete.

If your phone can't discern what your intention is, simply between wanting to tap the screen or zooming in, then it won't work. With Samsung's eye tracking it didn't matter if it didn't work. You'd still be able to use the phone the way you're used to. But if this zooming by proximity doesn't work as intended, then people would become so frustrated with the feature they'd throw their phone against the wall if it can't be turned off. I mean, just imagine trying to use the browser if it would zoom in half the times you just wanted to tap the screen for whatever reason. It'd be madness.
 
So any GPU related speculation/expectations for iOS 8 tomorrow?

If they DO decide to implement and push PVRTC2 it would be a feather in IMG's cap so to speak. It would also confirm Apple being happy to be entrenched in IMG hardware for the immediate future.

If next gen hardware has ASTC support, I don't think there will be any mention of it at WWDC. Apple generally don't give anything away in that regard. As I recall, there was no mention of gles3.0 support this time last year.
 
Double tapping to zoom in when web browsing is a PITA given the zoom is preset and not variable.
Never felt that way myself, doubletap woks quite well IME. I can't imagine waving my thumb at the screen would be much if any more precise (also, thumb is not long enough to reach all parts of even a comparatively small iphone screen.) How good can a capacitive screen's ability to detect stuff at a distance be anyway, what range are we talking about? I know I've accidentally tapped stuff without actually touching the screen, but that was at a very short distance, like 1mm from the screen maybe.

Pinching to variable zoom in/out is a pain too and requires two hands...one to hold the phone and one to pinch.
OMG wut? What are you doing with your other hand, driving your car maybe? (If you're touching yourself I don't want to know! :p)
 
One thing Apple might need to address is how apps are installed. If I'm not mistaken, right now apps are downloaded compressed and need to be uncompressed on install. That means an app needs 2x its size in free space to install. As apps get bigger, it becomes problematic on the most common 8-16 GB devices that 1-2 GB apps need 2-4 GB of free space to install. If Apple could have apps run in place compressed that would reduce install requirements and expand the usable storage of devices.
 
Apparently it's the open season for proprietary 3D API... even Apple is introducing one :p
 
New Graphics API, Metal, and a new programming language, Swift. Lots of other new APIs. Seems like a big update under the hood of iOS8.
 
New Graphics API, Metal, and a new programming language, Swift. Lots of other new APIs. Seems like a big update under the hood of iOS8.

Yeah, I'm very interested in the new extension functions (also the system wide 3rd party keyboards... I'd really like to use my favorite IME for typing Chinese on my iPhone)

The new language itself looks interesting but one major upside of using Objective-C is the ability to integrate C codes directly. Of course, if Apple managed to make codes written in Objective-C, C, and Swift to link seamlessly (which looks like the case) then it's not a big concern. Developers are likely to be using it for UI codes at first (if ever), though.
 
I wonder if they've already patented something like this?
Someone would of patented it since its so obvious after the first time someone used a touchscreen, man look at all these smuges on the screen how to get rid of them.

* "proximity zooming" ie the closer your finger to the screen the higher the zoom factor
No not going to work well for obvious reasons,
YES pinch to zoom
YES also swipe to scroll

personally I dont need to zoom much on my phone, though perhaps cause its 5" 1080x1920 which is a bit larger than 'the perfect size' 3.5" or a bit bigger 4"

excellant apple have finally seen the light and (practically) dropping objC, cant die soon enuf, pity they didnt go for c#. Hopefully this swift lang is static typed (Ive been doing quite a bit of JS recently :devilish:)
 
I wonder what is the ultimate plan for Metal? Is it sufficient abstracted to support AMD, nVidia, and Intel GPUs so it can eventually be brought over to OS X and if so I wonder if they'll open it up for standardization with Khronos like they did with OpenCL?
 
I wonder what is the ultimate plan for Metal? Is it sufficient abstracted to support AMD, nVidia, and Intel GPUs so it can eventually be brought over to OS X and if so I wonder if they'll open it up for standardization with Khronos like they did with OpenCL?

The interface is in objective c. Stupid.
 
Resolution for the new phone is probably going to increase +50% - as far as I know (might be wrong) app developers are still exposed to the original 320*480 res on 3-4S and 320*568 on the 5+. The system will then take the source art and render it at the 2x upscaled actual resolution, 640*960 and 640*1136.

So the logical next step is to render at 3x res, which would give you 960*1704, good enough for web and video, still a high enough PPI, and not as demanding or pointless as a 4x jump would be (which is doubling the 5's resolution). The UI would look the same, work the same way, app devs could target all supported platforms and so on.
 
Is it sufficient abstracted to support AMD, nVidia, and Intel GPUs so it can eventually be brought over to OS X
Since it targets A7 (and up, one would expect... :)), it's probably an iOS-specific PowerVR-only feature. ...I guess. It may be that it's abstracted, and Apple is just arbitrarily drawing the line at A7. Or perhaps prior iPhone GPUs lacked enough hardware features that it wasn't worth the bother targeting them as well.

So, complete guesswork! Take with truckloads of salt... ;)
 
app developers are still exposed to the original 320*480 res on 3-4S and 320*568 on the 5+. The system will then take the source art and render it at the 2x upscaled actual resolution, 640*960 and 640*1136.
Mate Ive developed on IOS, you can specific 640*1136 or (2048x1536 ipad) etc sized images you dont have to rely on pixel doubling IIRC the apple docs recommend not to
 
If the gap in graphics performance between Apple and the competition has narrowed from the hardware side, Apple still continues to push the lead in having their platform make better use of the GPU. The robustness added to SpriteKit and SceneKit, and the interoperability of the two, along with the powerful profiling capabilities of Xcode, should raise the quality of the average casual game or graphics app. And GPU compute, accelerated with little abstraction at that, is finally available now via Metal.
 
Supposedly Metal was received well by big developers but not smaller ones.

Doesn't seem like 99 cent or freemium games would bother.
 
Back
Top