A laptop I purchased in 2011 -with an i5-2500H or something like that- had a Sandy Bridge GPU and I was soooo hyped. Specially with the Miracast feature, so I could use my TV which supported Miracast along with my laptop without cables. It turns out that my particular HD3000 Graphics GPU from Intel wasn't Miracast compatible -iirc, some HD3000 were compatible, HD 4000 was totally compatible-.They've always been capable of playing some oldies. I remember playing with the i810 and being fairly satisfied with what it could do. Sandy Bridge was when things started to get interesting.
I suppose I'm most interested in how they behave with SteamVR. NVidia has a lot of VR functionality. Some games support DLSS. They have VRSS foveated rendering capability too which might become very important. AMD has much less going for them. I don't know if Intel has anything going for VR.
Your point makes sense. It was after Sandy Bridge when I could complete The Witcher 1 on my laptop, and I loved the feeling of having a computer I could move around and use to play good games on it.
Also, Diablo 3 was playable on the HD 3000. I purchased it day one and while it rarely ran at 720p 60fps at the lowest possible details, the feeling of having your humble laptop running an AAA game was kinda special. The less you got the more you get to recognize the true worth of things. :smile2:
did you have the Intel 740? I didn't tbh, it was only when I started buying laptops when I began to use Intel GPUs.Did you really have the Intel 740 from 1998? Isn't that the only dedicated GPU they have made before?
The first laptop I ever got (2005) had some GMAxx something GPU, which was REALLY bad for gaming but well, it worked. I had several laptops until 2011 with integrated GMA solutions from Intel, but well, I didn't complain, it was the time of my life (2005 'til 2015) where I played consoles the most 'cos my laptops couldn't keep up, and I prefer laptops to desktop computers.