I can confirm.Microcenters exist only where there's a large enough population to support them. During the crypto craze they couldn't keep any stock, everything sold immediately locally.
I can confirm.Microcenters exist only where there's a large enough population to support them. During the crypto craze they couldn't keep any stock, everything sold immediately locally.
yea always pick up on the limited hot deals. That's how they get you in to buy other stuff lolFor pick up only?
too expensive. I bet they would sell really well at $800-$1kA lot of Microcenter stuff is pickup only, especially high value equipment. They don't ship a whole lot of gear.
I just checked our local Microcenter and there's like 40 units of 4080 GPUs available... lol no one wants these things.
This is not true. There are plenty of cities all around the country with more than enough population to support a Microcenter.Microcenters exist only where there's a large enough population to support them. During the crypto craze they couldn't keep any stock, everything sold immediately locally.
You're looking at it the wrong way. What Malo said does not conflict with what you said. He did not say they exist at every large city. He said they only exist where the size of the city supports it.This is not true. There are plenty of cities all around the country with more than enough population to support a Microcenter.
Fair enough. I took it as implication that Microcenters exist in all the places worthwhile.You're looking at it the wrong way. What Malo said does not conflict with what you said. He did not say they exist at every large city. He said they only exist where the size of the city supports it.
Transitive property was not claimed.
A implies B.
B does not imply A.
Sure, if you only care about ray tracing performance.Are we sure that the 1200$ price for the 4080 is bad btw ? I mean, if a 7900xtx at 999 is behind in RT and without dlss2/3, you can argue 200$ is the price to pay to have theses performances ?
Are we sure that the 1200$ price for the 4080 is bad btw ? I mean, if a 7900xtx at 999 is behind in RT and without dlss2/3, you can argue 200$ is the price to pay to have theses performances ?
Its more like the price for the 7900XT and XTX is as bad...
Sure, if you only care about ray tracing performance.
Sure, but you are paying more to potentially get less rasterisation performance. You could make the same argument against AMD generally for ray tracing, so it does depend on consumer preference. However last generation Nvidia were able to equal AMD in rasterisation while offering much better ray tracing performance, so in that sense it is a regression.Raster is very good too...
Last gen, AMD didn't have a ray tracing competitor, but had a semi rasterization competitor to the 3090 (the 6900XT was faster at 1440p, but slower at 2160p), this gen, AMD doesn't have a rasterization competitor (to the 4090), nor a ray tracing competitor (to the 4080), the regression is on the AMD side this time, as they are back to the days of Vega vs Pascal. (Vega 64 vs 1080Ti).so in that sense it is a regression.
Yes, that's right, but we're talking about the value proposition of the 4080, not the 4090.Last gen, AMD didn't have a ray tracing competitor, but had a semi rasterization competitor to the 3090 (the 6900XT was faster at 1440p, but slower at 2160p), this gen, AMD doesn't have a rasterization competitor (to the 4090), nor a ray tracing competitor (to the 4080), the regression is on the AMD side this time, as they are back to the days of Vega vs Pascal. (Vega 64 vs 1080Ti).
I don't think anyone is going to be silly enough to compare a halo product to something like that.Last gen, AMD didn't have a ray tracing competitor, but had a semi rasterization competitor to the 3090 (the 6900XT was faster at 1440p, but slower at 2160p), this gen, AMD doesn't have a rasterization competitor (to the 4090), nor a ray tracing competitor (to the 4080), the regression is on the AMD side this time, as they are back to the days of Vega vs Pascal. (Vega 64 vs 1080Ti).
A supposed analysis on why these GPUs are relatively expensive is posted by Coreteks, I don't know how accurate it is, but it sounds plausible to a certain degree.Its more like the price for the 7900XT and XTX is as bad...
From the 3080 to 4080 it's 1.5x perf raster/bit more with RT with the price jumping 1.7x 2 years after the 3080. The 4090 has an excuse of being the absolute fastest you can buy right now and makes sense for some professional/cuda applications (e.g. video editing/processing/upscaling, ML stuff) too so even if it's bad for the "average gamer" it makes sense for some others. The 4080 seems to exist to make people spend more on the 4090 given how it's about 1.33x faster for 1.33x the price, the 4080 really only makes sense for people who want the fastest nvidia card but can't quite afford the 4090. I'm not sure if I've ever recommended saving and buying the most expensive option if possible before but it's true this gen, if not get a 3080 or 3090 if you need 24GB VRAM. Then for the 7900XTX, more and more games have upscaling so native RT perf is less of an issue even with a significant a big arch RT gap. There are Nvidia features people might want so again go 4090 or 3080/3090 for better value, ignore the 4080 and 7900XTAre we sure that the 1200$ price for the 4080 is bad btw ? I mean, if a 7900xtx at 999 is behind in RT and without dlss2/3, you can argue 200$ is the price to pay to have theses performances ?
I know it's not fair but if I'm not mistaken they thought big RDNA2 would be 2080Ti level performance and championed the "RT co-processor" for Ampere because of the fan on both sides of the cooler so thank you for the video but I'll passA supposed analysis on why these GPUs are relatively expensive is posted by Coreteks, I don't know how accurate it is, but it sounds plausible to a certain degree.
He doesn't understand that AMD is not using an interposer on RDNA 3, so his "costs" are a bit skewed.A supposed analysis on why these GPUs are relatively expensive is posted by Coreteks, I don't know how accurate it is, but it sounds plausible to a certain degree.
Are they using an organic substrate then? Or something similar to FuryX/Vega 64?He doesn't understand that AMD is not using an interposer on RDNA 3,
I referenced the tech here:Are they using an organic substrate then? Or something similar to FuryX/Vega 64?