dGPU vs APU spin-off

  • Thread starter Deleted member 13524
  • Start date
Status
Not open for further replies.
The single biggest hurdle to APUs is the lack of cheap, flexible, high bandwidth memory. In order to have a highend APU, you need a highend memory system. That means either GDDR 5/5x/6, Wide IO2, HBM/2 or HMC
HBM might be great for an APU, but then again further down the line, HBM will be even more powerful on dGPUs, faster clocks, wider buses .. etc. Come to think of it, there is nothing stopping people from putting GDDR5X on APUs now, after all they are cheaper than HBM and competitive as well. But no body did it.

This is even without mentioning any new memory technology that might appear down the line which makes HBM obsolete. Right then we will have a repeat of the current situation, APUs will play catch up with dGPUs over again.
 
Saying that IGPs will kill dGPUs due to performance is like saying that a 25W chip will have similar performance to a 250W chip of the same generation.

The 25W chip doesn't need similar performance to the 250W chip, it merely needs enough performance that the TAM for the 250W chip won't cover costs. It is basically the same argument for IGP vs GPU. Every quanta, IGP sucks up more of the TAM, every quanta the GPUs require higher and higher margins to be viable.

As an example, my last personal laptop is a Sony Vaio with both IGP and GPU. It needed he GPU to have any chance at doing any gaming. These days, I wouldn't get a laptop with a GPU since the IGPs have progressed to the point that they are more than fast enough for any gaming I would do on a laptop. Same thing happened for my sidebox, I was choosing from a personal pan with or without GPU, ended up going for the one with because the performance difference was small enough that the GPU based system just didn't provide enough advantage to offset its drawbacks.

And this is how GPUs die: slowly eaten away until the only viable profit point is 1K GPUs.
 
And this is how GPUs die: slowly eaten away until the only viable profit point is 1K GPUs.

Yup, and we're seeing that right now with Nvidia rapidly escalating the price for discrete gaming GPUs. It wouldn't surprise me if the 1080 Ti launches at 899-999 USD considering that Titan X moved from 1000 USD to 1200 USD.

GTX 970 at 329 USD while the GTX 1070 has yet to dib below 399 USD with the vast majority of cards being well over 400 USD. GTX 980 at 549 USD versus GTX 1080 at 619 USD and up.

HBM might be great for an APU, but then again further down the line, HBM will be even more powerful on dGPUs, faster clocks, wider buses .. etc. Come to think of it, there is nothing stopping people from putting GDDR5X on APUs now, after all they are cheaper than HBM and competitive as well. But no body did it.

There is, it's called the PS4 albeit with GDDR and not GDDRx. And we don't know what Project Scropio is going to end up using. That last will be interesting to see as Microsoft moves Xbox more and more into being just another Windows box with a different GUI.

As for the consumer PC market, it can be done, but no OEM appears to want to go that route as the vast majority of their sales are systems with integrated GPUs (hence why Intel has such a dominant share of total GPU shipments). And for consumers that want more GPU power it's easier just to take their existing system boards and throw on a discrete GPU.

Discrete GPU sales are shrinking despite small growth in enthusiast and performance level discrete GPUs.

Basically each year as integrated GPUs get more and more capable, less and less people feel the need to buy a discrete GPU.

Hence why we see Nvidia increasing the price of their discrete GPUs by such a large degree. It not only offsets the diminishing sales of discrete GPUs but boosts their GPU margins quite significantly. It's entirely possible that within a few years their margins will be higher than Intel's margins while they continue to sell less GPUs each year.

Regards,
SB
 
There is, it's called the PS4 albeit with GDDR and not GDDRx. And we don't know what Project Scropio is going to end up using. That last will be interesting to see as Microsoft moves Xbox more and more into being just another Windows box with a different GUI.
PS4 is not a desktop APU, consoles are not desktop APUs, they are machines tailored to be gaming specific, with high amount of high speed memory, custom point to point connections and different memory hierarchy. Desktop APUs are completely different at the moment, most APUs have poor system/video memory capacity for cost reasons, they are also coupled with low performing CPUs for adequate power consumption, because no body with their right mind would couple two power hungry silicons on the same die. Not to mention the far lower bandwidth (will remain true even with HBM).

And for consumers that want more GPU power it's easier just to take their existing system boards and throw on a discrete GPU.
Exactly.

Basically each year as integrated GPUs get more and more capable, less and less people feel the need to buy a discrete GPU.
Except they don't become capable at all, they can't run contemporary games adequately with good resolution, fps and details, and with the crazy race for denser resolutions, VR, and the hunt for ever increasing graphics details and realism, cheap integrated solution will remain a no go for for the majority of graphics seekers, and they are a lot of people nowadays.

People DO want graphics and fidelity, that's the only reason why so many switched to PS4 and XO at the start of the generation, and the reason why so may preferred PS4 over XO. And the reason why the PC gaming is booming at the current moment with most developers under the sun porting console exclusives to PCs, flashing and bragging about their higher resolution and frame rate. Heck that's why we are seeing an upgrade path for consoles for the first time in years, with Sony and Microsoft confident there is a BIG market to sell more graphics capable hardware.
Hence why we see Nvidia increasing the price of their discrete GPUs by such a large degree.
No, that's lack of competition.
 
no body with their right mind would couple two power hungry silicons on the same die.
Why not? It's far easier to cool and more compact if you do it properly. The real catch is you need to build a case around the concept so the form factor changes. With all the heat producing components co-planar and centralized it's easy to slap a giant block of copper on top with a ridiculously large fan. Take Project Quantum for example. Intel CPU, pair of Fijis (17TFLOPs), with a 180mm fan atop it. That wouldn't be nearly as easy to cool as a giant APU, but a fair approximation of the capability.

Except they don't become capable at all, they can't run contemporary games adequately with good resolution, fps and details, and with the crazy race for denser resolutions, VR, and the hunt for ever increasing graphics details and realism, cheap integrated solution will remain a no go for for the majority of graphics seekers, and they are a lot of people nowadays.
How many 300w+ APUs have you seen tested? Of course a 35W APU won't compare to a 100W CPU + 200W GPU. It's only thanks to the stacked memory we may start to see the SoCs with ample bandwidth. Before it would be impractical to get both DDR and GDDR all wired to the socket for the same reason interposers are required to make HBM work. Case in point HPC Zen which allegedly will have a full Vega on it, although focused on server market initially. I wouldn't expect moving the Vega to it's own discrete card to improve the situation any.
 
How many 300w+ APUs have you seen tested?
How many have YOU tested? I suggest before people start talking about Super Mega APUs or APUs with HBM to wait and see if their implementation actually takes off, or at least have the courtesy to compare them to actual dGPUs with HBM, before making any assumptions.

Why not? It's far easier to cool and more compact if you do it properly.
So it's easier to cool a Core i5 class CPU + a GTX 980 class GPU stuck together on single die than to cool them separately?
 
How many have YOU tested?
Since there aren't currently any available, none. We're only now getting to the point where it's possible.

So it's easier to cool a Core i5 class CPU + a GTX 980 class GPU stuck together on single die than to cool them separately?
Yes. Big difference being the changes to the case design and likely not having to bother with internal airflow. Effectively you have an open air case, not to mention the ability to cool many other components with the same apparatus.
 
Except they don't become capable at all, they can't run contemporary games adequately with good resolution, fps and details, and with the crazy race for denser resolutions, VR, and the hunt for ever increasing graphics details and realism, cheap integrated solution will remain a no go for for the majority of graphics seekers, and they are a lot of people nowadays.

That's only true if you ignore the existence of both the XBO and PS4 as well as ignore everyone who doesn't feel the need for a beefy dGPU in order to have a good gaming experience. The most played non-casual game on the planet (League of Legends) does just fine on modern integrated graphics for a large chunk of its player base, for example.

Both the XBO and PS4 are basically just PC APUs with a different OS. If either was hackable, it wouldn't be too surprising to see hacked to run Linux or possibly even Windows. Heck, the XBO is basically just running a variant of Windows. And it is getting closer and closer to being just a PC with a different UI as it is now open to UWP programs. If Microsoft was so inclined they could just turn the XBO into a Windows PC with relatively minimal effort and sell it as a Windows PC.

It doesn't matter that you like to discount those, but the reality is that they are APUs that are more than capable of playing games at 1080p with graphics that most people find quite good. And relatively soon that will potentially be ramped up to 2160p games. And in 2017, we'll see the release of an APU with the power of a 980/980 Ti.

The market for enthusiast level discrete GPUs is too small to make a dedicated APU for the PC ecosystem. So it's highly unlikely we'll ever see a 300+ watt PC APU featuring an enthusiast level GPU paired with an enthusiast level CPU on one package.

The question is, is the market large enough to support an APU with a Rx 480/GTX 1060 class GPU? That will determine whether something like that is introduced into the general PC ecosystem, or if it'll be limited to the console ecosystem. Albeit the console ecosystem will soon have an APU that surpasses both of those.

Regards,
SB
 
The market for enthusiast level discrete GPUs is too small to make a dedicated APU for the PC ecosystem. So it's highly unlikely we'll ever see a 300+ watt PC APU featuring an enthusiast level GPU paired with an enthusiast level CPU on one package.
Small maybe, but just how much work would it really involve if the chips were reused? A custom interposer would seemingly be the only new component required. The server market, where the GPU has full access to system memory, I could definitely see being worthwhile for some tasks. Selling it into a consumer market, while expensive, could be a bonus. It would seem a more elegant option than the P100 w/ NVLink on IBM platforms.

The question is, is the market large enough to support an APU with a Rx 480/GTX 1060 class GPU? That will determine whether something like that is introduced into the general PC ecosystem, or if it'll be limited to the console ecosystem.
Alongside a push towards SFF cases I could definitely see it. Pack it into a system on par with a Chromebox it could be a fun little system.
 
It doesn't matter that you like to discount those, but the reality is that they are APUs that are more than capable of playing games at 1080p with graphics that most people find quite good. And relatively soon that will potentially be ramped up to 2160p games. And in 2017, we'll see the release of an APU with the power of a 980/980 Ti.
No they are not, they are not desktop APUs, their architecture and design is different from desktops, they are not replaceable or exchangeable like desktops, and they don't have memories like desktops.

That's only true if you ignore the existence of both the XBO and PS4 as well as ignore everyone who doesn't feel the need for a beefy dGPU in order to have a good gaming experience.
I don't ignore them, these people exist and in large numbers, just like the people who feel the need to buy dGPUs and enjoy their games with decent fps and resolution, you know the people that make PC gaming prosper, the people that buy AAA games that require beefy hardware, the people that buy Battlefield, COD, GTA, Far Cry, Assassin's Creed. The streamers, youtubers, competitive players. The people that fund expensive projects like Star Citizen, The people that have 1440p and 4K monitors, the people that want VR, the people that hunt for graphics immersive realism. These people will ensure dGPUs stay, they are there if you don't ignore them as well.

And relatively soon that will potentially be ramped up to 2160p games. And in 2017, we'll see the release of an APU with the power of a 980/980 Ti.
By then 980Ti will have been relegated to the mid end, (heck, with the Titan Pascal and the 1070 it's already med-end)! A year later, dGPUs will probably have surpassed that mark by a massive order of magnitude.

The market for enthusiast level discrete GPUs is too small to make a dedicated APU for the PC ecosystem. So it's highly unlikely we'll ever see a 300+ watt PC APU featuring an enthusiast level GPU paired with an enthusiast level CPU on one package.
Exactly, hence why dGPUs are here to stay. Their market maybe small, in the sense that compared to cheap useless iGPUs that most people buy to watch movies on and do some browsing and word processing, they are small. But when compared to the gaming crowd (which is growing in numbers) they are not small at all. And that distinction is why dGPUs will continue to be needed to serve that crowd.

Yes. Big difference being the changes to the case design and likely not having to bother with internal airflow. Effectively you have an open air case, not to mention the ability to cool many other components with the same apparatus.
I am sorry, but I don't see that happening at all, for a simple reason, that kind of cooling solution will be expensive and complicated. Redesigning cases will also be a cumbersome effort.
 
No they are not, they are not desktop APUs, their architecture and design is different from desktops, they are not replaceable or exchangeable like desktops, and they don't have memories like desktops.

Wow, you are wobbling all over the place. Like to change the goalposts much? Just recently you asked why there weren't any APUs using anything other than DDR3. So I pointed out an APU using something other than DDR3. And now you are discounting these APUs because one of them doesn't use DDR3? BTW - the XBO uses DDR3 which is the same memory technology as desktop CPUs.

And their architecture is different? That's quite laughable. They are using desktop/laptop CPU cores and desktop/laptop GPU cores. Just like AMD's APUs. There are some vendor specific customizations, but nothing that wouldn't also work in the PC space. Which is proven as the XBO is running on the Windows Kernel, which I've stated multiple times now. As well it runs Windows programs through UWP. The same ones you can run on your desktop.

The XBO at this point is basically just a Windows PC with a walled garden similar to what Windows RT was, except without the limitations of an ARM core CPU. IE - if Microsoft wanted, the XBO could run standard win32 applications and not just UWP applications.

Exactly, hence why dGPUs are here to stay. Their market maybe small, in the sense that compared to cheap useless iGPUs that most people buy to watch movies on and do some browsing and word processing, they are small. But when compared to the gaming crowd (which is growing in numbers) they are not small at all. And that distinction is why dGPUs will continue to be needed to serve that crowd.

Noone is arguing that dGPUs are going to disappear. But dGPU sales are falling every year because for most people iGPUs are more than good enough, even for gaming. MOBA (the single largest non-casual gaming genre on PC) and more casual gamers, which has the largest gaming genre on PC (match 3 type games) don't need dGPUs. And for more demanding games, consoles offer an alternative that is just as good as everything except for enthusiast level GPUs. And they are using PC class x86 APUs, unlike past consoles. Even the original Xbox used a discrete CPU and a discrete GPU.

In other words, the market for dGPUs is going to continue to shrink as iGPU becomes more and more capable. The result will be a smaller dGPU market with far higher prices in order to sustain it.

Regards,
SB
 
There's so many ways this could go.. there are certainly plausible scenarios where dGPUs are killed off entirely in the relative near term. And there are at least as plausible scenarios where they retain a sizable chunk of the market. Or things could stabilize somewhere in the middle where dGPUs become niche devices that are more supported by R&D for HPC and mobile than the other way around.

It's possible for sockets to open up and start supporting much higher power configurations officially. HBM would remove the cost overheads with supporting a large pool of shared high speed memory on the motherboard. 3D stacked GPU dies can be used to avoid large die/yield issues (I assume we'll still count this as an APU?). If Intel offers something like this at 300W they could - assuming they had a competitive GPU uarch - overtake the dGPU high end already. And if they don't have a competitive uarch they could potentially license one from nVidia, even allowing nVidia to design a GPU die on their process separately. This isn't too much of a far fetched scenario; while Intel hasn't given high end GPUs much thought in a long time they do seem to be looking for as many ways as possible to increase their fab utilization and enter/expand in new markets as PC sales decline. AMD could offer such a high power APU as well, but they probably wouldn't be able to take down the dGPU market by themselves.

There's also the possibility that the difference between 50W and 200W GPUs becomes too small to really drive a market. With every console generation we've seen (very) roughly similar order of magnitude shifts in execution resources, bandwidth etc but the increase in subjective appeal (at least from my point of view?) hasn't been proportional. That is, the leap from PS1 to PS2 was far more impressive to me than the leap from PS3 to PS4. The consoles are also somewhat of a limiting baseline factor for a lot of titles and give APUs time to catch up until the next big generational shift. Although consoles are moving towards an incremental release strategy, games will still tend to target a nominal platform that's a few years old. Better hardware can have some more effects and be rendered at higher resolutions, but this has diminishing returns, especially if the display resolution becomes limiting.

On the other hand, Intel could remain conservative in its efforts to bring high bandwidth GPU memory options to the desktop, and that alone could keep the dGPU market chugging at some capacity. And VR or other new GPU venues could drive demands to even much higher power dGPU solutions than we're currently seeing. We could hit a big wall manufacturing-wise that keeps the perception gap from closing as much as it could. Who knows.
 
PS4 is not a desktop APU, consoles are not desktop APUs, they are machines tailored to be gaming specific, with high amount of high speed memory, custom point to point connections and different memory hierarchy. Desktop APUs are completely different at the moment, most APUs have poor system/video memory capacity for cost reasons, they are also coupled with low performing CPUs for adequate power consumption, because no body with their right mind would couple two power hungry silicons on the same die. Not to mention the far lower bandwidth (will remain true even with HBM).

HBM provides more benefit for IGP than GPU. IGPs are significantly more bandwidth constrained than GPUs are. Even moderate HBM solutions for GPUs provide more bandwidth than they need. A single HBM stack would increase the performance of an IGP solution significantly.


Except they don't become capable at all, they can't run contemporary games adequately with good resolution, fps and details, and with the crazy race for denser resolutions, VR, and the hunt for ever increasing graphics details and realism, cheap integrated solution will remain a no go for for the majority of graphics seekers, and they are a lot of people nowadays.

The vast majority of games these days can be played on IGPs. The volume for GPUs also keeps shrinking which runs counter to your "they are a lot of people nowadays". And VR? VR is dead, it just doesn't know it yet. It is a fad like mocap cameras and motion sensitive controllers are/were. 'member when' those were all the rage and going to take over the world...
 
Both the XBO and PS4 are basically just PC APUs with a different OS. If either was hackable, it wouldn't be too surprising to see hacked to run Linux or possibly even Windows. Heck, the XBO is basically just running a variant of Windows. And it is getting closer and closer to being just a PC with a different UI as it is now open to UWP programs. If Microsoft was so inclined they could just turn the XBO into a Windows PC with relatively minimal effort and sell it as a Windows PC.

There is nothing "basically" about it. XBO IS running windows. Next major update, IIRC, XBO will be running windows 10. This was explicitly part of the development changes that went into windows 10, to enable basically XBO and PC Windows to be running on the exact same updates and code base. The only difference is that XBO get the XBO UI stack and PC Windows get the PC UI stack plus additional drivers.

Basically, the only thing likely preventing either AMD or MS from releasing an XBO PC is licensing.
 
And now you are discounting these APUs because one of them doesn't use DDR3? BTW - the XBO uses DDR3 which is the same memory technology as desktop CPUs.
Again, consoles are not desktop APUs. I was very specific in my phrases, desktop APUs have yet to use any memory other than DDR3, don't try to force a contradiction that is only in your mind.

And their architecture is different? That's quite laughable. They are using desktop/laptop CPU cores and desktop/laptop GPU cores
You are dancing around the fire here, they are not upgradable/replaceable like desktop APUs, they don't have two different memory pools like desktop APUs, they have custom hardware and point to point connections/mainboards unlike desktop APUs, they have much larger system bus and bandwidth unlike desktop APUs, hence why they are not desktop APUs, they are another category altogether no matter how hard you are trying to morph them into one.


the market for dGPUs is going to continue to shrink as iGPU becomes more and more capable
I guess time will tell.


HBM provides more benefit for IGP than GPU.
They are also extremely beneficial to large GPUs with massive amount of cores. It's relative. None the less, I would love to see that claim vindicated, cramming massive bandwidth down some castrated APU cores will only get you so far. Your claim will be highly implementation specific, you would need the right amount of CPU and GPU power for that to work. We'll see how it goes.


The vast majority of games these days can be played on IGPs.
Nope. Not with decent experience at all. A lot of people upgrade their GPUs just to play FIFA and PES, because iGPUs suck at these games!

And VR? VR is dead, it just doesn't know it yet
And 4K is dead too? And any other visual technology that requires massive processing power in the future is going to be dead before it knows it too? right? basically we will be living under a rock enjoying what little visual computing power we have with those tiny APUs! What a bright future!
 
Last edited:
I'll take advantage of this thread to bring a pet issue. I'm disappointed about how RX460 and presumably RX450 don't have VGA output. But anyhow AM4 motherboards have VGA, so that's what I will likely consider in an undefined future.
How come a cheap ass motherboard can have a converter chip and a connector, but a semi-expensive graphics card won't? (taking the view that a RX460 4GB is not cheap for everyone)

Anyway, a fast GPU is often not that useful on its own. Even a Core 2 Quad is too slow for games, so to play overengineered games an APU that's fast on the CPU side along with lots of RAM will be helpful.
 
I've bought a cheap HDMI to VGA converter (about $5). It works great. The reason I bought it is since I use mirrored display on my PC (mirrored to my TV) and using VGA + DVI/HDMI introduce tearing to one of the display (basically display no. 2). By using DVI and HDMI to VGA, the tearing is gone. Of course I don't have to go through this much trouble if my PC monitor support 1080p 4:4:4 through HDMI. It does support it through VGA.
 
Even a Core 2 Quad is too slow for games
No its not, name a game that its too slow for
The only game I've had a problem with is no mans sky and thats because version 1.0 required sse 4.1
that was patched out.
mafia 3 also requires sse 4.1 but if the removed the requirement I bet it would run fine
 
Last edited:
Yup, and we're seeing that right now with Nvidia rapidly escalating the price for discrete gaming GPUs. It wouldn't surprise me if the 1080 Ti launches at 899-999 USD considering that Titan X moved from 1000 USD to 1200 USD.

GTX 970 at 329 USD while the GTX 1070 has yet to dib below 399 USD with the vast majority of cards being well over 400 USD. GTX 980 at 549 USD versus GTX 1080 at 619 USD and up.
......
Regards,
SB
I think it is a bit early to see if the trend of price increases over previous generation is down to APUs killing the lower down GPUs.
The main reason is that with Maxwell 2 it was still sort of balanced out by the year before 2xx Hawaii GPUs that had really good performance albeit at much higher temps and power demand.
This time round there is nothing to balance out the 1070 and higher cards by Nvidia, no performance competition would mean higher prices, and so that needs to be factored in along with the annoying FE pricing generally higher than the cheapest custom models.

And I can see discrete GPUs still having a place even at much lower price-performance bracket as it is reaching the point one can have great performance and much lower power demand/TDP, making them even more feasible for a small/compact implementation and even laptops.
The trend did seem laptops were going the way of primarily the enthusiast high performance GPUs and high margin, but it will be interesting if the smaller Polaris/Pascal changes this decline.

IMO there is still a complimentary implementation of processor+disctre GPU for now even mainstream, but logically this should decline as APUs increase even further in performance (although this then needs to be balanced with future games and certain software becoming ever more demanding).
Cheers
 
Last edited:
Rumored APU for socket AM4 carrying a 4-core Zen, a 16CU Vega iGPU and HBM2:

http://www.bitsandchips.it/52-english-news/7622-rumor-two-versions-of-raven-ridge-under-development

Wow, I completely forgot this thread until I saw these news, but I'm glad it took a more positive spin in general.

I'd just like to point out a few things (won't do selective quoting because it would take years to respond to everything, sorry):

1 - My original suggestion was that dGPUs would be gone from the consumer market within 10 years. 10 years is really long in tech. 10 years ago we were walking around with this mobile phone and most gamers were willing to play Wii Sports.
So don't worry guys, you'll still have dGPUs for a looong time. And it's simply a thought/opinion, don't fret over it.



2 - Of course the PS4Bone's SoCs apply to the equation because they're SoCs and they're even using x64 CPUs. Unlike previous generations of consoles, these are basically PCs and proof of that is how some people got the PS4 to run x64 Linux in it.
The only reasons AMD hasn't put a socketable SoC with that kind of performance in the PC market yet are:
a) Sufficient memory bandwidth wasn't attainable with the PC's DDR3 or even DDR4. Many factors point to Kaveri being supposed to get GDDR5L but that one was cancelled at the last minute (the chip was actually produced with two memory controllers but uses only one). The XBone uses some of the fastest DDR3 in a rather expensive 256bit bus and it still needs EDRAM to compensate.
This can now be solved by using HBM in the same interposer as the APU, and all forms of very wide and stacked memory will keep evolving.

b) A socket that supports APUs with local memory (HBM) would have to come up, and it wouldn't make much sense to introduce a new socket between AM3+/FM3 and AM4 considering Zen is right at the door.



3 - Claiming the consoles have "weak CPUs" so their SoCs don't apply to the PC equation. I don't think this is valid because consoles are made with balance in mind. If the 1.6GHz Jaguars really were such a terrible bottleneck for most instances, then both Sony and Microsoft would have gone with smaller and less consuming GPUs and e.g. 3GHz Bulldozers. Or if they didn't like any of those alternatives they could have gone with a PowerPC or even paid top dollar for a Haswell based Pentium.
We should take into consideration that consoles were designed by a large group of very smart people that tried to achieve the most balanced system possible within power, price and performance constraints.
If 8x 1.6Ghz Jaguars were paired with 1.3-1.8 TFLOPs GPUs for running games using low-overhead+closer-to-metal APIs and GPUs capable of general compute, then perhaps that's not far from what we'd be using if our PC games had been developed for low-overhead+closer-to-metal APIs. And games on the PC will start to be made for DX12 and/or Vulkan in the future.
 
Status
Not open for further replies.
Back
Top