Welcome, Unregistered.

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Reply
Old 01-Dec-2012, 23:34   #3726
wsippel
Member
 
Join Date: Nov 2006
Posts: 229
Default

Quote:
Originally Posted by BRiT View Post
There's nothing more to it. Too many people drank too much Nintendo cool-aid. Nintendo themselves have not mentioned GPGPU as being the savoir of the WiiU. It's the NDFers that have been hyping it as some WiiU savoir. Once again, GPGPU is no magical bullet. It is no savoir.
And once again: Nintendo did mention it (I even posted when and where, so just look it up if you don't believe me). Not as a "savior", because there's no indication that the system needs one to begin with, And there definitely wasn't any such indication at the time they mentioned it.
wsippel is offline   Reply With Quote
Old 02-Dec-2012, 00:23   #3727
Ruskie
Senior Member
 
Join Date: Mar 2010
Posts: 1,291
Default

Wii U was made for quick PS360 ports until Nintendo starts dropping their 1st party games that everybody is actually been waiting for.

There is no hidden "GPGPU" function that developers have yet to discover and use to propel Wii U performances to new heights, its simple. Wii U CPU was made to be backwards comp with Wii and cheap to develop first, performance came second.

Every 3rd party game that you could consider demanding (BO2, Arkham City, ME3) has performance issues with more CPU intense scenes on Wii U. ACIII has it too only Digital Foundry hasn't yet made a comparison. Game is locked at 30fps but it drops to mid 20s in even most basic scenes (360 version runs those at avg 35fps) so I wouldn't expect much more from Wii U on 3rd party front. Can't even imagine how bad GTAV would chug on it...
Ruskie is offline   Reply With Quote
Old 02-Dec-2012, 00:44   #3728
Laa-Yosh
member
 
Join Date: Feb 2002
Posts: 8,157
Default

Quote:
Originally Posted by wsippel View Post
Except the whole GPGPU stuff isn't based on rumors, it's a feature highlighted by Nintendo itself, both in public presentations and in the documentation for developers.
Still doesn't change the fact that most of us here would consider the stuff GPGPU is good for to be superficial features - it can't really help you in running gameplay related code, or critical rendering engine related stuff.


Think of it as having the ability to add another 100K particles into your scenes that can't have any effect on gameplay or anything happening, they're basically just dressing. You can design a game from the ground to have visuals that rely on this feature and produce some nice results - but you can't take an existing game's code and just make some critical elements of it use GPGPU instead of the CPU.
__________________
My opinions do not represent that of my employer blah blah etc.
Laa-Yosh is offline   Reply With Quote
Old 02-Dec-2012, 00:50   #3729
Laa-Yosh
member
 
Join Date: Feb 2002
Posts: 8,157
Default

Sebbi, your insight to game programming and your willingness to share your knowledge are both exceptional. Just wanted to say that I'm really grateful for your participation in this forum.
__________________
My opinions do not represent that of my employer blah blah etc.
Laa-Yosh is offline   Reply With Quote
Old 02-Dec-2012, 05:57   #3730
Kb-Smoker
Member
 
Join Date: Aug 2005
Posts: 614
Default

Quote:
Originally Posted by BRiT View Post
No, you don't understand the reality of the situation. I suggest you reread the posts of other developers and well informed posters such as Sebbi, Erp, and Shifty Geezer. GPGPU is no magic bullet. Please stop with the silly dreams. Yours will just get crushed as it will not amount to anything to vastly improve the performance of the WiiU.

[EDIT: Function, I truely hope you were being tongue-in-cheek sarcastic. If so, it was missed the first time I read your post.]
Of course he is joking.
Kb-Smoker is offline   Reply With Quote
Old 02-Dec-2012, 06:59   #3731
TheD
Member
 
Join Date: Nov 2008
Posts: 214
Default

Quote:
Originally Posted by Kb-Smoker View Post
Of course he is joking.
The scary thing is that I have seen people state that and not be joking.
TheD is offline   Reply With Quote
Old 02-Dec-2012, 07:39   #3732
Exophase
Senior Member
 
Join Date: Mar 2010
Location: Cleveland, OH
Posts: 1,884
Default

Quote:
Originally Posted by wsippel View Post
And once again: Nintendo did mention it (I even posted when and where, so just look it up if you don't believe me). Not as a "savior", because there's no indication that the system needs one to begin with, And there definitely wasn't any such indication at the time they mentioned it.
Nintendo's claims that it supports GPGPU don't mean anything. You could probably find some useful non-graphical task that could be performed on XBox 360's GPU too, not to say that it'd be a good use of resources. I don't know what you're really expecting here.. nVidia and AMD's latest GPUs are far ahead of R700 in terms of programmability/GPGPU features. There isn't some special feature that AMD could have added to Wii U's GPU that would have suddenly put it ahead. So regardless of whether or not it's useful and practical for some real world codes that aren't running in the standard rendering pipeline (and I'm sure it is), if physics engine developers feel that the latest crop of PC GPUs are poor fits for their core functionality then I guarantee you the same will hold for Wii U.

Wii U's GPU is going to largely remain a streaming processor. If the tasks need heavy caching to work well then they're going to bomb on GPGPU. Having slow main RAM only makes the situation worse, particularly when you don't want to spare any eDRAM for non-GPU tasks (which you normally wouldn't I'd think, especially if you're texturing as much as you can from there)

Quote:
Originally Posted by TheD View Post
The scary thing is that I have seen people state that and not be joking.
function is satirizing these people. He was exaggerating and trying pretty hard to look silly on purpose. If there are people who would phrase it the way he did and actually mean it then there's something pretty wrong with them.
Exophase is online now   Reply With Quote
Old 02-Dec-2012, 10:14   #3733
Shifty Geezer
uber-Troll!
 
Join Date: Dec 2004
Location: Under my bridge
Posts: 29,048
Default

Quote:
Originally Posted by wsippel View Post
Anyway, the problem I see here is that, as stated several times, Nintendo themselves highlights the feature. Why should they? They usually simply don't talk about tech, this was a very rare exception. Do you think they fell for some AMD snake oil?
Not at all. Wuu's GPU is definitely 100% capable of GPGPU. Only we don't know to what degree, and how that helps in games. Xenos and RSX are also capable of GPGPU work - the origins of GPGPU were in using graphics functions on non graphics data by formatting the data into a way that matched the graphics functions. It's a very clever, innovative solution to extracting performance from limited hardware. AMD and nVidia and friends (MS, Khronos Group) have taken these ideas and worked towards improving the flexibility of the GPUs to enable a broader range of workloads to be performed, but GPGPU itself isn't a feature that is or isn't present in a GPU. Like DirectX - a GPU isn't DirectX or not; a GPU has a degree of hardware support for various features in DirectX. A GPU isn't a GPGPU or not; it'll support a number of features that aid GPGPU work. We have no details on what Wuu's GPGPU level is, and can only guess by likely GPU architecture.

Quote:
I tend to believe that hardware manufacturers like AMD or Nvidia usually have to follow PC paradigms, even more so considering neither of the two is the market leader. It makes no sense putting too much time and money in a feature nobody will use, especially not if it requires costly and rarely used additions or compatibility breaking changes to the hardware. In the embedded space, there's no reason to hold back.
GPGPU is seeing its way into supercomputers. It's on all the GPU roadmaps. It's a feature that has the complete investment of the GPU IHVs as a necessary component to remain competitive, even if devs aren't using it particularly well in PCs yet.

Quote:
And I'm afraid a few Linkedin profiles got changed/ erased after one or two got too much attention, so you probably won't find the sources for my claims anymore. But Nintendo has (or had) people working on that stuff. 3rd party middleware optimizations I mean, I never found a concrete mention of GPGPU either.
They can also buy in people with experience such as from AMD or nVidia. However, it's a field still in its early days and there's no way Nintendo can be trusted to bring broad expertise that helps other devs refactor their game engines - there's no reason for them to have this superior expertise unless they've heavily invested in research for years prior to Wuu's release. Furthermore, a GPU cannot (yet) replace a CPU in terms of types of code it can handle or types of jobs it can do. I'm no expert on the game code so I could entertain the notion of things like physics being executed on Wuu's GPU, but with the likes of ERP are telling us otherwise. Nintendo's commentary about GPGPU is thus pretty meaningless. Yes, they did highlight it which is rare for Nintendo, but then they were facing an uphill struggle dealing with the game media and throwing them a bone of optimism seems a likely PR move.

If you just follow all the positive PR surrounding Wii and Wii U, all the talk of amazing, secret features, and special abilities, and Nintendo expertise, every single one was bunk. Is there really reason to think that this time Nintendo has a technological feature that'll make all the difference, especially in light of the evidence t the contrary? The GPGPU capabilities of Wuu will have some support roles, I'm sure, but it's highly unlikely that the GPU will be doing a lot of the game code heavy lifting.
__________________
Shifty Geezer
...

Tolerance for internet moronism is exhausted. Anyone talking about people's attitudes in the Console fora, rather than games and technology, will feel my wrath. Read the FAQ to remind yourself how to behave and avoid unsightly incidents.
Shifty Geezer is offline   Reply With Quote
Old 02-Dec-2012, 11:22   #3734
Squilliam
Beyond3d isn't defined yet
 
Join Date: Jan 2008
Location: New Zealand
Posts: 3,146
Default

Would it be safe to say that GPGPU is at least an order of magnitude more difficult than regular programming? Even compared to the Cell GPGPU is still quite a bit harder if you want to do anything useful/important/critical for the game code, isn't it?
__________________
It all makes sense now: Gay marriage legalized on the same day as marijuana makes perfect biblical sense.
Leviticus 20:13 "A man who lays with another man should be stoned". Our interpretation has been wrong all these years!
Squilliam is offline   Reply With Quote
Old 02-Dec-2012, 12:50   #3735
Gipsel
Senior Member
 
Join Date: Jan 2010
Location: Hamburg, Germany
Posts: 1,447
Default

Quote:
Originally Posted by wsippel View Post
And once again: Nintendo did mention it (I even posted when and where, so just look it up if you don't believe me). Not as a "savior", because there's no indication that the system needs one to begin with, And there definitely wasn't any such indication at the time they mentioned it.
Quote:
Originally Posted by Shifty Geezer View Post
We have no details on what Wuu's GPGPU level is, and can only guess by likely GPU architecture.
Nintendo did mention compute shader support (I guess wsippel refers to that). But at the most basic level this just means the GPU can execute shader programs outside of the graphics pipeline (so you don't need to set up a the complete pipeline with pass-through vertex/geometry shaders and render a quad of the desired size with your shader disguised as pixel shader to an offscreen target). That basic level got introduced in the R700 generation (plus the LDS) and basically comes for free.

Last edited by Gipsel; 02-Dec-2012 at 13:07.
Gipsel is offline   Reply With Quote
Old 02-Dec-2012, 13:28   #3736
function
Senior Member
 
Join Date: Mar 2003
Posts: 2,732
Default

I think the 360 has been able to do that since 2005 thanks to MEMEXPORT. Sebbbi talked about using it in Trials Evolution - iirc he said the 360 was beyond even DX 10.1 in that regard. If Nintendo allowed a similar feature through their WiiU API then they would have no reason not to list compute shaders too.

It wouldn't have to mean that Nintendo had rearchitected AMD's graphics chip to be GPGPU monster though, or that they had solved all the issues around GPU physics, or that it was the reason they chose such a weak CPU etc as wsippel seems to be implying.
function is offline   Reply With Quote
Old 02-Dec-2012, 18:04   #3737
function
Senior Member
 
Join Date: Mar 2003
Posts: 2,732
Default

So, WiiU edram bandwidth.

Looking at this (http://www.eurogamer.net/articles/di...wii-u-face-off):

Quote:
Originally Posted by Digital Foundry
What's interesting about the read-out overall is that similar events can stress all three engines, but it's the extent to which frame-rates are impacted that varies dramatically. The initial scene doesn't look too promising for Wii U: indeed, we see three distinct performance bands - Xbox 360 at the top, PS3 in the middle and the new Nintendo console right at the bottom. It's clear that plenty of characters and full-screen transparencies are particular Achilles Heels for the Wii U, a state of affairs that persists in further clips later on. However, beyond that we see a fairly close match for the PlayStation 3 version in other scenarios and occasionally it even pulls ahead of the Sony platform.
... and at the missing in trees in Darksiders 2 (which seem to use alpha textures and would fill the screen up close) it seems like the WiiU might have some issues. The GPU clock is 550 mHz and it probably has 8 ROPs so triangle setup and raw fillrate shouldn't be the issue, but bandwidth might be.

So what kind of bandwidth would we be looking at? I started thinking about the PS3, where RSX had its 256-bit memory bus split into two buses (sort of), with a 128-bit bus connected to GDDR3 and the other half bent around and pointing at the CPU with that FlexIO thing.

So then I thought (perhaps a little naively) "the WiiU has a 64-bit bus going to main memory, what if it had another 64-bit bus (the "other channel") pointing at the edram?" The simplest way might be to run both channels at the same speed and simply address the different banks of memory sequentially. So I wanted to compare data transfer rates, and it seems that the 40nm edram process from NEC can scale up to 800 mHz:

http://www.simmtester.com/PAGE/news/...3424&num=10720

... which is the possible (likely?) clock of the DDR3 the WiiU uses. Would this be possible? Can edram be accessed using DDR data rates/protocols/whatever? Could ~13 GB/s of video memory bandwidth be in the right kind of area for what we're seeing? Seems very low, but, yunno .... Nintendo, and that.

MSAA, transparencies and lots of z-tests (no fancy CPU for culling on the WiiU) are the kind of things that should eat up available frame buffer bandwidth. How common are these things on WiiU and what kind of performance do we see where they occur?

Last edited by function; 02-Dec-2012 at 20:06. Reason: Did not mean "DDR data pins". I know that on chip you wouldn't use a "pin". Don't know whay I put that.
function is offline   Reply With Quote
Old 02-Dec-2012, 18:59   #3738
Clockwork
Registered
 
Join Date: Feb 2004
Location: Wisconsin
Posts: 76
Send a message via AIM to Clockwork Send a message via Yahoo to Clockwork
Default

Quote:
Originally Posted by function View Post
So then I thought (perhaps a little naively) "the WiiU has a 64-bit bus going to main memory, what if it had another 64-bit bus (the "other channel") pointing at the edram?" The simplest way might be to run both channels at the same speed and simply address the different banks of memory sequentially.

... which is the possible (likely?) clock of the DDR3 the WiiU uses. Would this be possible? Can edram be accessed using DDR data pins? Could ~13 GB/s of video memory bandwidth be in the right kind of area for what we're seeing? Seems very low, but, yunno .... Nintendo
No. Just no. No.
Clockwork is offline   Reply With Quote
Old 02-Dec-2012, 19:14   #3739
function
Senior Member
 
Join Date: Mar 2003
Posts: 2,732
Default

No to what? Can't access edram using something like a DDR3 bus?

Or no to the bandwidth? Because as bad as that sounds - and it's just a far-out bit of pondering, I'm not pushing it as Truth - an A10 5800K with the same aggregate bandwidth absolutely kicks the WiiU's face off.

Hell, Llano / A8 will kick its face off with less.

And when the PS3 is beating you at transparencies ....

Edit: here you go, performance scaling with bandwidth on Trinity. At DDR3 1600 (and lower) you have something vastly beyond the WiiU, with probably 4X the CPU (more?) and a much beefier GPU.
http://www.tomshardware.com/reviews/...0k,3224-5.html

And the Batman: Arkham City bit is somewhat topical:
http://www.tomshardware.com/reviews/...k,3224-15.html

Last edited by function; 02-Dec-2012 at 19:49.
function is offline   Reply With Quote
Old 02-Dec-2012, 20:04   #3740
Grall
Invisible Member
 
Join Date: Apr 2002
Location: La-la land
Posts: 6,290
Default

Quote:
Originally Posted by function View Post
No to what? Can't access edram using something like a DDR3 bus?
You probably COULD, but it wouldn't make sense as DDR3 and such are designed as off-chip interfaces, to tolerate use with memory board add-in slots (DIMM, SODIMM etc) and so on.

To have eDRAM and NOT have massive on-chip bandwidth would be completely illogical, as the whole - in fact ONLY - point to put DRAM straight on the chip is to provide large amounts of bandwidth.

To have a (presumably large) chunk of eDRAM with piddly bandwidth would not be a help, but rather a hindrance as instead of a big, expensive fast pool of memory you'd have a big, expensive SLOW memory. That cost could have been sunk into something else that would have provided a better return for investment.
__________________
"If I were a science teacher and a student said the Universe is 6000 years old, I would mark that answer as wrong (why? Because it is)."
-Phil Plait
Grall is offline   Reply With Quote
Old 02-Dec-2012, 20:14   #3741
function
Senior Member
 
Join Date: Mar 2003
Posts: 2,732
Default

Quote:
Originally Posted by Grall View Post
You probably COULD, but it wouldn't make sense as DDR3 and such are designed as off-chip interfaces, to tolerate use with memory board add-in slots (DIMM, SODIMM etc) and so on.

To have eDRAM and NOT have massive on-chip bandwidth would be completely illogical, as the whole - in fact ONLY - point to put DRAM straight on the chip is to provide large amounts of bandwidth.
Actually I disagree - I think another, more likely reason is to use edram is to reduce costs. MS did this with the 360 - they clearly wanted to avid a 256-bit bus and 8 memory chips for life - but they did so in a way that also gave them performance advantages.

The WiiU is a low performance device. It performs worse - significantly worse - than AMD SoCs with a 128-bit bus and DDR3 1600.

I don't think the WiiU's edram is about performance - nothing about the WiiU at all says "performance" - I think it's about cost. Cost to manufacture, and cost to research and develop. Nintendo don't want to be left paying for more DDR3 in than they have to in 5 years when the price has quadrupled.

Quote:
To have a (presumably large) chunk of eDRAM with piddly bandwidth would not be a help, but rather a hindrance as instead of a big, expensive fast pool of memory you'd have a big, expensive SLOW memory. That cost could have been sunk into something else that would have provided a better return for investment.
The cost argument says go with edram and a half size bus, IMO.
function is offline   Reply With Quote
Old 02-Dec-2012, 20:22   #3742
Shifty Geezer
uber-Troll!
 
Join Date: Dec 2004
Location: Under my bridge
Posts: 29,048
Default

Quote:
Originally Posted by function View Post
Actually I disagree - I think another, more likely reason is to use edram is to reduce costs.
Well, it's a cost/performance balance. But if Nintendo are getting only 25 GB/s total BW, why didn't they use a conventional solution? PS3 is available with GDDR3+XDR for well under 200. A single pool of 128 bit GDDR3 would have sufficed using commodity parts.
__________________
Shifty Geezer
...

Tolerance for internet moronism is exhausted. Anyone talking about people's attitudes in the Console fora, rather than games and technology, will feel my wrath. Read the FAQ to remind yourself how to behave and avoid unsightly incidents.
Shifty Geezer is offline   Reply With Quote
Old 02-Dec-2012, 20:30   #3743
Clockwork
Registered
 
Join Date: Feb 2004
Location: Wisconsin
Posts: 76
Send a message via AIM to Clockwork Send a message via Yahoo to Clockwork
Default

Function, the situation you have laid out provides no benefit to even implementing a solution that includes eDRAM.

This has been outlined above and why I said no to your original post.
Clockwork is offline   Reply With Quote
Old 02-Dec-2012, 20:49   #3744
function
Senior Member
 
Join Date: Mar 2003
Posts: 2,732
Default

Quote:
Originally Posted by Shifty Geezer View Post
Well, it's a cost/performance balance. But if Nintendo are getting only 25 GB/s total BW, why didn't they use a conventional solution? PS3 is available with GDDR3+XDR for well under 200. A single pool of 128 bit GDDR3 would have sufficed using commodity parts.
A single pool of GDDR3 is likely to cost a lot more than the DDR3. Five years from now DDR3 will cost several times what it does now (if it follows DDR2 and DDR1 pricing as volume dropped) and GDDR3 is likely to cost even more than that. GDDR3 would be a bad choice to put in a console launching now.

And even using DDR3 on a 128-bit bus would force you to use 8 memory chips for the life of the machine, and would cause additional costs and complications for that little motherboard (clamshell would only give you a 64-bit bus). It would probably rule out any possibility of a 28nm shrink too, should you ever want that.

On the other hand, 30 mm^2 of silicon on an old process like 40nm is going to be pretty affordable now and only get cheaper and cheaper over the years.
function is offline   Reply With Quote
Old 02-Dec-2012, 20:52   #3745
almighty
Naughty Boy!
 
Join Date: Dec 2006
Posts: 2,469
Default

Quote:
Originally Posted by Shifty Geezer View Post
Well, it's a cost/performance balance. But if Nintendo are getting only 25 GB/s total BW, why didn't they use a conventional solution? PS3 is available with GDDR3+XDR for well under 200. A single pool of 128 bit GDDR3 would have sufficed using commodity parts.
It's even worse when you can find sub 35 graphics cards on PC that has 2Gb GDDR3 and an 128bit bus...

What were Nintendo thinking....
almighty is offline   Reply With Quote
Old 02-Dec-2012, 21:08   #3746
Rootax
Member
 
Join Date: Jan 2006
Location: France
Posts: 203
Default

Quote:
Originally Posted by almighty View Post
It's even worse when you can find sub 35 graphics cards on PC that has 2Gb GDDR3 and an 128bit bus...

What were Nintendo thinking....
They must have a reason, they're not killing BW just for fun.
__________________
- I'm french. Sorry if you don't understand what i say -
Rootax is offline   Reply With Quote
Old 02-Dec-2012, 21:16   #3747
function
Senior Member
 
Join Date: Mar 2003
Posts: 2,732
Default

Quote:
Originally Posted by almighty View Post
It's even worse when you can find sub 35 graphics cards on PC that has 2Gb GDDR3 and an 128bit bus...

What were Nintendo thinking....
They were probably thinking they'd like to be able to sell a WiiU Mini for $99 in five years time, and make a profit on it.

PC graphics card manufacturers ditch a memory type when it gets slow or expensive, console vendors are stuck with it, like Sony and their XDR. And GDDR3 for that matter.

I'm trying to find more info on wafer costs, but so far even on 28nm a small amount of edram looks good compared to buying obsolete memory and soldering it to your motherboard.
function is offline   Reply With Quote
Old 02-Dec-2012, 21:24   #3748
Shifty Geezer
uber-Troll!
 
Join Date: Dec 2004
Location: Under my bridge
Posts: 29,048
Default

You have a fair argument there. Really disappointing if true. Wuu may not even have a BW advantage from a more flexible eDRAM design than XB360. The only areas it'll compete with the older, cheaper consoles are:

1) more RAM
2) more modern GPU architecture
__________________
Shifty Geezer
...

Tolerance for internet moronism is exhausted. Anyone talking about people's attitudes in the Console fora, rather than games and technology, will feel my wrath. Read the FAQ to remind yourself how to behave and avoid unsightly incidents.
Shifty Geezer is offline   Reply With Quote
Old 02-Dec-2012, 21:52   #3749
function
Senior Member
 
Join Date: Mar 2003
Posts: 2,732
Default

Quote:
Originally Posted by Shifty Geezer View Post
You have a fair argument there. Really disappointing if true. Wuu may not even have a BW advantage from a more flexible eDRAM design than XB360. The only areas it'll compete with the older, cheaper consoles are:

1) more RAM
2) more modern GPU architecture
I could be completely wrong of course. But given the performance level of the WiiU and the areas where it seems to struggle on the GPU side (admittedly this is early days) I can't help drawing comparisons to much faster SoCs with their "puny" 128-bit DDR3 memory buses.

There's something I half remember reading about AMD Phenom memory controllers - ganged vs unganged memory. I think you could set the MC to access dimms independently, by each 64-bit channel. Slower for some things, faster for others. Putting the edram on the end of one channel (with lower latencies) and the DDR3 on the other might be a quick and dirty way of getting your APU level bandwidth but without the same long term exposure to costs. And like the CPU - which has surprised everyone with its lack performance and evolution - it might save a lot on R&D time and money.

I've no proof though, beyond the apparent contradiction of on-GPU edram and the possible sub APU/PS3 level performance in rendering bandwidth constrained bits of games.
function is offline   Reply With Quote
Old 02-Dec-2012, 22:01   #3750
Dr Evil
Anas platyrhynchos
 
Join Date: Jul 2004
Location: Finland
Posts: 4,692
Default

Regardless if this turns out to be true, that was good and well thought out speculation. Makes sense given what we have seen so far.
Dr Evil is offline   Reply With Quote

Reply

Tags
160 shaders, bananas, full of fail, gpgpgpu, hidden special sauce, magic bullet, my reggie is body, oh boy!, physics processing unit, power7, son of wat?, tapionvslink, watson

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 01:44.


Powered by vBulletin® Version 3.8.6
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.