The data doesn't stay in the eSRAM. It has to eventually go into the DDR3 pool and that is where the bottleneck can occur.
That simply isn't true. The eSRAM works similarly to the eDRAM of the X360, except eSRAM is faster and has a few different features. The eSRAM is connected to the GPU, to increase bandwidth for the GPU. The GPU communicates with RAM, CPU, and the eSRAM. The eSRAM mainly communicates with the GPU. Making it somehow pass through the normal RAM completely removes its purpose. Might as well not be there.
Also if Microsoft chose to solve the heat issue by down-clocking the GPU due to bad yeilds (due to eSRAM), this will lower the eSRAM's bandwidth to the point that it counter-acts adding it in the first place.
Not really. It'll lower its effectiveness, definitely, but it's still better having it than not having it. If its bandwidth is reduced from a 100GB/s to 50GB/s (random numbers), it's still 50 additional GB/s on top of the DDR3 bandwidth.
Also the eSRAM is only 32mb in size which questionable for holding HD frames (especially 1080p) with anything intensive going on.
Again, it works similarly to the eDRAM of the X360. It's not there for size, it's there for bandwidth. It mainly works as a frame buffer, where you can free up the bandwidth of the main RAM to do other things. A 32-bit 1080p frame is typically 8.3MB. With 32MB, you can have three 1080p frames of 32-bit in there. It's almost four actually, missing around 1MB, depending on how much the 32MB actually is in bits. There's more than enough there to have a stencil buffer, color buffer and depth buffer at the same time. And sometimes these are even reduced to be 24-bit (particularly the stencil buffer), allowing for more freedom inside the eSRAM. A 24-bit 1080p frame is 6.2MB.
There's a reason why 10MB still helped for the X360. Although it wasn't large enough for 'free' MSAA x4, it still added flexibility. a 32-bit 720p frame is typically 3.7MB. When they reached the 10MB limit they resorted to
tiling, to still make it useful. Same thing will probably happen with the XBO and its eSRAM.
The latency disparity between gddr5 and ddr3 ram is small
Not really. GDDR5 latency is actually extremely high compared to DDR3. Basically, if the DDR3 latency is 20ns, it's 200ns for GDDR5. Doesn't seem like much, but your CPU thinks it's a lot.
and as far as graphical data rendering is concerned bandwidth > latency. latency is more important for general purpose computing which is why in the "PC" market they still use DDR3 to run the Wintel (windows/intel) environment. PS4 is a gaming console not a general purpose PC.
I said the same thing, but, I also said that even though they are not that many, there are still games that are more CPU intensive than GPU intensive. Let me give you a few links, you can verify it yourself. Note that these are either different CPU speeds, or different CPUs all together, not based on latency. However, having latency is equivalent to reducing the clock speed, since latency stalls CPUs like crazy, reducing the amount of clock cycles that are actually doing something. In a few of these examples, you can literally get double the framerate with double the cpu clock. Ten times the latency can literally kill your performance by having half the framerate, and make it unplayable, despite the great GPU.
First, what the average game looks like when the CPU doesn't matter (look at the framerate):
Battlefield 3
And now what it looks like when it does:
Skyrim
Borderlands
SimCity
Crysis 3
Video cards with both types of ram have been tested. GDDR5 always out-preforms ddr3 on same video card for PC games. Why do you think all high end desktop cards use GDDR5 and not DDR3. Only low end cards use DDR3.
You should check this page of the thread out.
Two identical video cards put to the test with one having GDDR5 and the other having DDR3.
http://freestepdodge.com/threads/xbox-one-revealed.2850/page-2
Resident evil 5 was able to achieve double the frame rate on GDDR5 over its DDR3 counter part and both have the same video card.
Nothing I don't already know.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814130913
We are talking about graphical power right?
These are
Gaming consoles not PCs.
Read above, why the CPU might still be important for some games. Yes, most games will benefit from GDDR5, a select few might not. Is that so hard to understand? I also mentioned why DDR3 makes sense for the XBO since MS is pushing multitasking and three OSes running at the same time and stuff... Not great for graphics maybe, but it might be good for the overall experience, IF that's your thing. Aside from that, remember the little freezes the PS4 was having with multiple games at E3? It might very well be that the CPU was stalling and causing that, since the CPU needs to feed the GPU. No feed = no picture change. Better coding will probably fix it though... I hope.
Actually the CPU is becoming less relevant since most of the traditional CPU tasks used for gaming can now be done on the GPU which is far more efficient than running it on the CPU.
Most? No.. Some. GPUs are great for parallel processing. CPUs still outdo them at serial processing. Parallel are stuff like physics. Serial is stuff like AI (nowadays anyway) and post processing. Read the first answer to the question on
this page.
Please see all of my posts on this page that are in response to Raansu's question about the CPU's low clock-speed.
http://freestepdodge.com/threads/xbox-one-revealed.2850/page-7
Again, nothing I don't already know.
Oh It will show in sandbox world games like GTA. They are always starving for more ram in that genre.
No disagreement there.
I don't give my 'opinion' on something, if I don't know what I'm talking about. Why is it that people try so hard to avoid the possibility that the Xbox One might actually have an edge in some circumstances, even if it might be rare? We all know the PS4 has the advantage in general. We gain nothing by repeating that a thousand times. The interesting things are discovered when all the differences are explored, not just the ones that make the PS4 seem better.