That simply isn't true. The eSRAM works similarly to the eDRAM of the X360, except eSRAM is faster and has a few different features. The eSRAM is connected to the GPU, to increase bandwidth for the GPU. The GPU communicates with RAM, CPU, and the eSRAM. The eSRAM mainly communicates with the GPU. Making it somehow pass through the normal RAM completely removes its purpose. Might as well not be there.
Not really. It'll lower its effectiveness, definitely, but it's still better having it than not having it. If its bandwidth is reduced from a 100GB/s to 50GB/s (random numbers), it's still 50 additional GB/s on top of the DDR3 bandwidth.
Again, it works similarly to the eDRAM of the X360. It's not there for size, it's there for bandwidth. It mainly works as a frame buffer, where you can free up the bandwidth of the main RAM to do other things. A 32-bit 1080p frame is typically 8.3MB. With 32MB, you can have three 1080p frames of 32-bit in there. It's almost four actually, missing around 1MB, depending on how much the 32MB actually is in bits. There's more than enough there to have a stencil buffer, color buffer and depth buffer at the same time. And sometimes these are even reduced to be 24-bit (particularly the stencil buffer), allowing for more freedom inside the eSRAM. A 24-bit 1080p frame is 6.2MB.
There's a reason why 10MB still helped for the X360. Although it wasn't large enough for 'free' MSAA x4, it still added flexibility. a 32-bit 720p frame is typically 3.7MB. When they reached the 10MB limit they resorted to tiling, to still make it useful. Same thing will probably happen with the XBO and its eSRAM.
Not really. GDDR5 latency is actually extremely high compared to DDR3. Basically, if the DDR3 latency is 20ns, it's 200ns for GDDR5. Doesn't seem like much, but your CPU thinks it's a lot.
I said the same thing, but, I also said that even though they are not that many, there are still games that are more CPU intensive than GPU intensive. Let me give you a few links, you can verify it yourself. Note that these are either different CPU speeds, or different CPUs all together, not based on latency. However, having latency is equivalent to reducing the clock speed, since latency stalls CPUs like crazy, reducing the amount of clock cycles that are actually doing something. In a few of these examples, you can literally get double the framerate with double the cpu clock. Ten times the latency can literally kill your performance by having half the framerate, and make it unplayable, despite the great GPU.
First, what the average game looks like when the CPU doesn't matter (look at the framerate):
Battlefield 3
And now what it looks like when it does:
Skyrim
Borderlands
SimCity
Crysis 3
Nothing I don't already know.
Read above, why the CPU might still be important for some games. Yes, most games will benefit from GDDR5, a select few might not. Is that so hard to understand? I also mentioned why DDR3 makes sense for the XBO since MS is pushing multitasking and three OSes running at the same time and stuff... Not great for graphics maybe, but it might be good for the overall experience, IF that's your thing. Aside from that, remember the little freezes the PS4 was having with multiple games at E3? It might very well be that the CPU was stalling and causing that, since the CPU needs to feed the GPU. No feed = no picture change. Better coding will probably fix it though... I hope.
Most? No.. Some. GPUs are great for parallel processing. CPUs still outdo them at serial processing. Parallel are stuff like physics. Serial is stuff like AI (nowadays anyway) and post processing. Read the first answer to the question on this page.
Again, nothing I don't already know.
No disagreement there.
I don't give my 'opinion' on something, if I don't know what I'm talking about. Why is it that people try so hard to avoid the possibility that the Xbox One might actually have an edge in some circumstances, even if it might be rare? We all know the PS4 has the advantage in general. We gain nothing by repeating that a thousand times. The interesting things are discovered when all the differences are explored, not just the ones that make the PS4 seem better.
Is that including AA? Any Links? (would like to read more up on it.) generally from what I have read there seems to be a mystery over the actual effectiveness and use of the 32mb esram in the xbox one compared to the situation in the 360, or if its even intended to function the same way.
Why is it that people try so hard to avoid the possibility that the Xbox One might actually have an edge in some circumstances, even if it might be rare?
When talking about games specifically what other real significant noticeable advantages can it have over helping bandwidth?