Page 2 of 4 FirstFirst 1234 LastLast
Results 11 to 20 of 31

Thread: Crytek: Crysis 2 to push Xbox 360, PS3 capabilities to the limit

  1. #11
    Join Date
    Aug 2002
    Posts
    6,327
    Rep Power
    0

    Default

    Dual Octa-Core with a 256 MB Nvidia 7800 vs a single Tri-core with a 512 MB 4890 ?

    Which would you pick ?

    Eventually, when Shaders are moved back to CPU and PROPERLY optimised (DX12+ and Windows 9) the extra cores will help, but until then the best system should be obvious.
    Let Them Hate, So Long As They Fear.
    You do not know whereof you speak,and your words are empty things.
    Listen and gain Wisdom.

    http://twitter.com/nestersan

  2. #12
    Join Date
    Oct 2006
    Posts
    9,074
    Rep Power
    27

    Default

    4890? wha kinda exaggeration dat? And it depends on the game

    I will have to read up on the 360 gpu but i doh think it that much better, just some special components that gives it aa and that eDRAM that is too small for some things.
    This posting is provided "AS IS" with no warranties, and confers no rights.
    You assume all risk for your use. © 2006 Azix Solutions.
    All rights reserved.

    Dropbox: http://db.tt/8qVS35lo

  3. #13
    Join Date
    Aug 2002
    Posts
    6,327
    Rep Power
    0

    Default

    the ATI GPU is a Unified shader dx9/10 Hybrid with Tesselisation.
    Let Them Hate, So Long As They Fear.
    You do not know whereof you speak,and your words are empty things.
    Listen and gain Wisdom.

    http://twitter.com/nestersan

  4. #14
    Join Date
    Oct 2006
    Posts
    9,074
    Rep Power
    27

    Default

    THat cant do hdr. lolz......
    This posting is provided "AS IS" with no warranties, and confers no rights.
    You assume all risk for your use. © 2006 Azix Solutions.
    All rights reserved.

    Dropbox: http://db.tt/8qVS35lo

  5. #15
    Join Date
    Aug 2002
    Posts
    6,327
    Rep Power
    0

    Default

    Quote Originally Posted by semitop View Post
    THat cant do hdr. lolz......

    Devs will have to do a custom solution for HDR such as NAO32 which is not the standard way to do HDR. Even with NAO32, RSX really doesn't have the bandwidth to do AA and HDR (or the like) at the same time and it shows such as VF5 for both platforms. VF5 on the 360 has higher amounts of AA than the PS3 version. See also The Darkness which has HDR and AA on the 360 but only HDR on the PS3 version.
    I do not know how much the cell can be used to make up for this, but in a thread kind of like this one long ago a programmer pointed out that whatever trick you can use on the PS3 to assist in doing HDR and AA using the CPU you can also use on the 360 to equal or greater effect. In the end the RSX was either a poor choice or the 360 GPU is just that good.
    PS1 graphics in late 1994, and all through 1995 were ahead of anything the most expensive PCs with the best PC graphics cards.

    PS2 graphics rendering performance in 2000 was ahead of the highest end Nvidia cards of the time, GeForce 2 GTS and GeForce 2 Ultra.

    PS3? lol. comes out in late 2006 with a GPU that can be concidered upper-mid range
    by mid-2005 standards, and low-end by the time PS3 is released. WTF.
    Xbox360 does AA and HDR almost as a native hardware effect. That is why the eDRAM there. It takes care of both those things (and much more) without even touching the main die. Current GPU's sans the very recent XT cards aren't very adept at doing both, and even those cards until recently too a nice performance hit (still not as "free" as the 360's). Also, the FP10 blending solution used by Xenos is AWESOME for 360 and is currently producing the best HDR results with least amount ov visual inconsistencies (for now).

    NVIDIA CANT do AA+HDR at the same time unless specially coded in the game engine/driver to use 1.5x1.5 supersampling but it will have a bigger perfomance hit and will produce blurry textures.

    NAO32 is a format that gained some fame in the dev community when ex-Ninja Theory programmer Marco Salvi shared some details on the technique over on the beyond3D forums. Used in the game Heavenly Sword, it allowed for multi-sampling to be used in conjuction with HDR on a platform (PS3) whose GPU didn't support multi-sampling of floating-point surfaces (The RSX is heavily based on Nvidia G70). In this technique, color is stored in the LogLuv format usinga standard R8G8B8A8 surface. Two components are used to store X and Y at 8-bit precision, and the other two are used to store the log of luminance at 16-bit precision. Having 16 bits for luminance allows for a wide dynamic range to be stored in this format, and storing the log of the luminance allows for linear filtering in multi-sampling or texture sampling. Since he first explained it other games have also used it, such as Naughty Dog's Uncharted. It's likely that it's been used in many other PS3 games, as well.

    My actual shader implementation was helped along quite a bit by Christer Ericson's blog post, which described how to derive optimized shader code for encoding RGB into the LogLuv format. Using his code as a starting point, I came up with the following HLSL code for encoding and decoding:

    // M matrix, for encoding
    const static float3x3 M = float3x3(
    0.2209, 0.3390, 0.4184,
    0.1138, 0.6780, 0.7319,
    0.0102, 0.1130, 0.2969);

    // Inverse M matrix, for decoding
    const static float3x3 InverseM = float3x3(
    6.0013, -2.700, -1.7995,
    -1.332, 3.1029, -5.7720,
    .3007, -1.088, 5.6268);

    float4 LogLuvEncode(in float3 vRGB)
    {
    float4 vResult;
    float3 Xp_Y_XYZp = mul(vRGB, M);
    Xp_Y_XYZp = max(Xp_Y_XYZp, float3(1e-6, 1e-6, 1e-6));
    vResult.xy = Xp_Y_XYZp.xy / Xp_Y_XYZp.z;
    float Le = 2 * log2(Xp_Y_XYZp.y) + 127;
    vResult.w = frac(Le);
    vResult.z = (Le - (floor(vResult.w*255.0f))/255.0f)/255.0f;
    return vResult;
    }

    float3 LogLuvDecode(in float4 vLogLuv)
    {
    float Le = vLogLuv.z * 255 + vLogLuv.w;
    float3 Xp_Y_XYZp;
    Xp_Y_XYZp.y = exp2((Le - 127) / 2);
    Xp_Y_XYZp.z = Xp_Y_XYZp.y / vLogLuv.y;
    Xp_Y_XYZp.x = vLogLuv.x * Xp_Y_XYZp.z;
    float3 vRGB = mul(Xp_Y_XYZp, InverseM);
    return max(vRGB, 0);
    }


    Once I had this implemented and worked through a few small glitches, results were much improved in the 360 version. Performance was much much better, I could multi-sample again, and the results looked great. So once again things didn't exactly work out in an ideal way, but I'm pleased with the results.
    So, he took a PS3 HDR technique used in Commercial Games (Uncharted and in Heavenly Sword) Used it on the 360 and got better results. ROTFL...LOLERS

    You no tired for the UFO crash ?? Uncharted is the PS3's BEST game, yet the techniques used work better on the 360...
    FAILTALITY...


    -The 360 has 512MB of unified memory, which means it's shared by both the CPU and the GPU. You don't have all of that available to you, since some of it is taken up by the console's "OS" and some will also be taken up by the .NET Compact Framework. You'll also be working from the managed heap, rather than directly working with native memory.

    -You can only execute pure managed code on the 360. You can't, for example, P/Invoke into a non-managed DLL.

    -As far as GPU shaders go, you're pretty unrestricted. You can use SM3.0 HLSL, or you can also write portions of your shaders in the GPU's native microcode. This is really very nice...it lets you do things like un-normalized texture addressing, full texturing capabilities in the vertex shader, or directly fetching an element from a vertex stream. The microcode set is referred to as xvs_3_0 and xps_3_0.

    -The 360's GPU is different from your average PC GPU in that it has an eDRAM framebuffer. The eDRAM is 10MB in size, and has tremendous bandwidth (256GB/s). What this means is that writing out to the framebuffer or reading it back for blending is very very quick. Multi-sampling is also very quick, since again you don't have the bandwidth problem. In fact MSAA would be "free" if it weren't for tiled rendering...you see the downside of eDRAM is that if your render-target + z-buffer is too big to fit in eDRAM, you have to render to it in tiles. This means you render one portion of the target, then another. This isn't so bad, except for the fact that any geometry that's on the edges of 2 tiles has to be drawn twice. If you're not doing scenes with hugely complex geometry you probably won't even notice tiling (it happens automatically). To figure out whether you're going to tile you need to count the amount of bytes per pixel and then multiply by resolution. So for example if you're rendering to the Color format which is 4 bytes per pixel and you're using Depth24Stencil8 which is also 4 bytes per pixel, you have 8 bytes per pixel total. When multi-sampling, you multiply this amount by the number of samples (so 4xMSAA would by 32 bytes per pixel). 1280 x 720 with 4xMSAA would be ~28MB, so you'd need 3 tiles.

    This is what the DF guy said might happen with FF 13

    -Be prepared to get CPU-bound really quick if you're doing anything non-trivial. DrawPrimitive calls are extremely expensive on the 360...I've seen my framerate go from about 70 to 30 just from going from 24 DP calls to 34. Instancing is a must if you need to draw a lot of meshes...there's a good sample on the CC website. By the same token if you're doing any really fancy logic on the CPU that's not graphics-related, you'll probably need to run it on another thread on a different core since your main thread can get bogged down pretty quick.
    Need to be skilled at multi-thread programming (just like PS3)

    -Watch out for performance pitfalls with the .NET Compact Framework. Things like Garbage Collection compaction and virtual function calls are much more expensive than they are on the PC. I suggest reading this blog for tips. Just remember to keep your live object count as low as possible, and you should be okay.

    -Floating-point performance is not so great on the 360 CPU. Most of the fp power is in the vector units, but you have no access to those through XNA.

    -Avoid the surface formats that are larger than 32bpp. Mainly HalfVector4 and Vector2. Their performance is generally pretty terrible, and I've run into all kinds of driver bugs with them. This means you can't do HDR in straightforward way, but there are other options. There's an entry on my XNA blog where I talk about how I got around it.

    -Watch your texture sampling bandwidth. Framebuffer access may be quick, but reads from textures are limited by the 22.1GB/s read bandwidth. This may be quite a bit less than what you're used to, if you're prototyping on a higher-end card like an 8800. This can be especially painful in scenarios where you want to take multiple samples per pixel, like PCF for shadow maps or SSAO.

    -Prototyping and developing on the PC is a good idea since you have access to PIX, but make sure you test pretty often on the 360. You may need to optimize for quite few scenarios if you need to keep the framerate up.
    Last edited by Nestersan; Jun 6, 2009 at 01:02 PM.
    Let Them Hate, So Long As They Fear.
    You do not know whereof you speak,and your words are empty things.
    Listen and gain Wisdom.

    http://twitter.com/nestersan

  6. #16
    Join Date
    Oct 2006
    Posts
    9,074
    Rep Power
    27

    Default

    edram is only 10mb and the digitalfoundry folks already said thats not enough even for 2xaa at720p with HDR.

    Be prepared to get CPU-bound really quick if you're doing anything non-trivial. DrawPrimitive calls are extremely expensive on the 360...I've seen my framerate go from about 70 to 30 just from going from 24 DP calls to 34. Instancing is a must if you need to draw a lot of meshes...there's a good sample on the CC website. By the same token if you're doing any really fancy logic on the CPU that's not graphics-related, you'll probably need to run it on another thread on a different core since your main thread can get bogged down pretty quick.
    Need to be skilled at multi-thread programming (just like PS3)? FAIL. lol, this doesnt sound ntn like the ps3 multithreading. This is saying the 360 cpu is RUBBISH.

    Post links man

    Watch out for performance pitfalls with the .NET Compact Framework. Things like Garbage Collection compaction and virtual function calls are much more expensive than they are on the PC. I suggest reading this blog for tips. Just remember to keep your live object count as low as possible, and you should be okay.
    FAILOSAUROUS...

    Floating-point performance is not so great on the 360 CPU. Most of the fp power is in the vector units, but you have no access to those through XNA.
    FAILaTioN

    Which is more important to realization of a game world? AA or proper HDR? I havent really noticed aa problems in the games i play but the thing i wish sony would do right is scaling. Doing only HD resolutions cant work when i am using a 1680x1050 monitor to play.
    Last edited by semitop; Jun 6, 2009 at 01:21 PM.
    This posting is provided "AS IS" with no warranties, and confers no rights.
    You assume all risk for your use. © 2006 Azix Solutions.
    All rights reserved.

    Dropbox: http://db.tt/8qVS35lo

  7. #17
    Join Date
    Aug 2002
    Posts
    6,327
    Rep Power
    0

    Default

    Scaling has nothing to do with Sony..It has to do with the fact that the only resolutions that are HD (As far as the Broadcasting Commision, and by extension, movies, consoles, TV) are 1280x720 and 1920x1080, they have no responsibilty other to really support any other resolutions than those.
    It is your fault for getting a non native HD device...

    How are those things fails ? Considering he is comparing it to a PC and not to a PS3 ?

    So despite having two times the FP power, the PS3 has yet to demonstrate anything that cannot be done on the 360 ?
    Call out GOW and KZ2 and stuff all you like, those games are EXCLUSIVEs with huge budgets, Kz2 took 4 years to get to where it is.
    Four years was almost the lifespan of the PS1...

    M$ has no dev teams who make games like that. They have a very ****ty set of internal Devs. The only true comparison possible are multi-platform games. and the 360 is been winning so far...

    Metal Gear will be the decider. Cause that was an exclusive and everyone screamed it was not possible, so time will tell.

    Those are problems specific to the Xbox and it's OS. the Sony devs have just as much weird issues, like crazy bandwidth limitations, and the fact that the SPE's need synchronisation or u waste endless cycles doing nothing.

    Last edited by Nestersan; Jun 6, 2009 at 02:21 PM.
    Let Them Hate, So Long As They Fear.
    You do not know whereof you speak,and your words are empty things.
    Listen and gain Wisdom.

    http://twitter.com/nestersan

  8. #18
    Join Date
    Oct 2006
    Posts
    9,074
    Rep Power
    27

    Default

    Quote Originally Posted by Nestersan View Post
    Scaling has nothing to do with Sony..It has to do with the fact that the only resolutions that are HD (As far as the Broadcasting Commision, and by extension, movies, consoles, TV) are 1280x720 and 1920x1080, they have no responsibilty other to really support any other resolutions than those.
    It is your fault for getting a non native HD device...

    How are those things fails ? Considering he is comparing it to a PC and not to a PS3 ?

    So despite having two times the FP power, the PS3 has yet to demonstrate anything that cannot be done on the 360 ?
    Call out GOW and KZ2 and stuff all you like, those games are EXCLUSIVEs with huge budgets, Kz2 took 4 years to get to where it is.
    Four years was almost the lifespan of the PS1...

    M$ has no dev teams who make games like that. They have a very ****ty set of internal Devs. The only true comparison possible are multi-platform games. and the 360 is been winning so far...

    Metal Gear will be the decider. Cause that was an exclusive and everyone screamed it was not possible, so time will tell.

    Those are problems specific to the Xbox and it's OS. the Sony devs have just as much weird issues, like crazy bandwidth limitations, and the fact that the SPE's need synchronisation or u waste endless cycles doing nothing.


    hmmm? It is their responsibility to consider that their customers might not be playing on a tv or even an HD screen. xbox supports 1680x1050 and i would hope that ps3 will get some firmware update to help. There is no reason to ONLY support HD resolutions.

    So far is right. I have been playing fight night round 4 demo on ps3 and even on my lame monitor it looks decently sharp. The graphics are great tho i would like to see more character shadows (on themselves and the other character when they punch etc). Soon enough the only thing that will matter to differentiate are exclusives and the ps3 pwns there. 4 years is a bad development cycle for aaa games? Well look at uncharted which didnt have that time from ps3 release and release of the 2 games.
    Last edited by semitop; Jun 6, 2009 at 02:20 PM.
    This posting is provided "AS IS" with no warranties, and confers no rights.
    You assume all risk for your use. © 2006 Azix Solutions.
    All rights reserved.

    Dropbox: http://db.tt/8qVS35lo

  9. #19
    Join Date
    Jan 2005
    Posts
    875
    Rep Power
    0

    Default

    Quote Originally Posted by Lanza View Post
    haha u goin gwaan til the fanboys shoot u in ur foot innuh



    what makes u think that??
    almost every gaming website that compare the two say the same thing
    Video Game Technician in Jamaica 3968609
    Ultra Aluminus Black Case/ASUS Rampage Formula 2 /Intel Core 2 Quad Q6600 oc 3.6/ZALMAN CNPS9700/8GB G-Skill ripjaw DDR3 1600/EVGA GeForce GTX super oc 470 edition /SAMSUNG Black 32" Widescreen hdtv/etc/

  10. #20
    Join Date
    Jan 2005
    Posts
    875
    Rep Power
    0

    Default

    Quote Originally Posted by pezz View Post
    and here i thought being an technician you would be one of the intelligent ones, shame on you man

    hey im not hatin or ntn but fact sony ps3 is a powerhouse to the 360 if sony had make the ps3 easy to develop for like the 360 trust me it will be a mighty battle even ati who made the 360 gpu comes out and say not in so many words say that the ps3 gpu is far ahead of it time but because the ps3 have to share 256 xdr memory it hard u need read spec list and see how the memory is share in the ps3 cause the ps3 os use 96 of main memory and it dont release it back to the system there is so much in the bottom line bad memory management and hard of used in the devkits developers are just finding out how to code the software titles better, look at e3 09 ps3 games look at background how they starting to show damage more than 360 cause they finding out how to push the cell ps3 can show their 5 to 10 life

    this what ea said at e3 same thing im trying to say

    He then goes on to talk about the PS3’s potential, and whether more can be squeezed out of Xbox 360.

    "Sony has a lot of good games this year. If you go to their [E3] booth, there's a very consistent, high quality product line-up and that will help them. I do think that we'll see developers inside the organisation getting to understand the PS3 better and I think that we're getting more power out of PS3 right now... I think that we've maxed out the 360 but we haven't maxed out the PS3."
    Last edited by **ScarFace**; Jun 6, 2009 at 03:45 PM.
    Video Game Technician in Jamaica 3968609
    Ultra Aluminus Black Case/ASUS Rampage Formula 2 /Intel Core 2 Quad Q6600 oc 3.6/ZALMAN CNPS9700/8GB G-Skill ripjaw DDR3 1600/EVGA GeForce GTX super oc 470 edition /SAMSUNG Black 32" Widescreen hdtv/etc/

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •