Nobody really cares what we do in our labs, so we spend most weekends like this
http://gamrconnect.vgchartz.com/thread.php?id=145884Beyond that, further information is sketchy and unreliable. DaE reckons that the current devkits were dispatched to studios in February, and feature Intel CPUs and a graphics card that carries the NVIDIA brand - but he doesn't identify either part more specifically. He also claims that the Durango kit features more than 8GB of memory (other sources have suggested 12GB), and that it is 64-bit in nature - at this point it's worth bearing in mind that dev hardware typically features double the RAM of retail kit in order to accommodate debugging tools and other systems. DaE also says that Microsoft is targeting an eight-core CPU for the final retail hardware - if true, this must surely be based around Atom architecture to fit inside the thermal envelope. The hardware configuration seems difficult to believe as it is so divorced from the technological make-up of the current Xbox 360, and we could find no corroborative sources to establish the Intel/NVIDIA hook-up, let alone the eight-core CPU.
However, another source, speaking in the wake of the Durango developer meet-up that took place in London just before GDC this year, corroborates that the system is 64-bit in nature, adding that current DirectX 11 engines developed on PC can be ported to 64-bit and that they will run with no problem on Microsoft's new console. The platform holder achieved immense success by basing the Xbox 360 workflow closely on existing PC development tools and the DirectX API and it looks as though it aims to continue that recipe for success with its next-gen offering.
In an effort to prove the authenticity of his devkit leak, DaE also leaked a screenshot of Microsoft's Visual Studio coding tool, apparently set-up for Durango. None of the developers we spoke to disputed what they were seeing. "As an aside it's got my favourite MS return code in there: ERROR_SUCCESS," chuckled one source.
The presence on this screen of the "immintrin" element strongly suggests that the Durango coding environment is built around x86 CPU architecture, supporting the AVX (advanced vector extensions) instruction set that was added in last year's Sandy Bridge revision. However, AVX is now supported on some of the most recent AMD processors too.
Of course, this anonymous-looking PC box is not the machine Microsoft will be selling at retail next year - it's alpha kit, assembled from existing parts to best emulate the hardware configuration of the console and actual silicon will be in the closing phases of development as we speak. In a world where games typically take two years or more to develop, these units are lashed together to give game-makers a headstart in the absence of final hardware - Xbox 360 alpha kits were little more than PowerMacs with ATI graphics cards installed, and were still being used to demo code as late as E3 2005, mere months before the retail launch.
We are made of the same stuff of dreams
I am really not sure why the sudden focus on "64 bit design"... PS2 was (in parts) a 128bit design... and there's no reason to even use double precision at all for games... at least not from what I can see. We're not even fully "single precision" in games yet. (single = 32bit, double = 64bit pointers, a lot of stuff uses (for speed purposes) half = 16bit)
The only reason that it is NEEDED is the 4GB RAM barrier (a 32bit pointer can only address a memory range of 4GB, but even here, there are workarounds), famously, or rather infamously known, because Windows XP couldn't use more than 3.x GB of RAM (minus the GPU RAM, which is also addressed in the same manner, and thus is substracted from this number).
And guess what... 8 cores (if ATOM), will suck ass. PS3 had "7" cores. Different ones, sure, but also a high number. Rumor has it, that Sony might use Jaguar cores for PS4... Jaguar is rumored to be a vectorized version of Brazos or something. VERY low power, but with the added vector units, it might be capable enough to run games efficiently. And by lower power, I mean something below 10/15 watts for the whole CPU, with 2 cores. Assuming a PS4 design will use it, an eight core system wouldn't run hot at all. Though the general purpose performance of these CPUs remains to be seen... I can't see them doing any heavy lifting at all.
Kept you waiting
I too laughed at the "64 bits" focus. The video game industry started working with 64 bit parts back in 1996. Since then we've had various hybrid machines of 16, 32, 64 and 128 bits. PC's have had graphics cards with 512 bit buses.
Assuming the final consumer hardware has 4 GB of RAM, I wonder how much is dedicated to the OS and Kinect 2 leaving how much for CPU/GPU (separate or pooled?).
There were rumors that the 8GB machine sets back 3GB for the OS... that's so insane^^ My current PC has 6GB of RAM and without gaming or heavy lifting like compiling, the OS never takes more than 1.5GB (incl. Chrome Tabs etc).... and that's a genereal purpose OS with which I can do anything (not like a console MS will release).
512MB should be PLENTY, considering my Raspberry Pi can easily run OpenElec with a 50:50 split between OS and GPU (i.e. the FULL OS runs on 128MB of RAM).
Kept you waiting
They will probably alocated 512MB or something as reserved memory so they don't make the mistake sony did in allocating too small an ammount and keeping things like cross game chat from being patched in. If they shoot on the higher side they can always free up that space to developers later on or even use it for new things like multi user kinect video cross game chat or something.
It's a toss up between size and bandwidth. If it's fast, it can be bigger, as the RAM used "per frame" can be sizeable. But if it's slow, it doesn't need to be big, as you can't use the RAM anyways. Sebbbi (graphics programmer for Trials...) at B3D had a good post explaining this in rather "common" language.
I think, in the not too long run that 4GB should be plenty. I'd rather have 4GB of blazing fast memory, than slower 8GB. Most current games barely use more than 2GB... most don't (it's also a compiler thing, because above 2GB it can get problematic, especially if your libraries (that you bought) don't support it). Only a handful use more than that, or even CAN do it. Crysis 1 (not sure about the second one) was one of them. In the final level, it could max out 4GB and crash the 32bit version that way, if you set everything to the max. Skyrim was not compiled with the "LARGEADDRESSAWARE" flag enabled, artificially limiting it to use no more than 1GB of RAM, which caused a lot of problems, as the game could easily get a lot bigger than that on higher settings.
Back in PS3s early days, a lot of games already used or needed more than 512MB overall. Heck, there were even consumer GPUs with more than that amount (8800GTX had 768MB). You'll probably not find a GPU that has more than 4GB of RAM soon... at least not in a normal PC. CAD or Compute stuff maybe... but that's a different class altogether.
Kept you waiting
Could that 3GB be for dev kits specifically? My laptop has 16GB of RAM, it's barely using 4GB total most of the time.
And you shouldn't turn it off unless you have a fast SSD. But what I was suggesting is that I doubt your system is even 'using' the 4 GB on average that you see listed. Given that a console doesn't have a use for a pre-fetching cache system, that would be a lot of RAM going unused. And with streaming textures, tesselation, megatextures, etc... becoming more utilized, it's not as necessary to bump up console RAM like crazy.
Even so... in a "desktop system", anything beyond 4GB is ridiculous. 2GB would be enough, if Windows used a more intelligent caching system. Now, for a "workstation", that's very different, obviously. I've recently ran out of RAM (in a Linux shell, mind you), compiling and linking XBMC for ARM... this happened, because I enabled LTO (link time optimization), which is very memory intensive. Not really something a normal person would do.
Best Buy selling people an Office PC with anything they actually do (or laptops without a GPU) is just insane. Here, the minimum is 4GB, for a netbook, and something like 8GB for a desktop or normal laptop. RAM is cheap and thus can easily be doubled to increase the perceived value of a product (a lot of low key GPUs have more RAM than their very high end brothers because of this very reason).
So... back on topic now. So... how do you say it? It's not the size of the boat, it's the engine. 4GB of ultra fast RAM or 8GB of the same? I'd say go for the bigger one... but does it make sense? If the bandwidth isn't high enough, then there's no real reason to increase it (provided your game doesn't have massive "jumps" in active data sets...).
Let me give you an example. PS3 has about 22.4GB/s of bandwith to and from the GPU to the VRAM. Broken down to VERY simple terms, the rendering pipeline reads "some data" (be it textures, meshes or whatever), processes it (rotation, translation, texturing, filtering) and writes it to the framebuffer (final image). There aren't a lot of cases where it's just reading data, or just writing data, meaning we always go both ways. Now, 22.4GB/s (at 100% efficiency no less, which cannot be achieved, ever) breaks down to 764MB per frame at 30Hz, or 382MB/s at 60Hz. Again, this is at 100% efficiency. And this doesn't involved ANY post processes (which can be very bandwidth intensive) or similar. Just what the GPU can read/write to the RAM. Now, I am not sure, if it can do both simultaneously, though (i.e. full speed writes and full speed reads at the same time, or if it shares the bus). Because if so, then that's HALF the amount, again. PS3 does have an additional IO option with XDR, so that alleviates this issue a bit.
Now think about it... why keep stuff in memory, which you can't use in a given frame? The only (?) reason would be caching. But we have an HDD and a BluRay drive to read data all the time. And we don't switch out all the content inside the RAM any given frame, either. Games like GTA can have a full "swap" of textures within a frame or two (just look behind you). But that's just a "small" part. And since the game is completely abritrary, as to what the player can do, there's no way to anticipate and cache a whole lot of data, either. And what good would it do, if they had more RAM and the same bandwidth? Probably not a whole lot. The PC version of GTA4 has higher resolution textures, but PCs also have MUCH faster VRAM and MUCH more RAM and MUCH faster GPUs.
So... doubling your RAM, in essence, also requires you to double the bandwidth to make use of said RAM. There IS stuff you can do with the additional RAM (and no additional bandwidth), no question, but your game probably won't look better because of it.
Kept you waiting
Well for me, I need more than 4, 8 GB is a min for me for sound engineering, a lot of Real-time plug-ins are resource hogs especially RTAS drums like BFD or Superior. You can bounce down to Audio tracks but messing with latency and playback engine urgh... (but yeah its a workstation as @Segitz mentioned, not a casual pc) Anyway way off topic.
I'd personally just like to see both 360 and PS3 have somewhere between 6-8GB RAM, if only to impress anyone with a current pc like the old days... but after this gen's small memory footprint used in the console, I'm not too hopeful on how far MSony will push their system specs. I don't think the jump in hardware is going to be that great, however the cost for parts must gone down over the past 7(?) years... Maybe they'll pull something out the bag and suprise me.
Even a slow SSD is incredibly faster than a mechanical hard drive. There are some exceptions such as a system without TRIM support but Apple included that with OSX 10.6.8.
Ah... the new m4 Series by... whomever (Crucial or something), when buying the very low end variants, is actually slower than most HDDs. But they're also freaking cheap.
Kept you waiting
There are currently 1 users browsing this thread. (0 members and 1 guests)