So yeah... if you want the Steam version of Last Round you've probably already heard about the specs, and you've probably also heard that the new stages Crimson and Danger Zone as well as the soft engine will be missing from the game. We're going to document this information on the front page because it is certainly important news for anyone interested in this PC port.
There is also a rumor floating around that the game will be delayed on PC again. That's not good... more information will arrive to us from Team NINJA next week.
Thanks to community member @Jyakotu for reporting the following PC Specs:
Minimum:
OS :Windows Vista / 7/8 / 8.1 (32bit / 64bit)
CPU: Core i7 870 or more
Memory: 2GB or more
Hard disk: 10.0GB or more of free space
Display: The display capable of displaying at 1280 × 720 pixels
Video Card: VRAM1GB more, more DirecX9.0c
Sound board: Sound board that supports more than DirectX 9.0c
Recommended:
OS: Windows Vista / 7/8 / 8.1 (32bit / 64bit)
CPU: Core i7 2600 or more
Memory: More than 4GB
Hard disk: 10.0GB or more of free space
Display: 1920 × 1080 pixels or more, True Color viewable display
Video Card: VRAM1.5GB more, more DirecX9.0c
Sound board: Sound board that supports more than DirectX 9.0c
This is going to sound immensely hypocritical at first, I'm fully aware: Right now I have a GTX970, and an i5 4690K @ 4.5. I like tech, I dig having new parts. I built this thing with the help of a friend earlier this year and we did the best we could without going into SLI (or burning an immense amount of money on a GTX980, which I don't regret even with the new findings on the 970, which happened about a couple of weeks after the built was done. The 970 definitely gets the job done as far as I'm concerned).
The thing is that I don't upgrade often, if at all. My previous built (done in 2007) had a 8800 GT, which I replaced for a 9800 GT (a horrible trade by all accounts) a couple of years later ONLY because it failed due to a power surge. The 9800 GT was my graphics card until late last year. It served me well and played most of the games that I wanted. EDIT: It pays to say that I've also played extensively on a laptop with an integrated graphics card. It never bothered me, I just had to be selective of which games to play. Also played Metal Gear Rising Revengeance on a Mac Mini with a Intel 4000. It ran at 580p or whatever, all effects turned off, but it ran smoothly and it was a BLAST to play. Super fun.
There is a very particular example of my whole mind-set: I remember playing Skyrim when it came out. All low, 30 fps average. The 9800 (the previous one had already failed when I managed to get my hands on this particular game) was definitely straining there. It ran smoothly for me, no stutter, but everything was turned down or off. I thought the game was absolute trash. Combat was crap, exploration was ok but everything looked the same. They game was ridiculously grey. I simply went and played something else. If I had thought Skyrim was fun back then I would have continued to play because the computer experience for me was OK. Now I installed Skyrim back, have literal 237 mods running (plus a ton of texture replacements, 4K stuff downsampled stuff, etc) and the game runs at about 85 fps if I leave it uncapped (which I don't because that does nothing for me). Is the game more interesting now because it has all the bells and whistles? Surely. But nothing stopped me from simply playing something else when that was outside of my reach; which was exactly what I did. Back them I think I was playing a lot of Splinter Cell and Bioshock. Nothing was working great but I never stopped having fun. Right now, all the bells and whistles that I bought won't turn Super Metroid into a better game (because that's basically impossible anyway).
I heavily disagree with what you said earlier about there being, basically, a correct way to PC game; and if you think that the industry is dying, it is because too many people think like you, which is nothing short of a shame. A lot of games are poorly optimized and you have to brute force your way through, specs matter when you get down to it, but the PC catalogue (specially if you include emulation) is just SO HUGE. There is something for absolutely everyone, no matter your specs or genre disposition. It is sad to see people spread this absolute non-sense that playing at full HD and up is NECESSARY for a good PC experience. Baldur's Gate, one of the best and most expansive RPGs of all time, disagrees.
EDIT: Just wanted to say that I get where you are coming from, I think. There are a lot of people out there that love graphics. And yeah, buying a PC if you love graphics and then playing on low on an integrated graphics card makes no sense. Personally, Skyrim only got fun for me because I turned it into the most beautiful game I have ever seen (and because I managed to merged a gazillion mods together in a way that makes the combat feel responsive and rewarding). But there is also a lot of love for gameplay out there. Some people want both all the time, some people value one over the other. I happen to value graphics a lot, but I will always place gameplay first. As long as the game is working, is playable, I'm happy. I'm assuming I'm not the only one, I'm not that special.
To me, all this comes down to the fact that optimization is not a thing on upgradable hardware systems when it comes to development teams. It's an industry standard to assume the customer will be willing to upgrade hardware.
It's not about being 'correct' - if that's the vibe you're getting from me, you've gotten what I'm saying completely wrong. It's about being fully featured. It's about pushing boundaries. It's about advancing AI. It's about having the absolute best netcode possible. Fundamentally, it's about experiencing a title in it's entirety, rather than cut down, bastardized compromise. And the more people run medicore spec and consider it acceptable, the more limited future titles become, (see the patch process for DA:I, that has degraded the title to make allowances for mid-range players who whined that their machines couldn't cope.)
Do you understand where I'm coming from? Owning a PC is not about aiming for parity with the console release And if there isn't a considerable disparity between the PC market and the console market, then there's no reason for lazy console manufacturers to up their game. Hence - the PC market is dying. Sure, there are titles and genres currently (and I stress, currently) less favourably represented on current gen consoles. This is not going to remain the case for much longer.
I like the options, IMO. I think PC would be alot more viable if hardware manufactuers weren't so greedy. If they can obey the HAL in Windows, there's no reason why a more unversal, standard HAL cannot be followed instead. Should such a HAL exist, we could play games without the overhead of an OS and all the things that come with it. So many wasted cycles in the round robin, and so many alternative OSes that are pushed aside because of this.
...It bothers me that you don't seem to realize how absolutely insane this statement is. I'm a musician, I deal with this sort of mentality a lot. There are a lot of guitar players out there that will limit their knowledge and practice to playing as fast as they humanly can, and they think that is good music. Who am I to say that playing fast is bad music (it is not), but it is definitely a very limited way of thinking. Some people don't want to compete, most don't really want to even argue, most are in it just to have some fun. If people "settling for less" is ruining your perception of anything so heavily there is something absolutely wrong with your expectations of... I don't know, man. Life.
Heaven knows, i'd be on all your cases for even considering this game. When i picked up DoA at all i was already compromising my expectations of a video game. I could go on for hours how we've all settled for less with this game. And the state of software, today, is a whole new can of worms that's been festering before I was even born.
Is it? That's not my experience. I wouldn't play Skyrim unmodded again, because I don't think the core experience is actually any good; but I have no issue going back to earlier PC titles or ports. Even with the machine that I have now I still sometimes play on the Mac Mini. Civilization, some LEGO Games, the actually two or three indie games that I think are worth anything. I see no problem. I have not being spoiled for life because I dropped some money on a nice card. My favorite game of all time still is Super Metroid. And I simply can't honestly believe people haven't done better because (let's say) 4 out of 5 people that play on PC couldn't give a lesser fuck about specs if they tried. That doesn't even make any sense.
And look at minecraft and, more importantly, but less well known, cave story. These 2 ugly games have had major impacts on both PC and console gaming, and completely changed the economics of the industry.
And it is not about accepting parity. I'm skipping on DOA5LR on the PC because it is a shitty port. I have skipped on many other games because of lack of features or outright problems (bugs, glitches). Why get a PC? Because consoles (with the PS3 and 360, really, I just realized late) have ceased to be relevant for me. The whole point of console gaming was the convenience. The plug and play, the cohesive experience. Now everything needs to be installed, patched, DLC galore, payed online. I had a fat PS3 that couldn't run Gran Turismo 6. I went online and checked that the game only really works correctly in the new super slim model, because of power consumption... thingies. I don't know if I believe that but it was still true in my particular case. Why would I drop money on this? Consoles have done absolutely all they can to be turned into shitty PCs, so I just got a decent PC. Why buy a console if none of the convenience is there? The only reason left are the exclusives, but with more and more titles coming to PC in one capacity or another, the list of actual exclusives is rather short. Personally, there is nothing all that appealing on the MS or Sony side. I like Nintendo games, but I absolute hate their hardware.
I was a major Gran Turismo fan up until 4 but 5 of absolute garbage (I bought 6 in the hopes that it would have improved and couldn't even play it, so there is that). I bought my PS3 basically to play Metal Gear 4. I don't outright regret it, but I have no desire of buying a console after that experience, specially as I see no sign of business models being improved.
So it's safe to say that I have at least you to play with?
I think Brazil is a very peculiar case because the MSRP of the PS4 here is so insane (even though no one really practices it). Personally, I find it very hard to be interested in the new consoles, and it has nothing to do with price, as I said above.
Not really, actually. I've been on the PC on and off. What I favor most about it is emulation and the backwards compatibility, I really like to replay old classics. But there are certain titles on the PC which I have never really touched like System Shock 2 or Half-Life. A point off of my street cred, I assume. I remember getting really into Baldur's Gate around the turn of the millennium when a lot of people in my high-school class were playing it at the same time. One guy started playing and told everyone it was awesome, than we all got it and "the next day" when we realized the 5 of us had completely different stories to tell about the same game we were completely blown away.
The greatest thing about the PC scene is that you don't have to void your warranty to make your own content. And i know the big boys in the industry hate that, since we can still make money without their names, and we can take their customers, too.
You all need to upgrade your PC to Windows Doge so that the hidden stages and extra features may be revealed. Such speed, much process, yes. Wow Dogetel.
You all need to upgrade your PC to Windows Doge so that the hidden stages and extra features may be revealed. Such speed, much process, yes. Wow Dogetel.
Completely irrelevant analogy. Technology is not a static industry, nor should it become one. Progression isn't subjective - it's tangible. You're either pushing the dye and the software, or you're not.
If people thought like you, we'd all still be playing Pong.
Completely irrelevant analogy. Technology is not a static industry, nor should it become one. Progression isn't subjective - it's tangible. You're either pushing the dye and the software, or you're not.
If people thought like you, we'd all still be playing Pong.
Progression is subjective. Should chips be pushed for speed, power efficiency (which is a bigger deal than you think for computer technology), or capacity? The same can be said of software. Normally speed optimization includes smaller things, but, for example, "xor eax, eax" is 2 bytes, and "and eax, 0" is 3 bytes. "mov eax, 0" is both slow and 3 or 4 bytes. "xor rax, rax" (3 bytes) is a 64bit instruction, but in 64bit computing, numbers are automatically zero-extended to 64bit. Take a guess the most common equivalent of this instruction you're going to see in TN's code?
mov rax, 0 (7 bytes)
Is that a common instruction? You better believe it. Add 1 byte to the count, and you have the translation from compiler to assembly for the "return 0;" command from C languages. It's pretty fair to say that it's going to happen alot. Not that it eats up that much alone, but that's merely returning 0. How many functions are going to return other things? And that's return statements. What do compilers usually do at the beginning of functions? (Here's a hit, they pull all variables into a register, then stick them back almost immediately after as if they've actually changed. This eats up space and cycles for every variable you see.) I tell people about these issues, including those who are smart enough to know what to do about it. You know the answer? "RAM is cheap, so they'll be willing to upgrade. And no one notices those speed drops, anyway, because it's the 90% of code that runs 10% of the time." (The percentage is a reference to the optimization rule that only the 10% of the code that runs 90% of the time should be optimized.) Which is a lie on the latter part, because functions are always being called.
Another good one is when they align each function to have it's own 512byte page, regardless of the size of the function (even if it's only about 15 bytes). Why not even try to crunch those down instead of wasting all that cache (and this is why Intel was winning in benchmarks against AMD, because AMD worked on instruction speed, and Intel caught that the biggest gains could be had from preventing cache misses which are caused by these awful compilers [you'll find that hardware innovation usually ends up being trying to make the machine run bad coding practices better {and then those who know the good rules get screwed over when their shortcuts don't work into microcode well since the good instructions are rarely used}])?
And don't get me started on the hell that is dynamic linking on windows (I cry a little on the inside when people release a 1MB program that does little to nothing, because they're statically linking libraries that are included with windows). Or when Windows doesn't exactly have a sharable library directory for non-MS DLLs that are common enough to be sharable between programs. And why some steam games have a steam_api.dll, instead of steam having one that they share. zlib1.dll should be pretty standard by now. And games are still installing DirectX redist DLLs in private directories? And why do full installer files still exist instead of quick remove programs?
I know i'm knit-picking, but these actually add up in the long run. And to talk about wasting technology, i'm pretty sure it wroks the same way for console games. These kinds of practices are where companies say that console X can't handle something, only to be shut up by another game that actually pushes the limits. The 3ds is a great example. I know some effort went into the graphics code for revelations (every so often you can catch a glimpse of an empty frame buffer being swapped in due to a bug [usually during area loads]). And tekken, while not pushing boundries, doesn't drop from 2d to 3d, framerate wise (I'm told it's 60 FPS, but without something to compare it to i can't tell). Dimensions? Yeah... And dimensions isn't even that heavy on the graphics.
EDIT: And if your game runs well on 32bit, why are you using 64bit code? Worse yet, when you include libs for both when the 32bit versions are also on the 64bit platforms? Just use the 32bit since their instructions inherently take less space (SSE optimizations i can understand, but we don't need an extra 10 libs when the SSE code probably isn't going to be in them).
Progression is subjective. Should chips be pushed for speed, power efficiency (which is a bigger deal than you think for computer technology), or capacity? The same can be said of software. Normally speed optimization includes smaller things, but, for example, "xor eax, eax" is 2 bytes, and "and eax, 0" is 3 bytes. "mov eax, 0" is both slow and 3 or 4 bytes. "xor rax, rax" (3 bytes) is a 64bit instruction, but in 64bit computing, numbers are automatically zero-extended to 64bit. Take a guess the most common equivalent of this instruction you're going to see in TN's code?
mov rax, 0 (7 bytes)
Is that a common instruction? You better believe it. Add 1 byte to the count, and you have the translation from compiler to assembly for the "return 0;" command from C languages. It's pretty fair to say that it's going to happen alot. Not that it eats up that much alone, but that's merely returning 0. How many functions are going to return other things? And that's return statements. What do compilers usually do at the beginning of functions? (Here's a hit, they pull all variables into a register, then stick them back almost immediately after as if they've actually changed. This eats up space and cycles for every variable you see.) I tell people about these issues, including those who are smart enough to know what to do about it. You know the answer? "RAM is cheap, so they'll be willing to upgrade. And no one notices those speed drops, anyway, because it's the 90% of code that runs 10% of the time." (The percentage is a reference to the optimization rule that only the 10% of the code that runs 90% of the time should be optimized.) Which is a lie on the latter part, because functions are always being called.
Another good one is when they align each function to have it's own 512byte page, regardless of the size of the function (even if it's only about 15 bytes). Why not even try to crunch those down instead of wasting all that cache (and this is why Intel was winning in benchmarks against AMD, because AMD worked on instruction speed, and Intel caught that the biggest gains could be had from preventing cache misses which are caused by these awful compilers [you'll find that hardware innovation usually ends up being trying to make the machine run bad coding practices better {and then those who know the good rules get screwed over when their shortcuts don't work into microcode well since the good instructions are rarely used}])?
And don't get me started on the hell that is dynamic linking on windows (I cry a little on the inside when people release a 1MB program that does little to nothing, because they're statically linking libraries that are included with windows). Or when Windows doesn't exactly have a sharable library directory for non-MS DLLs that are common enough to be sharable between programs. And why some steam games have a steam_api.dll, instead of steam having one that they share. zlib1.dll should be pretty standard by now. And games are still installing DirectX redist DLLs in private directories? And why do full installer files still exist instead of quick remove programs?
I know i'm knit-picking, but these actually add up in the long run. And to talk about wasting technology, i'm pretty sure it wroks the same way for console games. These kinds of practices are where companies say that console X can't handle something, only to be shut up by another game that actually pushes the limits. The 3ds is a great example. I know some effort went into the graphics code for revelations (every so often you can catch a glimpse of an empty frame buffer being swapped in due to a bug [usually during area loads]). And tekken, while not pushing boundries, doesn't drop from 2d to 3d, framerate wise (I'm told it's 60 FPS, but without something to compare it to i can't tell). Dimensions? Yeah... And dimensions isn't even that heavy on the graphics.
EDIT: And if your game runs well on 32bit, why are you using 64bit code? Worse yet, when you include libs for both when the 32bit versions are also on the 64bit platforms? Just use the 32bit since their instructions inherently take less space (SSE optimizations i can understand, but we don't need an extra 10 libs when the SSE code probably isn't going to be in them).
Any libraries, really. There should be a system like the ones linux (and assumedly mac) use. Though programs on those systems are equally guilty. If you're using a library, try to share it with another program. Especially if you release multiple programs. Sharing code is kind of the point of libraries. Dynamic linking was made so that things that are frequently included can be without being included in every single program.
You can't forget Halo: The Master Chief Collection, which had to have had the worst launch in history. The game is still fucked up now, four months and somewhere around 20GB of patch data later.
Once patches reach a certain GB amount, they should just say, "We're pulling all copies and making new ones with the patch already implemented so you don't need to dedicate an inordinate amount of hard drive space to make the game work like it should have day one. Anyone that already bought the disc can send theirs in for a new copy." Assassin's Creed Unity (17 GB patch to fix things IIRC) and Borderlands: Handsome Collection (12 GB to get Pre-Sequel to work, 3GB to get Borderlands 2 to work (Why not do two discs?)) also come to mind.
Once patches reach a certain GB amount, they should just say, "We're pulling all copies and making new ones with the patch already implemented so you don't need to dedicate an inordinate amount of hard drive space to make the game work like it should have day one. Anyone that already bought the disc can send theirs in for a new copy." Assassin's Creed Unity (17 GB patch to fix things IIRC) and Borderlands: Handsome Collection (12 GB to get Pre-Sequel to work, 3GB to get Borderlands 2 to work (Why not do two discs?)) also come to mind.