Wondering how much lowering the resolution further would help the other tests, I dialed them all down to 1680×1050 and reran the tests at the same settings.HAWX 2 was still the only one to get above 30 fps (61 fps), though Batman: Arkham City and Lost Planet 2 came extremely close (29 fps and 29.1 fps respectively). Things clicked considerably more at 1440×900, with only Heaven and Metro 2033 still holding out — which they also did at even the lowest resolution I tested, 1280×800.
But I wasn’t satisfied with the video quality at any of these resolutions; lots of things looked just a little too blobby for my tastes. So on every game that passed my first test, I upped the details just a tad to see whether it could hack a slightly more demanding challenge. HAWX 2 again flew at all four resolutions,Lost Planet 2 was fine up through 1440×900, and that was all. Nudging the settings up still further,HAWX 2 hit 33 fps and Lost Planet 2 made it to 35.2 fps at 1280×800, but neither could surpass 30 fps even at 1440×900, so that’s where I stopped.
Even with the minor detail improvements I implemented toward the end of the testing, you still have to forsake a fair amount in terms of video quality just to get these games to play — and then generally at frame rates most legitimate gamers would barely consider good. Don’t get me wrong: As far as I’m concerned, Ivy Bridge marks a significant leap forward for Intel, and definitely suggests some good things to come on the graphics side. But I can’t quite get myself to the point where I feel comfortable pretending that this is a major victory.
The main thing Intel has proven — or, if you prefer, proven again — with Ivy Bridge is that, if you want to play 3D games that both perform well and look good, even at lower (I would argue too low) resolutions, you absolutely need a discrete video card. There’s no way around this. It may not be news to most (okay, any) system builders, but the average consumer who buys an Ivy Bridge system and is attracted by the possibility of not having to shell out another three to five wallet-size portraits of Andrew Jackson just to get the newest titles to play is going to be disappointed.
Ultimately, the conclusion to draw here is exactly the one Loyd did: Intel has functionally obliterated any compelling reason to buy a $60 video card. I’ve been a bit skeptical of the need for them for years, to be honest, as I’ve generally questioned whether the gains you might see in tasks like video transcoding were worth the money if robust gaming remained elusive. But if Intel and AMD chips can handle all the basic stuff themselves and do it well — which they now do — the GPU guys will either need to devise a much better argument for the entry-level models, drop the prices across the board to edge the $100 cards (which I’ve found in most cases to be worth the money) closer to or even onto the lowest pricing tier, or give up on trying to court that segment at all. And once Ivy Bridge gets around, AMD and Nvidia will need to make that decision sooner rather than later.
So Intel definitely deserves to be congratulated on a solid release that will change our outlooks about processing and integrated video for at least the next year or so. Just don’t assume that Ivy Bridge’s strides translate into a revolutionary rethink of the graphics market for everyone. Those for whom gaming is, at most, a sometime thing will unquestionably notice some benefits. But everyone else should stick with at least a $100 standalone card (and preferably at least a $200 one if they can afford it) to ensure the games’ performance and appearance make them worth playing.
source : View the original article here
0 comments:
Post a Comment