BF3 was very playable even before on the 7000s on high detail, but this really is a good psychological victory for eradicating the deficit as far as the average buyer is concerned.
It's behind in the chart they show, it only works well on certain maps, and is otherwise "not there" then.
So, behind in the test they show, only 10% on some or less and map specific, thus even further behind, but according to the article "it has eradicated it's performance deficit or even went ahead". ROFL
Further, only a fool would believe no eye candy drop, as Ryan for well over a year claimed there was none with the infamous perfect circle no visible game advantage driver he fell in love with... until he finally had to admit it - so he cannot see any eye candy problems with amd until after they release their next card series, then suddenly he will finally tell you about all the amd faults he had been hiding for the past year. That's how fan boys work, unconsciously.
So what we have here is what AMD is over 1 FULL YEAR late on. That's what we have, release drivers finally here after being 1 full year late.
Even so, a full year of recommending amd cards above all others hasn't changed a single bit. LOL
Wait, if these should have been AMDs release drivers, and the 7970, although released 2 months before the 680, beats the 680, which had 7 months of driver upgrades, are you actualy trying to say that S.I. is kicking GK1xx's butt?
Nice try epic fail - his own frikkin chart shows the amd 7970 behind...
That's why I find it so strange the wording surrounding the amd failure, as if it won.
So I guess we have to go to gigahertz edition, which was already proven slower anyway. About 1200 core is where the 7970 ties the 680, but then again OC the 680.
So I just find it a bit hard to take, that's all. It didn't do it. It only did great on some maps... it's shader hacks, but "no detectable" difference.. LOL heard that for a year before the added 10% IQ cheat by amd, and the wish here "that nVidia doesn't do the same thing" - which they never did.
Whatever.... it's just so sad. Might as well become a politician.
nVidia have used driver cheats in the past, stating otherwise is plain false.
It was big in the days of the Geforce FX as nVidia had woeful shader model 2.0 performance on those cards.
Also, just because they haven't been caught, doesn't mean they still don't do it either, these company's are incredibly competitive, you would be surprised what they will do to get a small edge and get gamers to hand over their wallets, defending either is simply ludicrous. - They won't send you flowers and cake for purchasing their card, they care about money plain and simple like all good business's do.
Didn't state otherwise, you lied about it by removing context and going prior to my statement, then you said nVidia does it even if they don't get caught. LOL Another fanboy. Another one, big surprise.
You're the fanboy, lashing out at the author with personal attacks, very obvious threat driven posturing, shorthand laughter meant to cover up some concern or insecurity (the oldest trick in the book), restating arguments and providing no rebuttal. saying "epic fail." That is an epic fail
Being that we now have much greater performance on AMD cards, are you going to be updating the chart on your graphics card benchmarks? I love the tool a lot, and would like current relevant data when comparing cards quickly.
Also, I'm really glad AMD stepped up the frostbite 2 game... Literally can't wait for the sales to start on the bundles, I'm looking forward to picking up a 7950. :D
Bench will be updated. Though it may take some time to get all GCN cards updated since this overlaps with the insanity that is the Windows 8 launch week.
Of course, but they will use the old nVidia scores on old systems, but there's no chance in hades the amd won't be updated. Those meals they get wined and dined on and that yearly vacation is much to cool.
Well trade it over, you had a long kicking the amd backside, now you can do so again, right ? LOL I feel bad for you too, having the fastest card that doesn't crash all the time and reduce eye candy on the screen.
I was wondering why you have shown the marquee cards from both the 7700 and 7800 lineups whilst not showing the flagship AMD card, the 7970 GHz Edition.
On other sites (TechPowerUp) which did show this card, the new drivers are enough to propel it above the 680.
Great synopsis, if a little incomplete due to the GE omission.
We only had enough time to run 3 cards. So I picked one of each GPU, going with the 7970 (versus the GE) so that I could trace the performance of the 7970/GCN back to its launch.
Looks like they are reluctant to make Skyrim on RADEON stable... Issues including missing or weird textures on Adaptive AA and poor CFX performance(CPU utilization is under 100% out of 200% using 2 GPUs) was reported in forums long time ago, but they're not fixed until now. Also, there were strange problems such as misrendered book text and missing HUD. And they've missed 7970 shadow issue... did they test new Catalyst before releasing it?
Tired of them, I'm not going to buy next generation RADEON for upgrade unless they improve the situation...
You shouldn't expect 100% GPU utilization in CF in any game since CF and SLI have inherent bottlenecks. Also, Skyrim itself is very CPU limited without heavy use of mods. If you use mods, you end up with way faster performance on AMD side, by a magnitude of 20% with older drivers: http://www.computerbase.de/artikel/grafikkarten/20...
The missing HUD has long been fixed. NV had texture filtering issues in BF3 with GTX650Ti launch as noted by Tom's Hardware. They also took months to fix Shogun 2 performance. Both companies have driver issues and continue to work on them. Have you tried using SSAA in Skyrim on your 7970s?
Notice the words "on launch" and "by Tom's" which has a gaggle of radeon fans who downvote to a minus 20 every comment that isn't super dooper amd love on every single videocard article. It's beyond pathetic.
nVidia is superior and mad is cutting 15% more of their workforce, expect their drivers to suck ever worse starting Q4.
What AMD did here was hammer the CRAP out of their driver team to come up with something before they get fired. Kind of like training your replacements.
Have you tried Adaptive AA on 5850? Some texture is partially rendered (e.g. stone paving on Whiterun roads) or weird (e.g. roof of Temple of Kynareth). I'm going to try 12.10 whether the problem is fixed.
Note that I have seen a blog article of a Japanese user experiencing the same shadow issue mentioned in article here only if the shadow map resolution is 4096.
I don't think HD6000 or even 5000 series would benefit since the issue appears to have stemmed from some shader inefficiencies specifically related to GCN architecture. VLIW-4/5 have been well optimized by AMD as HD5800 traces back to September of 2009. Unfortunately if you want more performance for BF3, you'll need to upgrade to NV/AMD or next gen. You can probably play BF3 well without MSAA on HD6900 series.
Recall when the 7000 series amd was released, this site proclaimed the virtues of amd "geting rid of the 69xx" series in the channels, praising a job well done. Instead of hearing amd competed with itself like we hear when nVidia releases any card, we heard how amd did so wonderful getting rid of that "old" series the 69xx. So now, after they got "rid of it", they can forget it. That's amd's driver policy. :-) Ain't it grand !
Ryan, while discussing image quality in Skyrim, you wrote:
AMD has told us that they haven’t seen this issue in-house (it would admittedly be hard to miss [emphasis added]) so this may be some esoteric issue;
Did you mean to write "hard to see"? "Hard to miss" seems to be the opposite of the intent of your sentence, which appears to me to be trying to say that the rendering error you saw on the 7970 was obscure and difficult to spot.
I believe Ryan's wording of the matter translates to: If AMD have not seen the issue in-house, then the rather obvious rendering error Ryan noted is likely attributable to some other hardware or software discrepancy in his test machine.
But in all seriousness, hopefully this prompts Nvidia to "step-up their game" and counter with more game promos, may be a bit optimistic but both Borderlands 2 and Assassin's Creed 3 for all their GTX cards would be amazing.
Agree. Too bad these companies won't reward people who've recently purchased their products as well as new buyers, but I can understand why they aren't.
Good post chizow. More competitive AMD offerings from price/performance and value perspectives should prompt NV to offer more value through price cuts, its own driver improvements or future game bundles. Win-win for all gamers.
nViodia already forced the price scalping greedy corporate pig monster failing bottom line amd to cut 4 their lying scalp price 4 times, and they still don't have the sales.
Thank nVidia, and don't beg for more. nVidia lowers prices, amd can close the doors.
I think this is great. This is the kind of scrappy competition they should have been doing for years now. Problem is, I think it's more of a reaction to all the bad press AMD's been getting (and the uncertainty we all feel about their overall future) than competition against nVidia.
And I don't think games plus honed drivers really address the issue that I wonder if AMD will even be around making video drivers two years from now.
Unless something changes they have design wins for all 3 of the upcoming consoles for graphics and in some cases cpu's as well. Hopefully that means they'll still be in business one way or another. But I hear you, all the bad press about AMD, huge earnings misses..., mass layoffs, worries me about their future. Even if they get bought, nvidia didn't keep making 3dfx drivers once they bought them... so we just hope AMD gets their house in order fast.
I just want the hundreds that attacked me when I was blowing the whistle for the past number of years to say " Thank you Cerise, you were right and we lied and attacked you in our fanboy ignorance and blind drooling amd backside kissing".
They couldn't offer this level of competition since ATI buyout. 2900 series development and targets suffered due to the merger transition (this was admitted publicly). The whole point of the small die strategy with HD4000-6000 series was to save costs on developing large die and achieve higher yields, hoping to gain market share through larger volume sales by targeting price/performance. That didn't work either. With HD7000 series they finally tried something totally different, the first mover advantage strategy, by launching first and dictating high prices while they had a small window of more advanced technology. Then promptly reverted to the old price/performance strategy once NV launched its cards at lower prices.
The execution was poor though as up to Cats 12.6, the drivers weren't great and NV won the key BF3 performance. At the same time I don't think NV went for the performance crown this generation as Kepler gen 1 feels like IVB money-making part from them. I think GTX780, if based on GK110, could easily retake the performance crown. Will NV do it or continue on offering just enough performance to stay near the top but not necessarily the best like they had with 8800GTX/GTX280/285/480/580 series.
Imagine how much more competitive AMD's GPU division could have been if its CPU division wasn't losing $ hand over fist every year.
They couldn't offer this level of competition since ATI billions overpaid buyout.
2900 series development and targets sucked initially but due to the merger transition this was the public excuse for failure.
The whole point of the small die strategy with HD4000-6000 series was to go cheap and achieve higher fanboy yields, hoping to gain a hatred advantage through whining about nVidia big dies and power usage, targeting idiots that screamed housefire. That didn't work either, but the fanboys pretended it did, and they took their Verdetrol pills to keep the illusion going.
With HD7000 series they tried to COPY nVidia, sick of being hammered to a long behind second place by the GTX580.
The first bowel movement came at a high price with no drivers. While they had a small window, more advanced technology from nVidia was at the ready. Then AMD promptly reverted their overblown corporate rape artist prices downward, again, and again, and again, and again, now again. Price/performance strategy was a JOKE once NV launched its much superior cards.
The execution was poor though as up to Cats 12.6 and beyond , the drivers sucked as usual, with so many errors and problems no one dre track them all.
The great NV won the key BF3 performance and is still winning it, depite the fanboy characterization in the article above.
At the same time I don't think as the Russian Sensation I've liked nVidia at all, and that won't change ever, that's why I hate the feeling I get when deep down inside I know nVidia is making big money from Kelper. It burns me up inside, I'm a raging fuming amd fanboy.
I think GTX780, if based on GK110, could easily retake the performance crown from itself, since nVidia holds it as well right now, something I cannot admit, my rage is so complete on the dark side.
Will NV do it or continue on offering just enough performance to stay near the top but not necessarily the best like they had with 8800GTX/GTX280/285/480/580 series. (Did I just say that ? I have cut down all those cards as losers for years from nVidia, what is happening to me ?!)
And finally, this is why AMD is the very best and at the top of it's game and the world: Imagine how much more competitive AMD's GPU division could have been if its CPU division wasn't losing $ hand over fist every year.
I could never admit they were losing money just last week I ranted and raved about their massive gains from the card sales stock release stuff from John Peddy. I figure if I'm in a deep depression once a year and I actually let the truth slip out in my personal despair my amd handlers will understand....
Do you have a real pupose? Nvidia obviously loves your useless blahing and wouldn't be surprised if your in on it with them. You even know this, and yet continue your nvidia is god blabbering.
How can anyone say this is anything other than a GOOD thing? The people in charge, the ones with their fingers on the big red, and big green, "Release!" Buttons care about profits and market performance, which comes primarily from BUSINESS-SECTOR sales. I swear, the employee's of the companies must be laughing their behinds off at the baseless vitriol spewed all over the internet by the ignorant "fanboys" (and fangirls; no sexism here!), while wondering why in the world anyone would get so worked up about a company that they personally have ABSOLUTELY NOTHING TO DO WITH! I don't care if you support AMD or if you back nVidia or if you are a die-hard Matrox loyalist or what, the result is the same: you are spending your (likely limited) brain cells defending something that, in reality, is nothing more than a concept, a group of people who collaborate together to make themselves rich!
The GPU market is no different than the CPU market, although I must say that I have seen the fanboyism die down much more in the latter, especially over the past year or two, while the former just gets worse! It is truly mind-boggling. We have three companies, really: Intel, AMD, nVidia. AMD competes against both, although to be fair, they do a pretty darn good job when you look at it that way: one company competing against two companies in two different sectors, each of whom has significantly (as in a LOT) more money/resources available, yet the one company is still remaining competitive to a degree, and certainly isn't bankrupt! I tip my hat to AMD for that, as they are the bravest of the three in many ways (the first and in my mind only REAL "FX" processors which introduced 64bit CPU's to the mainstream enthusiasts while at the same time outperforming Intel's 55% higher-clocked P4 Emergency Edition; their risky acquisition of ATi; their original implementation of the HyperTransport link, doing away with much potential bottlenecking; making dual-channel memory available and worthwhile; and of course, they have always been the ones to take the giant risk of designing a from-the-ground-up-new architecture and have, historically, always been about 3-4 years ahead of the curve). However, sometimes the risks don't pay off, and while we may look back in 5 years and see BD/PD as the beginning of a new era, or do the same for the A8-xxxx/A10-xxxxx APU's, the fact is that RIGHT NOW they are nowhere near the levels of performance offered by the premium Intel chips. They are doing quite well with their GPU's, although I fear that their entire GCN architecture was laid out too soon, and that they don't have the Ace in the Sleeve that they might need should nVidia decide to go all-out with the 7xx series of GPU's. And perhaps that is what nVidia has been counting on: they have GK100, which is a terrible design for gaming but could be "stuffed" onto a gaming platform card just as with Fermi should the need arise, but unless AMD has some giant surprise in store, they can continue to make money out of an architecture that has already been paid for.
Anyway, I really, really, really don't think that the 7xx series, or rather the 780, will utilize the GK100 core as we know it. I have used the Tesla K20 briefly, and have handled it, and there is NO WAY that chip can be made to fit into a "reasonable" power envelope, nevermind the immense amount of cooling required! AMD made some very smart moves this round, while I do think that nVidia was either unable to produce the card they REALLY wanted or perhaps just lazy after the unrelenting beat-down they gave AMD with the 5xx series vs 6xxxx series "battle". They handled things much better than they did with the 6970, which could never compete with the 580 even when it had a frame-buffer size advantage; they learned from the mistakes they made, and (unless I state otherwise, I will be referring to only x970/x950 and x80/x70, the flagship cards from each make) they have become absolutely competitive with Team Green. Some examples of their forward progress: - AMD gave their cards a 384-bit GDDR5 Bus and 3GB of "the good stuff", while nVidia oddly took a step backward and settled for a 256-bit memory bus - AMD was first-to-market by a not-insignificant amount of time, which put pressure on their competition, who were (in all likelihood) having some yield issues to begin with - AMD has taken a very "holistic" approach to product development and integration, with their APU's being actually very advanced compared to anything else similar (oh wait... there is nothing else similar) - The idea of Hybrid Crossfire-X is ingenious, and while the implementation may need some work, it may become a savior to all those trying to build a "respectable" gaming machine on a budget especially if they are able to make a more well-rounded package with the high-end offering on-die GPU's equivalent to the ~7870/7950, as then a single $600 purchase (idk, but a hex-core 4.5Ghz CPU with 12MB L3 and a, say, 8870/8950 GPU with its own on-package frame-buffer would probably be not-cheap) would cover both CPU and GPU with only a motherboard and RAM left to purchase; of course, when the individual decides that he/she needs more horsepower, $250-350 for a PCIe x16 8950 3GB(4GB? 6GB?) would actually be a SECOND GPU despite there being just one card - The theoretical implementation of Hybrid CF-X given above would allow for either significant improvements in PCIe bandwidth use for GPU's for those wanting the absolute largest amount of power they can get... - Or it could free up a huge number of said-lanes for other devices such as 12-18x SATA/SAS 6Gbps Ports, 20+ USB3.0 native ports, Extreme-Quality Onboard Audio supporting 13.4ch surround, native x16 PCIe RAID Hardware Chip allowing for RAID0/1/5/6/10/50/60 from the board with zero CPU overhead and massive throughput, Integrated Solid State NAND Flash and/or micro-SSD's that are of low capacity (~32-64GB) and low cost ($20/ea) but can be Teamed by an on-board quad-core/hex-core SSD Controller so that the more you add the faster it gets (similar to RAID but based on the way an SSD works) while capacity rises as well, Significant reductions in overhead for LAN or other similar external connections, or any few of a thousand other things....
I see the next true "revolution" in GPU tech coming when ALL cards are multi-GPU cards communicating over a practically bandwidth-free pathway with one another, and being able to utilize the entirety of the onboard memory in a shared manner. The only other way I can really see BIG improvements coming, and with 2560x1440/1600 becoming increasingly common not to mention 5760x1080/1200 (or larger) multi-display setups, is to overhaul the way in which Crossfire-X (and SLI) is performed; specifically, to allow the VRAM to "stack" so that having 3-way CFX/SLI with 3x 6GB cards doesn't equal 6GB of buffer but rather 18GB. As we head towards displays beyond 3-Megapixels, a "stacking" frame buffer COULD become vastly advantageous. Or, perhaps the way to do this is to prevent GPU's from accessing the HDD directly, and instead stick a large number of moderate-size SLC NAND modules and a controller on the GPU's PCB itself, allowing for textures to be loaded first into a 16-way SLC NAND "texture pool" via the system memory, and then allow the GPU to simply send a small Request Signal to the NAND controller regarding what it needs, at which point it is sent to the GDDR5/GDDR6 frame buffer after processing via an extremely short and thus extremely high-bandwidth parallel AND serial pathway. Remove the weakest link, allow the GPU and CPU to communicate much more freely yet with much reduced overhead, and wonderful things may happen.
HOWEVER, I am no hardware/software engineer, so maybe I am just a crazy person. Well, okay, I AM a crazy person, but even a blind squirrel finds a nut ;)
For the record, I am running a 3930K and GTX670 FTW in my newest, biggest build. My first self-built PC, which I still have and which is being completely "polished" and brought back to full strength, consists of an AMD FX-51 (even rarer due to it being one of the first 1,000 chips ever produced, period) and an ATI X800XT-PE. Both systems are/were absolute top-of-the-line for their time, and both overclock abnormally well (easily hit 5.1ghz on 3930K and 1400core/7800mem on 670, both using custom water; the FX-51 was 24/7/365 stable for 4yrs at 2.45Ghz with a circa-2003 air cooler and the X800XT-PE was at 604/1292 with the stock cooler and the simple addition of a PCI slot blower). I buy whatever gives me the best price:performance to price:longevity total ratio, and this time I felt Intel and Nvidia simply couldn't be beaten. I know that there is not a single TRADITIONALLY COOLED system made of AMD parts, with equivalent numbers of components (i.e. 1x/ea CPU/GPU) that can beat mine in anything and I am happy, yet I am also sad. I am sad because WE NEED for a "neck and neck" competition, or prices will be controlled not by market but by manufacturer, and innovation WILL stagnate.
Let's all of us get a case of Red-Green colorblindness for a while and do what we can to help ensure that ALL OF US win in the end, what do you say?
I read about the first paragraph, then I scrolled down, and saw the gigantic obviously totally hypocyryphal bs. You cared so much you nearly wrote a freaking book. Well, read it yourself, you're more obsessed than anyone else here, obviously. What a doof.
Does anyone else consider that NEXT is not only used by AMD as in AMD NEXT but evidently is also in the name of Microsoft's XBOX NEXT. Apparently XBOX 720 is out and XBOX NEXT is in. Is it becasue AMD NEXT is INSIDE?
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
65 Comments
Back to Article
Vini - Monday, October 22, 2012 - link
BF3 was very playable even before on the 7000s on high detail, but this really is a good psychological victory for eradicating the deficit as far as the average buyer is concerned.Well done to AMD's driver dept!
CeriseCogburn - Monday, October 22, 2012 - link
It's behind in the chart they show, it only works well on certain maps, and is otherwise "not there" then.So, behind in the test they show, only 10% on some or less and map specific, thus even further behind, but according to the article "it has eradicated it's performance deficit or even went ahead".
ROFL
Further, only a fool would believe no eye candy drop, as Ryan for well over a year claimed there was none with the infamous perfect circle no visible game advantage driver he fell in love with... until he finally had to admit it - so he cannot see any eye candy problems with amd until after they release their next card series, then suddenly he will finally tell you about all the amd faults he had been hiding for the past year.
That's how fan boys work, unconsciously.
So what we have here is what AMD is over 1 FULL YEAR late on.
That's what we have, release drivers finally here after being 1 full year late.
Even so, a full year of recommending amd cards above all others hasn't changed a single bit. LOL
This site is so much fun.
Remon - Monday, October 22, 2012 - link
Wait, if these should have been AMDs release drivers, and the 7970, although released 2 months before the 680, beats the 680, which had 7 months of driver upgrades, are you actualy trying to say that S.I. is kicking GK1xx's butt?CeriseCogburn - Monday, October 22, 2012 - link
Nice try epic fail - his own frikkin chart shows the amd 7970 behind...That's why I find it so strange the wording surrounding the amd failure, as if it won.
So I guess we have to go to gigahertz edition, which was already proven slower anyway. About 1200 core is where the 7970 ties the 680, but then again OC the 680.
So I just find it a bit hard to take, that's all. It didn't do it. It only did great on some maps... it's shader hacks, but "no detectable" difference.. LOL heard that for a year before the added 10% IQ cheat by amd, and the wish here "that nVidia doesn't do the same thing" - which they never did.
Whatever.... it's just so sad. Might as well become a politician.
StevoLincolnite - Tuesday, October 23, 2012 - link
nVidia have used driver cheats in the past, stating otherwise is plain false.It was big in the days of the Geforce FX as nVidia had woeful shader model 2.0 performance on those cards.
Also, just because they haven't been caught, doesn't mean they still don't do it either, these company's are incredibly competitive, you would be surprised what they will do to get a small edge and get gamers to hand over their wallets, defending either is simply ludicrous. - They won't send you flowers and cake for purchasing their card, they care about money plain and simple like all good business's do.
CeriseCogburn - Thursday, October 25, 2012 - link
Didn't state otherwise, you lied about it by removing context and going prior to my statement, then you said nVidia does it even if they don't get caught. LOLAnother fanboy.
Another one, big surprise.
lambchowder - Saturday, October 27, 2012 - link
You're the fanboy, lashing out at the author with personal attacks, very obvious threat driven posturing, shorthand laughter meant to cover up some concern or insecurity (the oldest trick in the book), restating arguments and providing no rebuttal. saying "epic fail." That is an epic failSpunjji - Monday, October 22, 2012 - link
This comments section would be so much more bearable without you in it.formulav8 - Wednesday, October 24, 2012 - link
This is the first time i've noticed him and can already see what you mean. Pretty sadjunky77 - Monday, October 22, 2012 - link
will the 12.11 release include a mobile drivers update too?Ryan Smith - Monday, October 22, 2012 - link
12.11 will be for all AMD products, desktop and mobile.JarredWalton - Monday, October 22, 2012 - link
I'll add a few short comments on a second Pipeline piece in a bit....xdeadzx - Monday, October 22, 2012 - link
Being that we now have much greater performance on AMD cards, are you going to be updating the chart on your graphics card benchmarks? I love the tool a lot, and would like current relevant data when comparing cards quickly.Also, I'm really glad AMD stepped up the frostbite 2 game... Literally can't wait for the sales to start on the bundles, I'm looking forward to picking up a 7950. :D
Ryan Smith - Monday, October 22, 2012 - link
Bench will be updated. Though it may take some time to get all GCN cards updated since this overlaps with the insanity that is the Windows 8 launch week.StevoLincolnite - Tuesday, October 23, 2012 - link
So are they updated yet? :PCeriseCogburn - Monday, October 22, 2012 - link
Of course, but they will use the old nVidia scores on old systems, but there's no chance in hades the amd won't be updated.Those meals they get wined and dined on and that yearly vacation is much to cool.
Spunjji - Monday, October 22, 2012 - link
Blah blah blah whine gripe whine blah.Articuno - Monday, October 22, 2012 - link
Because I bought a 670 FTW mostly to play BF3 and by now it's too late to get a refund.At least I still have PhysX.
CeriseCogburn - Monday, October 22, 2012 - link
Well trade it over, you had a long kicking the amd backside, now you can do so again, right ?LOL
I feel bad for you too, having the fastest card that doesn't crash all the time and reduce eye candy on the screen.
Kodongo - Monday, October 22, 2012 - link
I was wondering why you have shown the marquee cards from both the 7700 and 7800 lineups whilst not showing the flagship AMD card, the 7970 GHz Edition.On other sites (TechPowerUp) which did show this card, the new drivers are enough to propel it above the 680.
Great synopsis, if a little incomplete due to the GE omission.
Ryan Smith - Monday, October 22, 2012 - link
We only had enough time to run 3 cards. So I picked one of each GPU, going with the 7970 (versus the GE) so that I could trace the performance of the 7970/GCN back to its launch.Kodongo - Monday, October 22, 2012 - link
Fair enough. I just wanted to ask why it wasn't there as these drivers elevate the 7970GE clear of the 680 now.Thanks for answering and keep up the good work!
Meaker10 - Monday, October 22, 2012 - link
WHERE THE HELL IS THE ENDURO HOTFIX DRIVER WE WERE PROMISED...................HOODY - Monday, October 22, 2012 - link
LOL I'm with U homes!!!,I read the article and see nothing mentioned again about the 76xxM cards, ie mobel, ie switchable, ie "enduro".
And never any mention that AMD "provides" the intel driver part, check your device manager properties for the intel graphics part.
I'll still hold out hope, just till their "final" comes out.
JarredWalton - Monday, October 22, 2012 - link
This is it, sort of... more on that in a bit.CeriseCogburn - Monday, October 22, 2012 - link
There ya go bro, this is it "sort of" from the site master tech article man.LOL - amd = sort of
No kidding ? No kidding, it's serial.
Tegeril - Tuesday, October 23, 2012 - link
Ugh get over yourself.Spunjji - Tuesday, October 23, 2012 - link
+1CeriseCogburn - Thursday, October 25, 2012 - link
When the amd fanboys don't like the message, attack the messenger.CeriseCogburn - Thursday, October 25, 2012 - link
It's not about me losers, but when the message cannot be attacked, the amd fanboy attacks the messenger.k2_8191 - Monday, October 22, 2012 - link
Looks like they are reluctant to make Skyrim on RADEON stable...Issues including missing or weird textures on Adaptive AA and poor CFX performance(CPU utilization is under 100% out of 200% using 2 GPUs) was reported in forums long time ago, but they're not fixed until now.
Also, there were strange problems such as misrendered book text and missing HUD.
And they've missed 7970 shadow issue... did they test new Catalyst before releasing it?
Tired of them, I'm not going to buy next generation RADEON for upgrade unless they improve the situation...
k2_8191 - Monday, October 22, 2012 - link
Typo: CPU utilization -> GPU utilization.Oh I want to edit my post!
RussianSensation - Monday, October 22, 2012 - link
You shouldn't expect 100% GPU utilization in CF in any game since CF and SLI have inherent bottlenecks. Also, Skyrim itself is very CPU limited without heavy use of mods. If you use mods, you end up with way faster performance on AMD side, by a magnitude of 20% with older drivers:http://www.computerbase.de/artikel/grafikkarten/20...
The missing HUD has long been fixed. NV had texture filtering issues in BF3 with GTX650Ti launch as noted by Tom's Hardware. They also took months to fix Shogun 2 performance. Both companies have driver issues and continue to work on them. Have you tried using SSAA in Skyrim on your 7970s?
k2_8191 - Monday, October 22, 2012 - link
Thanks for information.Mmm, could more powerful CPU help improving GPU utilization?
Oh, I didn't know the issues on GeForce other than Skyrim...
Perhaps I should reconsider my decision for games for them.
I don't have 7970 due to budget limitation so I can't try. Sorry for that...
CeriseCogburn - Monday, October 22, 2012 - link
LOL - watch it the RS is a raging fanboy of amd.Notice the words "on launch" and "by Tom's" which has a gaggle of radeon fans who downvote to a minus 20 every comment that isn't super dooper amd love on every single videocard article.
It's beyond pathetic.
nVidia is superior and mad is cutting 15% more of their workforce, expect their drivers to suck ever worse starting Q4.
What AMD did here was hammer the CRAP out of their driver team to come up with something before they get fired. Kind of like training your replacements.
quasi_accurate - Monday, October 22, 2012 - link
That's odd... I haven't had any issues on a 5870 and a 7970 in Skyrim.k2_8191 - Monday, October 22, 2012 - link
Have you tried Adaptive AA on 5850?Some texture is partially rendered (e.g. stone paving on Whiterun roads) or weird (e.g. roof of Temple of Kynareth).
I'm going to try 12.10 whether the problem is fixed.
Note that I have seen a blog article of a Japanese user experiencing the same shadow issue mentioned in article here only if the shadow map resolution is 4096.
k2_8191 - Monday, October 22, 2012 - link
Oh, another typo: 5850 -> 5870I tried 12.10 just now, but didn't help :(
Here is screenshots below.
http://i1249.photobucket.com/albums/hh515/k2_8191/...
http://i1249.photobucket.com/albums/hh515/k2_8191/...
This is reproducible only when Adaptive AA is turned on, so I set AA mode to MSAA.
I'll report the issue to AMD again...
Deaks2 - Monday, October 22, 2012 - link
Considering the major boost in BF3 performance, has the new driver also fixed the 6xxx cards' performance deficiency in Ultra mode?Ryan Smith - Monday, October 22, 2012 - link
We haven't tested any 6000 series cards. Other sites have and have found no change in performance.Deaks2 - Monday, October 22, 2012 - link
Boourns, you're right: http://www.techpowerup.com/reviews/AMD/Catalyst_12...RussianSensation - Monday, October 22, 2012 - link
I don't think HD6000 or even 5000 series would benefit since the issue appears to have stemmed from some shader inefficiencies specifically related to GCN architecture. VLIW-4/5 have been well optimized by AMD as HD5800 traces back to September of 2009. Unfortunately if you want more performance for BF3, you'll need to upgrade to NV/AMD or next gen. You can probably play BF3 well without MSAA on HD6900 series.CeriseCogburn - Friday, October 26, 2012 - link
Recall when the 7000 series amd was released, this site proclaimed the virtues of amd "geting rid of the 69xx" series in the channels, praising a job well done.Instead of hearing amd competed with itself like we hear when nVidia releases any card, we heard how amd did so wonderful getting rid of that "old" series the 69xx.
So now, after they got "rid of it", they can forget it.
That's amd's driver policy.
:-)
Ain't it grand !
ddarko - Monday, October 22, 2012 - link
Ryan, while discussing image quality in Skyrim, you wrote:Did you mean to write "hard to see"? "Hard to miss" seems to be the opposite of the intent of your sentence, which appears to me to be trying to say that the rendering error you saw on the 7970 was obscure and difficult to spot.
arthur449 - Monday, October 22, 2012 - link
I believe Ryan's wording of the matter translates to: If AMD have not seen the issue in-house, then the rather obvious rendering error Ryan noted is likely attributable to some other hardware or software discrepancy in his test machine.CeriseCogburn - Monday, October 22, 2012 - link
Which of course is what AMD is famous for.chizow - Monday, October 22, 2012 - link
Looks like a nice parting gift from AMD. :)But in all seriousness, hopefully this prompts Nvidia to "step-up their game" and counter with more game promos, may be a bit optimistic but both Borderlands 2 and Assassin's Creed 3 for all their GTX cards would be amazing.
HisDivineOrder - Monday, October 22, 2012 - link
Agree. Too bad these companies won't reward people who've recently purchased their products as well as new buyers, but I can understand why they aren't.CeriseCogburn - Monday, October 22, 2012 - link
It's called not going bankrupt like amd is.
I guess your imagination is like, not working at all.
RussianSensation - Monday, October 22, 2012 - link
Good post chizow. More competitive AMD offerings from price/performance and value perspectives should prompt NV to offer more value through price cuts, its own driver improvements or future game bundles. Win-win for all gamers.CeriseCogburn - Tuesday, October 23, 2012 - link
nViodia already forced the price scalping greedy corporate pig monster failing bottom line amd to cut 4 their lying scalp price 4 times, and they still don't have the sales.
Thank nVidia, and don't beg for more. nVidia lowers prices, amd can close the doors.
HisDivineOrder - Monday, October 22, 2012 - link
I think this is great. This is the kind of scrappy competition they should have been doing for years now. Problem is, I think it's more of a reaction to all the bad press AMD's been getting (and the uncertainty we all feel about their overall future) than competition against nVidia.And I don't think games plus honed drivers really address the issue that I wonder if AMD will even be around making video drivers two years from now.
andrewaggb - Monday, October 22, 2012 - link
Unless something changes they have design wins for all 3 of the upcoming consoles for graphics and in some cases cpu's as well. Hopefully that means they'll still be in business one way or another. But I hear you, all the bad press about AMD, huge earnings misses..., mass layoffs, worries me about their future. Even if they get bought, nvidia didn't keep making 3dfx drivers once they bought them... so we just hope AMD gets their house in order fast.CeriseCogburn - Monday, October 22, 2012 - link
I just want the hundreds that attacked me when I was blowing the whistle for the past number of years to say " Thank you Cerise, you were right and we lied and attacked you in our fanboy ignorance and blind drooling amd backside kissing".
Not like that will happen, not even once.
Spunjji - Tuesday, October 23, 2012 - link
Nope, it won't! :D So kindly FOAD.CeriseCogburn - Thursday, October 25, 2012 - link
Another great amd fanboy, kind as usual.RussianSensation - Monday, October 22, 2012 - link
They couldn't offer this level of competition since ATI buyout. 2900 series development and targets suffered due to the merger transition (this was admitted publicly). The whole point of the small die strategy with HD4000-6000 series was to save costs on developing large die and achieve higher yields, hoping to gain market share through larger volume sales by targeting price/performance. That didn't work either. With HD7000 series they finally tried something totally different, the first mover advantage strategy, by launching first and dictating high prices while they had a small window of more advanced technology. Then promptly reverted to the old price/performance strategy once NV launched its cards at lower prices.The execution was poor though as up to Cats 12.6, the drivers weren't great and NV won the key BF3 performance. At the same time I don't think NV went for the performance crown this generation as Kepler gen 1 feels like IVB money-making part from them. I think GTX780, if based on GK110, could easily retake the performance crown. Will NV do it or continue on offering just enough performance to stay near the top but not necessarily the best like they had with 8800GTX/GTX280/285/480/580 series.
Imagine how much more competitive AMD's GPU division could have been if its CPU division wasn't losing $ hand over fist every year.
CeriseCogburn - Monday, October 22, 2012 - link
They couldn't offer this level of competition since ATI billions overpaid buyout.2900 series development and targets sucked initially but due to the merger transition this was the public excuse for failure.
The whole point of the small die strategy with HD4000-6000 series was to go cheap and achieve higher fanboy yields, hoping to gain a hatred advantage through whining about nVidia big dies and power usage, targeting idiots that screamed housefire. That didn't work either, but the fanboys pretended it did, and they took their Verdetrol pills to keep the illusion going.
With HD7000 series they tried to COPY nVidia, sick of being hammered to a long behind second place by the GTX580.
The first bowel movement came at a high price with no drivers.
While they had a small window, more advanced technology from nVidia was at the ready.
Then AMD promptly reverted their overblown corporate rape artist prices downward, again, and again, and again, and again, now again.
Price/performance strategy was a JOKE once NV launched its much superior cards.
The execution was poor though as up to Cats 12.6 and beyond , the drivers sucked as usual, with so many errors and problems no one dre track them all.
The great NV won the key BF3 performance and is still winning it, depite the fanboy characterization in the article above.
At the same time I don't think as the Russian Sensation I've liked nVidia at all, and that won't change ever, that's why I hate the feeling I get when deep down inside I know nVidia is making big money from Kelper.
It burns me up inside, I'm a raging fuming amd fanboy.
I think GTX780, if based on GK110, could easily retake the performance crown from itself, since nVidia holds it as well right now, something I cannot admit, my rage is so complete on the dark side.
Will NV do it or continue on offering just enough performance to stay near the top but not necessarily the best like they had with 8800GTX/GTX280/285/480/580 series. (Did I just say that ? I have cut down all those cards as losers for years from nVidia, what is happening to me ?!)
And finally, this is why AMD is the very best and at the top of it's game and the world: Imagine how much more competitive AMD's GPU division could have been if its CPU division wasn't losing $ hand over fist every year.
I could never admit they were losing money just last week I ranted and raved about their massive gains from the card sales stock release stuff from John Peddy.
I figure if I'm in a deep depression once a year and I actually let the truth slip out in my personal despair my amd handlers will understand....
ROFL - there we are, all fixed.
Zink - Monday, October 22, 2012 - link
It's someones job at AMD to read these comments and you're just hurting their head.formulav8 - Wednesday, October 24, 2012 - link
Do you have a real pupose? Nvidia obviously loves your useless blahing and wouldn't be surprised if your in on it with them. You even know this, and yet continue your nvidia is god blabbering.CeriseCogburn - Friday, October 26, 2012 - link
No just correcting the amd fan boy lies, it's big job but someone should do it.nleksan - Thursday, October 25, 2012 - link
How can anyone say this is anything other than a GOOD thing? The people in charge, the ones with their fingers on the big red, and big green, "Release!" Buttons care about profits and market performance, which comes primarily from BUSINESS-SECTOR sales. I swear, the employee's of the companies must be laughing their behinds off at the baseless vitriol spewed all over the internet by the ignorant "fanboys" (and fangirls; no sexism here!), while wondering why in the world anyone would get so worked up about a company that they personally have ABSOLUTELY NOTHING TO DO WITH! I don't care if you support AMD or if you back nVidia or if you are a die-hard Matrox loyalist or what, the result is the same: you are spending your (likely limited) brain cells defending something that, in reality, is nothing more than a concept, a group of people who collaborate together to make themselves rich!The GPU market is no different than the CPU market, although I must say that I have seen the fanboyism die down much more in the latter, especially over the past year or two, while the former just gets worse! It is truly mind-boggling.
We have three companies, really: Intel, AMD, nVidia. AMD competes against both, although to be fair, they do a pretty darn good job when you look at it that way: one company competing against two companies in two different sectors, each of whom has significantly (as in a LOT) more money/resources available, yet the one company is still remaining competitive to a degree, and certainly isn't bankrupt! I tip my hat to AMD for that, as they are the bravest of the three in many ways (the first and in my mind only REAL "FX" processors which introduced 64bit CPU's to the mainstream enthusiasts while at the same time outperforming Intel's 55% higher-clocked P4 Emergency Edition; their risky acquisition of ATi; their original implementation of the HyperTransport link, doing away with much potential bottlenecking; making dual-channel memory available and worthwhile; and of course, they have always been the ones to take the giant risk of designing a from-the-ground-up-new architecture and have, historically, always been about 3-4 years ahead of the curve). However, sometimes the risks don't pay off, and while we may look back in 5 years and see BD/PD as the beginning of a new era, or do the same for the A8-xxxx/A10-xxxxx APU's, the fact is that RIGHT NOW they are nowhere near the levels of performance offered by the premium Intel chips.
They are doing quite well with their GPU's, although I fear that their entire GCN architecture was laid out too soon, and that they don't have the Ace in the Sleeve that they might need should nVidia decide to go all-out with the 7xx series of GPU's. And perhaps that is what nVidia has been counting on: they have GK100, which is a terrible design for gaming but could be "stuffed" onto a gaming platform card just as with Fermi should the need arise, but unless AMD has some giant surprise in store, they can continue to make money out of an architecture that has already been paid for.
Anyway, I really, really, really don't think that the 7xx series, or rather the 780, will utilize the GK100 core as we know it. I have used the Tesla K20 briefly, and have handled it, and there is NO WAY that chip can be made to fit into a "reasonable" power envelope, nevermind the immense amount of cooling required!
AMD made some very smart moves this round, while I do think that nVidia was either unable to produce the card they REALLY wanted or perhaps just lazy after the unrelenting beat-down they gave AMD with the 5xx series vs 6xxxx series "battle".
They handled things much better than they did with the 6970, which could never compete with the 580 even when it had a frame-buffer size advantage; they learned from the mistakes they made, and (unless I state otherwise, I will be referring to only x970/x950 and x80/x70, the flagship cards from each make) they have become absolutely competitive with Team Green.
Some examples of their forward progress:
- AMD gave their cards a 384-bit GDDR5 Bus and 3GB of "the good stuff", while nVidia oddly took a step backward and settled for a 256-bit memory bus
- AMD was first-to-market by a not-insignificant amount of time, which put pressure on their competition, who were (in all likelihood) having some yield issues to begin with
- AMD has taken a very "holistic" approach to product development and integration, with their APU's being actually very advanced compared to anything else similar (oh wait... there is nothing else similar)
- The idea of Hybrid Crossfire-X is ingenious, and while the implementation may need some work, it may become a savior to all those trying to build a "respectable" gaming machine on a budget especially if they are able to make a more well-rounded package with the high-end offering on-die GPU's equivalent to the ~7870/7950, as then a single $600 purchase (idk, but a hex-core 4.5Ghz CPU with 12MB L3 and a, say, 8870/8950 GPU with its own on-package frame-buffer would probably be not-cheap) would cover both CPU and GPU with only a motherboard and RAM left to purchase; of course, when the individual decides that he/she needs more horsepower, $250-350 for a PCIe x16 8950 3GB(4GB? 6GB?) would actually be a SECOND GPU despite there being just one card
- The theoretical implementation of Hybrid CF-X given above would allow for either significant improvements in PCIe bandwidth use for GPU's for those wanting the absolute largest amount of power they can get...
- Or it could free up a huge number of said-lanes for other devices such as 12-18x SATA/SAS 6Gbps Ports, 20+ USB3.0 native ports, Extreme-Quality Onboard Audio supporting 13.4ch surround, native x16 PCIe RAID Hardware Chip allowing for RAID0/1/5/6/10/50/60 from the board with zero CPU overhead and massive throughput, Integrated Solid State NAND Flash and/or micro-SSD's that are of low capacity (~32-64GB) and low cost ($20/ea) but can be Teamed by an on-board quad-core/hex-core SSD Controller so that the more you add the faster it gets (similar to RAID but based on the way an SSD works) while capacity rises as well, Significant reductions in overhead for LAN or other similar external connections, or any few of a thousand other things....
I see the next true "revolution" in GPU tech coming when ALL cards are multi-GPU cards communicating over a practically bandwidth-free pathway with one another, and being able to utilize the entirety of the onboard memory in a shared manner. The only other way I can really see BIG improvements coming, and with 2560x1440/1600 becoming increasingly common not to mention 5760x1080/1200 (or larger) multi-display setups, is to overhaul the way in which Crossfire-X (and SLI) is performed; specifically, to allow the VRAM to "stack" so that having 3-way CFX/SLI with 3x 6GB cards doesn't equal 6GB of buffer but rather 18GB. As we head towards displays beyond 3-Megapixels, a "stacking" frame buffer COULD become vastly advantageous.
Or, perhaps the way to do this is to prevent GPU's from accessing the HDD directly, and instead stick a large number of moderate-size SLC NAND modules and a controller on the GPU's PCB itself, allowing for textures to be loaded first into a 16-way SLC NAND "texture pool" via the system memory, and then allow the GPU to simply send a small Request Signal to the NAND controller regarding what it needs, at which point it is sent to the GDDR5/GDDR6 frame buffer after processing via an extremely short and thus extremely high-bandwidth parallel AND serial pathway. Remove the weakest link, allow the GPU and CPU to communicate much more freely yet with much reduced overhead, and wonderful things may happen.
HOWEVER, I am no hardware/software engineer, so maybe I am just a crazy person. Well, okay, I AM a crazy person, but even a blind squirrel finds a nut ;)
For the record, I am running a 3930K and GTX670 FTW in my newest, biggest build. My first self-built PC, which I still have and which is being completely "polished" and brought back to full strength, consists of an AMD FX-51 (even rarer due to it being one of the first 1,000 chips ever produced, period) and an ATI X800XT-PE. Both systems are/were absolute top-of-the-line for their time, and both overclock abnormally well (easily hit 5.1ghz on 3930K and 1400core/7800mem on 670, both using custom water; the FX-51 was 24/7/365 stable for 4yrs at 2.45Ghz with a circa-2003 air cooler and the X800XT-PE was at 604/1292 with the stock cooler and the simple addition of a PCI slot blower).
I buy whatever gives me the best price:performance to price:longevity total ratio, and this time I felt Intel and Nvidia simply couldn't be beaten. I know that there is not a single TRADITIONALLY COOLED system made of AMD parts, with equivalent numbers of components (i.e. 1x/ea CPU/GPU) that can beat mine in anything and I am happy, yet I am also sad.
I am sad because WE NEED for a "neck and neck" competition, or prices will be controlled not by market but by manufacturer, and innovation WILL stagnate.
Let's all of us get a case of Red-Green colorblindness for a while and do what we can to help ensure that ALL OF US win in the end, what do you say?
CeriseCogburn - Friday, October 26, 2012 - link
I read about the first paragraph, then I scrolled down, and saw the gigantic obviously totally hypocyryphal bs.You cared so much you nearly wrote a freaking book. Well, read it yourself, you're more obsessed than anyone else here, obviously.
What a doof.
Gastec - Tuesday, November 13, 2012 - link
I normally don't post but I had to come online just for you. You are a troll and you will be banned.akamateau - Saturday, October 27, 2012 - link
Does anyone else consider that NEXT is not only used by AMD as in AMD NEXT but evidently is also in the name of Microsoft's XBOX NEXT. Apparently XBOX 720 is out and XBOX NEXT is in. Is it becasue AMD NEXT is INSIDE?