tag:blogger.com,1999:blog-2602471396566186819.post8604762187691222792..comments2023-10-26T15:06:30.940+00:00Comments on AIMeD Corporation: All Your Fabs Are Belong To Us - TSMCRoborat, Ph.Dhttp://www.blogger.com/profile/04845879517177508741noreply@blogger.comBlogger86125tag:blogger.com,1999:blog-2602471396566186819.post-9215144961108757952007-11-20T06:57:00.000+00:002007-11-20T06:57:00.000+00:00Anonymous poster said:I fully intend to post the 3...Anonymous poster said:<BR/><I>I fully intend to post the 3.0GHz data on his site when it comes out in H2'08 and compare it retroactively to a 2.93GHz Kenstfield to better understand his concept of a "stone cold killer".</I><BR/><BR/>http://enthusiast.hardocp.com/article.html?art=MTQyMiw2LCxoZW50aHVzaWFzdA==<BR/><BR/>There is an overclocked 3GHz Phenom and 3GHz Kentsfield there, which is close enough to 2.93GHz.<BR/><BR/>Stone cold killer my ass! ROFLAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-2602471396566186819.post-1057089215341421702007-11-19T22:12:00.000+00:002007-11-19T22:12:00.000+00:00My post copied for posterity from Scientia's blog....My post copied for posterity from Scientia's blog.<BR/><BR/><B>Ho Ho</B><BR/><BR/><I>If not then I'm afraid only savior AMD can hope is Bulldozer</I><BR/><BR/>To be blunt, AMD have almost certainly known since before this past July that K10 would simply be a stopgap to Bulldozer. During the July Analyst Day conference, AMD essentially downplayed K10 and talked up <I>Bulldozer</I> as the real deal, probably for the sole purpose of winning over gullible investors like the Abu Dhabi government so that AMD can survive until late 2009 when Bulldozer is slated to launch.<BR/><BR/>It hardly matters anymore if 65-nm K10 can hit 3.0 Ghz by Q2 08 with decent thermals. Its IPC to die size ratio is so far behind that of Yorkfield, and Yorkfield's pricing so aggressive, that AMD will not be able to make enough money on K10 to stop the bleeding even if they can get faster clocks out by Q2. At this point we're looking at 2.4 GHz and <I>hopefully</I> 2.6 GHz by late Q1. I doubt that 2.8 GHz would emerge until Q2 at earliest.<BR/><BR/>Shanghai in late 2008 would bring the die down to competitive sizing but the big question is whether AMD can bring the leakage down to reasonable levels to clock them competitively.<BR/><BR/>Effectively, AMD are screwed for 2008 with their current form of operations. They must either raise a lot more cash or start selling assets. The Abu Dhabi windfall buys them a quarter or two of time to consolidate and execute on asset lite.Axelhttps://www.blogger.com/profile/15126742407361053721noreply@blogger.comtag:blogger.com,1999:blog-2602471396566186819.post-45896289685192956012007-11-19T21:51:00.001+00:002007-11-19T21:51:00.001+00:00Link for statment of post just above:http://www.le...Link for statment of post just above:<BR/>http://www.legitreviews.com/article/597/13/Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-2602471396566186819.post-79256658643762984572007-11-19T21:51:00.000+00:002007-11-19T21:51:00.000+00:00Ouuuuuchhhh!"We were able to overclock the 2.6Ghz ...Ouuuuuchhhh!<BR/><BR/>"We were able to overclock the 2.6Ghz Phenom 9900 to 3.06Ghz and even at 3GHz it didn't pose a threat to really any of the Intel processors, so the only way AMD is going to be able to compete with Intel is by lowering prices once again and pitching the whole platform and not just the processor." <BR/><BR/>Even if you get a magical bin-split and get to 3 GHz, it still does not affect the overall competitive landscape.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-2602471396566186819.post-88517031852314447132007-11-19T21:23:00.000+00:002007-11-19T21:23:00.000+00:00"So out of curiosity, I wonder if Rahul Sood of HP..."So out of curiosity, I wonder if Rahul Sood of HP/VoodooPC knowingly lied when a couple months ago he spouted the nonsense that Phenom 3.0 Ghz would "kick the living crap" out of any CPU then on the market. Or did he unknowingly regurgitate the lies told him by someone at AMD?"<BR/><BR/>I challenged him on that (I must credit for him having the integrity to publish and respond to the comments). He stood by the comments, said he doesn't speculate on performance (i.e. it was measured), and that he continues to stand by the comment.<BR/><BR/>He did get a little squirrelly as he says his comment was kick the living crap out of CURRENT processors at the time (which I think was the 2.93 Kentsfield?) <BR/><BR/>When pointed out that that may have been a bit of a copout as there was no reason to believe a3GHz Phenom was due out soon when he made that comment, he say he had an inside source which lead him to believe this would be coming soon. I also don't think he is allowed to publish the data (either he saw the demo or did the demo with the agreement not to publish any data)<BR/><BR/>I fully intend to post the 3.0GHz data on his site when it comes out in H2'08 and compare it retroactively to a 2.93GHz Kenstfield to better understand his concept of a "stone cold killer".Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-2602471396566186819.post-82160444412130266992007-11-19T20:25:00.000+00:002007-11-19T20:25:00.000+00:00tonus"I don't care if your CPU has better IPC, if ...<B>tonus</B><BR/><I>"I don't care if your CPU has better IPC, if it's not the fastest at a given price point, I'm not buying it!"</I><BR/><BR/>but ... how can you ignore things that are just better by design? I mean K10 is true quad, it has multiple cache levels for optimal performance. Also split memory controllers and IMC with uberadvanced HT. Sure, even with all those things making it one of the best CPU in the world its final performance is still lacking but so what? The cpu is just beautiful!<BR/><BR/>:P<BR/><BR/>In other news, AMD sais its <A HREF="http://www.extremetech.com/article2/0,1697,2218304,00.asp" REL="nofollow">9900 at 2.6GHz will be released late Q1 next year at 140W.</A><BR/>Anyone wants to guess when will 3GHz quad with normal thermals be availiable? Would it be before next Christmas?Ho Hohttps://www.blogger.com/profile/00177815588184912351noreply@blogger.comtag:blogger.com,1999:blog-2602471396566186819.post-12406483539045273392007-11-19T20:05:00.000+00:002007-11-19T20:05:00.000+00:00jumpingjack:Unfortunately, he does think he is an ...<B>jumpingjack:</B><BR/><BR/><I>Unfortunately, he does think he is an authority on the subject, and he states with such conviction that he convinces the ranks of AMDzone that he is some sort of God. So yes, many believe his antics.</I><BR/><BR/>It is very likely that a large part of his audience is even less informed on the subject than he is. Being in that category myself, I couldn't tell you if he knows his stuff or is just blowing hot air. I read these sites with interest but also am aware that while I can grasp some of the technical discussion, I lack the experience and schooling to know for sure what is right and what is mistaken.<BR/><BR/>I think a lot of visitors are like that. If they are heavily pro-AMD, they believe what he is saying because it's what they want to be true. If they are heavily pro-Intel, they hope that he is mistaken because that is what they want to believe. But in the end, the results are what will matter. It's interesting to read all this stuff, but I prefer not to invest myself personally in it. <BR/><BR/>My next CPU or GPU purchase won't be based on RDR or MCM or whatever acronyms we can throw on the pile. My purchases will be based on performance on applications that matter to me. I don't care if your CPU has better IPC, if it's not the fastest at a given price point, I'm not buying it!Tonushttps://www.blogger.com/profile/01082528970434639776noreply@blogger.comtag:blogger.com,1999:blog-2602471396566186819.post-50996463375181229192007-11-19T17:56:00.000+00:002007-11-19T17:56:00.000+00:00jumpingjack:Unfortunately, he does think he is an ...<B>jumpingjack:</B><BR/><BR/><I>Unfortunately, he does think he is an authority on the subject, and he states with such conviction that he convinces the ranks of AMDzone that he is some sort of God. So yes, many believe his antics.</I><BR/><BR/>It is very likely that a large part of his audience is even less informed on the subject than he is. Being in that category myself, I couldn't tell you if he knows his stuff or is just blowing hot air. I read these sites with interest but also am aware that while I can grasp some of the technical discussion, I lack the experience and schooling to know for sure what is right and what is mistaken.<BR/><BR/>I think a lot of visitors are like that. If they are heavily pro-AMD, they believe what he is saying because it's what they want to be true. If they are heavily pro-Intel, they hope that he is mistaken because that is what they want to believe. But in the end, the results are what will matter. It's interesting to read all this stuff, but I prefer not to invest myself personally in it. <BR/><BR/>My next CPU or GPU purchase won't be based on RDR or MCM or whatever acronyms we can throw on the pile. My purchases will be based on performance on applications that matter to me. I don't care if your CPU has better IPC, if it's not the fastest at a given price point, I'm not buying it!Tonushttps://www.blogger.com/profile/01082528970434639776noreply@blogger.comtag:blogger.com,1999:blog-2602471396566186819.post-11017788034754986762007-11-19T14:51:00.000+00:002007-11-19T14:51:00.000+00:00So out of curiosity, I wonder if Rahul Sood of HP/...So out of curiosity, I wonder if Rahul Sood of HP/VoodooPC <I>knowingly</I> lied when a couple months ago he spouted the nonsense that Phenom 3.0 Ghz would "kick the living crap" out of any CPU then on the market. Or did he <I>unknowingly</I> regurgitate the lies told him by someone at AMD? Either way, I'm now crossing Sood off my list of credible sources. Good riddance!Axelhttps://www.blogger.com/profile/15126742407361053721noreply@blogger.comtag:blogger.com,1999:blog-2602471396566186819.post-11461264342636003632007-11-19T09:12:00.000+00:002007-11-19T09:12:00.000+00:00Thanks, AMD.November 19, 2007 – 6:56 amThank you f...<I><B>Thanks, AMD.</B><BR/>November 19, 2007 – 6:56 am<BR/>Thank you for not just letting me down, but letting me down in the most unimaginative way possible. It’s like you didn’t even try. First, you told me your quad core was going to be WAY faster than any of Intel’s offerings, and that turned out to be a bust. And when that didn’t work, you conjured up some sort of proprietary-esque “platform” nonsense. It’s like you’re purposefully trying to hurt my feelings.<BR/><BR/>I had high hopes for you, AMD, but I don’t see a point in continuing this relationship. It’s like you’re purposefully trying to hurt my feelings. I told people to wait for you. I told people things would shape up. And what do you do? You drop a shitty quad-core CPU with shitty clock speeds, shitty benchmarks, and tell me that it’ll work great with your shitty new video card and your shitty new chipsets. WHY did you have to go and buy ATI? What was the point, really? Nobody cares about ATI anymore. The people who care about ATI and AMD anymore are the same people who flush money down the toilet for fun.<BR/><BR/>I don’t care if your new processor is cheap. Cheap doesn’t compensate for suck. Go fuck yourselves, AMD. And don’t bother calling me. We’re through</I><BR/><BR/><A HREF="http://jonathanhatch.com/thanks-amd" REL="nofollow"><STRONG>Thanks, AMD</STRONG></A><BR/><BR/>Ha ha ha ha haAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-2602471396566186819.post-36392239111201474802007-11-19T08:40:00.000+00:002007-11-19T08:40:00.000+00:00"I read it sometimes for a laugh and one thread ha..."I read it sometimes for a laugh and one thread had Sci reasoning that the triple core was delayed because yields on the quad core were so good. then there were comments like "good point, most people would have missed that.., etc." Absolutely hilarious."<BR/><BR/>I recall seeing that too (though don't recall if it was Sci) and nearly blowing the soda I was drinking through my nose I laughed so hard.<BR/><BR/>I couldn't find it but I found this Scientia "nugget" (his blog, Oc'07):<BR/><BR/>"So, I'm thinking Barcelona is at least 10% of production and that isn't bad for initial launch.I'm thinking that AMD is ramping normally but that the demand is high enough that there are still shortages. The descriptions that I've seen suggest that AMD is ramping aggressively."<BR/><BR/>I'M THINKING his 'thinking' is more HOPING and WISHING than actual thinking!<BR/><BR/>And this one for the next time he says he made no claims about 3.0GHz chips (this was a comment in his '2007: The second half' blog)<BR/><BR/>"Also, AMD will almost certainly have 3.0Ghz quads by Q1 08 "<BR/><BR/>Of course this could have been a typo and meant to be Q1'09?. I'm sure if confronted (and in the event he doesn't just censor the post), he'd say that 'almost certainly' doesn't mean "definitely" or some other ridiculous backtrack. He had other comments to this effect (and 2.8's in Q4, with an outside chance of 3.0) - however I can't find those.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-2602471396566186819.post-19391971893814456872007-11-19T08:19:00.000+00:002007-11-19T08:19:00.000+00:00Ho Ho --Read AnandTech's review, page 3:http://www...Ho Ho --<BR/><BR/>Read AnandTech's review, page 3:<BR/>http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=3153&p=3<BR/><BR/>Bravo to Anand for standing his ground.<BR/><BR/>Firingsquad went to the Tahoe dog and pony show, but they labeled it as such:<BR/>http://www.firingsquad.com/hardware/amd_phenom_preview/Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-2602471396566186819.post-11489989900950978642007-11-19T08:13:00.000+00:002007-11-19T08:13:00.000+00:00Anyone else wondered why didn't AMD give journalis...Anyone else wondered why didn't AMD give journalists retail CPUs? Perhaps if it had given them then there wouldn't have been enough to send to retailers.Ho Hohttps://www.blogger.com/profile/00177815588184912351noreply@blogger.comtag:blogger.com,1999:blog-2602471396566186819.post-53502330751516710842007-11-19T07:43:00.000+00:002007-11-19T07:43:00.000+00:00THG - "Our engineering sample was very promising i...THG - "Our engineering sample was very promising indeed. We were able to overclock the CPU by 25%, resulting in a 15% performance increase in 3DMark."<BR/><BR/>Promising? A 25% overclock? And how about that scaling - if this scaling holds true for the actual retail chips... there's no way this will be competitive with Penryn (putting aside the fact that Penryn also will likely clock higher). It really looks like AMD will have to price these lower to compensate for lower performance for the forseeable future.<BR/><BR/>I'm not a big 3Dmark benchmark person - is this benefit somehwat representative of what you would expect to see in other applications?<BR/><BR/>I'm curious as to why Tom views this result as promising? Also I wonder if anything should be read into AMD not allowing folks to play with Vcore - with a power measurement, this probably could have given folks an idea of how good/bad the leakage is on these things. Or perhaps AMD just assumed the journalist wouldn't know what they are doing and didn't want to fry the chips - still would've been nice to see any Vcore change. <BR/><BR/>I guess we'll have to wait until these become available tomorrow for a review site to purchase and do an overclocking review (he says laughing)Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-2602471396566186819.post-89786888764506370552007-11-19T07:23:00.000+00:002007-11-19T07:23:00.000+00:00"I also find it amusing that it appears as thought..."I also find it amusing that it appears as thought the triple core has slipped from Q1 to Q2... I mean, slipping a defective product? How sad is that, they can't even get a part with a defective core out the door! Perhaps they can just keep downbinning these into dual cores and then they won't even need to bother with "native" dual cores."<BR/><BR/>NOW is the time I really wish amdzone's archive was still in place. I read it sometimes for a laugh and one thread had Sci reasoning that the triple core was delayed because yields on the quad core were so good. then there were comments like "good point, most people would have missed that.., etc." Absolutely hilarious.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-2602471396566186819.post-77602940916835980792007-11-19T07:14:00.000+00:002007-11-19T07:14:00.000+00:00"so wait, all the review sites have the Phenom 970..."so wait, all the review sites have the Phenom 9700 on display in their benchmarking but just at the last moment AMD decides to delay it till Q1 2008?"<BR/><BR/>Well, AMD is in the habit of publishing benchmark data on processors they are not launching at that speed aren't they.<BR/><BR/>Anand did an interesting side not editorial:<BR/>http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=3153&p=3<BR/><BR/>AMD is in rough times, there is no doubt about that.<BR/><BR/>Anand's review is somewhat interesting, he was able to clock one of the 2.4 GHz Phenoms to 2.6 GHz (basically, AMD's FX processor when and if it actually launches), it still cannot beat the Q6600 :) :) <BR/><BR/>Catch that, "AMD's fastest Quad is Fragged by Intel's Slowest Quad" a headline you will not see on Sharikaboob's blog.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-2602471396566186819.post-82897666423693812802007-11-19T07:09:00.000+00:002007-11-19T07:09:00.000+00:00"so wait, all the review sites have the Phenom 970..."so wait, all the review sites have the Phenom 9700 on display in their benchmarking but just at the last moment AMD decides to delay it till Q1 2008?"<BR/><BR/>Come on, you're such a cynic - the INQ article says the error only occurs when all cores are loaded and certain circumstances are run. Clearly this would have taken months to find out and was mere COINCIDENCE that they found it right before launch!<BR/><BR/>You don't think they pulled it this late knowing that the sites would make a note of the lack of availability but in all likelihood keep the benchmarks in the reviews anyway?<BR/><BR/>It is also obvious this is the L3 issues is the ONLY reason why the clocks are so far below expectations. Well it's either that or AMD is 'focusing' on the low end products customers are apparently demanding.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-2602471396566186819.post-15414838442524719132007-11-19T07:00:00.000+00:002007-11-19T07:00:00.000+00:00Intel should send out their lower clocked Yorkfiel...Intel should send out their lower clocked Yorkfield and Wolfdale cpus to reviewers just to fill up their benchmark pages.<BR/><BR/>I mean, sure you won't be able to buy them till Q1 2008, but why let that interfere with good benchmarking. Just ask AMD. =)Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-2602471396566186819.post-29020409940631877282007-11-19T06:56:00.001+00:002007-11-19T06:56:00.001+00:00Hexus about sums it up:"But what we've also seen i...Hexus about sums it up:<BR/><BR/>"But what we've also seen is that AMD cannot match the clock-speed of Intel's slowest quad-core processor and, worse still, can't match Core 2 Quad's performance on a clock-for-clock basis either. "<BR/><BR/>"Bottom line: the new Phenom quad-core processor and 7-series chipset pack in some potent technology. Trouble is, Intel got there first. You need to be better than the competition if coming from behind: AMD's new launches aren't quite that."Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-2602471396566186819.post-42433581334037109352007-11-19T06:56:00.000+00:002007-11-19T06:56:00.000+00:00so wait, all the review sites have the Phenom 9700...so wait, all the review sites have the Phenom 9700 on display in their benchmarking but just at the last moment AMD decides to delay it till Q1 2008?<BR/><BR/>wow is that bait and switch or what? So technically, the reviews all have a cpu on display that is vaporware even though their articles are written with the 9700 in mind.<BR/><BR/>Although the Phenom 9700 still loses to the Q6600 in performance, it at least matches it in clockspeed, an important frame of reference for those "clock vs clock" people.<BR/><BR/>But in reality, AMD has neither caught up to Intel's _slowest_ consumer quad core in clockspeed nor IPC.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-2602471396566186819.post-68707602842737478912007-11-19T06:53:00.000+00:002007-11-19T06:53:00.000+00:00The reviews look about where things were expected,...The reviews look about where things were expected, given the recent lowering of expectations. Clearly the 40% better has turned into, well when you consider the price, it's kind of competitive....<BR/><BR/>What's scary though is the power #'s, if the THG review is to believed. The idle was horrendous when compare to AMD's K8 - The K10 was more than double the highest bin X2 (and at MUCH lower clock). It also only seemed to be competitive / slightly better than Intel's 65nm quad (I was expecting better performance here)<BR/><BR/>The hexus review was similar - even with a total system power measurement (which hits Intel harder because of the Northridge), the Q6600 was better at idle and under load. This doesn't even begin to factor in what is already known about the 45nm power consumption #'s.<BR/><BR/>I also find it amusing that it appears as thought the triple core has slipped from Q1 to Q2... I mean, slipping a defective product? How sad is that, they can't even get a part with a defective core out the door! Perhaps they can just keep downbinning these into dual cores and then they won't even need to bother with "native" dual cores.<BR/><BR/>As JJ astutely pointed out, it is rather obvious why this is a "platform" launch and not a "CPU" launch - AMD is clearly trying to take some attention away from the CPU performance.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-2602471396566186819.post-50088765696615747642007-11-19T06:18:00.000+00:002007-11-19T06:18:00.000+00:00Meyer will inherit a trainwreck of massive proport...<I>Meyer will inherit a trainwreck of massive proportions from Ruiz.</I><BR/><BR/>Indeed. Dirk Meyer is in for one hell of a job. The only good thing going for him is that Ruiz has already done the damage. If he can save AMD he will be heralded as a great CEO. If he fails, and AMD goes under, Ruiz will get the blame since he's already done the damage.<BR/><BR/><I><BR/>Confirmed to be slower than Kentsfield per clock, as the illicit previews all year have shown.</I><BR/><BR/>That's just pathetic. All these reviews aren't even counting the fact that Intel goes to 3Ghz AND that Yorkfield is coming for the masses in January. Yorkfield offers 5% IPC gain without SSE4, and that can go much higher when SSE4 is used.Unknownhttps://www.blogger.com/profile/04674699447174785970noreply@blogger.comtag:blogger.com,1999:blog-2602471396566186819.post-8301854047674170692007-11-19T06:08:00.000+00:002007-11-19T06:08:00.000+00:00“Also, AMD was running 3.0GHz demos a while ago - ...“Also, AMD was running 3.0GHz demos a while ago - did they just find this out now or did they bury the issue and are now just raising it as a convenient excuse to buy time as they fix other problems?”<BR/><BR/>I agree, the possibility also crossed my mind.<BR/><BR/><BR/>“Charlie (unkowingly?) to spin things.”<BR/><BR/><BR/>Nah, the article was written by Theo Valich. I agree with the rest of your supposition, however. Besides, in Charlie’s defense, I think he’s done with quoting AMD’s spin/pabulum without reservation, unfortunately he learned it the hard way.<BR/><BR/><BR/>“None of these are execution/engineering issues.”<BR/><BR/><BR/>Exactly the point, this is 100% on target.<BR/><BR/><BR/>Let take this a bit further, shall we? If they didn’t spend 5.4 B on a graphics company, perhaps they could have spent a little money on some midnight oil. <BR/><BR/>Then, why bother? It has never been said, or leaked, how many AMD’s engineers knew (the IBM tunneling read above) that quantum effects would be a limiting factor in Barcelona’s viability. Obviously, the hand writing was on the wall (IBM’s), literally. <BR/><BR/>Personally, I put these engineers in the GURU league of chip genius. If you knew it, I’m sure they knew it. Perhaps management rammed it down their throats anyway! <BR/><BR/>Then again, I’m a Union Electrician, if I think something ain’t gonna fly, I’ll tell ‘em to shove it, and I have. Perhaps highly disciplined corporate engineers don’t have that luxury, the poor bastards.<BR/><BR/>But, dumping on the guys out it the field? Nothing infuriates me more. This is what I believe these lying sons of bitches are doing; they have found a very convenient scapegoat, the guys who been busting their asses for over a year trying to make this pig fly.<BR/><BR/>SPARKSAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-2602471396566186819.post-40805513969966510882007-11-19T06:06:00.000+00:002007-11-19T06:06:00.000+00:00GiantHexus PHENOM reviewConfirmed to be slower tha...<B>Giant</B><BR/><BR/><I>Hexus PHENOM review</I><BR/><BR/>Confirmed to be slower than Kentsfield per clock, as the illicit previews all year have shown. Even worse, the 2.4 GHz Phenom 9700 has apparently been delayed to <A HREF="http://www.fudzilla.com/index.php?option=com_content&task=view&id=4259&Itemid=1" REL="nofollow">late Q1 08</A>.<BR/><BR/>Here are a couple other reviews confirming this cold hard truth:<BR/><A HREF="http://www.hothardware.com/Articles/AMD_Spider_Platform__Phenom_790FX_RV670/" REL="nofollow">Hot Hardware</A><BR/><A HREF="http://www.tomshardware.com/2007/11/19/the_spider_weaves_its_web/" REL="nofollow">Tom's Hardware</A><BR/><BR/>Meyer will inherit a trainwreck of massive proportions from Ruiz.Axelhttps://www.blogger.com/profile/15126742407361053721noreply@blogger.comtag:blogger.com,1999:blog-2602471396566186819.post-53134782747732327452007-11-19T05:59:00.000+00:002007-11-19T05:59:00.000+00:00Holy PR Batman: http://biz.yahoo.com/bw/071119/200...Holy PR Batman: <BR/><BR/>http://biz.yahoo.com/bw/071119/20071118005069.html?.v=1<BR/><BR/>"In a new initiative to measure real-world processor power consumption, AMD surveyed consumer and commercial users to understand precise usage patterns. AMD measured power consumption for these usage patterns and has found that AMD Phenom processors with Cool’n’Quiet 2.0 technology rated at 95W TDP can consume an average power of 32W for consumers and 29W for commercial users "<BR/><BR/>They neglected the initiative including drugging these users such that they would pass out and the "uasge" pattern would consist of things idling! Actually if you read the notes it assumes 39-44% idle time)<BR/><BR/>Seriously why not do a similar comparison study with K8 users to see what that performance would be like? Thus they can truly show off the 'new' benefit of the new platform. Why? Perhaps the power levels have more to do with the measurement technique than the actual product?!?!? Just a thought. How about doing it with an Intel CPU? <BR/><BR/>Fantastics scientific discipline - let's do a study and provide no baseline or reference point in order to draw a reasonable conclusion!<BR/><BR/>"AMD Phenom processors 9600 (2.3GHz) and 9500 (2.2GHz) are now available for $283 and $251"<BR/><BR/>Slightly >10% cost difference for 100MHz? Which would be <5% speed delta... why even bother with 2 bins at this point? Heck might as well sprinkle some triple cores in there (probably got tons of those laying around)Anonymousnoreply@blogger.com