User Control Panel
Search iVirtua
Advanced/Tag Search...
Search Users...
What is iVirtua Exclusive Community?
  • An exclusive gaming industry community targeted to, and designed for Professionals, Businesses and Students in the sectors and industries of Gaming, New Media and the Web, all closely related with it's Business and Industry.
  • A Rich content driven service including articles, contributed discussion, news, reviews, networking, downloads, and debate.
  • We strive to cater for cultural influencers, technology decision makers, early adopters and business leaders in the gaming industry.
  • A medium to share your or contribute your ideas, experiences, questions and point of view or network with other colleagues here at iVirtua Community.
Guest's Communication
Live Chat
Teamspeak (VOIP) Audio Conference
Private Messages
Check your Private Messages
Themes
Choose an iVirtua Community theme to reflect your interests...
Business Theme
India/Arabic Theme

Gaming Theme
iVirtua Recommends
Fly Emirates Advertising
Nvidia declares the CPU dead
Digg This Digg Topic Tag it on del.icio.us Tag topic on On del.icio.us Technorati Search Technorati Search Post to Slashdot Post to Slashdot
You are currently in Hardware, Internet, Networking, Comms and Security
Post new topic Reply to topic
Mon Apr 28, 2008 2:21 pm Reply and quote this post
Nvidian boy Roy basically declares that the CPU is dead, and Nvidia's chipsdo all the real work in a PC.
To back up his claims. He quotes almost an entire article fromTGDaily, penned by our old mucker Theo Valich, and foundhere.
We say "almost", because Roy omits the bits he doesn't like and highlightsthe bits he does.
Just in case you're wondering here's a bit he missed off:
"In case you wonder, no, Nvidia's CEO did not deliver explanations on theresults of Windows hardware survey which blamed nv4_displ.dll driver for almosta third of BSODs in Windows Vista (Google search will reveal around 613.000results for a "Nvidia BSOD" search)."
The following letter is reproduced exactly as we got it with only formattingchanges made. Links were removed during HTMLising, but can be found in theoriginal article which, ironically, wasn't linked in the email. Almost like theydidn't want you to read the full version for some reason. µ
--------------------------------------------------------------------------------------------------------------
From: Roy Taylor [mailto:RTaylor@nvidia.com]
Sent: 10 April 2008 23:36
Subject: The best job in the world.
Guys I have the best job in the world. Official. I cant tell you how much funwe’re having here right now.
I don’t know how much this will mean to you all but for those that don’t knowa war has just started that will likely be written about foryears and which will affect everyone who owns a PC. Everyone.
Basically the CPU is dead. Yes, that processor you see advertisedeverywhere from Intel. Its run out of steam. The fact is that it nolonger makes anything run faster. You don’t need a fast one anymore. This iswhy AMD is in trouble and its why Intel are panicking. They are panicking somuch that they have started attacking us. This is because you do still [need]one chip to get faster and faster – the GPU. That GeForce chip. Yes honestly. NoI am not making this up. You are my friends and so I am not selling you. Thisshit is just interesting as hell.
Today your PC plays video (its our chip that makes that work), you play games(its our chip that makes that work), you rip movies (yup our chip again) – youget the picture?
Today we hit back at Intel this is what the press are saying, I thought you’dbe interested…
-------------------------------------------------------------------------------------------------------------
The visual computing clash: Nvidia CEO opens a can of whoop-ass forIntel
Business and Law
By Theo Valich
Thursday, April 10, 2008 17:12
--------------------------------------------------------------------------------------------------------------
Santa Clara (CA) – Nvidia and Intel are on a crash course: WithNvidia moving its GPUs into potential CPU territory and Intel tuning CPUs totake over GPU territory, you have a classic scenario for a confrontation betweentwo industry giants that have the same goal – to shape the era of visualcomputing. Nvidia’s chief executive officer Jen-Hsun Huang today lashed out atrecent Intel announcements and claims that indicated how the company wants tobuild up its graphics front line. Huang chose strong, emotional words to strikeback, calling Intel’s second discrete visual computing offering“Laughabee”.
Huang, known for his great passion for the company he founded, apparently hasbeen hit on the wrong nerve. He opened Nvidia’s financial analyst day byexplaining that "Nvidia is a Visual Computing company, not a semiconductorcorporation" and that his goal is nothing else but "to make GPUs better anddeliver great experience". But the opening lines quickly shifted into anotherdimension when he compared Intel's performance roadmap from IDF Spring 2008 andNvidia's current products.
"Intel is false. They have crossed the line, they're saying false things."
“They say”, Huang stated, "Nvidia is going to be dead. Their graphics aregood, but we'll put graphics into the CPU and there is no place for them tostick it." He went on to compare Intel’s current Core 2 platform with thenext-gen processors and said that it would be “nothing else but putting moretransistors [on it] instead of thinking of a solution.”
"People don't buy Nvidia products because they have to, because they'reallowed to. They buy our stuff because they want to. They're overwhelmed by thevalue and the benefit we bring," Huang noted.
"This team [Nvidia] is like a Ferrari team. We know how to bring visualtechnology to life. We bring 20-30-40x the performance advantage and 27x theprice/performance ratio". Even if Intel was able to deliver a 10-foldperformance increase, the company would still not be able to reach catch up withNvidia and AMD in the discrete space, Huang said.
Jen-Hsun also commented on article by Jon Peddie showing the last ten yearsof the graphics market, recently published on TG Daily, stating that Nvidia wentthrough a lot of competitors and sees Intel just one of them.
Intel’s Larrabee was called "Laughabee". Much of the performance provided bythis card in fact will depend on quality drivers for DirectX and OpenGL APIs.Huang openly doubted that Intel can deliver workable drivers, judging by theircurrent state of incompatibility. Bear in mind that Intel's integrated graphicsparts don't yield great results in Microsoft DCT tests, and most of the issuesare waived by WHQL Labs due to the lack of hardware support. Then again, youshould not consider Intel's integrated graphics being garbage because ofwaivers on the DCT test (Nvidia had the same issues with GeForce FX and 6/7series of products).
Over the past few weeks, numerous Intel representatives were talking aboutIntel’s visual computing ideas – starting with Paul Otellini’s presentation atthe firm’s analyst day, Pat Gelsinger’s pre-IDF briefing and more aggressiveinformation that was coming out of IDF. Ranging from the integration of graphicsinto the Nehalem CPU to the company’s first discrete graphics card, for whichthe company is creating lots of hype.
We were willing to give Intel benefit of the doubt on future parts, but thefact of the matter is that their current integrated graphics systems willprobably end up costing Microsoft billions of dollars and an integrated PCplatform that is believed to be slowly pushing the mainstream PC market into theconsole market. Given the amount of issues that Intel integrated graphics facestoday, including the criticism coming from industry gurus such as Tim Sweeneyand John Carmack, you could expect Nvidia to go take aim at Intel today.
While it certainly looks that Intel and Nvidia are heading into aconfrontation, it appears that some information may also got out of hand. Forexample, we were contacted by Intel about a recent article in which an Intelengineer stated that people “probably” won’t need graphics cards in the futureanymore. In a statement sent to us by email, the company said:
"Intel is not predicting the end of the discrete graphics business.Moore's Law has allowed Intel to innovate and integrate. As a result, we expectthat we and others will integrate graphics and visual computing capabilitiesdirectly into our CPUs in the future much like floating point coprocessors andother multimedia functions have in the past. However, we don't expect that thisintegration will eliminate the market for higher-end discrete graphics cards andthe value they provide."
-----
Roy Taylor
VP Content Business Development (CBD) Relations
NVIDIA Corp,. Cell +1 408 XXX XXXX
This email message is for the sole use of the intended recipient(s) and maycontain confidential information. Any unauthorized review, use, disclosure ordistribution is prohibited. If you are not the intended recipient, pleasecontact the sender by reply email and destroy all copies of the originalmessage.
-----------------------------------------------------------------------------------
Note
TGDaily article reprinted here with permission. Edits by Nvidia withoutpermission.

Contributed by Editorial Team, Executive Management Team
372659 iVirtua Loyalty Points • View ProfileSend Private MessageBack to Top

Tue Apr 29, 2008 7:43 am Reply and quote this post
I didn't read all of this but I find this article somewhat agreeable.  First of all, the GPU does not do all the work (unless Roy was just exaggerating), not even most of the work.  It just has the most stress.  Is going past a 2.8ghz dual core worth your money?  Certainly not.  I have a 2.4GHz socket 939 and that still doesn't reach 100% during gaming.  I feel processor companies are in trouble but not as much as this article says they are.  Intel is huge, they can afford to lose some money if they stop these stupid projects and advertisements.  AMD can't afford to lose money, in fact they're getting pretty far behind.  but they made 1 smart decision - they combined with ATI.  This way, they still get money from the graphics department.  So far AMD and ATI have made a significant difference working together, but Nvidia is still the better choice.

So again, I don't believe Intel and AMD have something to panic about, they just have to expect a lower income from now on (except AMD/ATI if their video cards become faster than Nvidia's).

Contributed by schmidtbag, iVirtua Leading Contributor
4511 iVirtua Loyalty Points • View ProfileSend Private MessageBack to Top

Related Articles
Post new topic   Reply to topic


Page 1 of 1

iVirtua Latest
Latest Discussion

Discuss...
Latest Articles and Reviews

Latest Downloads
Subscribe to the iVirtua Community RSS Feed
Use RSS and get automatically notified of new content and contributions on the iVirtua Community.


Tag Cloud
access amd announced applications author based beta building business card case company content cool core course cpu create data deal dec demo design desktop developers development digital download drive email feature features file files firefox flash free future gaming google graphics hardware help industry information intel internet iphone ipod jan launch linux lol love mac market media memory million mobile money movie music net nintendo nov nvidia oct office official online patch performance playing power price product program ps3 pst publish ram release released report rss sales screen search security sep server show size software sony source speed support technology thu tue update video vista war web website wii windows work working works xbox 360 2006 2007 2008

© 2006 - 2008 iVirtua Community (UK), Part of iVirtua Media Group, London (UK). Tel: 020 8144 7222

Terms of Service and Community RulesAdvertise or Affiliate with iVirtuaRSSPress Information and Media CoverageiVirtua Version 4PrivacyContact