Skip to main content [aditude-amp id="stickyleaderboard" targeting='{"env":"staging","page_type":"article","post_id":93637,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,social,","session":"B"}']

Intel takes another jab at Nvidia during its research day

Intel takes another jab at Nvidia during its research day

Justin Rattner almost made it through his speech at Intel’s Research Day without controversy. But the chief technology officer of Intel decided to take questions. Then, as he was about to get off the stage, someone asked him what the world’s biggest microprocessor maker thought about graphics processors.

If you haven’t noticed lately, Intel and Nvidia are in a war of words about the usefulness of graphics chips. Intel would love to see them disappear so that it can do all of the important tasks inside your computer. Nvidia’s very existence depends on proving Intel wrong. They are, of course, both too shortsighted to see that they need each other badly to thrive in the personal computer industry.

[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":93637,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,social,","session":"B"}']

The question set off a sequence I’ve seen before. Intel makes central processing units, or CPUs, and it prefers that people pay more money for the CPUs than for the stand-alone graphics chips, or GPUs, from Nvidia or Advanced Micro Devices’ graphics division. Rattner chose to needle Nvidia, which is the die-hard GPU company based in Santa Clara, yet again. He said Nvidia’s graphics techniques, dubbed rasterization (where a computer draws a scene by layering shades of colors on top of each other), would become obsolete.

This war of words is kind of getting old. I wrote about it last year after I went to Intel’s research day event. Last year, Rattner and one of his top graphics guys, Daniel Pohl , showed off how you could do graphics on a CPU using a technique known as ray tracing. Ray tracing essentially shoots rays in all directions and only draws on a screen the objects that are hit by the rays. It doesn’t waste time drawing things that are hidden and, according to Intel, it is best done on the CPU.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

Nvidia, of course, begs to differ. It believes that rasterization has a lot of mileage left in it and that ray tracing could be useful in a limited number of graphics applications. Ray tracing, it believes, can be done perfectly easily on a GPU. Nvidia bought a small ray tracing company recently, Rattner said, and he noted later in an interview that Nvidia was trying to hire Intel’s ray tracing experts.

Rattner said, “Their actions speak louder than their words.”

But Nvidia said that it bought the company not to shift to ray tracing but to create chips where hybrid techniques are used. Nvidia realizes it just has to keep rasterization moving forward, always staying one step ahead of Intel.

The problem for Intel is that ray tracing takes a huge amount of processing power and Intel’s CPUs aren’t up to the task yet. Last year, Pohl showed off the vintage shooting game “Quake III” running on a multi-core CPU at reasonably good speeds. Pohl (pictured here) was back again this year with a 16-core, four-chip monster CPU, running last fall’s game, “Quake Wars: Enemy Territory.” It ran at 16 frames per second, not the usual 60 frames per second that is possible on a good graphics chip.

Pohl says that programmers will like ray tracing better because they can do many complex tasks with simple one-line programs. Next year, Pohl promises better results thanks to the inexorable progress of chip technology. Rattner notes that measuring the cost and energy used to generate a pixel on a screen is how we will find out if rasterization or ray tracing is more efficient.

Note: Cnet blogger Peter Glaskowsky wrote a response to this  post on his own blog.

[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":93637,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,social,","session":"B"}']

Are you yawning with me? Make no mistake. This is the greatest competitive battle now in progress in the PC chip industry. It’s just that I’ve written about it for a couple of years now. Hubert Nguyen of Ubergizmo said he was getting bored by the rhetoric. There is a lot of bluster, but very little substance. It’s like a Phoney War is under way, and the shooting hasn’t yet started.

The next battle in this war will come in August at the Siggraph graphics conference in Los Angeles. There, Intel will describe its Larrabee processor, which will be a stand-alone graphics chip that will be capable of both rasterization and ray tracing.

Jon Peddie, a longtime graphics analyst of Jon Peddie Research, will be staging a luncheon (Aug. 13) on ray tracing versus rasterization during Siggraph. John Casey, an Intel PR guy, laughed when he saw Peddie and I standing in front of Pohl’s demo, as if he could predict what we were talking about.

I told Rattner that I was almost certain I’ll be talking about it again with him a year from now. Like Nguyen, I’m a little fed up at this point with both sides. The battle  is going to play out for a couple of years. But at some point, somebody is going to come out as King of the Hill. When the shooting war starts, it won’t be boring. Get on with it.

[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":93637,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,social,","session":"B"}']

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More