Language Selection

English French German Italian Portuguese Spanish

First Dual-Core Pentium 4 a Rush Job, Intel Says

Filed under
Hardware

Intel's first dual-core chip was a hastily concocted design that was rushed out the door in hopes of beating rival Advanced Micro Devices (AMD) to the punch, an Intel engineer told attendees at the Hot Chips conference today.

Following the company's realization that its single-core processors had hit a wall, Intel engineers plunged headlong into designing the Smithfield dual-core chip in 2004, but they faced numerous challenges in getting that chip to market, according to Jonathan Douglas, a principal engineer in Intel's Digital Enterprise Group, which makes chips for office desktops and servers.

"We faced many challenges from taking a design team focused on making the highest-performing processors possible to one focused on multicore designs," Douglas said in a presentation on Intel's Pentium D 800 series desktop chips and the forthcoming Paxville server chip, both of which are based on the Smithfield core.

Same Old Bus

Intel was unable to design a new memory bus in time for the dual-core chip, so it kept the same bus structure that older Pentium 4 chips used, Douglas said at the conference at Stanford University. This bus could support two separate single-core processors, but it was far less efficient than either the dual-independent buses that will appear on the Paxville processors or the integrated memory controller used on AMD's chips. The memory bus or front-side bus on Intel's chips is used to connect the processor to memory.

All of Intel's testing tools and processes had been designed for single-core chips, Douglas said. As a result, the company had to quickly devise a new testing methodology for dual-core chips that could measure the connections between both cores.

In addition, engineers had to design a new package for the Pentium D chips that could accommodate both cores. "We're putting two cores in one package; it's like trying to fit into the pair of pants you saved from college," Douglas said.

Another Design Preferred

Intel would have preferred to design a package that put two pieces of silicon in a single package, like the design that will be used for a future desktop chip called Presler, but its packaging team simply didn't have time to get that in place for Smithfield, Douglas said.

The company's Pentium D processors consist of two Pentium 4 cores placed closely together on a single silicon die. The design creates some problems, since dual-core processors must have some logic that coordinates the actions of both cores, and those transistors must go somewhere in an already small package, Douglas said. This complication led to signaling problems that needed to be overcome, he said.

Intel also had to design special thermal diodes into the chip to closely monitor the heat emitted by the combination of two fast processor cores, Douglas said.

Ultimately, Intel completed the Smithfield processor core in nine months, Douglas said. By Intel's standards, that is an extremely short development time for a major processor design, said Kevin Krewell, editor in chief of The Microprocessor Report in San Jose, California.

"Most designs take years," Krewell said. "But it was very important for them to get back in the game and have a road map."

Timeline

Intel began to put together the Smithfield project around the time it publicly announced (in May 2004) plans to cancel two future single-core designs and concentrate on multicore chips. The company realized that wringing more clock speed out of its single-core designs would require a significant engineering effort to deal with the excessive heat given off by such chips.

At the time, AMD had already started work on a dual-core version of its Opteron server processor, which it subsequently demonstrated in September of that year. AMD unveiled its dual-core Opteron chip in April, a few days after Intel launched Smithfield. AMD has since released dual-core desktop chips.

One reason for Intel's aggressive schedule for developing Smithfield was the company's need to respond to AMD's actions, Douglas said, without mentioning AMD by name. "We needed a competitive response. We were behind," he said.

Despite the rush, Smithfield was good enough to get Intel into the dual-core era, Krewell said. "It's not an optimal solution, but it's a viable solution. It works, and it works reasonably well," he said.

Intel took a little more time designing the server version of Smithfield, known as Paxville, Douglas said. For instance, the company addressed the bus inefficiencies by designing Paxville to use dual-independent front-side buses. Also, the more sophisticated package was available in time for Paxville, reducing the chip's power consumption, he said.

Paxville will be released ahead of schedule later this year in separate versions for two-way servers and for servers with four or more processors. Though Intel had originally expected to release the chip in 2006, it announced Monday that it will get Paxville out the door in the second half of this year. Another dual-core server processor, code-named Dempsey, will be released in the first quarter of 2006.

Future multicore designs will present additional challenges, Douglas said. Point-to-point buses and integrated memory controllers have been prominent features of other multicore designs, such as Opteron and the Cell processor. These designs help improve performance, but they require a larger number of pins to deliver electricity into the processor, and that can hurt yields, he said.

By Tom Krazit
IDG News Service

More in Tux Machines

Clonezilla Live "Wily"

One of my favourite open source utilities is Clonezilla Live. The Clonezilla project creates tools to assist people in making copies of their hard drives and disk partitions. This can be useful at home for transferring an operating system from one computer to another. It's also a quick way to backup a system's packages and configuration files. In office environments it can be a big time saver to be able to clone one generic operating system onto multiple computers quickly. While installing, configuring and updating an operating system from scratch might take anywhere from half an hour to several hours, Clonezilla can transfer a copy of an operating system across a network in ten to twenty minutes. Read more

Year of Linux, Steam on PS4, Linux in Space

Today in Linux news Linus Torvalds declared the end to "Year of Linux desktop" jokes as loosely Linux-based Chromebooks outsell Macs. The big news over the weekend was of clever hackers who installed Arch then Steam on his PlayStation 4. Mageia extended their version 6 artwork contest deadline and the GIMP project put out the call for upcoming version 2.10 documentation update. Dimstar has the latest on Tumbleweed and Lunduke listed 10 more fun things to do in a terminal. Read more

AMDGPU-PRO OpenCL vs. NVIDIA 364 Compute Benchmarks On Ubuntu Linux

Coming up in a short while I have some fresh AMDGPU-PRO BETA 2 (the fresh -PRO "hybrid" driver release) for OpenGL graphics performance while here are some quick OpenCL compute metrics. I tested this new AMDGPU-PRO driver on the GCN 1.2-based Radeon R9 285 (Tonga) and R9 Fury (Fiji) graphics cards and it also worked out fine for the GCN 1.1-based Radeon R9 290. All the tests were done on Ubuntu 16.04 LTS with the Linux 4.4 kernel. I compared these latest AMD results to the NVIDIA 364.19 results I did from some recent benchmarks on Phoronix. Read more

Not Just for Computer Nerds: 5 Reasons Linux is a Growing Business Trend

Linux has come out of oblivion, forcing itself onto the realms of digital climate. This article brings us closer to Linux in five different yet intriguing ways. No more is it a measure of geekiness, as Linux drives the entrepreneurial wheel with complete control. Days are gone when this open-source operating platform was more of a luxury with enthusiasts raving about the flexibilities on offer. Read more