Language Selection

English French German Italian Portuguese Spanish

First Dual-Core Pentium 4 a Rush Job, Intel Says

Filed under

Intel's first dual-core chip was a hastily concocted design that was rushed out the door in hopes of beating rival Advanced Micro Devices (AMD) to the punch, an Intel engineer told attendees at the Hot Chips conference today.

Following the company's realization that its single-core processors had hit a wall, Intel engineers plunged headlong into designing the Smithfield dual-core chip in 2004, but they faced numerous challenges in getting that chip to market, according to Jonathan Douglas, a principal engineer in Intel's Digital Enterprise Group, which makes chips for office desktops and servers.

"We faced many challenges from taking a design team focused on making the highest-performing processors possible to one focused on multicore designs," Douglas said in a presentation on Intel's Pentium D 800 series desktop chips and the forthcoming Paxville server chip, both of which are based on the Smithfield core.

Same Old Bus

Intel was unable to design a new memory bus in time for the dual-core chip, so it kept the same bus structure that older Pentium 4 chips used, Douglas said at the conference at Stanford University. This bus could support two separate single-core processors, but it was far less efficient than either the dual-independent buses that will appear on the Paxville processors or the integrated memory controller used on AMD's chips. The memory bus or front-side bus on Intel's chips is used to connect the processor to memory.

All of Intel's testing tools and processes had been designed for single-core chips, Douglas said. As a result, the company had to quickly devise a new testing methodology for dual-core chips that could measure the connections between both cores.

In addition, engineers had to design a new package for the Pentium D chips that could accommodate both cores. "We're putting two cores in one package; it's like trying to fit into the pair of pants you saved from college," Douglas said.

Another Design Preferred

Intel would have preferred to design a package that put two pieces of silicon in a single package, like the design that will be used for a future desktop chip called Presler, but its packaging team simply didn't have time to get that in place for Smithfield, Douglas said.

The company's Pentium D processors consist of two Pentium 4 cores placed closely together on a single silicon die. The design creates some problems, since dual-core processors must have some logic that coordinates the actions of both cores, and those transistors must go somewhere in an already small package, Douglas said. This complication led to signaling problems that needed to be overcome, he said.

Intel also had to design special thermal diodes into the chip to closely monitor the heat emitted by the combination of two fast processor cores, Douglas said.

Ultimately, Intel completed the Smithfield processor core in nine months, Douglas said. By Intel's standards, that is an extremely short development time for a major processor design, said Kevin Krewell, editor in chief of The Microprocessor Report in San Jose, California.

"Most designs take years," Krewell said. "But it was very important for them to get back in the game and have a road map."


Intel began to put together the Smithfield project around the time it publicly announced (in May 2004) plans to cancel two future single-core designs and concentrate on multicore chips. The company realized that wringing more clock speed out of its single-core designs would require a significant engineering effort to deal with the excessive heat given off by such chips.

At the time, AMD had already started work on a dual-core version of its Opteron server processor, which it subsequently demonstrated in September of that year. AMD unveiled its dual-core Opteron chip in April, a few days after Intel launched Smithfield. AMD has since released dual-core desktop chips.

One reason for Intel's aggressive schedule for developing Smithfield was the company's need to respond to AMD's actions, Douglas said, without mentioning AMD by name. "We needed a competitive response. We were behind," he said.

Despite the rush, Smithfield was good enough to get Intel into the dual-core era, Krewell said. "It's not an optimal solution, but it's a viable solution. It works, and it works reasonably well," he said.

Intel took a little more time designing the server version of Smithfield, known as Paxville, Douglas said. For instance, the company addressed the bus inefficiencies by designing Paxville to use dual-independent front-side buses. Also, the more sophisticated package was available in time for Paxville, reducing the chip's power consumption, he said.

Paxville will be released ahead of schedule later this year in separate versions for two-way servers and for servers with four or more processors. Though Intel had originally expected to release the chip in 2006, it announced Monday that it will get Paxville out the door in the second half of this year. Another dual-core server processor, code-named Dempsey, will be released in the first quarter of 2006.

Future multicore designs will present additional challenges, Douglas said. Point-to-point buses and integrated memory controllers have been prominent features of other multicore designs, such as Opteron and the Cell processor. These designs help improve performance, but they require a larger number of pins to deliver electricity into the processor, and that can hurt yields, he said.

By Tom Krazit
IDG News Service

More in Tux Machines

lkml: remove eight obsolete architectures

In the end, it seems that while the eight architectures are extremely different, they all suffered the same fate: There was one company in charge of an SoC line, a CPU microarchitecture and a software ecosystem, which was more costly than licensing newer off-the-shelf CPU cores from a third party (typically ARM, MIPS, or RISC-V). It seems that all the SoC product lines are still around, but have not used the custom CPU architectures for several years at this point. Read more

If you hitch a ride with a scorpion… (Coverity)

I haven’t seen a blog post or notice about this, but according to the Twitters, Coverity has stopped supporting online scanning for open source projects. Is anybody shocked by this? Anybody? [...] Not sure what the story is with Coverity, but it probably has something to do with 1) they haven’t been able to monetize the service the way they hoped, or 2) they’ve been able to monetize the service and don’t fancy spending the money anymore or 3) they’ve pivoted entirely and just aren’t doing the scanning thing. Not sure which, don’t really care — the end result is the same. Open source projects that have come to depend on this now have to scramble to replace the service. [...] I’m not going to go all RMS, but the only way to prevent this is to have open tools and services. And pay for them. Read more

Easily Fund Open Source Projects With These Platforms

Financial support is one of the many ways to help Linux and Open Source community. This is why you see “Donate” option on the websites of most open source projects. While the big corporations have the necessary funding and resources, most open source projects are developed by individuals in their spare time. However, it does require one’s efforts, time and probably includes some overhead costs too. Monetary supports surely help drive the project development. If you would like to support open source projects financially, let me show you some platforms dedicated to open source and/or Linux. Read more

KDE: Kdenlive, Kubuntu, Elisa, KDE Connect

  • Kdenlive Café #27 and #28 – You can’t miss it
    Timeline refactoring, new Pro features, packages for fast and easy install, Windows version and a bunch of other activities are happening in the Kdenlive world NOW!
  • Kubuntu 17.10 Guide for Newbie Part 9
    This is the 9th article, the final part of the series. This ninth article gives you more documentations to help yourself in using Kubuntu 17.10. The resources are online links to certain manuals and ebooks specialized for Kubuntu basics, command lines usage, software installation instructions, how to operate LibreOffice and KDE Plasma.
  • KDE's Elisa Music Player Preparing For Its v0.1 Released
    We have been tracking the development of Elisa, one of several KDE music players, since development started about one year ago. Following the recent alpha releases, the KDE Elisa 0.1 stable release is on the way. Elisa developers are preparing the Elisa v0.1 release and they plan to have it out around the middle of April.
  • KDE Connect Keeps Getting Better For Interacting With Your Desktop From Android
    KDE Connect is the exciting project that allows you to leverage your KDE desktop from Android tablets/smartphones for features like sending/receiving SMS messages from your desktop, toggling music, sharing files, and much more. KDE Connect does continue getting even better.
  • First blog & KDE Connect media control improvements
    I've started working on KDE Connect last November. My first big features were released yesterday in KDE Connect 1.8 for Android, so cause for celebration and a blog post! My first big feature is media notifications. KDE Connect has, since it's inception, allowed you to remotely control your music and video's. Now you can also do this with a notification, like all Android music apps do! So next time a bad song comes up, you don't need to switch to the KDE Connect app. Just click next on the notification without closing you current app. And just in case you don't like notifications popping up, there's an option to disable it.