Language Selection

English French German Italian Portuguese Spanish

First Dual-Core Pentium 4 a Rush Job, Intel Says

Filed under
Hardware

Intel's first dual-core chip was a hastily concocted design that was rushed out the door in hopes of beating rival Advanced Micro Devices (AMD) to the punch, an Intel engineer told attendees at the Hot Chips conference today.

Following the company's realization that its single-core processors had hit a wall, Intel engineers plunged headlong into designing the Smithfield dual-core chip in 2004, but they faced numerous challenges in getting that chip to market, according to Jonathan Douglas, a principal engineer in Intel's Digital Enterprise Group, which makes chips for office desktops and servers.

"We faced many challenges from taking a design team focused on making the highest-performing processors possible to one focused on multicore designs," Douglas said in a presentation on Intel's Pentium D 800 series desktop chips and the forthcoming Paxville server chip, both of which are based on the Smithfield core.

Same Old Bus

Intel was unable to design a new memory bus in time for the dual-core chip, so it kept the same bus structure that older Pentium 4 chips used, Douglas said at the conference at Stanford University. This bus could support two separate single-core processors, but it was far less efficient than either the dual-independent buses that will appear on the Paxville processors or the integrated memory controller used on AMD's chips. The memory bus or front-side bus on Intel's chips is used to connect the processor to memory.

All of Intel's testing tools and processes had been designed for single-core chips, Douglas said. As a result, the company had to quickly devise a new testing methodology for dual-core chips that could measure the connections between both cores.

In addition, engineers had to design a new package for the Pentium D chips that could accommodate both cores. "We're putting two cores in one package; it's like trying to fit into the pair of pants you saved from college," Douglas said.

Another Design Preferred

Intel would have preferred to design a package that put two pieces of silicon in a single package, like the design that will be used for a future desktop chip called Presler, but its packaging team simply didn't have time to get that in place for Smithfield, Douglas said.

The company's Pentium D processors consist of two Pentium 4 cores placed closely together on a single silicon die. The design creates some problems, since dual-core processors must have some logic that coordinates the actions of both cores, and those transistors must go somewhere in an already small package, Douglas said. This complication led to signaling problems that needed to be overcome, he said.

Intel also had to design special thermal diodes into the chip to closely monitor the heat emitted by the combination of two fast processor cores, Douglas said.

Ultimately, Intel completed the Smithfield processor core in nine months, Douglas said. By Intel's standards, that is an extremely short development time for a major processor design, said Kevin Krewell, editor in chief of The Microprocessor Report in San Jose, California.

"Most designs take years," Krewell said. "But it was very important for them to get back in the game and have a road map."

Timeline

Intel began to put together the Smithfield project around the time it publicly announced (in May 2004) plans to cancel two future single-core designs and concentrate on multicore chips. The company realized that wringing more clock speed out of its single-core designs would require a significant engineering effort to deal with the excessive heat given off by such chips.

At the time, AMD had already started work on a dual-core version of its Opteron server processor, which it subsequently demonstrated in September of that year. AMD unveiled its dual-core Opteron chip in April, a few days after Intel launched Smithfield. AMD has since released dual-core desktop chips.

One reason for Intel's aggressive schedule for developing Smithfield was the company's need to respond to AMD's actions, Douglas said, without mentioning AMD by name. "We needed a competitive response. We were behind," he said.

Despite the rush, Smithfield was good enough to get Intel into the dual-core era, Krewell said. "It's not an optimal solution, but it's a viable solution. It works, and it works reasonably well," he said.

Intel took a little more time designing the server version of Smithfield, known as Paxville, Douglas said. For instance, the company addressed the bus inefficiencies by designing Paxville to use dual-independent front-side buses. Also, the more sophisticated package was available in time for Paxville, reducing the chip's power consumption, he said.

Paxville will be released ahead of schedule later this year in separate versions for two-way servers and for servers with four or more processors. Though Intel had originally expected to release the chip in 2006, it announced Monday that it will get Paxville out the door in the second half of this year. Another dual-core server processor, code-named Dempsey, will be released in the first quarter of 2006.

Future multicore designs will present additional challenges, Douglas said. Point-to-point buses and integrated memory controllers have been prominent features of other multicore designs, such as Opteron and the Cell processor. These designs help improve performance, but they require a larger number of pins to deliver electricity into the processor, and that can hurt yields, he said.

By Tom Krazit
IDG News Service

More in Tux Machines

How To Build A Raspberry Pi Smartwatch — The Geekiest Watch Ever Made

In our Getting Started With Raspberry Pi series, we’ve introduced you to the basics of Pi, told you how to get everything you need, and help you boot a basic operating system. But, Raspberry Pi is much more than that. You can use it as a TOR proxy router, build your own PiPhone, and even install Windows 10 IoT. This little device comes with lots of flexibility, that allows it to be used in multiple applications. Well, did you ever think about wearing your Raspberry Pi? If your answer is NO, I won’t be surprised. If you imagine a scenario where Raspberry Pi is used to build a smartwatch, it would look too bulky. Well, that’s the thing about making geeky things that set you apart from the regular crowd, right? Read more

Ubuntu Leftovers

  • Yakkety Yak Alpha 2 Released
  • Ubuntu 16.10 "Yakkety Yak" Alpha 2 Released
    Today marks the second alpha release for Ubuntu 16.10 "Yakkety Yak" flavors participating in these early development releases. Participating in today's Yakkety Yak Alpha 2 development milestone are Lubuntu, Ubuntu MATE, and Ubuntu Kylin. No Xubuntu or Kubuntu releases to report on this morning.
  • PSA: Ubuntu 15.10 Hits End of Life Today
    It's time to wave a weary goodbye to the Wily Werewolf, as Ubuntu 15.10 support ends today.
  • Jono Bacon on Life After (and Before) GitHub
    Do you want to know what it takes to be a professional community manager? This interview will show you the kind of personality that does well at it, and how Jono Bacon, one of the world’s finest community managers, discovered Linux and later found his way into community management. Bacon is world-famous as the long-time community manager for Ubuntu. He was so good, I sometimes think his mother sang “you’ll be a community manager by and by” to him when he was a baby. In 2014 he went to XPRIZE, not a FOSS company, but important nevertheless. From there he dove back into FOSS as community manager for GitHub. Now Bacon is a freelance, self-employed community manager. One of his major clients is HackerOne, whose CEO is Bacon’s and my mutual friend Mårten Mickos. But HackerOne is far from his only client. In the interview he says he recently got back from visiting a client in China, and that he has more work then he can handle.

I've been Linuxing since before you were born

Once upon a time, there was no Linux. No, really! It did not exist. It was not like today, with Linux everywhere. There were multiple flavors of Unix, there was Apple, and there was Microsoft Windows. When it comes to Windows, the more things change, the more they stay the same. Despite adding 20+ gigabytes of gosh-knows-what, Windows is mostly the same. (Except you can't drop to a DOS prompt to get actual work done.) Hey, who remembers Gorilla.bas, the exploding banana game that came in DOS? Fun times! The Internet never forgets, and you can play a Flash version on Kongregate.com. Apple changed, evolving from a friendly system that encouraged hacking to a sleek, sealed box that you are not supposed to open, and that dictates what hardware interfaces you are allowed to use. 1998: no more floppy disk. 2012: no more optical drive. The 12-inch MacBook has only a single USB Type-C port that supplies power, Bluetooth, Wi-Fi, external storage, video output, and accessories. If you want to plug in more than one thing at a time and don't want to tote a herd of dongles and adapters around with you, too bad. Next up: The headphone jack. Yes, the one remaining non-proprietary standard hardware port in Apple-land is doomed. Read more