25 thoughts on “The End Of The AMD/Intel War?”

  1. INVENTORS – DO NOT TRUST INTEL

    I invented a CPU cooler – 3 times better than best – better than water. Intel have major CPU cooling problems – “Intel’s microprocessors were generating so much heat that they were melting” (iht.com) – try to talk to them – they send my communications to my competitor & will not talk to me.

    Winners of major ‘Corporate Social Responsibility’ award.

    Huh!!!!
    When did RICO get repealed?”

    Be advised
    1) I am prepared to kill to protect my IP (Intel HAVE NOT
    stolen it AFAIK – so you can’t Sean Dix me) and
    2) I am prepared to die to get TRUE patent reform.

    IPROAG – The Intellectual Property Rightful Owners Action Group.
    The One Dollar Patent.

  2. Without AMD we would not have 64-bit X86. Intel was trying to push Merced/IA-64/Itanium into the market. Athlon and Opteron were fantastic products at the time.

    There were an awful lot of weird business decisions at AMD when Hector Ruiz was made CEO, after the retirement of Jerry Sanders. Purchasing of ATI at an outrageous price before the stock market collapsed, lagging research, selling their manufacturing plants to Abu Dhabi investors and spinning them off as GlobalFoundries. Ruiz is involved in an insider trading scandal (Galleon Case). Some people suspect the AMD-ATI merger involved some shady dealings, including the Galleon Group as well. Hector Ruiz left no love when he left Motorola Semiconductor either.

    Dirk Meyer (Athlon chief designer) seems to be doing a decent job ever since Ruiz left, and AMD actually has a credible hardware roadmap now. But the damage has been done.

  3. If AMD is to pass, then isn’t this the natural course of events in a free market? Creative destruction. There is plenty of opportunity for new upstarts to bring new ideas and innovation to the marketplace. For instance, the ARM family is slowly coming up from below and entering the notebook and perhaps eventually the laptop market.

    The biggest problem right now is that the dominant operating system, Windows, is intimately tied to the x86 architecture. There is really no technical reason for this (early versions of Windows NT were supported on non-x86 hardware), but for all practical purposes to the consumer it is only available for x86 devices. If an operating system more friendly to different chip architectures could take hold it would spur tremendous opportunities for innovation and competition on the hardware side of things.

  4. “If an operating system more friendly to different chip architectures could take hold it would spur tremendous opportunities for innovation and competition on the hardware side of things.”

    I believe you’ll find it’s called Linux. I have it running on two Intel x86 CPUs, one AMD x86, an ARM and a couple of MIPS (I believe, from what I can figure out of the hardware). The router runs it too, though I can’t tell whether that’s ARM or MIPS without opening up the box.

    But that rather highlights the problem: Linux is open source, so it can easily run on multiple different CPUs, since the software is just recompiled and any hardware-dependent bugs fixed. Most Windows software is closed source, so if you buy SuperWidgetPro x86 for $500 then you’re stuck with an x86 chip, you can’t buy an ARM next time and transfer your software over.

    It’s the closed-source software that keeps people tied to Windows on x86, not the fact that desktop Windows won’t run on ARM… so if Intel were the only people making x86 chips, well, that would be a good time to own Intel stock.

  5. “If AMD is to pass, then isn’t this the natural course of events in a free market? ”

    Yes, but a perfectly free market isn’t the goal either. You end up with companies that are “too big to fail” in that case. Intel is surely one of them at the moment.

    IMNSHO, all corporate taxes should be at least somewhat pegged to “How many competitors do you have? What marketshare do you have?” And questions along those lines.

    Monopoly -> highest tax bracket.
    Cartel/Oligarchy -> pretty darn close to monopoly taxes.
    10,000 competitors – -> no taxes.

    This gives at least a slight interest in not completely crushing the other guys. You don’t have to -help-, but you wouldn’t be quite so interested in going out of one’s way to use the near-monopolistic power to squeeze the last breath out of them.

  6. I remember how AMD trumpeted that they were going to create the first quad-core chip. When Intel beat them to it, AMD whined (IMO) that the Intel chip was basically two conjoined dual-cores and not a “true” quad-core.

    True quad-core or not, it beat the quad Opteron to market by months or years. I haven’t bought an AMD chip since.

    (Too bad that DEC wasn’t nimble enough with the Alpha to challenge Intel. Iirc there was talk of Samsung buying the Alpha franchise, but Samsung was in financial trouble when the opportunity arose.)

  7. Yes, but a perfectly free market isn’t the goal either. You end up with companies that are “too big to fail” in that case.

    There’s no such thing as “too big to fail” in a perfectly free market.

  8. my current and two previous processors are AMD, but AMD can’t compete with Itels current architecture unless they create something as powerful and low heat as the dual cores.

    I can’t speak technicaly, but no way amd can compete with intel without producing way too much heat.

    I too hope AMD gets around to doing something brilliant and is willing to take the time and effort to let others realize just how brilliant it is.

    But Intel owns the processor market now, I will not buy/assemble another system with an AMD processor.

    There are other markets, but it is the processors that build the brand.

  9. gs: It was true. Intel did not make a true quad core processor before AMD. Nehalem (Core i7) was the first Intel quad core. If putting two dies in the same multi-chip module is a processor, then IBM Power 5 with four dies in a multi-chip module was an octa core processor. In practical terms however, people got four cores in their desktop machine and that was what mattered. Intel’s manufacturing prowess made the solution cheap enough.

    AMD’s problems went way beyond that however. Their per core performance has sucked ever since Intel launched the Core processor, and the gap has only increased since then. Barcelona (first AMD quad core) was plagued with hardware bugs on release. My current computer has an Intel processor, while my previous computer was AMD.

    The current CEO of AMD, Dirk Meyer, was one of the designers of the Alpha 21064 and 21264 processors. The Athlon originally even used the DEC EV6 bus. People used to joke that Athlon was a cheap X86 compatible Alpha processor.

  10. Thanks for your response, Godzilla. I’d be interested in your thoughts about why Alpha failed. The conventional wisdom, that it was due to DEC’s abysmal marketing and sales skills, rings true to me. By the time of the single occasion when I saw a persuasive Alpha commercial, the handwriting was on the wall.

    (In 1996 I upgraded from my 486 to an Alpha/NT. It was like upgrading from a motor scooter to a Harley.)

  11. Hmm, not sure if AMD buying ATI was all that bad a move. Radeon GPU’s are sitting at the top of the performance charts right now giving Nvidia some serious fits. I have an older Radeon 3870 and was able to overclock it an additional 250 mhz on the stock cooler.

    As far as processors go the AMD processor is still sitting pretty on the performance vs price charts; a la bang for the buck. The Phenom II black edition processors sit right along side many of the core2’s. Processors now days generally outpace the capacity of the average end user. You don’t need the latest greatest most expensive processor to run the newest latest games like you used to. And if all you do is browse the internet and read email, well, I see p3 processors in older systems that still fill the role adequately. So, bang for the buck metrics count for a lot now days and in that light it still really is a matter of personal taste as to which manufacturer you want to go with. If your someone who likes to thump that big dick processor out on the table to impress your friends and gloat that you dropped $900 on just a processor then yea, Intel is the way to go. But at the end of the day it really only gives you the upper hand in maybe 20% of potential situations. Otherwise, older slower processors will perform quite well the remaining 80% of the time.

    So, really it is a questions of who is posed to produced processors in the most efficient manner possible. I think that this is where AMD has shown time and again that they can fill the value minded niche and go right on letting Intel expend the money and energy breaking new ground. In other words I’m not quite convinced that AMD is going anywhere any time soon.

  12. Josh, don’t forget plenty of people use computers for work, and in areas where sheer scalar speed matters. Simulation, rendering, math computation: these are all areas where serious computing power is a Good Thing, and we are certainly not limited by bus speed or memory bandwidth.

    I loved the Alpha when it came out. An Alpha workstation could hold its own with a Cray 90. Sweet. I don’t know why it crapped out. For that matter, it’s a mystery why DEC itself — which was the hardware geek’s haven, in the old days, the creme de la creme of chip engineering — crapped out. Hard not to conclude that IBM’s philosophy, which is that the company is run by salesmen, not engineers, may be the only long-term strategy. Sad but true.

  13. While I agree that salesmen are often good CEO material (e.g. Richard Branson) they provide no guarantee of success. Informix went down the tubes when the original technically oriented founder left and was replaced by a salesman. Some think Microsoft is going through the same process. Google is considered a technically oriented company.

    IBM has had a good record of investing in new areas and divesting non-profitable areas of business. They had enough foresight to enter the PC market, and enough foresight to leave it, for example. They sold their hard disk division, when they helped pioneer GMR head technology years before. It helps that they are a consulting business. Most of their profit doesn’t come from hardware, which increasingly has had its margins squashed.

  14. I hope that it can survive, or someone else take up the cause.

    It looks like nVidia may be making a go with GPUs… Guess we’ll see.

  15. Well, nothing is a guarantee of success, God. But my impression is that, all other things being equal, if you’re run by salesmen clever enough to climb to the top of a pile of geeks, then you’re probably in good shape.

    IBM is a wonder to me. They started off making adding machines, then went to typewriters, then electric typewriters, then big iron, then PCs, then back to big iron and some weird combination of consulting and services. I don’t know how they do it, but they’re one of the most agile technology-intensive companies ever. What they make their money on in 2009 bears little resemblance to what they made their money on in 1990 or 1920.

    I wouldn’t actually buy stock in Google. I think they only make money now because of their lock on the search market, and I haven’t been convinced that they’re innovative enough in their basic money-making operation, whch is searches, to continue to hold a dominant position. They seem to spend a little too much effort coming up with cool new apps that generate a lot of buzz among technophiles — but for which no one actually pays money.

    This is actually my point. My impression is that at IBM there are some practically-minded boring people near the top who, when they get some breathless demonstration by the bitheads of a Cool New Tech, annoyingly say Gosh, yes, that’s all wonderful, Dr. Cleverbean, we look forward to your Nobel acceptance speech, but will this make IBM money? Will people fork over their hard-earned cash to have it? On the other hand, I get the feeling the top folks at Google say Zowie! Let’s fly you out to give a TED talk! We’ll roll it out by Christmas, and…er…figure out how to monetize it later.

    Now, myself, I’m a bithead. I love clever ideas. I’m sad when they don’t come to market because the bulk of humanity doesn’t need them for surfing online pr0n or e-mailing Aunt Tilly about the upcoming annual Thanksgiving family miniature golf tournament. Alas, Babylon. But, on the other hand, I have a deep appreciation for the winner-picking cool judgment of a talented sales-driven manager, and I’d want to work for a company run by one. I may be disappointed that my own cool new ideas are shelved, but I’d know my job is as secure as it can be.

  16. Well, evil Bill, the real thing “keeping people on x86” is that there isn’t any competition with the architecture.

    ARM is, for general personal computing, pretty much a joke – and if you really want low TDP, you can get an Atom, a VIA Nano, or one of the low-power Athlons.

    PowerPC can’t compete. SPARC can’t compete. MIPS can’t compete.

    That one can run linux on a notional toaster is nice, but irrelevant to pretty much everyone – and there’s no reason the toaster can’t be running a tiny embedded x86 chip just as easily as an ARM.

    Carl: Which Alphas, though? I had a Multia in the late 90s, and it was slower than a Pentium 90 for anything but FP math – and I can’t imagine it even being in the same category as a contemporary Cray. The 21164AA was not exactly a powerhouse, even for its day.

    Josh: I see AMD’s big problem as being that it has nothing to really compete with the i5/i7. You don’t need to spend $900 to get a great CPU from Intel – the base i7-920 is about $280, and AMD has nothing in that level of computing power at any price.

    AMD can compete on value, for now, but that won’t last. Their roadmap suggests nothing competitive until 2011… hopefully they’ll do alright. Competition will do everyone good, and I’m no Intel partisan.

  17. GS said, “….(In 1996 I upgraded from my 486 to an Alpha/NT. It was like upgrading from a motor scooter to a Harley.)…”

    I still use a 486sx 16 mhz almost every day. It never crashes. (Portege 3400ct).

  18. This article misunderstands the history. The reason for AMDs rise, plain and simple, wasn’t just that Athlon was a great design (it was), but that Intel screwed up. Intel decided that they’d get away from that pesky x86 competition by moving everybody to a new, superior, “Itanium” architecture. Only nobody wanted to move, and delivering a new architecture was harder than they though, and in general the whole thing was a hilarious point-for-point reproduction of all IBMs errors with OS/2, Microchannel, and the PC market in general – right down to Intel purposely making x86 designs that were unsuitable for anything much larger than a desktop.

    AMD was able to occupy that vacuum with Athlon/Opteron, despite their structural disadvantages. When Intel finally grasped their errors, they had to ditch the pentium-4 based designs entirely, and adopted their pentium-M mobile designs for server use (!) – these became the “core” 5100 series. With the 5500/Nehalem, they’ve finally fixed the bus/memory issues as well (essentially adopting the same sort of design as Opteron). I had expected AMD to be pretty well hosed at this point, but they’re hanging in their pretty well with the 6-core Opterons. I’m not sure anything will withstand the Nehalem-EX when that 8-core monster comes out next spring, though… as others have noted, the challenge may be some kind of niche non-x86 coming from the low end (e.g. Tilera’s 100-core ARM processor).

  19. Sigi, I had a DEC Alphastation, so it was the 21064 chip, I believe. This was in 1995. I don’t know squat about the Multia, but…it’s important to note that programming and compiling for the Alpha was quite different from the normal CISC chip. I was writing Fortran code, but even so, my experience is that you had to use the DEC Fortran compiler to get the speed. The GNU compiler, for example, produced sluggish code. I expect it still does, but people don’t care enough any more to spend the big bucks required to pay for a superb compiler. (The DEC compiler was $1000 per license in 1995, so maybe $1500 now. Imagine that!)

    But in any event, FP math is the sine qua non of scientific computing. What else is there? I used to write weird custom code to avoid making library calls for sines and cosines.

    But that’s nothing. Early in my career I wrote machine code for a Motrola 6502 so that a miserable Apple IIe could control the stepper motors in an automated microscope stage I was using in a laser-annealing experiment.

    Ah, yes. The days when men were men, and programmers knew the twenty most used op codes of the chip by memory. Today we have to have garbage collection so that we don’t forget to deallocate memory and thereby shoot ourselves in the foot. Kids these days!

  20. Intel get’s Multi-core, and integrated Memory Controller as the big ones.AMD gets x86, SSE, which are the big ones.If they didn’t sort this out, Intel wouldn’t be able to make Dual/Quad cores CPUs anymore, or the i5/i7 CPUs either… so we’d all go back to P3s. And AMD would have to come up with their own CPU architecture. So yeah, they really needed to figure this out. Glad to see that they did.

  21. One of the biggest fallacies around is that we wouldn’t as x86-64 without AMD. In 1996, if you’d assembled 100 teams of engineers familiar with then state of x86 and told them to each come up with a design for a 64-bit version, nearly all of the documents produced would be interchangeable. Unless you were pursuing some wild Sony-esque concept, the path to producing a 64-bit extension of the Pentium was pretty straightforward.

    Alpha had a lot of potential but by that era DEC was a shambles. The company had trouble deciding what business it was in from one day to the next. Remember, this was the company that entered the PC business with a machine that couldn’t format a standard floppy. You had to buy overpriced pre-formatted floppies from DEC.

    Godzilla, the Intel Core 2 Quad product only failed to be a ‘true’ quad core if you were sucker for silly AMD PR claims. It put four Core 2 CPUs in a single socket and that is the sole definition that matters in the real world. AMD is starting to play this same game again with the their Bulldozer product, introducing new definitions that only serve to obfuscate what the product actually does.

    Evil Twin, that rationale is nonsense. Having a pile of source code for the typical modern app is not going to let the average user move around between hardware architectures. Any decent software makes too much use of the hardware’s features to be simply recompiled elsewhere and expected to work. Consider Java. THat was going to eliminate the need to know what kind of machine you had because a Java app would run anywhere a VM was available. didn’t work out that way because the performance sucked early on if you didn’t make calls to the local system that ruined portability. Later, performance was much better but the ante for apps had been raised considerably and run anywhere stuff tended to be lacking.

Comments are closed.