Rumor: AMD's Low Cost K8 with Integrated Graphics in 2008?

Apparently, AMD has been talking about doing a very integrated, very low cost K8 derivative for 2008.  The CPU would feature an on-die memory controller like the current Athlon 64. However, it would also feature an on-die graphics core and I/O controller - effectively, removing the need for any chipset on the motherboard. 

If you'll remember a few years back, Intel had a similar chip planned, code-named Timna.  Timna was supposed to integrate a graphics core and memory controller onto a single chip to drive total system costs down considerably, but Intel pulled the plug on the project at the last minute and shifted resources to what eventually became Centrino. 

AMD is definitely in a good position to piece together such a highly integrated CPU, given that they have already integrated the memory controller on-die with much success.  We do wonder where the graphics core would come from, as AMD would either have to design one from scratch or license the technology from another company.  Given that this type of a CPU would be targeted at very low cost markets, it would almost have to be an in-house job.  Granted, this is a very early rumor that may not pan out at all, so take all of this with a grain of salt. 

In the more near-term future, AMD will be transitioning to an on-die DDR2 memory controller by the middle of next year with their new M2 Socket and Socket-F (for desktops and servers respectively).  The initial design guides for boards based on these new sockets have been given to motherboard manufacturers, but the first samples won't be ready until the end of this year. 

Finally, the last piece about AMD here involves Turion.  Either AMD isn't very serious about Turion right now, or manufacturers aren't too impressed with it because we hardly heard any mention of the new mobile CPU at Computex from any of the notebook vendors.  Many product roadmaps going through to the end of this year were completely devoid of any mention of a Turion based notebook.

Index The AMD Chipset Battle: NVIDIA vs. VIA
POST A COMMENT

45 Comments

View All Comments

  • spinportal - Tuesday, June 14, 2005 - link

    It's not ironic or a surprise to see ATI or Nvidia pushing chipsets without integrated graphics solution since it will cannabalize their wonderful Turbo PCIe cards! When was the last time Intel's i/g tech or Via's S3 tech on an add-on board could compare or compete to any ATI or Nvidia offerings? It's basic hubris - you want 3D? you buy our cards at additional cost. No free lunch for you! Reply
  • redhatlinux - Tuesday, June 14, 2005 - link

    Oooops that's FAB Reply
  • redhatlinux - Tuesday, June 14, 2005 - link

    Great Article, couldn' expect anything less from the boss. Back in the days AMD produced their own chipset, but as so well put, $ talks. AmD MUST focus their R&D $ on the best possible Retun on Investment, its that simple. BTW I have a buddy, BRIAN who worked at the Austin FAM plant over 4 years ago, These so called 'new cores' were in R@D back then. SOI and 69nm gates as well. Brian still uses a Tyan mobo with 2 MP's. Still pretty smokin rig.

    Eric
    Reply
  • Nayr - Tuesday, June 14, 2005 - link

    Thanks #33 for pointing that out.

    +1 Nerd power.

    =P
    Reply
  • Viditor - Tuesday, June 14, 2005 - link

    "This is, of course, why DDR2 is becomming popular for mobile computing where thermal dissipation is more important than performance"

    True...both heat and power are lower with DDR2, which will make it an excellent choice for mobile.
    Both AMD and Intel will be going DDR2 at the start of 2006...
    Reply
  • 2cpuminimum - Tuesday, June 14, 2005 - link

    What seemed odd was "Being able to run at much higher frequencies than DDR1 is the major advantage that DDR2 offers." when the greatest advantage supposedly held by DDR2 is lower heat production due to a slower core speed. Higher frequency isn't really much of an advantage when that frequency isn't high enough to compensate for higher latency. This is, of course, why DDR2 is becomming popular for mobile computing where thermal dissipation is more important than performance. Reply
  • Viditor - Tuesday, June 14, 2005 - link

    Well let's see...Porkster is trying to use a stress test that wasn't benchmarked for multiple apps as a rationale for a supposed Intel superiority in multitasking...sigh.

    1. Has anyone done any tests that were designed for this? Well gee whiz I guess they have...
    http://tinyurl.com/chck7
    http://tinyurl.com/akueq
    http://tinyurl.com/7agle

    The results were that the X2 was vastly superior in 2 threads with heavy workloads, and that with 4 threads of heavy workload the P4EE 840 pulled equal (not better) because HT gives it superior load balancing. Of course in single threads the X2 was again vastly superior (in fact the 840EE proved slower than some other P4 chips...)

    2. What about the actual purpose of Tom's test...which platform handles stress better?

    Well, on the face of it the X2 was the hands down winner without contestation!
    The Intel system kept crashing (5 times IIRC), then they restarted after changing from the Intel Nforce platform to the pure Intel system. After that the Intel platform had to be rebooted 3 times...
    The AMD platform just kept running the whole time!

    That said, Tom's test doesn't show anything worthwhile...

    1. The test methods are extremely flawed. To show stability of a platform, using 1 or 2 systems isn't scientific...it's just sensationalist.
    2. Many mistakes were made both in the performance and the design of the test..

    As to porkster's (dubbed by many forums as the greatest Troll who ever lived!) assertion of AMD being driven by the "teenager" market, I must say that I'm glad to see so many major corporations hiring teenagers to head up their IT departments! :-)
    Reply
  • 4lpha0ne - Tuesday, June 14, 2005 - link

    @porkster:
    I'm sure, you'd also call Pentium Ds lemons, because they are also only able to run 2 threads at once. Everything else is a matter of priority settings (like low DivX encoding priority) and hyperthreading, which doesn't distinguish between low and high priority tasks.

    Reply
  • 4lpha0ne - Tuesday, June 14, 2005 - link

    BTW, AMD already has a graphics core (in Geode). And I read, that a part (50 people or so) of the National Semiconductor team, which they took over, was already working on a 3D core.

    So this would make sense.
    Reply
  • porkster - Tuesday, June 14, 2005 - link

    If you see the poor multitasking performance of the AMD X2 then you can expect the market share to drop big time, but are AMD users smart enough to avoid bad chips like the X2.

    AMD is riding the teenager market with a theme of join the club or feel out of it. It's peer group pressure into buying into poor hardware/old tech.

    Just check out THG review of the AMD X2 and you wont want one of those lemons.

    http://www.tomshardware.com/stresstest/load.html

    .
    Reply

Log in

Don't have an account? Sign up now