NVIDIA Demos nForce4 with SLI for Intel platforms

Despite rumors of it being called nForce5, NVIDIA showcased their nForce4 for Intel chipsets at IDF. 

The chipset is up and running:

We didn't find out much about the chipset at NVIDIA's keynote, but we later got a full demo of the chipset and some shots of vendor boards:

We can't say too much about the chipset, but given that it's still being called a nForce4 you can pretty much guess almost all the features it will support. 

More FBDIMM Talk NVIDIA Demos nForce4 SLI Intel Edition


View All Comments

  • mickyb - Wednesday, March 02, 2005 - link

    Doesn't Intel's GOAT (Eh hmm IOAT) sound a lot like what nVidia was doing with their I/O chips? Intel should put "Not to scale" or "No real data was found" on every single one of those extrapolation graphs. I find it kind of funny how multi-core is the panacea to all performance problems. How is this any different than multi CPU SMP? It isn't, except for compressing them to a smaller space. SMP has its problems as well and the number of CPUs does not create an exponential graph like Intel is implying.

    I am interrested in this FBDIMM and will need to do some checking around on that one. It looks interesting. RAMBUS is still at it. We'll see how things shape up.
  • glennpratt - Wednesday, March 02, 2005 - link

    That changes his point very little. And, YOU probably won't be buying crap out of pocket... Reply
  • Questar - Wednesday, March 02, 2005 - link

    When will you guys realize how small the gaming market is?

    I'll buy more corporate systems this year than every gamer on this site will buy in the next five years.

  • Pete84 - Wednesday, March 02, 2005 - link

    ^^ Not just games, but every app too . . . Reply
  • ZobarStyl - Wednesday, March 02, 2005 - link

    I love the graph on page 4, where multicore just jumps ahead by leaps and bounds, with the "Performance" being exponential growth. I'm sorry but the last time we saw something like this it was the NetBurst graph taking us into "10 GHz Space" and lo and behold, well you know the story. I'm so tired of Intel just putting a band-aid on a bad idea for a chip (not a bad chip mind you, just designed by marketing people, not engineers). Multicore without onboard memory controllers, tacking on an extra meg of slower cache to Prescott...why are we not seeing samples of a new chip that aims to correct the problems of NetBurst rather than just adding more and more to Prescott like it really is going to change the fact? Until games get really multi-core oriented, this last generation of single-core products is going to be the best thing out there until probably late 06. Reply
  • raskren - Wednesday, March 02, 2005 - link


    Guess what, none of these are anywhere near store shelves so CTFD (calm the F down).

    The Nforce4 board finally adds some appeal to the latest Pentium 4s. I'd say that i875/865 were the last two exciting chipsets. 9xx has fallen short on innovation.
  • Beenthere - Wednesday, March 02, 2005 - link

    The SpinMeisters from Intel are realing blow smoke up the azzes of journalist, as usual. Only the gullible would belive the nonsense these folks peddle when they can't even deliver a P4 without a fire extinguisher. No one with a clue would touch any of Intel's current or short term products. Maybe by '07 Intel will have something worth considering but that remains to be seen. Reply
  • xsilver - Wednesday, March 02, 2005 - link

    The idea of VT is a good one I think -- it may be possible to run a small office on only 1 multicore, multithread system? (spreadsheets and email aren't exactly taxing)

    and intel's idea of split dual cores may be future possibilities of selling cpu's scaled with cores rather than clock speeds?
    eg. Extreme edition will have 8 cores, regular will have 4-7 cores ... celerons will have 1-3 cores... according to how the cpu's are binned?
  • bersl2 - Wednesday, March 02, 2005 - link

    Microsoft’s Jim Allchin came on stage and echoed Gelsinger’s statements with the simple line “it’s time.”

    Duuuuuuuu... no, really?

    "And Wintel said, 'Let there be light!' But they were too slow to realize that the light had been on for quite some time."

Log in

Don't have an account? Sign up now