Quad Cores: Intel Kentsfield Preview

by Anand Lal Shimpi on 9/28/2006 7:00 PM EST
POST A COMMENT

34 Comments

Back to Article

  • Mclendo06 - Monday, October 02, 2006 - link

    Will this chip have an unlocked multiplier like other "Extreme Edition" processors, or will it be extreme in core count only? Reply
  • Ealdric - Saturday, September 30, 2006 - link

    These 4-core chips seem to be out ridiculously soon after the C-2-D. They could (should) have just gone straight to quad. Seems like the vendors will have a hard time keeping up. Reply
  • feraltoad - Saturday, September 30, 2006 - link

    You guys see any pricing for the "lower cost" Kentsfields?

    I thought in the past quad-core? Wut game even uses dual-core? but with Crysis saying it will use multiple cores if they are there. I can't imagine a better way to drive amazing games with crazy physics/AI/enviroments than by using available cores esp. since Aegia is flagging, and the 360 and PS3 are themselves multicored. Plus the video encoding would rawk.
    Reply
  • JNo - Friday, September 29, 2006 - link

    I know time based percentages can be a little confusing at first but you guys need to sort out your maths...

    [given and identical task...] "As an example, it is not accurate to state that a score of 40 seconds vs. 80 seconds is twice as fast, but rather that the 40 second score takes half as long or the 80 second score is twice as slow."

    Half as long IS twice as fast!! The 40 sec cpu can do the same task twice in the time that it takes the other cpu to do the task i.e. it can work twice as hard ergo it is twice as fast. Twice as fast is 100% faster. Now to be twice as slow it would have to do the task in 160 secs obviously (not 120 secs).

    If it did the same task in 20 secs (vs 80 secs), where are we? Well it is 4x as fast! It can do the same task 4 times when the other cpu can only do it once. It is 300% faster (NOT 400%).

    Speed difference (as a multiple) is Old time/New time but percentage difference is (Old time/New time -1)

    This works the other way round too of course i.e. 80 sec vs 40 sec is 100% slower and 80 vs 20 is 300% slower.
    Reply
  • JarredWalton - Friday, September 29, 2006 - link

    I updated the text after one reader pointed out the error. Technically, it *is* correct now. 80 seconds is twice as long as 40 seconds; 40 seconds is half as long as 80 seconds. Some like to say "twice as fast" but that is slightly wrong. In situations where higher scores are worse, you have to change the syntax to remain grammatically correct.

    Time is a duration measurement, not a speed measurement. Would you disagree that 80 seconds is twice as long as 40 seconds? Or that 20 seconds is one fourth as long as 80 seconds? "Fast" is the wrong term to use for such a comparison, other than to say that 40 seconds is faster than 80 seconds. You could talk about rate of travel and say one guy is moving at 40 MPH and that's twice as fast as 80 MPH. Call it a symantic difference of opinion, but I don't like "fast" as a way of describing time.
    Reply
  • JNo - Wednesday, October 04, 2006 - link

    I understand that time is a duration measurement, not a speed measurement. That's why you have to think of it in terms of the amount of time taken to complete *a given task*, which is a defined amount of work, which will then allow you to make the speed comparison. If a car travels the same distance in half the time another car does (eg 40 secs instead of 80 secs), it is twice as fast. PERIOD.

    "Some like to say 'twice as fast' but that is slightly wrong." - No it isn't! It is undisputably correct!

    [quote from following comment] "What is 10% faster than 100 seconds? I suppose you can day it's 90.91 seconds if you want." - I don't just want to - it is! You can not say otherwise without being incorrect!

    If you want to make this easier, use the INVERSE of time to define speed (which is scientifically correct - think of the equation distance = speed x time). Now, speed = distance (or workload - eg calculating pi to 2million dp or encoding a video file) divided by the time taken to do said task. Just make the workload = 1 for the sake of argument (as the answer will still be proportionally correct) and you have speed equalling the inverse of time.

    This way, 1/40 (faster cpu) is twices as fast as 1/80 (slower cpu) and a cpu that does the task in 20 secs is 4 times as fast as the 80 sec cpu (1/20 is 4x 1/80). Use a calculator if you don't believe me. What if the faster one does task in 73 secs? It's speed is 1/73 compared to 1/80, which is 9.59% faster (1/73 divided by 1/80).

    It is not a matter of semantics. It is fact. I am certainly not saying this just to be annoying cos most of the stuff you guys understand (cpu architecture etc) and evaluate goes way over my head. But this is quite straight forward - I just want to try to help you to get it right in future.
    Reply
  • vhx500 - Friday, September 29, 2006 - link

    .. it's "semantic", not symantic.

    *Ducks
    Reply
  • yacoub - Friday, September 29, 2006 - link

    I think his issue is regarding % speed improvement.

    If something scores 100 points on something, what's a 10% improvement over that? 110 points. What's 100% improvement over 100 points? 200 points.

    Thus something that scores 200 points is not 200% faster than something that scores 100 points, it is 100% faster.

    There was never a problem with 40s vs 80s being twice as fast. That's the same thing as "taking half as long".
    Reply
  • JarredWalton - Friday, September 29, 2006 - link

    But that works only for instances where higher is better, without becoming confusing. What's 10% faster than 100 seconds? I suppose you can say it's 90.91 seconds if you want. I prefer to stick with your math and simply state that 110 seconds is 10% slower. :) Reply
  • Kougar - Friday, September 29, 2006 - link

    Great idea to create that Kentsfield compatible motherboard list! I've heard that a board needs an EPS12v connector to be truly compatible though... can you confirm this? Already seeing good results with the Abit AW9D-MAX being able to overclock these!

    Since "Core" processors are the best thing out there for gaming, I can't see why anyone would turn down "Core" based Kentsfield, just simply overclock it. Worst case you get the same performance of a overclocked C2D and have some extra cores... ;)

    Again, thanks for the mainboard list!
    Reply
  • yacoub - Friday, September 29, 2006 - link

    Now give me a storage technology (ie, hard drive) that can multi-task and really feed the CPUs, RAM, and GPU effectively. Currently hard drives are by far the biggest bottleneck.

    I really don't want to get too involved with RAID; I just want a new technology that allows for much quicker data access and multi-tasking ability over high-bandwidth lines.

    Also, affordable plz. ;)
    Reply
  • Madellga - Friday, September 29, 2006 - link

    Anand, I was going to ask if you could tell us which mobos will support the Quad core and after reading the article I just noticed that you just promised a list.

    I look forward to see the list and thanks for thinking of it :)
    Reply
  • imaheadcase - Thursday, September 28, 2006 - link

    I was thinking of upgrading to core 2 duo end of this month, what is price range on these being released in Novemeber? These gonna be like $1000 till more are release im assuming.. Reply
  • coldpower27 - Friday, September 29, 2006 - link

    It's going to be Core 2 Extreme QX6700 so it's a 999US processor which is pretty standard, what is the most interesting though is what is coming in Q1 2007 with the Core 2 Quad Q6600. Reply
  • bob661 - Friday, September 29, 2006 - link

    quote:

    It's going to be Core 2 Extreme QX6700 so it's a 999US processor which is pretty standard, what is the most interesting though is what is coming in Q1 2007 with the Core 2 Quad Q6600.
    Seriously doubt it will cost anywhere near the present cost of the X6800.
    Reply
  • coldpower27 - Saturday, September 30, 2006 - link

    Well the present cost of the X6800 is around 975US or so, when the Kentsfield just comes out in Extreme form I expect it to cost 1150US due to slight gouging at first but the price should fall back to 999US once the intial burst demand and supply builds up.

    1300US might exist for vendors that overcharge extremely, but not as the de feacto standard price for the QX6700.
    Reply
  • bob661 - Thursday, September 28, 2006 - link

    They'll definitely be more expensive than the present X6800's. I'm thinking about $1300 retail. Reply
  • JackPack - Friday, September 29, 2006 - link

    They already said $999.... Reply
  • bob661 - Friday, September 29, 2006 - link

    quote:

    They already said $999....
    And by the time the Intel fanbois are done it will be $1300 and OOS.
    Reply
  • Questar - Friday, September 29, 2006 - link

    What do you care? You're not going to buy one anyway. Reply
  • kleinwl - Thursday, September 28, 2006 - link

    I for one am extremely excited about quad core. If the pricing stays relatively neutral to the core 2 dual units (minus a speed bump or two) these cores could become mainstream very quickly. If they do become mainstream, similar to how Pentium D and core 2 units are, I expect that we will see a push for more software developers to take advantage of them, at which point they will be much more useful than a mere dual or single core.

    There are many ways that we can take advantage of quad cores now... for example, distributed computing, power multi-tasking (ripping DVD/CDs while watching a movie and playing WoW), or just offloading physics to the unused cores. Even if these are normally idle, the ability to run 4 things at one is a huge advantage when things are time critical(ie. I need to finish so my wife so she can check her email).

    Anyway, I feel like I'm stuck in the stone age now with my (still respectable) venice 3000+. Even overclocked to 2.3GHz, the computer feels slower that my brother's core 2 duo lappy.
    Reply
  • kmmatney - Thursday, September 28, 2006 - link

    I think that offloading the physics onto another core will be the next big thing, so I think at least dual core will be best to get the best experience out of the next generation of games. All of the gaming consoles are multicore, so games will naturally start to use all cores on a PC. Reply
  • coldpower27 - Friday, September 29, 2006 - link

    I rather have a discrete PPU for that, like the AEGIA thingy, rather then the general purpose Quad Core do it, the CPU should be dedicated to other things like AI. Reply
  • JarredWalton - Friday, September 29, 2006 - link

    I too would rather have a $275 add-in card instead of getting something just as powerful (possibly more so) as part of the CPU. I would much rather spend $1000 for 2.93 GHz and two cores than $1000 for 2.67 GHz and four cores.... (Sarcasm intended)

    Long term, I expect PPU-type cores and even GPU cores to make their way into the CPU package. GPUs have an issue, though, as they demand tons of bandwidth and lots of memory that they have essentially exclusive access to. But by 2010 I'm almost certain we will have new cores in the CPU. Or maybe Intel will just make SSE6 or whatever a full physics specification, and then state that there are only x number of Physics pipelines per CPU package?
    Reply
  • coldpower27 - Saturday, September 30, 2006 - link

    I am not sure where you are going with this. I don't see the general purpose processor with the CPU being more powerful then the dedicated PPU for awhile, I am much for specialized processors with a specific purpose and let the general purpose processor handle something else. If they can make the Physics run on CPU faster then a dedicated unit can do it, then ok, do physics on the CPU.

    Sorry my sarcasm detection is broekn today, so just to say, I would prefer a lower clock Kentsfield over a slightly higher clocked Core 2 Duo (Conroe), given if the pricing is the same.

    Maybe we will have an integrated, platform with the CPU/GPU/PPU functions all integrated together. However I like the ability to upgrade each piece of the puzzle rathern then being forced to upgrade the single "processor" all at once.
    Reply
  • theteamaqua - Thursday, September 28, 2006 - link

    as a gamer i would say this thing is useless Reply
  • brownba - Friday, September 29, 2006 - link

    quote:

    as a gamer i would say this thing is useless


    as an intelligent person, I would say your comments are useless.
    Reply
  • wilki24 - Friday, September 29, 2006 - link

    Useless for gaming? Hardly.

    http://www.siliconvalleysleuth.com/2006/09/intel_s...">IDF Alan Wake Demo

    Notice the water? The lighting? The phsyics when the tornado rips apart those buildings?

    Try utterly necassary for gaming. :-)
    Reply
  • Griswold - Friday, September 29, 2006 - link

    You are certainly easily convinced by marketing yip-yap. I'm all for multicore systems, but before I believe that you absolutely need 4 cores for this kind of visuals, I want to see a process- and load logfile of the 4 cores.

    Quite frankly, 80% of the eye candy seen in that movie is done by the GPU and not by the CPU - considering all the whining from game developers about multithreading games, I find it hard to believe that streaming/threading a game on 4 cores will work so much better than doing the same on for 2 cores. In fact, I'm willing to bet, the exact same stuff is doable on a dual core machine without a noticeable performance hit.
    Reply
  • Questar - Friday, September 29, 2006 - link

    quote:

    In fact, I'm willing to bet, the exact same stuff is doable on a dual core machine without a noticeable performance hit.


    Which game developer did you say you worked for?
    Reply
  • theteamaqua - Friday, September 29, 2006 - link

    http://www.tomshardware.com/2006/09/10/four_cores_...">http://www.tomshardware.com/2006/09/10/four_cores_...

    also games r heavily GPU depandent, i much rather blow $1000 on G80 , R600 than this.

    as for servers well, like i siad i said it as a gamer. i dont own a server , i jsut wanna pla ygames at 60fps. and jumping from core 2 duo to kentsfield wont get me much FPS at all...

    in fact its slower if apps dont uses the 3rd and 4th cores. check out that link i gave. Core 2 Quad at 3GHz is on par with Core 2 Duo at 2.93GHz.
    Reply
  • rqle - Thursday, September 28, 2006 - link

    "as a gamer i would say this thing is useless"

    kinda sad to hear a gamer says that. as a gamer i found dual core and multicore very beneficial. side server, game server, even downloading large game update (bittorrent, game patch updater, etc) while gaming prove to very helpful with multicore. we run numerous appls while gaming, though not threaded there still a large performance difference.
    Reply
  • Aikouka - Thursday, September 28, 2006 - link

    I don't agree. People said the same about dual-core processing when I remarked about how I was going to buy one. Ya know what, I'm glad I ignored their nay saying as I absolutely love my dual-core processor.

    The ability to run another application and not hamper your gaming is great. I personally run two LCDs at once and I do a lot while gaming (I play WoW mainly, so there's a decent amount of downtime, so watching videos, browsing the web, etc is all possible).

    So, once games begin to adapt to multiple cores (you may see support for dual-cores before quad-core, although Alan Wake is supposedly better on quad-cores), I'll need more cores as games begin to gobble up multiple cores :P.

    Oh and if you've never used multiple monitors in a DualView (nVidia's monicker for multiple split (but conjoined) desktops), you should try it... it's fantastic :O.
    Reply
  • theteamaqua - Thursday, September 28, 2006 - link

    "So, once games begin to adapt to multiple cores"

    there is the keyword, not many games uses dual core let alone 4 cores. Crysis's recommanded system was Pentium d or AMD X2 with GeForce 7800XTX or ATI X850.
    Reply

Log in

Don't have an account? Sign up now