POST A COMMENT

47 Comments

Back to Article

  • Dman23 - Tuesday, January 29, 2013 - link

    You should also be using GLBenchmark when testing GPU performance on mobile products! The Kishonti guys do truely create Great cross-platform analysis tools for Windows and Mac and Android products. Please include these. Reply
  • Dman23 - Tuesday, January 29, 2013 - link

    Also, I should add that you should use their CLBenchmark Suite for comparing the computational performance of different platforms using OpenCL. Reply
  • Dman23 - Tuesday, January 29, 2013 - link

    Also, if you are trying to make your mobile benchmark suite platform-agnostic, you should seriously consider a game that is available on multiple platforms (i.e. Windows and Mac) for a more apples to apples comparison on gaming performance such as The Witcher 2, Borderlands 2 or Batman: Arkham City. Reply
  • JarredWalton - Wednesday, January 30, 2013 - link

    We're not trying for platform agnostic. Mostly, gaming tests on other platforms would only tell us how well optimized the game is for other platforms, which is more of an OS/gaming question than a laptop question. Reply
  • Dman23 - Wednesday, January 30, 2013 - link

    Why wouldn't you want to be platform-agnostic? What is the point of doing Windows-only gaming tests if the idea is to test all mobile platforms including Android and OS X??

    Also, based upon your "gaming tests on other platforms only tell us how well optimized the game is", why have gaming tests at all?? If you truly believe that statement, then basically your saying the gaming tests in general are rigged and it "only tells us how well optimized that game is" for that platform.

    Come on now, just because hardcore gaming is my predominant on the Windows platform, doesn't mean you should exclude other mobile platforms such as OS X or Android/Chrome. Stick to Anandtech's tradition of providing a very comprehensive, unbiased review process and either provide a more apples-to-apples comparison of gaming performance using cross-platform games or don't include it at all if you believe that gaming benchmarks are rigged to begin with.
    Reply
  • JarredWalton - Wednesday, January 30, 2013 - link

    Be reasonable. First, most of the games we test are only available on Windows. Of the games that we test that are available on OS X, past indications are that OS X performance is far worse than Windows performance on the same games. So, either OS X is poorly optimized for gaming or the games are poorly optimized for OS X. Until that changes, why continue to try and compare apples and oranges? I figure people already know the limitations of OS X in regards to gaming (and if OS X doesn't support games well, Linux is even lower down the hierarchy), and unless/until that changes there's no sense beating a dead horse and wasting the time of our reviewers.

    This is not to say that we won't add GLBenchmark or CLBenchmark, but I'll leave gaming tests on OS X to Anand and Vivek when/if they run such tests.
    Reply
  • Dman23 - Wednesday, January 30, 2013 - link

    First of all, who says they don't support gaming on OS X? If that were the case, then they wouldn't even bother with implementation of a whole online-platform Game Center in the first place, let alone integrate it into the OS itself!! And like I said before, they have a whole bunch of high-end games like Bonderlands 2 and The Whitcher 2 that work great on Apple's current mobile platforms, not to mention a whole myriad of other good games that they sell on the App Store or you can purchase via Steam.

    Look it, I get it. Your probably a Windows user that thinks in order to measure high-end gaming performance or be a high-end mobile gamer, you have to use Windows. This is completely untrue. That may have been true 10 years ago but not today. All these mobile products from Apple, Dell, Hp, Lenovo, Acer, etc. all use similar graphics chipsets and processors from Intel, AMD, and Nvidia. This is why it is important to provide cross-platform gaming performance when reviewing each product as a measure of what strength and weaknesses each of them have.

    If your going to have a Gaming Performance section in your reviews for mobile products such as laptops / tablets, you should include games that are available on all mobile platforms! This includes Chrome, Mac OS X, and Linux. (Hell, even games that are only on Windows and Mac OS X would be a good first step.)

    If your not willing to implement that because you think it is a "waste of time" to compare ALL major mobile platforms and just have a section SOLELY on Windows gaming, then you should (like I've said before) either think long and hard about having a section on gaming performance, as a way to diagnose the strength of a mobile laptop / tablet or update the title of this post to "Anandtech's Mobile WINDOWS Benchmark Suite" because you sure aren't trying to provide an unbiased benchmarking suite that compares and contrasts other platforms and products.

    Btw, your argument that gaming performance is poor on Mac OS X, as a basis of not doing cross-platform gaming performance, is really short-sighted. The whole point of reviewing a mobile product and the platform that it's based on is to understand the strengths and weaknesses on that said product/platform. And who knows, by exposing the weaknesses of said platform when it comes to gaming in either Chrome or Mac OS X, it will provide an incentive for the company to improve upon those weaknesses. Isn't that really the whole point of reviewing products at a sight like AnandTech, so that down the line that company can improve upon the weaknesses that AnandTech has exposed??

    I guess I'd like to end with by not providing these cross-platform comparisons, you are doing a disservice to the readers of Anandtech and in my view, watering down what makes AnandTech great...in that it covers ALL major platforms / products and provides Comprehensive benchmarking suites across all products. (To much the chagrin of fanboys who only want to hear about their "favorite" platform, while dismissing / degrading other platforms and products.)

    Jared, I hope you take these suggestions to heart because I would hate to see AnandTech become more of a platform-centric site (i.e. HotHardware, etc.) instead of a platform-agnostic site, which is what makes this site so unique and great!
    Reply
  • JarredWalton - Wednesday, January 30, 2013 - link

    You're talking to the guy that does all of the Windows laptop benchmarking. The point is to benchmark games that are relevant and new, so if we only choose games that are available on all platforms we inherently limit us. I've already ruled out Borderlands 2 as a game that's not particularly demanding on hardware, not because it's a game that's available on other platforms.

    Ultimately, I'm going to create a list of games for our laptops reviews, and if the games are available on OS X and Anand wants to run them, great. I'm not going to ask our other editors (e.g. Dustin) to do extra work benchmarking games that aren't meaningful to most users just to make for cross-platform comparisons.

    There are really only a couple non-Windows laptops to consider, all of them from Apple, so what you're talking about is having MacBook reviews run as many gaming tests as possible. That's fine and I hope they will do that when it comes time for another MacBook review.

    Or to put it another way, I wouldn't expect Anand to not run Final Cut Pro or iMovie tests (or anything else he wants to test) just because those programs aren't available on Windows. The fact is, the vast majority of games are still Windows only affairs. Steam is changing that (slowly), and we might even see Steambox push stuff onto Linux.

    Right now, this is about testing the majority of laptops. Overlap with tablets will occur in some areas, and likewise with MacBooks, but it's impractical (e.g. a waste) to try to only test cross-platform offerings. I could have put "Windows" in the title, but I figured that should have been immediately apparent from the context of the article.
    Reply
  • Notmyusualid - Saturday, February 02, 2013 - link

    Agreed.

    We game on Windows. Please continue testing games mostly on Windows.

    But I did enjoy the Anand article, which compared Linux / Windows gaming performance I must say.

    But regardless, I don't need the headache of using a *Nix operating system any more than I'm required to. And I can just imagine how buggy gaming under Linux would be...oh, and with Crossfire too? I'd cut my wrists.
    Reply
  • aryonoco - Tuesday, January 29, 2013 - link

    Since most tablet and smartphone benchmarks are JS benchmarks, I think it would be good to include at least one JS benchmark here as well, to put thins into perspective.

    I nominate Mozilla's Kraken for this, relatively complex to be meaningful on a laptop, but it should still be very easy to perform and very repeatable.
    Reply
  • HibyPrime1 - Tuesday, January 29, 2013 - link

    They're trying to get away from JS benchmarks on mobile, I don't think introducing them on another platform is a good idea. Not to mention JS performance on any non-atom/bobcat laptop is more than adequate.

    The problem is JS on full voltage x86 CPUs is basically just a browser benchmark, not a hardware benchmark.

    I'd like to see cinebench 10 come back. It's by far the most reliable indicator of single threaded performance I've seen - you see almost perfectly linear scaling with clock speed in a given architecture, and comparisons across architectures almost always show what you would expect.

    11.5 is just as reliable as far as I can tell. The problem is 11.5 numbers aren't comparable to 10 and for the most part only Cinebench 10 numbers exist for older processors that people are looking to upgrade.
    Reply
  • mayankleoboy1 - Wednesday, January 30, 2013 - link

    If you plan to use any JS benchmark, use the Google Octane one only. It is the most comprehensive and most real world becnhmark, as agreed by both Mozilla and google developers.

    Kraken, Sunspider are obsolete and optimized in every major browser.
    Robohornet, Robohornet Pro are a joke, which google and moz avoid.
    Reply
  • Blacksn0w - Tuesday, January 29, 2013 - link

    I would like to see a battery test when doing mobile gaming, really any game would probably do, just to give an indication of how long I would be able to do gaming disconnected from the mains, e.g. during a long commute, airplane or train ride.
    Really no other background tasks should be run at the same time, as the scenario probably excludes access to any reasonably fast internet connection.
    Reply
  • QChronoD - Wednesday, January 30, 2013 - link

    I agree that vanilla minecraft isn't the most demanding of games but adding a few graphics mods can bring most systems to their knees. It's still probably one of the more popular OpenGL games, and the fact that its running on java makes it a good test for single core performance.

    Optifine, GLSL Shaders and a 128x pack drops the performance on my i7-920 & GF560 from ~80 to ~30. I would imagine that with a more extreme shader (adding godrays, motion blur and DOF) and 256x or 512x pack would start to stress a 680 or 7970.
    Reply
  • JarredWalton - Wednesday, January 30, 2013 - link

    We've discussed this in the past, but basically we don't benchmark heavily modded games -- for one, mods are seldom optimized well for all platforms/configurations, and for two it just opens up a huge can of worms. Besides, the number of people playing Minecraft with tons of demanding mods is pretty trivial in comparison to the number playing the largely stock Minecraft. Reply
  • ltcommanderdata - Wednesday, January 30, 2013 - link

    I suggest Max Payne 3 as a RAGE engine proxy for GTA V, which should be of interest to a lot of people. Despite all the angst over whether GTA V is coming to PC, if history is any indication, since it'll be launching for consoles in the spring, it'll be out for PC before the end of the year.

    Unigine Heaven could be of some use as a multiplatform benchmark supporting Windows, Mac, and Linux and testing the status of OpenGL drivers vs DirectX drivers in Windows.

    F1 2012 is the latest EGO engine game and it's system requirements are slightly higher than Dirt Showdown.

    Bioshock Infinite could be something to look forward to.

    For OpenCL benchmarks, are the new OpenCL filters in Photoshop CS6 suitable as a benchmark?
    Reply
  • riddler9 - Wednesday, January 30, 2013 - link

    UE3 Epic Citadel was just released and it has a benchmarking mode too. This benchmark looks like a hardcore gaming benchmark. Reply
  • JarredWalton - Wednesday, January 30, 2013 - link

    Erm... are you talking about the version for iOS and Android? Because this article is specifically about laptop/desktop testing. Plus, it sounds like only the Android version will have benchmarking support -- I've heard Apple isn't all that keen on authorizing apps with benchmarking ability, though perhaps that's just an urban legend. Reply
  • dananski - Wednesday, January 30, 2013 - link

    To be fair to riddler9, I often get confused by the use of the word 'mobile' on this site to refer to devices such as laptops, as opposed to 'mobile phones'. I suppose it's a British thing - "Can I nab yer mobile fer a minute? Gotta ring me nan."

    Reading the article before posting helps though ;-)
    Reply
  • karasaj - Wednesday, January 30, 2013 - link

    I know you briefly mentioned it, but please add heart of the swarm when it is out! If you're curious of the best way to test it, throwing an army against an army at maxed (2v2?) lategame would be best. It's a better indication than anything from single player (I don't remember how you did it last time though) Reply
  • JarredWalton - Wednesday, January 30, 2013 - link

    That's the problem: if HOTS is anything like the first SC2, the playback mode will require you to view the whole battle to get to the benchmark portion. Still, it can be done if that's the only way to do it. We actually did a variety of tests with SC2; one was single-player, but several were multiplayer with lots of units later in the game. The ones with lots of units were even more CPU limited, and I think Anand mostly used them for CPU testing. Reply
  • IanCutress - Wednesday, January 30, 2013 - link

    Hey Jarred, a C++ AMP benchmark would be great. It would automatically run on the most powerful AMP device on the machine (either multi-dGPU, dGPU, iGPU or CPU) and can provide a comparison point for testing GPU/AMP simulations while on-the-go. Have a look at the C++ AMP example site at MS, and run the n-body simulation in MultiAMP mode (it will default to the best mode regardless of dGPU, iGPU, CPU or SLI/CFX) with a fixed number of bodies. Either take FPS or GFLOPs, and it only takes 5 seconds to get a result.

    Ian
    Reply
  • JarredWalton - Wednesday, January 30, 2013 - link

    Is there a pre-compiled version available? Seems like you've used this before, so can you send me your binary? Reply
  • Rick83 - Wednesday, January 30, 2013 - link

    Especially with mobile devices, it can be important to protect data from physical theft, which is where encryption comes in.
    On laptops used in most businesses, this is obligatorily activated.

    What's important to know then, is
    1) does the standard installation provide an easy way to encrypt? (built-in HDD/SSD supports encryption, Bitlocker easy to set up (TPM present), pre-installed software offering encryption capability)

    2) how does this modify some of the I/O + CPU-heavy benchmarks (or how many CPU-cycles are lost in a max-I/O situation)?

    3) how does a standard (dm_crypt would be one choice, bitlocker or truecrypt alternatives ) encryption algorithm perform on the device?

    An additional battery life test would be nice to see (encryption on vs off) but as those take a lot of time to run, I wouldn't want to impose that upon anyone.

    Point number 3, arguably the most important, is very simple to test - set up a bootable USB key (in the dm_crypt case) with a Linux system that performs the benchmarks, writes the result into the benchmark database, and then reboots. All it takes, is some free disk space on the integrated storage, so that it can perform write-benchmarks (ideally RAW, as otherwise the NTFS driver might have a minute impact on the results, but a bit file, loop-mounted as a block-device would also work.
    Reply
  • DanNeely - Wednesday, January 30, 2013 - link

    I'd be interested in if TPMs are available in any consumer systems too. I haven't seen them mentioned anywhere; but when they're conspicuously absent on Dell's Latitude pages I'm not sure if I can draw any conclusions from that.

    This is of interest to me because Truecrypt doesn't fully support Win8 yet and bitlocker's usability is badly degraded without a TPM.
    Reply
  • Death666Angel - Wednesday, January 30, 2013 - link

    I'm not sure if that is completely relevant to this benchmark thread, but I would be very interested in seeing low wattage CPUs (35W quad, 17W dual) be tested for their turbo capabilities. This could be done using any number of torture tests or extremely high setting games (to be more "realistic", although torture programs would point to realistic results down the road when the laptop is a year old and starting to build up dust in grim in the cooling system). You would run these tests, record performance metrics and clock frequencies. This way we can see how the cooling system handles the load and how the limited TDP range allows for simultaneous GPU/CPU turbo modes. :) Reply
  • JarredWalton - Wednesday, January 30, 2013 - link

    This of course would be more a test of the laptops using these CPUs than of the CPUs themselves I think. With enough cooling, 17W and 35W chips should be able to run at near-max Turbo constantly, but in most Ultrabooks and smaller laptops they can't do so because the cooling is insufficient. I've got the ASUS UX51VZ to review still, and that's definitely something I'll look at. Reply
  • Death666Angel - Wednesday, January 30, 2013 - link

    Yeah, I realize that the test I proposed is not strictly a component test but more of a platform/OEM test. :) Still, very interesting and with many components being equal in a lot of laptops these days, very important for purchasing decisions. Looking forward to your review! :D

    But wasn't there an article even here on AT where it showed that even with sufficient cooling Ultrabook ULVs did not reach both the max GPU and max CPU turbos because of their limited TDP capacity? Or is it really purely thermal limitations? I also remember reading that in AMD ULV chips (A8-4555M, A10-4655M especially), the limited TDP (19/25W) limits the turbo modes of the GPU/CPU, that there is a trade off between the two. That is why, especially in CPU intensive games, an A8-4555M with a small discrete graphics card can be better, even if the dGPU would be inferior to the iGPU, because the CPU part can turbo higher with the iGPU part being off.
    Wow, I hope I made myself understood. :D
    Reply
  • JarredWalton - Wednesday, January 30, 2013 - link

    Yes, you're correct. If you try to run HD 4000 and the CPU at full load (e.g. playing almost any game), the 17W TDP comes into play. My best estimate is that the CPU cores in a ULV IVB processor can draw around 15W and the HD 4000 can draw around 10W, so something has to give. Oddly, even though the HD 4000 showed clocks around 900MHz (just 20% lower than the max), actual performance was down more like 20-40% from standard voltage IVB mobile chips, indicating the clocks reported by HWiNFO may not be accurate. Reply
  • Death666Angel - Wednesday, January 30, 2013 - link

    Thanks for the follow up on that! :D

    Yeah, I'd be very interested in articles that could tackle those things with laptops. I'm in the market for ultra mobile (13.3", below 1.6kg) laptops or even tablet/laptop hybrids. Nearly all reasonably priced options have ULV processors (most Intel, some AMD). But gaming is still an important thing for me, although not top priority. I don't need to be playing the latest Hitman @ full res. But getting playable rates with Portal2/CoD4 comparable games in non-lowest settings is something I would like very much (currently I have an i3-330UM and used that for some gog.com stuff... worked okay).
    Especially with Intel pursuing faster and faster iGPUs while continually reducing the power envelope, I think it is important to put their feet to the fire so to speak.
    Looking forward to seeing those kinds of things! Keep up the good work you are already doing! :)
    Reply
  • Nexing - Wednesday, January 30, 2013 - link

    So far, the Professional Audio world has been completely sided in this regards. Latency-wise computers and more so notebooks have presented problems not yet properly addressed by manufacturers.
    Dawbench's test is widely utilized -particularly- by live acts and nowadays by Djs going pro, but nowhere to be seen at the computer side, hence missed at manufacturers radar scope. I do hope and expect Anandtech helps to bridge this too longstanding gap.
    Reply
  • JarredWalton - Wednesday, January 30, 2013 - link

    Given the need for additional software besides just the DAW Bench files, this is probably too much to coordinate -- we'd need professional software (that most likely none of us have used). Is something like DPC Latency Checker sufficient, or is that too simplistic? Reply
  • Nexing - Wednesday, January 30, 2013 - link

    DPC is the ONLY way to start configuring a laptop that has been already bought and implemented...
    And it only shows latency peaks over a time line. Most available solutions are in the area of disabling running services (antivirus, bluetooth, Wireless, and the rest not connected with the actual performance) or sequential tweaking of preferences and options of the involved software. Lastly continuing by disabling Plug-ins or reducing the load of musical layers because the actual notebook cannot actually handle what is been put through... and it usually shows it wtih dropouts, clicks or BSODs.

    In these times that laptops achieve C9 latencies at 1600 Mhz with 16GB or more RAM, where we may access to a new lower floor of ns latencies by way of responsive SSDs, regular computer users that visit these technical websites cannot image that OVERALL latencies reach way over 5ms from pressing a MIDI controller button to a Firewire or USB out (not even counting the extra Lat introduced by the required external soundcard)... and easily exceed that figure, commonly several times. numbers that affect performing musicians

    As it is so far, as computer buying segment, we mostly opt for MACs to reduce those critical risks at the midst of a musical performance. Or if we are Windows users, we buy notebooks with expresscard connectors to have the choice to solve the last stage Lat problem commonly introduced by certain USB/Firewire chipsets.

    //Basically the problem is that there is nowhere to find a public bench or a standard that shows those latency numbers for commercially available notebooks. On the contrary, latencies for high end soundcards are standard in their specifications, but one cannot perform without the other and we have no way to know before buying what performance we will get.
    I understand that this is a complex task, as i am sure specialized Studio or Pro Audio communities will gladly help to approach the needed standards. So far there has been no communication to fill the computer-Prof Audio gap, just unilateral efforts like Dawbench,... It would be of the interest of quality manufacturers, users and reviewers.
    Reply
  • Agustin - Thursday, January 31, 2013 - link

    Hey Jared, if you are planning to do the test bench with W8, you will have a problem, if you check the DPC latency you will find that in W8 the latency is ALWAYS at 1000 us, which is highest value that you can achieved if you want a pleasant experience with your sound system, this is something that Microsoft implement on purpose in the kernel of windows 8 to reduce the power draw from the CPU especially in tablets. its not represent a problem for the common user, and the only way to reduce the DPC value is to stress the CPU to the max and the values will drop drastically to the order of 2 or 10 us

    Rightmark audio analyzer is another program you can use to check the dB the integrated chip can deliver

    And Sorry for my english Jared, Im from Argentina, Santiago del Estero
    Reply
  • ToTTenTranz - Wednesday, January 30, 2013 - link

    I think Skyrim should be tested with some of the quality enhancement mods.
    In my opinion, it doesn't even make much sense to play Skyrim in a PC without using mods, since it will provide a generational leap in image quality.

    Just go to the Steam Workshop, choose the top 5 IQ mods and list them in the benchmark description. It should be easy to reproduce for any reader.

    I think XCOM and Orcs Must Die 2 should only be present in benchmarks for iGPUs and low-end discrete GPUs (if there will ever be such a thing in the future).
    I don't think it would be very interesting to see 300 FPS in XCOM when testing the higher-end Geforce GTX 78x and Radeon HD89xx.
    Reply
  • JarredWalton - Wednesday, January 30, 2013 - link

    We've commented on this in the past (and above I made a short comment in regards to Minecraft mods), but the short summary is: no way. Simply making a game more demanding isn't the goal; the goal is to test popular games the way most people will play them. On laptops, we're already running a couple tiers down from desktop GPUs in terms of performance, so making a game even more demanding (and often overflowing the limited VRAM with some mods) just doesn't make sense.

    As for Orcs Must Die 2 and XCOM, your opinion is duly noted. It looks like they won't make the cut, which means we're still looking for a good strategy game or two to add to the list.
    Reply
  • bostontx - Wednesday, January 30, 2013 - link

    I love the Civ5 benchmarks, since it is one game that seems to stress both CPU & GPU. Also, it's my go-to game when I'm bored at the airport or at my hotel on the road. As laptop graphics get better, the play ability should increase in the later stages of the game. Reply
  • powerarmour - Wednesday, January 30, 2013 - link

    Catzilla please :) Reply
  • IanCutress - Wednesday, January 30, 2013 - link

    I find the Catzilla final score is too heavily influenced by the 'load time', which is for whatever reason part of their score calculation. This means that an odd read due to cache or other factors can impact the score a fair bit. I'm tempted to use it in my motherboard reviews when multi-GPU is sorted and I upgrade the drivers, although I'd be using the CPU+GPU score result, not the 'final score'. Reply
  • dragosmp - Wednesday, January 30, 2013 - link

    In the last year frame time testing has developed into a pretty useful tool to asses the overall performance of a system. Primarily it has been used for graphics cards testing, but it can be used just as well for CPUs. It should be an excellent test for laptops since it would asses how smooth is the frame rate delivery of the overall system including the CPU, GPU and driver(s).

    My opinion, and hopefully it's shared, is that within a given time it's better to test a few representative games thoroughly for FPS and frame time than more games only for FPS.

    My choices: no 3DMark, only games: Farcry 3, Dirt 3, Civ 5 (late game sym) and Skyrim.
    Reply
  • JarredWalton - Wednesday, January 30, 2013 - link

    As noted in the article, I primarily include 3DMark because I can't go back and retest most of the old laptops -- they're sent back when we finish the reviews. Thus, it's good to have at least a few graphics tests that give us an estimate of graphics performance over a long period of time. Besides that, 3DMark is extremely easy to install and run, taking less than an hour to complete several runs of each version, and it can be automated (start, go make lunch or write something, come back and it's done).

    As for thoroughly testing a few select games rather than less in depth testing of more games, we'll see what others think. In the past, the general consensus has been "more more more" to both games and testing. I think cutting down the list to a few titles just encourages companies to highly optimize for those titles and ignore all others, so I'm strongly opposed to such a course.
    Reply
  • Alkapwn - Wednesday, January 30, 2013 - link

    One of the things I would like to see done is some sort of bechmark on how well the device handles full workloads before generating enough heat it throttles itself on the GPU/CPU side.

    On the various notebooks and portables I've tested, I ran Prime95 and Furmark, or Intel's tool, or some combination of stress test tools to see how long it takes to go from room temperature idle to CPU/GPU throttle temperatures.

    Great laptop designs handle heat well, were as others do not.
    Reply
  • mrdude - Wednesday, January 30, 2013 - link

    Ditto. I'd definitely urge you guys to pay closer attention to throttling and TDP limitations as chip makers push the bounds of performance in smaller form factors. When comparing competing laptops and devices with similar hardware, the cooling is likely to be the biggest determining factor with respect to performance differentiation. Reply
  • slashdotcomma - Wednesday, January 30, 2013 - link

    I would be interested in seeing WiFi performance say 25', 50', and 100' or whatever distance is deemed valueable. I find that WiFi performance even with the same chipset varies based on mobile device's wireless antenna and placement. I know typical reviews show 2.4GHz speed and 5GHz, but it would be nice to know what to expect from the antenna strength too. Reply
  • IanCutress - Wednesday, January 30, 2013 - link

    Hopefully Jarred has access to an open field and it never rains :) I would have to go around to the 20+ flats in the local vicinity and ask them all to turn off their microwaves and home WiFi. Reply
  • JarredWalton - Wednesday, January 30, 2013 - link

    Bingo, Ian. Not only do I not have a great testing location (there are at present nine 2.4GHz networks in range of my house), but we have other editors in different locations with different testing equipment (routers) and environments. I think the best we can do is to do a short test or two at "ideal" range (less than five feet) and at a moderate distance of 25 feet from the router, with the understanding that results aren't really comparable between reviews. Reply
  • Boogaloo - Wednesday, January 30, 2013 - link

    I'd like to see dwarf fortress benchmarks on mobile devices. Most laptops have sub-par graphics performance but decent to good processors these days, and dwarf fortress is obviously almost entirely CPU (and apparently memory bandwidth) bound. All laptops should be capable of running it at least decently, but it really hammers your CPU once the fort population rises up above 100. It's a good game that can produce a meaningful benchmark on any laptop, not just the ones with good dedicated graphics. Reply

Log in

Don't have an account? Sign up now