
<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom">
<channel>
<docs>http://www.rssboard.org/rss-specification</docs>
<atom:link rel="self" type="application/rss+xml" href="http://www.anandtech.com/rss/" />
<title>AnandTech</title>
<description>This channel features the latest computer hardware related articles.</description>
<link>http://www.anandtech.com</link>
<language>en-us</language>
<copyright>Copyright 2013 AnandTech</copyright> 
<dc:creator>Anand Lal Shimpi</dc:creator>

    
<item>
    
        <title>The HTC One Review</title>
    <author>Brian Klug</author>
    <description><![CDATA[ <p>
	It is nearly impossible to begin to review the HTC One without some context, and I&rsquo;ll begin our review of the HTC One (formerly the device known as codename M7) much the same way I did my <a href="http://www.anandtech.com/show/6754/hands-on-with-the-htc-one-formerly-m7">impressions piece</a> simply by stating that HTC is in an interesting position as a result of last year&rsquo;s product cycle. If there&rsquo;s one thing Anand has really driven home for me in my time writing for AnandTech, it&rsquo;s that in the fast-paced mobile industry, a silicon vendor or OEM really only has to miss one product cycle in a very bad way to get into a very difficult position. The reality of things is that for HTC with this last product cycle there were products with solid industrial design and specs for the most part, but not the right wins with mobile operators in the United States, and not the right marketing message abroad. It&rsquo;s easy to armchair the previous product cycle now that we have a year of perspective, but that&rsquo;s the reality of things. HTC now needs a winner more than ever.</p>
<p>
	For 2013 HTC is starting out a bit differently. Rather than announce the entire lineup of phones, it&rsquo;s beginning with the interestingly-named HTC One. It&rsquo;s just the HTC One &mdash; no S or X or V or any other monikers at all. It&rsquo;s clear that the HTC One is the unadulterated representation of HTC&rsquo;s vision for what the flagship of its smartphone lineup should be. HTC is different from other OEMs in that it only makes smartphones, and as a result the flagship clearly defines the rest of the product portfolio below it. With the One it looks as though HTC is making that kind of statement by literally letting it define the entire One brand.</p>
<p>
	Enough about the position and the strategy for HTC, these are mostly things that are interesting to enthusiasts and industry, but not really relevant to consumers or the review of a singular product. Let&rsquo;s talk about the HTC One.</p>
]]></description>
    <link>http://www.anandtech.com/show/6747/htc-one-review</link>
    <pubDate>Fri, 05 Apr 2013 20:50:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6747:news</guid>
    <category><![CDATA[ Smartphones]]></category>
    <pubDate>Fri, 05 Apr 2013 20:50:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6747:news</guid>
 	<category><![CDATA[ Smartphones]]></category>
</item>  
    
    
<item>
    
        <title>OpenCL Support Coming To Adobe Premiere Pro for Windows</title>
    <author>Ryan Smith</author>
    <description><![CDATA[ <p align="center"><a href="http://www.anandtech.com/show/6881/opencl-support-coming-to-adobe-premiere-pro-for-windows"><img src="http://images.anandtech.com/doci/6881/Adobe.png" alt="" /></a></p><p><p>
	Taking place next week is the National Association of Broadcasters&rsquo; annual trade show, NAB 2013. Though most of the announcements coming out of NAB are for highly specialized products &ndash; rackmount video encoders, broadcast-quality software, etc &ndash; there are usually a few announcements applicable to the wider world. And Adobe and AMD are getting the jump on one of them with an <a href="http://www.amd.com/us/press-releases/Pages/amd-and-adobe-2013apr5.aspx">early announcement of OpenCL support for Premiere Pro</a>.</p>
<p>
	Premiere Pro is Adobe&rsquo;s popular non-linear video editor (NLE), which in version CS5 (2010) added support for a collection of GPU-accelerated effects with Adobe&rsquo;s Mercury Playback Engine. However at the time support was limited to NVIDIA cards due to the use of CUDA, leaving AMD out in the cold, due in part to the fact that <a href="http://www.anandtech.com/show/3972/nvidia-gtc-2010-wrapup/5">Adobe was not satisfied</a> with the state of OpenCL at the time. On the Mac this changed somewhat in CS6 when Adobe added OpenCL support for some (but not quite all) effects, while the PC version of CS6 continued to be CUDA powered.</p>
<p>
	Jumping forward, with the yet-to-be named upcoming version of Premiere Pro &ndash; currently dubbed Premiere Pro CS <em>Next</em> &ndash; Adobe is bringing broader OpenCL support to the Windows market, and in effect finally enabling hardware processing on AMD GPUs. As is often the case, AMD has been working directly with Adobe to get OpenCL integrated into Premiere Pro, and in fact today&rsquo;s announcement comes by the way of AMD rather than Adobe. Adobe for their part isn&rsquo;t saying much about Premiere Pro <em>Next</em> at this time &ndash; traditionally Adobe saves that for their own events &ndash; but at a minimum it looks like OpenCL is coming to parity with CUDA (or close enough). Though with Adobe consistently working to expand their usage of GPU processing and having more than a year to work with AMD&rsquo;s GCN architecture, it will be interesting to see if Premiere Pro CS <em>Next</em> will add support for new effects, on top of OpenCL support for their existing GPU accelerated effects.</p>
<p>
	Anyhow, for AMD this is of course a big deal. While some other NLEs like Sony Vegas have supported hardware accelerated effects with their cards for some time, Premiere Pro represents a sizable part of the NLE market that they were previously locked out of. Especially since this lets AMD leverage their APU advantage, including both the consumer A-series and the rarely mentioned <a href="http://www.anandtech.com/show/6139/amd-introduces-firepro-a300-a320-apus-trinity-for-graphics-workstations">FirePro APUs</a>. That the A-series is being supported is actually a big deal in and of itself since Premiere Pro CS6&rsquo;s CUDA path only officially supports a <a href="http://www.adobe.com/products/premiere/tech-specs.html">small number</a> of high-end NVIDIA consumer cards, so this marks a major broadening of support on Adobe&rsquo;s part.</p>
<p>
	Finally, <a href="http://blogs.amd.com/work/2013/04/04/sneak-peek-adobe-premiere-pro-next-gpu-performance-testing/">AMD has a blog up</a> offering a sneak peek at performance, though as with any vendor-published benchmarks it should be taken with a grain of salt. Performance aside, it&rsquo;s interesting to note that it looks like Adobe will be keeping their CUDA code path, as AMD&rsquo;s test configurations indicate that the NVIDIA cards are using the CUDA code path even on Premiere Pro <em>Next</em>. Having separate code paths is not all that unusual in the professional world, as in cases like these it means each GPU family gets an optimized code path for maximum performance, but it does mean Adobe is putting in extra work to make it happen.</p>
</p>]]></description>
    <link>http://www.anandtech.com/show/6881/opencl-support-coming-to-adobe-premiere-pro-for-windows</link>
    <pubDate>Fri, 05 Apr 2013 13:45:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6881:news</guid>
    <category><![CDATA[ GPUs]]></category>
    <pubDate>Fri, 05 Apr 2013 13:45:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6881:news</guid>
 	<category><![CDATA[ GPUs]]></category>
</item>  
    
    
<item>
    
        <title>Best Budget Laptops, April 2013</title>
    <author>Jarred Walton</author>
    <description><![CDATA[ <p align="center"><a href="http://www.anandtech.com/show/6880/best-budget-laptops-april-2013"><img src="http://images.anandtech.com/doci/6880/ASUS-A55A-Red (1)_575px.jpg" alt="" /></a></p><p><p>
	Last week we launched a new sort of buyer&rsquo;s guide for AnandTech with our <a href="http://www.anandtech.com/show/6865/best-budget-ultrabook-march-2013">Best Budget Ultrabook</a> recommendation. We&rsquo;ll be fleshing out the &ldquo;best XYZ&rdquo; recommendations for other components and categories over the coming months, but for now my focus is on the notebook sector, and the plan is to have a new recommendation for laptops every Friday. Last week was a budget Ultrabook, and this week is the true budget category for all laptops. Let me know what you&rsquo;d like me to cover next, keeping in mind that there are probably five or six categories of laptop that I&rsquo;ll rotate through on a regular basis.</p>
<p>
	With that out of the way, let&rsquo;s talk briefly about the budget laptop sector. Laptops comprise everything from Chromebooks to ultraportables/thin and lights, and on up to beefy gaming systems. You won&rsquo;t find us recommending a Chromebook as a gaming laptop for what should be obvious reasons, but otherwise it&rsquo;s basically wide open. For the budget category, I&rsquo;m going to try to keep recommendations under $500, with some leeway to go as high as $600 if there&rsquo;s a really special offering. That gives me plenty of choices, and while I&rsquo;ll try to avoid short-term sales, it&rsquo;s difficult to gauge availability if interest suddenly spikes thanks to an article. To that end (and thanks to reader feedback), while there will be a primary recommendation, I&rsquo;m going to throw in a few alternatives as well&mdash;no more of that &quot;one size fits all&quot; funny stuff!</p>
<p>
	Haswell and Richland laptops are still hiding just over the horizon, but I&rsquo;m pretty confident that at least for the next couple months we won&rsquo;t see either new processor challenge the budget category we&rsquo;re looking at today. Besides, Trinity and Ivy Bridge laptops are still able to handle just about anything you might want to run&mdash;for that matter, even Sandy Bridge and Llano can be sufficient. In short, I&rsquo;m not too worried about performance compromises even when looking at sub-$500 laptops.</p>
<p>
	Where you will have to make some sacrifices are in areas like display quality (seriously: are there <em>any</em> budget laptops with good displays out there?), build quality, and perhaps battery life and features. Size is another area where you&rsquo;ll likely end up with a ubiquitous 15.6&rdquo; LCD, or alternatively an 11.6&rdquo; or 10.1&rdquo; netbook. After surveying the options&mdash;I focused mostly on Amazon.com, Newegg.com, and a few other major retailers&mdash;I ended up finding quite a few laptops that end up being similar in both features and performance, not to mention price. All things being equal, I&rsquo;d rather have Ivy Bridge than Sandy Bridge, or Trinity than Llano. You can find the older parts for as little as $325-$375 in some cases, but the best option I can find right now comes from ASUS.</p>
<p>
	<strong>Best Budget Notebook: <a href ="http://detonator.dynamitedata.com/cgi-bin/redirect.pl?user=u00000626&url=http%3a%2f%2fwww.amazon.com%2fdp%2fB009OV5150%2f">ASUS A55A, $430 (i3-3110M)</a></strong></p>
<p align="center">
	<a href="http://www.anandtech.com/show/6880/best-budget-laptops-april-2013"><img alt="" src="http://images.anandtech.com/doci/6880/ASUS-A55A-Black%20(5)_575px.jpg" /></a></p>
<p>
	The A55A-AH31 is available in <a href ="http://detonator.dynamitedata.com/cgi-bin/redirect.pl?user=u00000626&url=http%3a%2f%2fwww.amazon.com%2fdp%2fB009OV5150%2f">black</a>, <a href ="http://detonator.dynamitedata.com/cgi-bin/redirect.pl?user=u00000626&url=http%3a%2f%2fwww.amazon.com%2fdp%2fB009OV50V0%2f">blue</a>, <a href ="http://detonator.dynamitedata.com/cgi-bin/redirect.pl?user=u00000626&url=http%3a%2f%2fwww.amazon.com%2fdp%2fB009OV18SE%2f">red</a>, <a href ="http://detonator.dynamitedata.com/cgi-bin/redirect.pl?user=u00000626&url=http%3a%2f%2fwww.amazon.com%2fdp%2fB009OV196U%2f">pink</a>, or <a href ="http://detonator.dynamitedata.com/cgi-bin/redirect.pl?user=u00000626&url=http%3a%2f%2fwww.amazon.com%2fdp%2fB009OV192E%2f">white</a>, all with the same core features and specs and mostly with the same price of $430&mdash;the pink model currently goes for $506 while the white offering costs $510. The specs are reasonable as well: Ivy Bridge Core i3-3110M (2.4GHz, no Turbo, HD 4000 iGPU), 4GB RAM, and a 750GB hard drive. You also get two USB 3.0 ports (one USB 2.0) and&mdash;wait for it!&mdash;a &ldquo;glorious&rdquo; 1366x768 display (like I said, you have to compromise somewhere). The laptop is also a bit chunky at 5.8 pounds, but battery life is at least okay at 4-5 hours of moderate use. However you slice it, I find $430 to be an excellent price for a good laptop, and you still get a reasonable keyboard layout and build quality.</p>
<p>
	<strong>Best Budget Gaming Notebook: <a href ="http://detonator.dynamitedata.com/cgi-bin/redirect.pl?user=u00000626&url=http%3a%2f%2fwww.toshibadirect.com%2ftd%2fb2c%2fcdetland.to%3fpoid%3d2000044676">Toshiba L850D, $500 (A10-4600M)</a></strong></p>
<p>
	If you want something that can handle moderate gaming as well, you have two options: get an Intel system with a discrete GPU from NVIDIA, or buy something with an AMD Trinity APU. My alternate choice is going to take the Trinity route, and it looks like the best way to get Trinity A10 (because A8 and especially A6 tend to be too slow to really do gaming justice) is to go straight to either Toshiba or HP. Of the two, I&rsquo;m going to give the edge to Toshiba, based on pricing.</p>
<p align="center">
	<a href="http://www.anandtech.com/show/6880/best-budget-laptops-april-2013"><img alt="" src="http://images.anandtech.com/doci/6880/01_Toshiba_L850D_front_575px.jpg" /></a></p>
<p>
	Sadly, where Toshiba previously had the Satellite L840D, L850D, and L870D (14&rdquo;, 15.6&rdquo;, and 17.3&rdquo;, respectively), it appears only the L850D remains available in AMD trim&mdash;the L840 and L870 are both Intel-only now. The good news is that where pricing on laptops equipped with the AMD A10-4600M tends to hover around $650 (which is frankly too much), <a href ="http://detonator.dynamitedata.com/cgi-bin/redirect.pl?user=u00000626&url=http%3a%2f%2fwww.toshibadirect.com%2ftd%2fb2c%2fcdetland.to%3fpoid%3d2000044676">Toshiba&rsquo;s L850D starts at $400</a> with an A6-4400M and you can upgrade to the A10-4600M for $100. The base configuration ends up being an A10-4600M, 4GB DDR3-1600, 640GB 5400RPM HDD, 1366x768 LCD, and all the other typical accessories. Note that there&rsquo;s currently a $150 instant rebate going on (and it&rsquo;s frequently around), so at present the $500 offer lasts through April 8, 2013.</p>
<p>
	The HP offerings are virtually identical to Toshiba in features, but now you can choose between a 15.6&rdquo; <a href ="http://detonator.dynamitedata.com/cgi-bin/redirect.pl?user=u00000626&url=http%3a%2f%2fwww.shopping.hp.com%2fen_US%2fhome-office%2f-%2fproducts%2fLaptops%2fHP-ENVY%2fB5Y73AV%3bpgid%3dc7twGfjc0ptSRpIq7ZUcoGXQ0000Czl2cCIT%3bsid%3dIYPFMKrUzZqZNPuJXm-fpHPbFZ8AZZ7I5NaL7ExxK1hImyGv-aje_jjH%3fHP-ENVY-dv6z-7200-Notebook-PC">HP ENVY dv6z-7200</a> ($530 base price, $630 with A10-4600M) or a 17.3&rdquo; <a href ="http://detonator.dynamitedata.com/cgi-bin/redirect.pl?user=u00000626&url=http%3a%2f%2fwww.shopping.hp.com%2fen_US%2fhome-office%2f-%2fproducts%2fLaptops%2fHP-Pavilion%2fB8B23AV%3bpgid%3dc7twGfjc0ptSRpIq7ZUcoGXQ0000yIAIG8aG%3bsid%3dwYnY5y0bgKYu5nxGurl_b_UU9ZUdsifATCKWO8u-y1JVTKZgGaLdE_LT%3fHP-Pavilion-g7z-2200-Notebook-PC">HP Pavilion g7z-2200</a> ($480 base price, $580 with A10-4600M). Both of those upgraded prices push the limits of what I would consider &ldquo;budget&rdquo;, but they&rsquo;re still reasonable alternatives if you want a laptop that can handle most games at moderate to high detail settings. The dv6z comes with a base configuration that&rsquo;s slightly higher than the g7z, incidentally: 6GB RAM and a 640GB HDD compared to 4GB RAM and 500GB HDD. The LCDs are also obviously different, and HP even offers a 1080p upgrade on the dv6z, but at $150 it&rsquo;s definitely not budget material; the 17.3&rdquo; panel is 1600x900 while the stock 15.6&rdquo; panel is the bog standard 1366x768. That said, the HD 7660G iGPU in the A10 APU is better suited to gaming at 768p than it is 900p.</p>
<p>
	There are a few likely reasons for the current pricing. One explanation is that people have been largely underwhelmed with Windows 8, so discounting the laptops can help move product. I&rsquo;m not a huge fan of Windows 8, but I&rsquo;ve found that installing <a href="http://www.classicshell.net/">Classic Shell</a> or <a href="http://www.stardock.com/products/start8/">Start 8</a> is enough to fix 90% of my gripes (YMMV). Another likely factor is that the laptop OEMs are working hard to clear out existing inventory before the upgraded Haswell and Richland models arrive, and hopefully we&rsquo;ll start to see Richland before the end of May (if not sooner).</p>
<p>
	<strong>Honorable Mention: <a href="http://play.google.com/store/devices/details?id=chromebook_acer_c710&amp;utm_campaign=en&amp;utm_source=en-ha-na-us-plas&amp;utm_medium=ha">Acer C7 Chromebook, $200 (Celeron 867)</a></strong></p>
<p align="center">
	<a href="http://www.anandtech.com/show/6880/best-budget-laptops-april-2013"><img alt="" src="http://images.anandtech.com/doci/6880/Acer%20AC710%20left%20facing_575px.jpg" /></a></p>
<p>
	There are other options of course, mostly with even older hardware, or slower and more specialized hardware. Acer&rsquo;s latest C7 Chromebook has been selling really well, thanks largely to the <a href="https://play.google.com/store/devices/details?id=chromebook_acer_c710&amp;utm_campaign=en&amp;utm_source=en-ha-na-us-plas&amp;utm_medium=ha">$200 price tag</a>. If you&rsquo;re tied into the cloud rather than doing local storage, it can be a great alternative to more expensive laptops, plus you&rsquo;re less likely to get distracted by games (given the rather poor performance and support for such). It&rsquo;s definitely more compelling than an Atom-based netbook, and ChromeOS requires far less of the Celeron CPU than Windows 8.</p>
<p>
	<strong>Honorable Mention: <a href ="http://detonator.dynamitedata.com/cgi-bin/redirect.pl?user=u00000626&url=http%3a%2f%2fwww.officemax.com%2ftechnology%2fcomputers%2flaptop-computers%2fproduct-prod4040009">Lenovo G570, $330 (i3-2370M)</a></strong></p>
<p align="center">
	<a href="http://www.anandtech.com/show/6880/best-budget-laptops-april-2013"><img alt="" src="http://images.anandtech.com/doci/6880/Lenovo%20G570%204334DBU%2015.6-Inch%20Laptop%20(Black)_575px.jpg" /></a></p>
<p>
	Another really inexpensive laptop to consider is the Lenovo G570, currently <a href ="http://detonator.dynamitedata.com/cgi-bin/redirect.pl?user=u00000626&url=http%3a%2f%2fwww.officemax.com%2ftechnology%2fcomputers%2flaptop-computers%2fproduct-prod4040009">selling for a mere $330 at OfficeMax</a>. You get a Sandy Bridge i3-2370M CPU, 4GB RAM, 500GB HDD, and around 5 hours of battery life. Performance should still be reasonable compared to Ivy Bridge, though the HD 3000 graphics are a big step down from HD 4000 so even light gaming is almost too much to expect.</p>
<p>
	There are plenty of other laptops in the $300 range with AMD&rsquo;s C-series and E-series APUs (not to mention the dog that is Intel Atom), but while battery life might be good, just about everything else is too slow for me to personally recommend such a laptop. Then again, people are okay with tablets that offer even less performance in many cases, so consider your own wants and shop accordingly. In the meantime, if you have a favorite budget laptop that you feel we&rsquo;ve neglected, by all means let us know in the comments, and as noted in above, let me know what category of laptops you&rsquo;d like me to analyze next Friday.</p>
<div>Gallery: <a href="/Gallery/Album/2717" target="_blank">Best Budget Laptops, April 2013</a><div><a href="/Gallery/Album/2717#1" target="_blank"><img src="http://images.anandtech.com/galleries/2717/02_Toshiba_L850D_front-right_thumb.jpg" width="85" height="85" border="0"/></a><a href="/Gallery/Album/2717#2" target="_blank"><img src="http://images.anandtech.com/galleries/2717/03_Toshiba_L850D_front-left_thumb.jpg" width="85" height="85" border="0"/></a><a href="/Gallery/Album/2717#3" target="_blank"><img src="http://images.anandtech.com/galleries/2717/04_Toshiba_L850D_low-angle_thumb.jpg" width="85" height="85" border="0"/></a><a href="/Gallery/Album/2717#4" target="_blank"><img src="http://images.anandtech.com/galleries/2717/05_Toshiba_L850D_top_thumb.jpg" width="85" height="85" border="0"/></a><a href="/Gallery/Album/2717#5" target="_blank"><img src="http://images.anandtech.com/galleries/2717/06_Toshiba_L850D_right-edge_thumb.jpg" width="85" height="85" border="0"/></a><a href="/Gallery/Album/2717#6" target="_blank"><img src="http://images.anandtech.com/galleries/2717/07_Toshiba_L850D_left-edge_thumb.jpg" width="85" height="85" border="0"/></a></div></div></p>]]></description>
    <link>http://www.anandtech.com/show/6880/best-budget-laptops-april-2013</link>
    <pubDate>Fri, 05 Apr 2013 00:50:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6880:news</guid>
    <category><![CDATA[ Mobile]]></category>
    <pubDate>Fri, 05 Apr 2013 00:50:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6880:news</guid>
 	<category><![CDATA[ Mobile]]></category>
</item>  
    
    
<item>
    
        <title>Facebook Announces Home, and HTC First</title>
    <author>Brian Klug</author>
    <description><![CDATA[ <p align="center"><a href="http://www.anandtech.com/show/6879/facebook-announces-home-and-htc-first"><img src="http://images.anandtech.com/doci/6879/facebook_575px.jpg" alt="" /></a></p><p><p>
	Today saw some interesting news from Facebook, which announced a new Android experience named Facebook Home, and alongside a new handset from HTC which will run it out of the box, the rather ironically named HTC First. This isn&#39;t Facebook or HTC&#39;s first attempt at a Facebook-specific handset, either, if you recall the HTC Status, or its codename ChaCha.</p>
<p>
	First is Home, which is part Android launcher replacement, part Facebook Skin. Home replaces the lock screen, application launcher, notification, and chat system with Facebook designed counterparts. Home overlays Facebook and Instagram views atop the lock screen, statuses and stories alike. The UI looks like a drastic departure from Android and takes cues from Web OS with lots of transparent regions. Facebook has dubbed its chat interface on Home &#39;Chat Heads&#39; and intercepts SMS notifications from Android and augments them with Facebook chat as well. These persist throughout the UI atop active applications. Facebook has videos explaining <a href="http://www.youtube.com/watch?v=tWKE0HTl0ig">Cover Feed</a>, <a href="http://www.youtube.com/watch?v=HKyO0hJEp-g">Notifications</a>, and <a href="http://www.youtube.com/watch?v=9p_y9dAK94Q">Chat Heads</a> on their YouTube page.</p>
<p>
	Rather than build a complete platform from scratch, it makes sense for Facebook to leverage Android and deliver something like Home which essentially is part launcher replacement, part UI skin. Home will be available on April 12 through the Play Store for a limited subset of devices - the HTC One X, X+, Samsung Galaxy S 3, and Note 2, and in the coming months the Galaxy S 4 and HTC One. What&#39;s interesting is that subset of devices which Home will work with, it&#39;s a bit more than a launcher replacement since it is intercepting some notifications from the OS.</p>
<p>
	The other part of the announcement is an HTC-made phone called the HTC First. It&#39;s a midrange specification HTC handset which will be, well, home to Facebook Home, First, get it? Phew. I&#39;ve put together a table with the specifications of the HTC First.</p>
<div align="center">
	<table align="center" border="0" cellpadding="0" cellspacing="1" width="575">
		<tbody>
			<tr class="tgrey">
				<td align="center" colspan="5">
					HTC First Specifications</td>
			</tr>
			<tr class="tlblue">
				<td class="tlgrey" width="180">
					Device</td>
				<td align="center" valign="middle" width="392">
					HTC First</td>
			</tr>
			<tr>
				<td class="tlgrey">
					SoC</td>
				<td align="center" valign="middle">
					1.4 GHz Snapdragon 400<br />
					(MSM8930AA, 2x Krait, Adreno 305 GPU)</td>
			</tr>
			<tr>
				<td class="tlgrey">
					RAM/NAND/Expansion</td>
				<td align="center" valign="middle">
					1 GB LPDDR2, 16 GB Storage with USB-OTG</td>
			</tr>
			<tr>
				<td class="tlgrey">
					Display</td>
				<td align="center" valign="middle">
					4.3-inch 720p LCD</td>
			</tr>
			<tr>
				<td class="tlgrey">
					Network</td>
				<td align="center" valign="middle">
					GSM/EDGE: 850/900/1800/1900 MHz<br />
					WCDMA: 850/1900/2100 MHz<br />
					LTE: 700/850/AWS/1900 MHz</td>
			</tr>
			<tr>
				<td class="tlgrey">
					Dimensions</td>
				<td align="center" valign="middle">
					4.96 x 2.56 x 0.35 inches, 4.37 oz</td>
			</tr>
			<tr>
				<td class="tlgrey">
					Camera</td>
				<td align="center" valign="middle">
					5.0 MP F/2.0 (Rear), 1.6 MP F/2.2 (Front)</td>
			</tr>
			<tr>
				<td class="tlgrey">
					Battery</td>
				<td align="center" valign="middle">
					2000 mAh 3.8V</td>
			</tr>
			<tr>
				<td class="tlgrey">
					OS</td>
				<td align="center" valign="middle">
					Android 4.1 with Facebook Home</td>
			</tr>
			<tr>
				<td class="tlgrey">
					Connectivity</td>
				<td align="center" valign="middle">
					NFC, BT 4.0, 802.11a/b/g/n</td>
			</tr>
		</tbody>
	</table>
</div>
<p>
	The First is at present an AT&amp;T exclusive officially, which is further backed up considering the presence of Band 17 LTE. AT&amp;T is taking <a href="http://att.com/htcfirst">preorders</a> for the HTC First which will go on sale for $99 with a two year contract starting April 12th.</p>
<p>
	<strong>Update:</strong> I clarified the LTE band situation on the HTC First, which reflects the newer AT&amp;T RFP with LTE on both its current WCDMA bands (850/1900 MHz) in addition to Band 17 (700) and 4 (AWS 1700/2100 MHz).</p>
<p>
	Source: <a href="http://www.htc.com/www/smartphones/htc-first/">HTC</a> (First Page), <a href="http://att.com/htcfirst">AT&amp;T</a> (Preorders), <a href="http://newsroom.fb.com/News/597/Introducing-Home">Facebook</a> (Introducing Home)</p>
<p>
	<div>Gallery: <a href="/Gallery/Album/2713" target="_blank">Facebook Announces Home, and HTC First</a><div><a href="/Gallery/Album/2713#1" target="_blank"><img src="http://images.anandtech.com/galleries/2713/Chat-Head-Preview_thumb.jpg" width="85" height="85" border="0"/></a><a href="/Gallery/Album/2713#2" target="_blank"><img src="http://images.anandtech.com/galleries/2713/Chat-Head-Thread_thumb.jpg" width="85" height="85" border="0"/></a><a href="/Gallery/Album/2713#3" target="_blank"><img src="http://images.anandtech.com/galleries/2713/CoverFeed_thumb.jpg" width="85" height="85" border="0"/></a><a href="/Gallery/Album/2713#4" target="_blank"><img src="http://images.anandtech.com/galleries/2713/Launcher_thumb.jpg" width="85" height="85" border="0"/></a><a href="/Gallery/Album/2713#5" target="_blank"><img src="http://images.anandtech.com/galleries/2713/Notifications_thumb.jpg" width="85" height="85" border="0"/></a></div></div></p>
</p>]]></description>
    <link>http://www.anandtech.com/show/6879/facebook-announces-home-and-htc-first</link>
    <pubDate>Thu, 04 Apr 2013 15:23:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6879:news</guid>
    <category><![CDATA[ Smartphones]]></category>
    <pubDate>Thu, 04 Apr 2013 15:23:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6879:news</guid>
 	<category><![CDATA[ Smartphones]]></category>
</item>  
    
    
<item>
    
        <title>The Gadget Show Live, April 2013: Technology in the UK</title>
    <author>Ian Cutress</author>
    <description><![CDATA[ <p>
	On Tuesday this week I went to The Gadget Show Live, a trade and public show about technology and entrepreneurs in the UK. &nbsp;There are some interesting developments in home grown talent...</p>
]]></description>
    <link>http://www.anandtech.com/show/6878/the-gadget-show-live-april-2013-technology-in-the-uk</link>
    <pubDate>Thu, 04 Apr 2013 06:30:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6878:news</guid>
    <category><![CDATA[ Trade Shows]]></category>
    <pubDate>Thu, 04 Apr 2013 06:30:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6878:news</guid>
 	<category><![CDATA[ Trade Shows]]></category>
</item>  
    
    
<item>
    
        <title>The Great Equalizer 3: How Fast is Your Smartphone/Tablet in PC GPU Terms</title>
    <author>Anand Lal Shimpi</author>
    <description><![CDATA[ <p>
	For the past several days I&#39;ve been playing around with Futuremark&#39;s new <a href="http://www.anandtech.com/show/6876/microsofts-surface-pro-vs-android-devices-in-3dmark">3DMark for Android</a>, as well as <a href="http://www.anandtech.com/show/6872/the-great-equalizer-apple-android-windows-tablets-compared-using-gldxbenchmark-27">Kishonti&#39;s GL and DXBenchmark 2.7</a>. All of these tests are scheduled to be available on Android, iOS, Windows RT and Windows 8 - giving us the beginning of a very wonderful thing: a set of benchmarks that allow us to roughly compare mobile hardware across (virtually) all OSes. The computing world is headed for convergence in a major way, and with benchmarks like these we&#39;ll be able to better track everyone&#39;s progress as the high performance folks go low power, and the low power folks aim for higher performance.</p>
<p>
	The previous two articles I did on the topic were really focused on comparing smartphones to smartphones, and tablets to tablets. What we&#39;ve been lacking however has been perspective. On the CPU side we&#39;ve known how fast Atom was for quite a while. Back in 2008 I concluded that a 1.6GHz single core Atom processor delivered performance similar to that of a 1.2GHz Pentium M, or a mainstream Centrino notebook from 2003. Higher clock speeds and a second core would likely push that performance forward by another year or two at most. Given that most of the ARM based CPU competitors tend to be a bit slower than Atom, you could estimate that any of the current crop of smartphones delivers CPU performance somewhere in the range of a notebook from 2003 - 2005. Not bad. But what about graphics performance?</p>
]]></description>
    <link>http://www.anandtech.com/show/6877/the-great-equalizer-part-3</link>
    <pubDate>Thu, 04 Apr 2013 01:00:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6877:news</guid>
    <category><![CDATA[ Tablets]]></category>
    <pubDate>Thu, 04 Apr 2013 01:00:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6877:news</guid>
 	<category><![CDATA[ Tablets]]></category>
</item>  
    
    
<item>
    
        <title>MyDigitalSSD BP4 2.5&quot; &amp; mSATA (240GB) Review</title>
    <author>Kristian Vättö</author>
    <description><![CDATA[ <p>
	When I reviewed <a href="http://www.anandtech.com/show/6527/mydigitalssd-smart-bp3-msata-ssd-review">MyDigitalSSD&#39;s BP3</a>, I have to say I was positively surprised. A relatively unknown manufacturer combined with a Phison controller is not the most promising mix. With SandForce you at least know what to expect but our experience with Phison based SSDs is limited and <a href="http://www.anandtech.com/show/6325/crucial-v4-256gb-review">Crucial&#39;s v4</a> definitely didn&#39;t build a golden image of Phison as a controller maker, which made me very skeptical about the BP3 when I first got it. Fortunately, MyDigitalSSD proved me wrong. The BP3 turned out to be not the highest performing drive, but rather a very good bang for the buck. It was noticeably cheaper than any other mSATA offerings in the market, which made it an alluring option for value orientated mSATA buyers.</p>
<p style="text-align: center;">
	<a href="http://www.anandtech.com/show/6809/mydigitalssd-bp4-25-msata-240gb-review"><img alt="" src="http://images.anandtech.com/doci/6809/cover_575px.jpg" /></a></p>
<p>
	Almost immediately after our BP3 and SMART review went up, MyDigitalSSD told me that the successor to the BP3 was just around the corner: the BP4. From a hardware standpoint, not much has changed in the BP4 and the only major change is the move from 24nm to 19nm NAND. However, there have been some big changes on the firmware front and MyDigitalSSD is promising some pretty impressive performance figures and very affordable prices. Do their claims hold up? Read on to find out!</p>
]]></description>
    <link>http://www.anandtech.com/show/6809/mydigitalssd-bp4-25-msata-240gb-review</link>
    <pubDate>Wed, 03 Apr 2013 11:13:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6809:news</guid>
    <category><![CDATA[ Storage]]></category>
    <pubDate>Wed, 03 Apr 2013 11:13:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6809:news</guid>
 	<category><![CDATA[ Storage]]></category>
</item>  
    
    
<item>
    
        <title>The Great Equalizer Part 2: Surface Pro vs. Android Devices in 3DMark</title>
    <author>Anand Lal Shimpi</author>
    <description><![CDATA[ <p>
	While we&#39;re still waiting for Windows RT and iOS versions of the latest 3DMark, there is one cross-platform comparison we can make: Ivy Bridge/Clover Trail to the Android devices we just tested in 3DMark.</p>
]]></description>
    <link>http://www.anandtech.com/show/6876/microsofts-surface-pro-vs-android-devices-in-3dmark</link>
    <pubDate>Tue, 02 Apr 2013 14:26:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6876:news</guid>
    <category><![CDATA[ Smartphones]]></category>
    <pubDate>Tue, 02 Apr 2013 14:26:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6876:news</guid>
 	<category><![CDATA[ Smartphones]]></category>
</item>  
    
    
<item>
    
        <title>3DMark for Android: Performance Preview</title>
    <author>Anand Lal Shimpi &amp; Brian Klug</author>
    <description><![CDATA[ <p>
	As I mentioned in&nbsp;<a href="">our coverage of GL/DXBenchmark 2.7</a>, with the arrival of Windows RT/8 we&#39;d finally see our first truly cross-platform benchmarks. Kishonti was first out of the gate, although Futuremark was first to announce its cross platform benchmark simply called 3DMark.</p>
<p>
	Currently available for x86 Windows 8 machines, Futuremark has Android, iOS and Windows RT versions of 3DMark nearing release. Today the embargo lifts on the&nbsp;<a href="">Android version of 3DMark</a>, with iOS and Windows RT to follow shortly.</p>
]]></description>
    <link>http://www.anandtech.com/show/6875/3dmark-for-android-performance-preview</link>
    <pubDate>Tue, 02 Apr 2013 07:00:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6875:news</guid>
    <category><![CDATA[ Smartphones]]></category>
    <pubDate>Tue, 02 Apr 2013 07:00:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6875:news</guid>
 	<category><![CDATA[ Smartphones]]></category>
</item>  
    
    
<item>
    
        <title>The AnandTech Podcast: Episode 18</title>
    <author>Anand Lal Shimpi</author>
    <description><![CDATA[ <p align="center"><a href="http://www.anandtech.com/show/6874/the-anandtech-podcast-episode-18"><img src="http://images.anandtech.com/doci/6874/Podcast_A.jpg" alt="" /></a></p><p><p>
	We are back with the first half of our return to podcasting. While Brian finishes up his review of the One, Ryan and I took some time to recap (almost) everything in the world of GPUs for the past couple of weeks. We&#39;ll be doing a mobile focused podcast after the One review hits to discuss it, the SGS4 and more.</p>
<p style="text-align: center; ">
	<audio controls="" src="http://images.anandtech.com/reviews/Podcast/AnandTech_Podcast_018.mp3" tabindex="0"></audio></p>
<p>
	<em><strong>The AnandTech Podcast - Episode 18</strong></em><br />
	<em>featuring Anand Shimpi &amp; Ryan Smith</em></p>
<p>
	<em><a href="http://itunes.apple.com/us/podcast/anandtech-podcast-m4a-feed/id554054212">iTunes</a><br />
	RSS -&nbsp;<a href="http://www.anandtech.com/rss/podcastmp3">mp3</a>,&nbsp;<a href="http://www.anandtech.com/rss/podcastm4a">m4a</a><br />
	Direct Links -&nbsp;<a href="http://images.anandtech.com/reviews/Podcast/AnandTech_Podcast_018.mp3">mp3</a>,&nbsp;<a href="http://images.anandtech.com/reviews/Podcast/AnandTech_Podcast_018.m4a">m4a</a></em></p>
<p>
	<em>Total Time: &nbsp;1 hour &nbsp;5 minutes</em></p>
<p>
	<em>Outline - hh:mm</em></p>
<div>
	The AnandTech Redesign - 00:00</div>
<div>
	NVIDIA&#39;s GTC 2013 - 00:14</div>
<div>
	NVIDIA&#39;s Maxwell &amp; Volta (2014/2016 GPU Architectures) - 00:18</div>
<div>
	NVIDIA&#39;s Kayla - 00:22</div>
<div>
	Predicting Logan&#39;s GPU Performance - 00:24</div>
<div>
	AMD Dev Rel at GDC 2013 - 00:27</div>
<div>
	AMD&#39;s Radeon Sky &amp; Cloud Gaming - 00:35</div>
<div>
	AMD&#39;s Radeon HD 7790 - 00:41</div>
<div>
	NVIDIA GeForce GTX 650 Ti Boost - 00:43</div>
<div>
	The Sweet Spot between $100 - $200 00:47</div>
<div>
	FCAT - 00:51</div>
<p>
	As always, comments are welcome and appreciated.&nbsp;</p>
</p>]]></description>
    <link>http://www.anandtech.com/show/6874/the-anandtech-podcast-episode-18</link>
    <pubDate>Mon, 01 Apr 2013 21:55:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6874:news</guid>
    <category><![CDATA[ Podcast]]></category>
    <pubDate>Mon, 01 Apr 2013 21:55:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6874:news</guid>
 	<category><![CDATA[ Podcast]]></category>
</item>  
    
    
<item>
    
        <title>NVIDIA’s GeForce 700M Family: Full Details and Specs</title>
    <author>Jarred Walton</author>
    <description><![CDATA[ <p>
	With spring now well under way and the pending launch of Intel&rsquo;s Haswell chips, OEMs always like to have &ldquo;new&rdquo; parts across the board, and so once more we&rsquo;re getting a new series of chips from NVIDIA, the 700M parts. We&rsquo;ve already seen a few laptops shipping with the 710M and GT 730M; today NVIDIA is filling out the rest of 700M family. Last year saw NVIDIA&rsquo;s very successful <a href="http://www.anandtech.com/show/5697/nvidias-geforce-600m-series-keplers-and-fermis-and-die-shrinks-oh-my">launch of mobile Kepler</a>; since that time, the number of laptops shipping with NVIDIA dGPUs compared to AMD dGPUs appears to have shifted even more in NVIDIA&rsquo;s favor. Not surprisingly, with TSMC still on 28nm NVIDIA isn&rsquo;t launching a new architecture, but they&rsquo;ll be tweaking Kepler to keep it going through 2013.</p>
]]></description>
    <link>http://www.anandtech.com/show/6873/nvidias-geforce-700m-family-full-details-and-specs</link>
    <pubDate>Mon, 01 Apr 2013 09:00:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6873:news</guid>
    <category><![CDATA[ Mobile]]></category>
    <pubDate>Mon, 01 Apr 2013 09:00:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6873:news</guid>
 	<category><![CDATA[ Mobile]]></category>
</item>  
    
    
<item>
    
        <title>The Great Equalizer: Apple, Android &amp; Windows Tablet GPUs Compared using GL/DXBenchmark 2.7</title>
    <author>Anand Lal Shimpi</author>
    <description><![CDATA[ <p>
	For the past few years we have been lamenting the state of benchmarks for mobile platforms. The constant refrain from those who had been around long enough to remember when all PC benchmarks were terrible was to wait for the release of Windows 8 and RT. The release of those two OSes would bring many of the traditional PC benchmark vendors space into the fray. While we&#39;re expecting to see new Android, iOS, Windows RT and Windows 8 benchmarks from Futuremark and Rightware, it&#39;s our old friends at Kishonti who are first out of the gate with a cross-OS/API/platform benchmark. GLBenchmark has existed on both Android and iOS for a while now, but we&#39;re finally able to share information and performance data using DXBenchmark - GLB&#39;s analogue for Windows RT/8.</p>
]]></description>
    <link>http://www.anandtech.com/show/6872/the-great-equalizer-apple-android-windows-tablets-compared-using-gldxbenchmark-27</link>
    <pubDate>Sun, 31 Mar 2013 23:58:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6872:news</guid>
    <category><![CDATA[ Tablets]]></category>
    <pubDate>Sun, 31 Mar 2013 23:58:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6872:news</guid>
 	<category><![CDATA[ Tablets]]></category>
</item>  
    
    
<item>
    
        <title>A Comment on PC Gaming Battery Life</title>
    <author>Vivek Gowri</author>
    <description><![CDATA[ <p align="center"><a href="http://www.anandtech.com/show/6871/a-comment-on-pc-gaming-battery-life"><img src="http://images.anandtech.com/doci/6871/_DSC1610_575px_575px.JPG" alt="" /></a></p><p><p>
	During the process of writing <a href="http://www.anandtech.com/show/6858/the-razer-edge-review">the Razer Edge review</a>, I spent a lot of my time gaming on battery. The Edge is marketed as being a mobile PC gaming console, and is pretty well suited in that role with one caveat - battery life. Razer quotes 1-2 hours of gaming battery life on the internal 41.44Wh battery, with those figures doubling when the extended battery is inserted in the gamepad controller. The range makes sense; playing Angry Birds would be understandably less strenuous than, say, Skyrim or Crysis.&nbsp;</p>
<p>
	In real-world testing that holds up - I saw just over two hours of Dirt 3 playing time, and around 3.5 hours when playing the decade-old Quake III: Arena. But something I missed was that Jarred had actually developed and done some repeatable, instrumented gaming battery life testing in his <a href="http://www.anandtech.com/show/5772/mobile-ivy-bridge-and-asus-n56vm-preview/7">preview of the ASUS N56VM</a>, one of the first systems we tested with the mobile Ivy Bridge platform. I can&rsquo;t honestly remember why we didn&rsquo;t put more systems through this test, but such is life.&nbsp;</p>
<p>
	The test itself is pretty simple: looping the four 3DMark06 gaming tests at 1366x768 in the balanced power profile and the display set to 100nits, with the GPU specifically set in the balanced performance setting (usually by default on battery it&rsquo;s set to maximum battery saving except in the High Performance profile). Jarred ran the test on the N56VM twice, once with the HD 4000 and once with the Fermi-based 40nm GT 630M that our international-spec N56VM test unit had, as well as the Sandy Bridge-based ASUS K53E (i5-2520M and HD3000) and the Compal-built AMD Llano reference platform that we looked at way back in June 2011.&nbsp;</p>
<p>
	Naturally, my first inclination was to run it on my Edge evaluation unit - so I did. Twice, in fact, both with and without the extended battery. I also had a Sony VAIO T13 ultrabook on hand, a pretty run of the mill entry-level ultrabook from summer 2012, so I ran that too. The spec rundown: i5-3317U, HD 4000, 4GB of memory, 500GB 5400RPM hard drive, 32GB SSD cache, 45Wh battery, a mostly terrible 1366x768 13.3&rdquo; TN display, and Windows 7. Advance apologies for not having a more recent AMD-based system in this comparison, ideally I&rsquo;d have a Trinity system to compare against but I&rsquo;m on the road and had to go with what I had near me.&nbsp;</p>
<p>
	<a href="http://www.anandtech.com/show/6871/a-comment-on-pc-gaming-battery-life"><img alt="Battery Life - Gaming" src="http://images.anandtech.com/graphs/graph6871/53928.png" /></a></p>
<p>
	<a href="http://www.anandtech.com/show/6871/a-comment-on-pc-gaming-battery-life"><img alt="Battery Life - Gaming Efficiency" src="http://images.anandtech.com/graphs/graph6871/53927.png" /></a></p>
<p>
	The Edge checks in at 1:12 on the internal 41.44Wh battery and 2:20 with the extended battery (82.88Wh combined capacity), roughly where I expected given the <a href="http://www.anandtech.com/show/6858/the-razer-edge-review/6">real-world testing</a> done previously. That works out to efficiency in the 1.75-1.8 minutes per watt-hour range. The ultrabook platform is a good deal more efficient than the Edge, which makes sense given the power consumption delta between GT 640M LE and HD 4000, but at the cost of substantially reduced performance. The Edge would likely hit the close to the 2.5 minutes per watt hour number as the ultrabook if the discrete graphics were disabled and the test run on the HD 4000.</p>
<p>
	The point of comparison that I&#39;m really interested in is actually the AMD platform. I wish I had a Trinity system nearby to run this on, but Llano does pretty well from an efficiency standpoint, and a system based around the more powerful Trinity could be a very viable alternative. &nbsp;It&#39;s a platform that seems pretty well suited to the demands of mobile gaming, with a good balance between power consumption and graphics performance. I know that Razer has pretty close ties with both Intel and especially Nvidia, so I never expected them to go the Trinity route, but it&#39;d be interesting to see a different company explore it.</p>
<p>
	<strong>Update:</strong> Jarred also ran the same test on the <a href="http://www.anandtech.com/show/5831/amd-trinity-review-a10-4600m-a-new-hope/8">AMD Trinity prototype</a>. Turns out Trinity actually does worse in this test than Llano, likely thanks to the higher performance GPU. That of course was a prototype system, so performance and battery life with a retail Trinity platform might prove to be better.</p>
<p>
	<a href="http://www.anandtech.com/show/6871/a-comment-on-pc-gaming-battery-life"><img alt="" src="http://images.anandtech.com/doci/6871/_DSC1593_678x452.JPG" /></a></p>
<p>
	Two and a half hours of real gaming isn&#39;t great, but to be honest, considering the power draw and sheer amount of battery capacity on board with the extended battery, I&rsquo;m not sure that anything else can top that number right now. There just isn&#39;t another system that can hit 1.8 minutes per watt-hour while gaming with a battery larger than 80Wh. The cut-down version of HD 4000 in the ultrabook platform is more power efficient, but the performance tradeoffs are simply too significant to consider it adequate for gaming unless the titles you are playing are quite old. And even then, there aren&#39;t any ultrabooks with more battery capacity than the Edge offers.&nbsp;</p>
<p>
	What needs to be kept in mind here is that gaming essentially represents the worst-case real world usage scenario for battery life. Doesn&rsquo;t matter what type of device, you&rsquo;ll blow through the battery pretty quickly if you&rsquo;re gaming on it, even if it&rsquo;s just Fruit Ninja. My Galaxy Nexus has a Gameboy emulator runtime of roughly 4.5 hours, which is pretty awful considering that Pokemon Silver is one of the least graphically-intensive games out there. Our GLBenchmark-based 3D battery life test for <a href="http://images.anandtech.com/graphs/graph6330/50111.png">phones</a> and <a href="http://images.anandtech.com/graphs/graph6472/51767.png">tablets</a> sheds some light on just how quickly it can drain - less than 6 hours for both generations of Retina iPad, a bit under 4 hours for the Nexus 7, 3 hours and 9 minutes for the iPhone 5, just over two hours for both versions of HTC One X (Tegra 3 and Snapdragon). For a quick comparison versus a dedicated handheld gaming device, Sony quotes the PS Vita gaming battery life in the 3-5 hour range, and real world reports commonly place it around 3.5-4 hours.&nbsp;</p>
<p>
	<a href="http://www.anandtech.com/show/6871/a-comment-on-pc-gaming-battery-life"><img alt="" src="http://images.anandtech.com/doci/6871/_DSC1536_575px_575px.JPG" /></a></p>
<p>
	So while two hours may seem short, for a device running full PC games on real PC hardware at respectable (read: playable) framerates, that&rsquo;s actually about as good as it gets in today&rsquo;s world. That should improve going forward - Haswell&rsquo;s idle power improvements won&rsquo;t have an impact, but as GPUs become more efficient, attaining this level of performance will require less power. But as GPU performance becomes &ldquo;cheaper&rdquo; from a power envelope standpoint, an increase in display resolution starts to make sense, and then we arrive back at the battery life conversation. I expect a lot of the current tablet PC issues to be fixed by Haswell (idle power consumption, Thunderbolt, etc), but the shrink to 14nm in Broadwell and Skylake will probably be what gets us the best of both worlds from a performance and power draw perspective. For now, it&rsquo;s hard to knock the Edge for battery life - it simply faces limitations from power and thermal standpoints that apply to every PC on the market right now, and makes some pretty logical compromises based on the technology available.&nbsp;</p>
</p>]]></description>
    <link>http://www.anandtech.com/show/6871/a-comment-on-pc-gaming-battery-life</link>
    <pubDate>Sat, 30 Mar 2013 08:35:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6871:news</guid>
    <category><![CDATA[ Mobile]]></category>
    <pubDate>Sat, 30 Mar 2013 08:35:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6871:news</guid>
 	<category><![CDATA[ Mobile]]></category>
</item>  
    
    
<item>
    
        <title>Capsule Review: Logitech&#39;s G100s, G500s, and G700s Gaming Mice</title>
    <author>Dustin Sklavos</author>
    <description><![CDATA[ <p>
	The dirty secret of gaming peripherals is that if they&#39;re good quality products in general, they&#39;re often going to be head and shoulders above hardware marketed toward the regular consumer. For whatever reason, high rent keyboards and mice just aren&#39;t marketed to consumers who&#39;ll often settle on an inexpensive wireless mouse and keyboard combination. This was strangely evident in Logitech&#39;s pre-G-branding era, and while the G branding is ultimately a good thing, some users are liable to miss out on some fantastic quality kit.</p>
]]></description>
    <link>http://www.anandtech.com/show/6855/capsule-review-logitechs-g100s-g500s-and-g700s-gaming-mice</link>
    <pubDate>Sat, 30 Mar 2013 00:01:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6855:news</guid>
    <category><![CDATA[ Mouse]]></category>
    <pubDate>Sat, 30 Mar 2013 00:01:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6855:news</guid>
 	<category><![CDATA[ Mouse]]></category>
</item>  
    
    
<item>
    
        <title>A Visual Guide to the iPhone 5 on T-Mobile</title>
    <author>Anand Lal Shimpi</author>
    <description><![CDATA[ <p align="center"><a href="http://www.anandtech.com/show/6866/test-iphone5-tmo"><img src="http://images.anandtech.com/doci/6866/iPhone5-284_575px.jpg" alt="" /></a></p><p><p style="text-align: left; ">
	<span style="text-align: left; ">This is normally Brian&#39;s beat but with him busy putting the finishing touches on his review of <a href="http://anandtech.com/show/6851/the-htc-one-a-remarkable-device-anands-mini-review">HTC&#39;s One</a>, I thought I&#39;d help out. We&#39;re still seeing (and hearing) a lot of confusion about what T-Mobile announced earlier this week with regards to existing and future iPhone support on its network. Brian <a href="http://www.anandtech.com/show/6860/apple-to-ship-updated-a1428-iphone-5-with-aws-wcdma-enabled-for-tmobile-usa">already went through all of this in his excellent article</a> on the topic, but seeing continued confusion I thought I&#39;d whip up a few diagrams to help explain.</span></p>
<p style="text-align: left; ">
	For the purposes of this article I&#39;m focusing on compatibility for the current AT&amp;T iPhone 5 (hardware model A1428) as well as the new unlocked iPhone 5 (also hardware model A1428) that will be shipping start April 12.</p>
<p style="text-align: left; ">
	The easiest question to answer is will existing AT&amp;T iPhone 5s that have been unlocked work on T-Mobile&#39;s recently deployed LTE network. The answer is an emphatic yes. The original AT&amp;T iPhone 5 was designed to work on LTE band 17 (700MHz) and band 4 (1700MHz), a superset of T-Mobile&#39;s LTE deployment (band 4). If you&#39;re in one of the few areas with T-Mobile LTE service and an existing unlocked AT&amp;T iPhone 5, the combination will work just fine. Apple will need to release an updated carrier bundle (.ipcc file) for the phones, which I assume is coming soon - but there&#39;s no hardware change required.</p>
<p style="text-align: center; ">
	<a href="http://www.anandtech.com/show/6866/test-iphone5-tmo"><img height="343" id="LTE" src="http://images.anandtech.com/reviews/smartphones/apple/iphone5tmo/lte.png" width="512" /></a><br />
	<input onclick="document.getElementById('LTE').src='http://images.anandtech.com/reviews/smartphones/apple/iphone5tmo/lte.png'" type="button" value="LTE Deployment" /> <input onclick="document.getElementById('LTE').src='http://images.anandtech.com/reviews/smartphones/apple/iphone5tmo/lte-original.png'" type="button" value="Original iPhone 5 (A1428)" /> <input onclick="document.getElementById('LTE').src='http://images.anandtech.com/reviews/smartphones/apple/iphone5tmo/lte-new.png'" type="button" value="New iPhone 5 (A1428)" /></p>
<p>
	The new unlocked iPhone 5 that will be available via T-Mobile doesn&#39;t add any additional functionality in this case. As you can see, both A1428 revisions support the same LTE bands.</p>
<p>
	Where it gets <em>somewhat</em>&nbsp;complicated is in the 3G WCDMA discussion. I emphasize somewhat because it&#39;s really not that hard to understand. The complexity comes from the fact that there are a number of names and acronyms here that aren&#39;t well understood by most who aren&#39;t of Klug-descent. If we focus on the frequency bands themselves and ignore their common names, things are a bit easier to understand.</p>
<p>
	The original AT&amp;T iPhone 5 supported 3G operation on band 5 (850MHz) and band 2 (1900MHz). Only band 2 overlaps with T-Mobile&#39;s network. The problem with 1900MHz on T-Mobile is that the majority of that spectrum is used for 2G and hasn&#39;t yet been migrated over to 3G. The bulk of T-Mobile&#39;s 3G currently exists in band 4 (1700MHz uplink, 2100MHz downlink), which isn&#39;t supported on the existing AT&amp;T iPhone 5.</p>
<p>
	After April 12th, the new unlocked A1428 iPhone 5 with band 4 WCDMA support will begin rolling out and should have much better coverage on T-Mobile&#39;s 3G network as a result. The diagram and toggles below help illustrate this:</p>
<p align="center">
	<a href="http://www.anandtech.com/show/6866/test-iphone5-tmo"><img height="343" id="WCDMA" src="http://images.anandtech.com/reviews/smartphones/apple/iphone5tmo/wcdma.png" width="512" /></a><br />
	<input onclick="document.getElementById('WCDMA').src='http://images.anandtech.com/reviews/smartphones/apple/iphone5tmo/wcdma.png'" type="button" value="WCDMA Deployment" /> <input onclick="document.getElementById('WCDMA').src='http://images.anandtech.com/reviews/smartphones/apple/iphone5tmo/wcdma-original.png'" type="button" value="Original iPhone 5 (A1428)" /> <input onclick="document.getElementById('WCDMA').src='http://images.anandtech.com/reviews/smartphones/apple/iphone5tmo/wcdma-new.png'" type="button" value="New iPhone 5 (A1428)" /></p>
<p style="text-align: left; ">
	The original A1428 iPhone 5 lacks band 4 support, which means it&#39;ll only support WCDMA on band 2. The only problem here, as I mentioned above, is that T-Mobile&#39;s 3G deployment on band 2 isn&#39;t ubiquitous - so in many cases you&#39;ll fall back to 2G/EDGE speeds.&nbsp;The new iPhone 5 simply enables band 4 WCDMA support.</p>
<p style="text-align: left; ">
	There&#39;s one other benefit to the new iPhone 5. DC-HSPA+ (42Mbps max downlink) is now supported on all bands as well. Although it was never (and likely will never ever be) used by AT&amp;T, DC-HSPA+ was a feature of the iPhone 5. T-Mobile on the other hand does use carrier aggregation on WCDMA in some markets and the new A1428 will benefit from higher speeds in those situations.&nbsp;</p>
<p style="text-align: left; ">
	That&#39;s it.</p>
</p>]]></description>
    <link>http://www.anandtech.com/show/6866/test-iphone5-tmo</link>
    <pubDate>Fri, 29 Mar 2013 12:21:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6866:news</guid>
    <category><![CDATA[ Smartphones]]></category>
    <pubDate>Fri, 29 Mar 2013 12:21:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6866:news</guid>
 	<category><![CDATA[ Smartphones]]></category>
</item>  
    
    
<item>
    
        <title>Unreal Engine 4 Infiltrator Demo from GDC [video]</title>
    <author>Anand Lal Shimpi</author>
    <description><![CDATA[ <p align="center"><a href="http://www.anandtech.com/show/6869/unreal-engine-4-infiltrator-demo-from-gdc-video"><img src="http://images.anandtech.com/doci/6869/Screen Shot 2013-03-29 at 11.52.39 AM_575px.png" alt="" /></a></p><p><p>
	The folks at <a href="http://kotaku.com/impressive-unreal-engine-4-graphics-demo-infiltrator-462820497">Kotaku spotted a video of Epic&#39;s Unreal Engine 4 demo</a> from GDC on YouTube. The video is really a tech demo of Epic&#39;s UE4 technology and not any upcoming IP, but in usual Epic fashion it looks downright impressive. Even more exciting is the fact that the demo was run entirely on a single GeForce GTX 680. With the <a href="http://www.anandtech.com/show/6770/sony-announces-playstation-4-pc-hardware-inside">next generation of game consoles</a> targeting GPU performance under that of a 680, the performance target makes sense.</p>
<p>
	<a href="">Fortnite</a>&nbsp;will be Epic&#39;s first game to use Unreal Engine 4. We can also expect UE4 to make its way onto mobile devices as well thanks to the excellent UDK for Android and iOS.</p>
<p style="text-align: center; ">
	<iframe allowfullscreen="" frameborder="0" height="360" src="http://www.youtube.com/embed/dO2rM-l-vdQ" width="640"></iframe></p>
</p>]]></description>
    <link>http://www.anandtech.com/show/6869/unreal-engine-4-infiltrator-demo-from-gdc-video</link>
    <pubDate>Fri, 29 Mar 2013 11:57:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6869:news</guid>
    <category><![CDATA[ GPUs]]></category>
    <pubDate>Fri, 29 Mar 2013 11:57:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6869:news</guid>
 	<category><![CDATA[ GPUs]]></category>
</item>  
    
    
<item>
    
        <title>AMD Teases Official Radeon HD 7990</title>
    <author>Ryan Smith</author>
    <description><![CDATA[ <p align="center"><a href="http://www.anandtech.com/show/6868/amd-teases-official-radeon-hd-7990"><img src="http://images.anandtech.com/doci/6868/7990Car_575px.jpg" alt="" /></a></p><p><p>
	Continuing our AMD GDC 2013 coverage, the other bit of major AMD news coming out of GDC 2013 involves a consumer product after all. But not strictly as a product announcement. Rather this is more of a product <em>tease</em> coming out of AMD. That tease? The Radeon HD 7990.</p>
<p>
	Since the latter half of 2012, AMD partners such as PowerColor and Asus have been offering what we&rsquo;ve been calling &ldquo;officially unofficial&rdquo; Radeon HD 7990 cards. Officially, AMD doesn&rsquo;t have a 7990 SKU, but at the same time AMD will approve multi-GPU Tahiti designs, and bless them with the right to be called a 7990. So officially the 7990 doesn&rsquo;t exist yet, and yet unofficially it&rsquo;s been offered for months now.</p>
<p>
	With that in mind, the 7990 will be moving from officially unofficial status to just outright official status. Ending their GDC presentation, AMD&rsquo;s final item was a tease of their official Radeon HD 7990 design, with word that it&rsquo;s coming soon. Real soon in fact, as we later found out DICE had been using some of these 7990 cards to <a href="https://www.facebook.com/AMDGaming/posts/554763727889662">power their Battlefield 4 demo elsewhere at GDC</a>.</p>
<p>
	As this is a teaser AMD isn&rsquo;t saying anything about the card beyond the fact that it&rsquo;s a dual Tahiti card just as the unofficially official 7990s were. But even from the few pictures they&rsquo;ve strategically provided we can infer a few things.</p>
<p align="center">
	<a href="http://www.anandtech.com/show/6868/amd-teases-official-radeon-hd-7990"><img alt="" src="http://images.anandtech.com/doci/6868/7990_3.jpg" /></a></p>
<p>
	First and foremost, it&rsquo;s a complete open-air cooler. AMD&rsquo;s previous dual-GPU cards have all employed some kind of blower; up through the 300W 5970 they were full blowers, and the 375W 6990 was a split blower. Open air coolers have generally high performance, but they do require a breezy case since they&rsquo;re not very capable of pushing hot air out on their own. The design essentially punts cooling off to the case, which is not always a bad thing since this affords much larger &ndash; and thereby slower and quieter &ndash; cooling fans.</p>
<p align="center">
	<a href="http://www.anandtech.com/show/6868/amd-teases-official-radeon-hd-7990"><img alt="" src="http://images.anandtech.com/doci/6868/7990_2.jpg" /></a> &nbsp; <a href="http://www.anandtech.com/show/6868/amd-teases-official-radeon-hd-7990"><img alt="" src="http://images.anandtech.com/doci/6868/7990_1.jpg" /></a></p>
<p>
	Second of all, we can see something about the power delivery system. Two 8pin PCIe connectors are visible, which would put power at or under 375W. AMD has always shipped their cards with the proper connectivity to pull their rated power at stock, so as long as they&rsquo;re holding to that this puts an upper-limit on where the 7990&rsquo;s TDP would be at stock. This would be notably lower than the unofficial cards, which are closer to 500W (though admittedly also designed for liberal overclocking). Meanwhile we can also just see the edge of the VRM circuitry; the Volterra &lsquo;C&rsquo; is visible on the edge of what appears to be a <a href="http://www.cooperindustries.com/content/public/en/bussmann/electronics/products/coiltronics_inductorandtransformermagnetics/high_current_inductors/multi-phase_powercoupledinductorsformulti-phaseapplications/cl0904.html">Volterra multi-phase inductor</a>. Volterra is widely considered to offer some of the best VRM circuitry in the industry, and has been found on previous generation AMD dual-GPU cards.</p>
<p>
	In any case, we&rsquo;ll be following this up as AMD releases more information. The fact that an official 7990 is appearing now makes it hard to argue that AMD isn&rsquo;t late to the party &ndash; we&rsquo;re coming up on a full year since the GeForce GTX 690 &ndash; but with AMD keeping Tahiti through Q4 there&rsquo;s really no reason not to do it. So we&rsquo;ll have to see just what AMD comes up with, and how their design differs from the unofficial cards that have come before it.</p>
</p>]]></description>
    <link>http://www.anandtech.com/show/6868/amd-teases-official-radeon-hd-7990</link>
    <pubDate>Fri, 29 Mar 2013 08:00:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6868:news</guid>
    <category><![CDATA[ GPUs]]></category>
    <pubDate>Fri, 29 Mar 2013 08:00:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6868:news</guid>
 	<category><![CDATA[ GPUs]]></category>
</item>  
    
    
<item>
    
        <title>AMD Announces &quot;Radeon Sky&quot; Family of Server-Cloud Video Cards</title>
    <author>Ryan Smith</author>
    <description><![CDATA[ <p align="center"><a href="http://www.anandtech.com/show/6867/amd-announces-radeon-sky-family-of-servercloud-video-cards"><img src="http://images.anandtech.com/doci/6867/SkyLogoBig.jpg" alt="" /></a></p><p><p>
	Catching up on announcements from GDC 2013, we&rsquo;ll kick things off with AMD. Though AMD doesn&rsquo;t use traditionally GDC to formally launch consumer products since it&rsquo;s not a consumer show, when it comes to professional hardware this is another story. To that end AMD is using GDC 2013 to announce some new professional products.</p>
<p>
	AMD&rsquo;s first announcement out of the show is a new line of server video cards. Dubbed the Radeon Sky family, these are new cards based on AMD&rsquo;s existing Pitcairn and Tahiti GPUs, targeted for use in cloud gaming deployments. These are passively cooled cards intended to be sold directly to cloud gaming providers, and despite the Radeon name are <strong>not</strong> consumer cards.</p>
<p align="center">
	<a href="http://www.anandtech.com/show/6867/amd-announces-radeon-sky-family-of-servercloud-video-cards"><img alt="" src="http://images.anandtech.com/doci/6867/Sky900.jpg" /></a><br />
	<em>AMD Radeon Sky 900</em></p>
<p>
	Altogether there will be 3 Sky cards, the 900, 700, and 500. The 900 is a dual-Tahiti card (not too unlike the FIrePro S10000), meanwhile the 700 is a single-GPU Tahiti card, and finally the Sky 500 is a Pitcairn card. We know the number of enabled CUs for each along with memory clockspeeds, but AMD has not published the core clockspeeds. Given the power and thermal constraints of a server environment, it&rsquo;s a safe bet this won&rsquo;t be as high as consumer Radeon desktop cards.</p>
<p align="center">
	<a href="http://www.anandtech.com/show/6867/amd-announces-radeon-sky-family-of-servercloud-video-cards"><img alt="" src="http://images.anandtech.com/doci/6867/SkySlide_575px.jpg" /></a></p>
<p>
	Coupled with the Sky series is a new piece of software from AMD called RapidFire technology. Unfortunately we don&rsquo;t have much in the way of details on it at this time. The first two aspects of it &ndash; &ldquo;low latency&rdquo; and &ldquo;HD image quality&rdquo; &ndash; are essentially just branding AMD&rsquo;s Video Codec Engine (VCE) hardware video enecoder. We do not know at this time quite where AMD is going on the subject of multiple video steams; this may be their branding for utilizing virtualization to run multiple game instances on a single GPU in order to achieve high server densities, but we&rsquo;ll follow this up when we know more.</p>
<p align="center">
	<a href="http://www.anandtech.com/show/6867/amd-announces-radeon-sky-family-of-servercloud-video-cards"><img alt="" src="http://images.anandtech.com/doci/6867/AMDRapidfire_575px.jpg" /></a></p>
<p>
	Radeon Sky cards will start shipping in Q2 of this year, and AMD is already demoing them to press and partners alike now. AMD is making a direct-pitch play for the cloud gaming space, joining the rest of the gaming industry in believing that cloud gaming stands to become a multi-billion dollar market very quickly, and one they want to be a part of. The Radeon Sky series will be going up against NVIDIA&rsquo;s GRID cards in this space, which are the hardware backing their GeForce GRID initiative.</p>
</p>]]></description>
    <link>http://www.anandtech.com/show/6867/amd-announces-radeon-sky-family-of-servercloud-video-cards</link>
    <pubDate>Fri, 29 Mar 2013 07:00:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6867:news</guid>
    <category><![CDATA[ GPUs]]></category>
    <pubDate>Fri, 29 Mar 2013 07:00:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6867:news</guid>
 	<category><![CDATA[ GPUs]]></category>
</item>  
    
    
<item>
    
        <title>HP EliteBook Folio 9470m Ultrabook Review: Ultrabooks in Enterprise</title>
    <author>Dustin Sklavos</author>
    <description><![CDATA[ <p>
	Something funny happened when a lot of us weren&#39;t really paying attention last year: Intel&#39;s nascent &quot;ultrabook&quot; specification and definition quietly expanded and, in the process, sort of redefined what a notebook was. In their own circular way, Intel created a brand and changed the way notebooks were built (with ULV Ivy Bridge leading the way); I&#39;m sure it&#39;s no coincidence that this trademarked product name has only squeezed AMD further. Ultrabooks that were 14&quot; and larger weren&#39;t as rigidly confined by the definition as ones below that threshold, but they&#39;re still smaller creatures than the notebooks of old. HP&#39;s EliteBook Folio 9470m is one of these newer beasts.</p>
]]></description>
    <link>http://www.anandtech.com/show/6852/hp-elitebook-folio-9470m-ultrabook-review-ultrabooks-in-enterprise</link>
    <pubDate>Fri, 29 Mar 2013 00:01:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6852:news</guid>
    <category><![CDATA[ Ultrabook]]></category>
    <pubDate>Fri, 29 Mar 2013 00:01:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6852:news</guid>
 	<category><![CDATA[ Ultrabook]]></category>
</item>  
    
    
<item>
    
        <title>Best Budget Ultrabook, March 2013</title>
    <author>Jarred Walton</author>
    <description><![CDATA[ <p align="center"><a href="http://www.anandtech.com/show/6865/best-budget-ultrabook-march-2013"><img src="http://images.anandtech.com/doci/6865/vizio-ct14 (1)_575px.jpg" alt="" /></a></p><p><p>
	We&rsquo;re planning to start a regular revolving list of recommended products at AnandTech&mdash;sort of like a mini buyer&rsquo;s guide focused on a single product or component. Anand has asked me to kick things off with a look at the notebook market. Initially, I wasn&rsquo;t sure if I could find anything I was really comfortable recommending, considering Haswell is right around the corner and Richland APUs have been announced by AMD and should start showing up in laptops in the next month or two. But then I took a look around and found that there are some decent laptops that appear to be on clearance, making way for the next round of new products.</p>
<p>
	With that in mind, I tried to find what I felt was the most compelling offering right now, and somewhat surprisingly I ended up with an Ultrabook. To be clear, Ultrabooks and ultraportables aren&rsquo;t the be-all, end-all of laptops; they&rsquo;re great for portability and performance for general use applications is usually adequate, but they&rsquo;re generally not gaming powerhouses and even battery life often gets compromised in pursuit of a small size. I&rsquo;ll have some other recommendations for laptops over the coming weeks, but for now I&rsquo;m specifically looking at the ultrathin class of offerings. The key things I like to see in an Ultrabook are pure solid state storage, a good LCD, and the lower the price the better; that brings me to VIZIO.</p>
<p>
	Last month, Vivek posted his thoughts on the <a href="http://www.anandtech.com/show/6543/vizio-thin-light-ct15-something-new-and-edgy">VIZIO CT15 &ldquo;Thin+Light&rdquo; laptop</a>. Note that the Windows 8 variants updated the touchpad and that they&rsquo;re better than the original release, but they&rsquo;re still not perfect. However, a good price can go a long ways towards making a product acceptable, and right now the CT14 with 128GB SSD is <a href="http://detonator.dynamitedata.com/cgi-bin/redirect.pl?user=u00000626&amp;url=http%3a%2f%2fwww.amazon.com%2fdp%2fB009PJHE4W%2fref%3dsr_1_29%3ftag%3ddosk-20">available for $680 at Amazon.com</a>. (Note that this is basically a discontinued product, and even VIZIO has a list price of $599 on their store&mdash;except they&rsquo;re sold out. <em>[<strong>Update:</strong> Now back in stock at $849]</em>) Not only does that make this one of the least expensive Intel-based ultraportable around right now, but for the price you still get a 128GB SSD, Core i5-3317U processor, and best yet: a 1600x900 high quality IPS display.</p>
<p>
	Does that make this the best current ultraportable? For the money, I would say yes, though I don&rsquo;t know how long supplies will last. There are other issues that need to be mentioned as well: the keyboard still flexes a fair amount when you type (and lacks backlighting), the RAM can&rsquo;t be upgraded, opening the lid can be a bit more difficult that I&rsquo;d like, and battery life is merely so-so. However, when you look at competing offerings you often end up with other compromises (poor quality LCDs being a major one), and most of those cost more than the CT14 and come with a hard drive and caching SSD.</p>
<p>
	Bottom line: if you have to buy a laptop&nbsp;<em>today</em>&nbsp;and you want an Ultrabook or ultraportable, unless you&rsquo;re willing to pay substantially more money (like $1100+), this is the one I&rsquo;d recommend. Otherwise, wait and see what Haswell and Richland bring to the table in terms of Ultrabook/ultraportable options; they&rsquo;ll likely cost more than the VIZIO&rsquo;s sub-$700 price, but they&rsquo;re likely to make up for that with higher performance, improved battery life, and better build quality. VIZIO will likewise be offering touchscreen updates of the CT14 and CT15 with the new CPUs/APUs and they also fix the keyboard complaints; we just need to see where they&rsquo;re priced when they launch. Interestingly, I believe VIZIO is forgoing the Ultrabook aspect on the refresh, as they&#39;re switching to standard voltage CPUs. <em>[<strong>Update: </strong>VIZIO just posted the <a href="http://www.vizio.com/thin-light/specs">updated specs and MSRPs</a> for the touchscreen models. Hopefully street pricing is quite a bit lower.]</em></p>
<p>
	<div>Gallery: <a href="/Gallery/Album/2703" target="_blank">Best Budget Ultrabook, March 2013</a><div><a href="/Gallery/Album/2703#1" target="_blank"><img src="http://images.anandtech.com/galleries/2703/vizio-ct14 (1)_thumb.jpg" width="85" height="85" border="0"/></a><a href="/Gallery/Album/2703#2" target="_blank"><img src="http://images.anandtech.com/galleries/2703/vizio-ct14 (2)_thumb.jpg" width="85" height="85" border="0"/></a><a href="/Gallery/Album/2703#3" target="_blank"><img src="http://images.anandtech.com/galleries/2703/vizio-ct14 (3)_thumb.jpg" width="85" height="85" border="0"/></a><a href="/Gallery/Album/2703#4" target="_blank"><img src="http://images.anandtech.com/galleries/2703/vizio-ct14 (4)_thumb.jpg" width="85" height="85" border="0"/></a><a href="/Gallery/Album/2703#5" target="_blank"><img src="http://images.anandtech.com/galleries/2703/vizio-ct14 (5)_thumb.jpg" width="85" height="85" border="0"/></a><a href="/Gallery/Album/2703#6" target="_blank"><img src="http://images.anandtech.com/galleries/2703/vizio-ct14 (6)_thumb.jpg" width="85" height="85" border="0"/></a></div></div></p>
</p>]]></description>
    <link>http://www.anandtech.com/show/6865/best-budget-ultrabook-march-2013</link>
    <pubDate>Thu, 28 Mar 2013 18:00:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6865:news</guid>
    <category><![CDATA[ Mobile]]></category>
    <pubDate>Thu, 28 Mar 2013 18:00:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6865:news</guid>
 	<category><![CDATA[ Mobile]]></category>
</item>  
    
    
<item>
    
        <title>The Razer Edge Review</title>
    <author>Vivek Gowri</author>
    <description><![CDATA[ <p>
	This story starts in a dark meeting room in the back of Razer&rsquo;s booth at CES 2012. I&rsquo;m sitting with CEO Min-Liang Tan, who is walking me through the intriguing Project Fiona concept gaming tablet. A number of major manufacturers announced Tegra 3 and OMAP4-based Android slates at CES 2012, but Project Fiona stood out &ndash; instead of an ARM SoC, it ran a Core i7 CPU, Nvidia graphics, and Windows 7. At the time my guess was an i7-2617M ultra-low voltage processor and an Nvidia GT 520M, though I never got a confirmation on either spec from Razer. In addition to the powerhouse specs, the tablet had handles resembling two Wii nunchucks attached to either side. Even with a small 10.1&rdquo; display, the performance-oriented silicon and gamepad combined for a tablet that was big, hot, and heavy. Clearly, not an ideal solution, but the concept had obvious potential and was almost a lock to reach production eventually. It walked away from Las Vegas with a handful of awards.</p>
<p>
	Eventually turned out to be today, and so we have the Razer Edge. Read on for our full review.</p>
]]></description>
    <link>http://www.anandtech.com/show/6858/the-razer-edge-review</link>
    <pubDate>Thu, 28 Mar 2013 11:00:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6858:news</guid>
    <category><![CDATA[ Tablets]]></category>
    <pubDate>Thu, 28 Mar 2013 11:00:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6858:news</guid>
 	<category><![CDATA[ Tablets]]></category>
</item>  
    
    
<item>
    
        <title>Rosewill Blackhawk Ultra Case Review: Were It Not For Competition</title>
    <author>Dustin Sklavos</author>
    <description><![CDATA[ <p>
	We&#39;ve long maintained that Rosewill&#39;s <a href="http://www.anandtech.com/show/4648/rosewill-thor-v2-the-god-of-cooling-and-silence">Thor v2</a> is one of the best deals floating around for enthusiasts. In that enclosure, Rosewill has a product that&#39;s fairly feature rich, quiet, and offers stellar performance. Yet the Thor v2 isn&#39;t the flagship of their enclosure line, but today we have that flagship in house. Given its predecessor&#39;s stellar performance, expectations are pretty high for the Blackhawk Ultra.</p>
]]></description>
    <link>http://www.anandtech.com/show/6854/rosewill-blackhawk-ultra-case-review-were-it-not-for-competition</link>
    <pubDate>Thu, 28 Mar 2013 00:01:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6854:news</guid>
    <category><![CDATA[ Cases/Cooling/PSUs]]></category>
    <pubDate>Thu, 28 Mar 2013 00:01:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6854:news</guid>
 	<category><![CDATA[ Cases/Cooling/PSUs]]></category>
</item>  
    
    
<item>
    
        <title>HandBrake to Get QuickSync Support</title>
    <author>Anand Lal Shimpi</author>
    <description><![CDATA[ <p align="center"><a href="http://www.anandtech.com/show/6864/handbrake-to-get-quicksync-support"><img src="http://images.anandtech.com/doci/6864/media2_575px.jpg" alt="" /></a></p><p><p>
	The latest version of Intel&#39;s Media SDK open sourced a key component of the <a href="http://www.anandtech.com/show/4083/the-sandy-bridge-review-intel-core-i7-2600k-i5-2500k-core-i3-2100-tested/8">QuickSync</a> pipeline that would allow the open source community to begin to integrate QuickSync into their applications (if you&#39;re not familiar with QS, it&#39;s Intel&#39;s hardware accelerated video transcode engine included in most modern Core processors). I <a href="http://www.anandtech.com/show/6665/intels-quick-sync-coming-soon-to-your-favorite-open-source-transcoding-applications-">mentioned this open source victory back at CES this year</a>, and today the HandBrake team is officially announcing support for QuickSync.&nbsp;</p>
<p>
	The support has been in testing for a while, but the HandBrake folks say that they expect to get comparable speedups to other QuickSync enabled applications.</p>
<p>
	No word on exactly when we&#39;ll see an official build of HandBrake with QuickSync support, although I fully expect Intel to want to have something neat to showcase QuickSync performance on Haswell in June. I should add that this won&#39;t apply to OS X versions of HandBrake unfortunately, enabling that will require some assistance from Apple and Intel - there&#39;s no Media SDK available for OS X at this point, and I don&#39;t know that OS X exposes the necessary hooks to get access to QuickSync.</p>
</p>]]></description>
    <link>http://www.anandtech.com/show/6864/handbrake-to-get-quicksync-support</link>
    <pubDate>Wed, 27 Mar 2013 21:01:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6864:news</guid>
    <category><![CDATA[ GPUs]]></category>
    <pubDate>Wed, 27 Mar 2013 21:01:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6864:news</guid>
 	<category><![CDATA[ GPUs]]></category>
</item>  
    
    
<item>
    
        <title>Intel&#39;s PixelSync &amp; InstantAccess: Two New DirectX Extensions for Haswell</title>
    <author>Anand Lal Shimpi</author>
    <description><![CDATA[ <p align="center"><a href="http://www.anandtech.com/show/6863/intels-pixelsync-instantaccess-two-new-directx-extensions-for-haswell"><img src="http://images.anandtech.com/doci/6863/oit_575px.jpg" alt="" /></a></p><p><p>
	As Intel continues its march towards performance relevancy in the graphics space with <a href="http://www.anandtech.com/show/6355/intels-haswell-architecture">Haswell</a>, it should come as no surprise that we&#39;re hearing more GPU related announcements from the company. At this year&#39;s Game Developer Conference, Intel introduced two new DirectX extensions that will be supported on Haswell and its integrated GPU. The first extension is called PixelSync (yep, Intel branding appears alive and well even on the GPU side of the company - one of these days the HD moniker will be dropped and Core will get a brother). PixelSync is Intel&#39;s hardware/software solution to enable Order Independent Transparency (OIT) in games. The premise behind OIT is the quick sorting of transparent elements so they are rendered in the right order, enabling some pretty neat effects in games if used properly. Without OIT, game designers are limited in what sort of scenes they can craft. It&#39;s a tough problem to solve, but one that has big impacts on game design.</p>
<p>
	Although OIT is commonly associated with DirectX 11, it&#39;s not a feature of the API but rather something that&#39;s enabled using the API. Intel&#39;s claim is that current implementations of OIT require unbounded amounts of memory and memory bandwidth (bandwidth requirements can scale quadratically with shader complexity). Given that Haswell (and other integrated graphics solutions) will be more limited on memory and memory bandwidth than the highest end discrete GPUs, it makes sense that Intel is motivated to find a smaller footprint and more bandwidth efficient way to implement OIT.</p>
<p>
	The hardware side of PixelSync is simply the enabling of programmable blend operations on Haswell. On PC GPU architectures, all frame buffer operations flow through fixed function hardware with limited flexibility. Interestingly enough, this is one area where the mobile GPUs have moved ahead of the desktop world - NVIDIA&#39;s Tegra GPUs reuse programmable pixel shader ALUs for frame buffer ops. The Haswell implementation isn&#39;t so severe. There are still fixed function ROPs, but the Haswell GPU core now includes hardware that locks and forces the serialization of memory accesses when triggered by the PixelSync extension. With 3D rendering being an embarrassingly parallel problem, having many shader instructions working on overlapping pixel areas can create issues when running things like OIT algorithms. What PixelSync does is allows the software to tell the hardware that for a particular segment of code, that any shaders running on the same pixel(s) need to be serialized rather than run in parallel. The serialization is limited to directly overlapping pixels, so performance should remain untouched for the rest of the code. This seemingly simple change goes a long way to enabling techniques like OIT, as well as giving developers the option of creating their own frame buffer operations.</p>
<p>
	The software side of PixelSync is an evolved version of Intel&#39;s own Order Independent Transparency algorithm that leverages high quality compression to reduce memory footprint and deliver predictable performance. Intel has talked a bit about an <a href="http://download-software.intel.com/sites/default/files/m/d/4/1/d/8/AdaptiveTransparency_GDC2011.pdf">earlier version of this algorithm here</a> for those interested.</p>
<p>
	Intel claims that two developers have already announced support for PixelSync, with Codemasters producer Clive Moody (GRID 2) appearing in the Intel press release excited about the new extension. Creative Assembly also made an appearance in the PR, claiming the extensions will be used in Total War: Rome II.</p>
<p>
	The second extension, InstantAccess is simply Intel&#39;s implementation of zero copy. Although Intel&#39;s processor graphics have supported unified memory for a while (CPU + GPU share the same physical memory), the two processors don&#39;t get direct access to each others memory space. Instead, if the GPU needs to work on something the CPU has in memory, it needs to make its own copy of it first. The copy process is time consuming and wasteful. As we march towards true heterogeneous computing, we need ways of allowing both processors to work on the same data in memory. With InstantAccess, Intel&#39;s graphics driver can deliver a pointer to a location in GPU memory that the CPU can then access directly. The CPU can work on that GPU address without a copy and then release it back to the GPU. AMD introduced support for something similar back with Llano.</p>
</p>]]></description>
    <link>http://www.anandtech.com/show/6863/intels-pixelsync-instantaccess-two-new-directx-extensions-for-haswell</link>
    <pubDate>Wed, 27 Mar 2013 21:00:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6863:news</guid>
    <category><![CDATA[ GPUs]]></category>
    <pubDate>Wed, 27 Mar 2013 21:00:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6863:news</guid>
 	<category><![CDATA[ GPUs]]></category>
</item>  
    
    
<item>
    
        <title>FCAT: The Evolution of Frame Interval Benchmarking, Part 1</title>
    <author>Ryan Smith</author>
    <description><![CDATA[ <p>
	In the last year stuttering, micro-stuttering, and frame interval benchmarking have become a very big deal in the world of GPUs, and for good reason. Through the hard work of the <a href="http://techreport.com/">Tech Report&rsquo;s</a> Scott Wasson and others, significant stuttering issues were uncovered involving AMD&rsquo;s video cards.</p>
<p>
	At the same time however, as AMD has fixed those issues, both AMD and NVIDIA have laid out their concerns over existing testing methodologies, and why these may not be the best methodologies. To that end, today NVIDIA will be introducing a new too evaluating frame latencies: FCAT. As we will see, the Frame Capture Analysis Tool stands to change frame interval benchmarking as we know it, and as we will see it will be for the better.</p>
]]></description>
    <link>http://www.anandtech.com/show/6862/fcat-the-evolution-of-frame-interval-benchmarking-part-1</link>
    <pubDate>Wed, 27 Mar 2013 09:00:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6862:news</guid>
    <category><![CDATA[ GPUs]]></category>
    <pubDate>Wed, 27 Mar 2013 09:00:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6862:news</guid>
 	<category><![CDATA[ GPUs]]></category>
</item>  
    
    
<item>
    
        <title>First Impressions: Kinesis Advantage Mechanical Ergonomic Keyboard</title>
    <author>Jarred Walton</author>
    <description><![CDATA[ <p align="center"><a href="http://www.anandtech.com/show/6861/first-impressions-kinesis-advantage-mechanical-ergonomic-keyboard"><img src="http://images.anandtech.com/doci/6861/Kinesis Advantage (2)_575px.jpg" alt="" /></a></p><p><p>
	Earlier this month I posted my <a href="http://www.anandtech.com/show/6819/truly-ergonomic-computer-keyboard-review-one-month-with-the-teck">review of the TECK</a>, an ergonomic keyboard with mechanical switches that&rsquo;s looking to attract users interesting in a high quality, highly ergonomic offering and don&rsquo;t mind the rather steep learning curve or the price. The <a href="http://www.trulyergonomic.com/store/index.php">TECK</a> isn&rsquo;t the only such keyboard, of course, and I decided to see what other mechanical switch ergonomic keyboards I could get for comparison. Next up on the list is the granddaddy of high-end ergonomic keyboards, the <a href="http://www.kinesis-ergo.com/">Kinesis Contour Advantage</a>.</p>
<p>
	Similar to <a href="http://www.anandtech.com/show/6682/first-impressions-the-teck-ergonomic-mechanical-keyboard">what I did with the TECK</a>, I wanted to provide my first impressions of the Kinesis, along with some thoughts on the initial switch and the learning curve. This time, I also made the effort to put together a video of my first few minutes of typing. It actually wasn&rsquo;t as bad as with the TECK, but that&rsquo;s likely due to the fact that I already lost many of my typing conventions when I made that switch earlier this year. I&rsquo;ll start with the video, where I take a typing test on four different keyboards and provide some thoughts on the experience, and then I&rsquo;ll provide a few other thoughts on the Kinesis vs. TECK comparison. It&rsquo;s far too early to determine which one I&rsquo;ll end up liking the most, but already I do notice some differences.</p>
<p>
	<iframe allowfullscreen="" frameborder="0" height="371" src="http://www.youtube.com/embed/dJMl4O6uO9M" width="660"></iframe></p>
<p>
	Compared to the TECK&mdash;as well as many other keyboards&mdash;the Kinesis Advantage feels quite large. Part of that is from the thickness of the keyboard, with the palm rests and middle section being much thicker than on other keyboards. Looking at the way my hands rest on the Advantage, though, I have to say it seems like it should be a good fit for me once I adapt to the idiosyncrasies. I discussed some of the changes in the above video, but let me go into some additional detail on the areas that appear to be causing me the most trouble (and this is after the initial several hours of training/adapting to the modified layout).</p>
<p align="center">
	<a href="http://www.anandtech.com/show/6861/first-impressions-kinesis-advantage-mechanical-ergonomic-keyboard"><img alt="" src="http://images.anandtech.com/doci/6861/Kinesis%20Advantage%20(1)_575px.jpg" /></a></p>
<p>
	My biggest long-term concern is with the location of the CTRL and ALT keys. As someone that uses keyboard shortcuts frequently, I&rsquo;m very accustomed to using my pinkies to hit CTRL. Reaching up with my thumb to hit CTRL is going to take some real practice, but I can likely come to grips with that over the next few weeks. Certain shortcuts are a bit more complex, however&mdash;in Photoshop, for instance, I routinely use &ldquo;Save for Web&hellip;&rdquo;, with the shortcut CTRL+ALT+SHIFT+S; take one look at the Kinesis and see how easy that one is to pull off! Similarly, the locations of the cursor keys, PgUp/PgDn, and Home/End keys is going to really take some time for me to adjust. On the TECK I actually didn&rsquo;t mind having them located under the palms of the hands, but here the keys are split between both hands and aren&rsquo;t centralized.</p>
<p>
	With that said, the Kinesis keyboards do have some interesting features that may mitigate such concerns. For one, there&rsquo;s a built-in function for reprogramming any of the keys, so it&rsquo;s possible with a little effort to change the layout. Of course, for that to be useful you also need to figure out a &ldquo;better&rdquo; layout than the default, and I&rsquo;m not sure what that might be&mdash;plus I wanted to give the default layout a shot first. The Advantage also features macro functionality, allowing you to program up to 24 macros of approximately 55 keystrokes. Truth be told, I haven&rsquo;t even tried the macros or key mapping features yet, but I can at least see how they might prove useful.</p>
<p>
	There are a few other items to mention for my first impressions. One is that I didn&rsquo;t like the audible beeping from my speakers at all; I think the keys sound plenty loud when typing (not that they&rsquo;re loud, necessarily, but they&rsquo;re not silent either), so adding a beep from the speakers wasn&rsquo;t useful for me. Thankfully, it&rsquo;s very easy to disable the sounds with a quick glance at the manual. Another interesting feature is built-in support for the Dvorak layout (press PROGRAM+SHIFT+F5 to switch between QWERTY and Dvorak; note that switching will lose any custom key mappings). Finally, unlike the TECK, Kinesis also includes a USB hub (two ports at the bottom-back of the keyboard near the cable connection).</p>
<p>
	As far as typing goes, the Cherry MX Brown switches so far feel largely the same to me as on the TECK. I haven&rsquo;t experienced any issues with &ldquo;double pressing&rdquo; of keys yet, but then I didn&rsquo;t have that happen with the TECK for a couple weeks either. Right now, it&rsquo;s impossible for me to declare which keyboard is better in terms of ergonomics&mdash;and in fact, even after using both for a month I fear I might not be able to come to a firm opinion on the matter&mdash;but one thing I do know is that looking at the video above, I can see that my hands and arms move far less when typing on both the TECK and Kinesis. I also know that at least from a technology standpoint, the Kinesis is more advanced than the TECK, what with a USB hub, key remapping, and macro functionality, but it&rsquo;s also more expensive thanks to those features.</p>
<p>
	Reviews of this nature are inherently something that will take a while and they end up being quite subjective, but within the next few months I hope to have a better idea of which mechanical switch ergonomic keyboard I like the most&hellip;and I have at least one if not two more offerings coming my way. Hopefully you can all wait patiently while I put each through a month or so of regular use. If you&rsquo;re looking to spend $200+ on a high quality ergonomic keyboard, you&rsquo;ll probably be willing to wait a bit longer, but if not I believe many of the companies will offer you a 60-day money back guarantee&mdash;the TECK and Kinesis both offer such a guarantee if you&rsquo;re interested in giving one a try.</p>
<p>
	<div>Gallery: <a href="/Gallery/Album/2701" target="_blank">First Impressions: Kinesis Advantage Mechanical Ergonomic Keyboard</a><div><a href="/Gallery/Album/2701#1" target="_blank"><img src="http://images.anandtech.com/galleries/2701/Kinesis Advantage (2)_thumb.jpg" width="85" height="85" border="0"/></a><a href="/Gallery/Album/2701#2" target="_blank"><img src="http://images.anandtech.com/galleries/2701/Kinesis Advantage (3)_thumb.jpg" width="85" height="85" border="0"/></a><a href="/Gallery/Album/2701#3" target="_blank"><img src="http://images.anandtech.com/galleries/2701/Kinesis Advantage (4)_thumb.jpg" width="85" height="85" border="0"/></a><a href="/Gallery/Album/2701#4" target="_blank"><img src="http://images.anandtech.com/galleries/2701/Kinesis Advantage (5)_thumb.jpg" width="85" height="85" border="0"/></a><a href="/Gallery/Album/2701#5" target="_blank"><img src="http://images.anandtech.com/galleries/2701/Kinesis Advantage (6)_thumb.jpg" width="85" height="85" border="0"/></a><a href="/Gallery/Album/2701#6" target="_blank"><img src="http://images.anandtech.com/galleries/2701/Kinesis Advantage (7)_thumb.jpg" width="85" height="85" border="0"/></a></div></div></p>
</p>]]></description>
    <link>http://www.anandtech.com/show/6861/first-impressions-kinesis-advantage-mechanical-ergonomic-keyboard</link>
    <pubDate>Wed, 27 Mar 2013 03:00:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6861:news</guid>
    <category><![CDATA[ Keyboard]]></category>
    <pubDate>Wed, 27 Mar 2013 03:00:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6861:news</guid>
 	<category><![CDATA[ Keyboard]]></category>
</item>  
    
    
<item>
    
        <title>Apple to Ship Updated A1428 iPhone 5 With AWS WCDMA Enabled for T-Mobile USA</title>
    <author>Brian Klug</author>
    <description><![CDATA[ <p align="center"><a href="http://www.anandtech.com/show/6860/apple-to-ship-updated-a1428-iphone-5-with-aws-wcdma-enabled-for-tmobile-usa"><img src="http://images.anandtech.com/doci/6860/20120921-_DSC3625_575px.jpg" alt="" /></a></p><p><p>
	Back when I did my <a href="http://www.anandtech.com/show/6541/the-state-of-qualcomms-modems-wtr1605-and-mdm9x25">Qualcomm modems and transceivers piece</a>, I gained a deeper understanding about the cellular RF engineering side of the handset puzzle. Specifically, how an OEM can enable LTE on some bands and not enable WCDMA on those very same bands. The interesting and relevant takeaway from the whole exploration is that all ports on the transceiver are created equal, and that if an OEM implements LTE on a particular band, that usually means that the device design can inherit support for 3G WCDMA on that same band, given the right power amplifier. I alluded at the end of the article to the fact that if you see an OEM implement band 4 on LTE and not band 4 on WCDMA, it&#39;s just a matter of a firmware lock and appropriate certifications to enable it, and what I was alluding directly to was the A1428 iPhone 5.</p>
<p>
	Today T-Mobile USA formalized their LTE plans and <a href="http://newsroom.t-mobile.com/articles/t-mobile-makes-un-carrier-moves">announced</a> that the Samsung Galaxy Note 2 (as <a href="http://www.anandtech.com/show/6386/samsung-galaxy-note-2-review-t-mobile-/9">predicted</a>), Blackberry Z10, and Sonic 2.0 hotspots would immediately work with their Band 4 LTE&nbsp;which is either 5 or 10 MHz FDD&nbsp;depending on market. In addition the upcoming HTC One and Samsung Galaxy S 4 will support T-Mobile LTE. The operator also launched its LTE&nbsp;network in&nbsp;Baltimore, Houston, Kansas City, Las Vegas, Phoenix, San Jose, Calif., and Washington, D.C. I plan to drive up to Phoenix at some point this week and test the network out there.&nbsp;</p>
<p>
	Among all that other news however was news that T-Mobile would finally be <a href="http://newsroom.t-mobile.com/articles/t-mobile-unleashes-iphone-5">carrying the iPhone 5</a>, specifically an updated version of the A1428 hardware model which included Band 4 (AWS) LTE support. This was the variant aimed at AT&amp;T specifically, with both Band 4 and Band 17 LTE included, in addition to a number of other bands as we noted in <a href="http://www.anandtech.com/show/6330/the-iphone-5-review/18">the iPhone 5 review</a>. As I mentioned earlier, what&#39;s interesting about A1428 is that it always had the necessary power amplifiers for AWS WCDMA, but only enabled it on LTE. The hardware could support AWS WCDMA, but that was locked out in firmware &mdash; until now. Apple gave a <a href="http://www.engadget.com/2013/03/26/iphone-5-att-aws-unlocked-plans-t-mobile/">statement</a> to Engadget which confirmed my earlier suspicions &ndash; beginning 4/12, Apple will ship a new&nbsp;A1428 with different firmware onboard that enables AWS WCDMA. There won&#39;t be any software update for existing A1428 owners, meaning if you bought an iPhone 5 AT&amp;T model, you&#39;re not going to be able to get AWS WCDMA&nbsp;on T-Mobile overnight unfortunately, instead new shipping A1428 models will simply have different firmware on them which enables the AWS paths through the transceiver for WCDMA to be enabled. I&#39;m unclear how Apple will choose to differentiate the two identical A1428 hardware models for users or on their own spec lists, either way there will be an old version and new version which differ in this regard. In addition, existing A1428 hardware without AWS WCDMA support will be phased out.</p>
<p>
	In fact, there&#39;s the same FCC-ID for the A1428 with AWS&nbsp;WCDMA enabled, it&#39;s still&nbsp;BCG-E2599A. I was surprised to see that Apple has already in fact processed the Class II Permissive Change and added Band 4 (AWS) WCDMA tests as necessary, dated today March 26th 2013.&nbsp;</p>
<p align="center">
	<a href="http://www.anandtech.com/show/6860/apple-to-ship-updated-a1428-iphone-5-with-aws-wcdma-enabled-for-tmobile-usa"><img alt="" src="http://images.anandtech.com/doci/6860/Screen Shot 2013-03-26 at 4.00.16 PM_575px.png" style="height: 312px; width: 550px;" /></a><br />
	From the FCC&#39;s A1428 Class 2 Permissive Change Cover Letter</p>
<p>
	So there we have it, the new A1428 with AWS&nbsp;WCDMA for T-Mobile is identical hardware to the previous A1428 hardware, it&#39;s just a matter of enabling those modes in the transceiver for WCDMA. The hardware will also support DC-HSPA+ (42.2 Mbps downlink) on AWS, which means speedy fallback if you detach from LTE and are in a T-Mobile market with two WCDMA carriers.</p>
<p>
	Source: <a href="http://newsroom.t-mobile.com/articles/t-mobile-makes-un-carrier-moves">T-Mobile</a> (LTE and Uncarrier), <a href="http://newsroom.t-mobile.com/articles/t-mobile-unleashes-iphone-5">T-Mobile</a> (iPhone 5), <a href="http://www.engadget.com/2013/03/26/iphone-5-att-aws-unlocked-plans-t-mobile/">Engadget</a></p>
<p>
	<div>Gallery: <a href="/Gallery/Album/2700" target="_blank">Apple to Ship Updated A1428 iPhone 5 With AWS WCDMA Enabled for T-Mobile USA</a><div><a href="/Gallery/Album/2700#1" target="_blank"><img src="http://images.anandtech.com/galleries/2700/20120921-_DSC3625_thumb.jpg" width="85" height="85" border="0"/></a><a href="/Gallery/Album/2700#2" target="_blank"><img src="http://images.anandtech.com/galleries/2700/Screen Shot 2013-03-26 at 3.58.07 PM_thumb.png" width="85" height="85" border="0"/></a><a href="/Gallery/Album/2700#3" target="_blank"><img src="http://images.anandtech.com/galleries/2700/Screen Shot 2013-03-26 at 3.59.09 PM_thumb.png" width="85" height="85" border="0"/></a><a href="/Gallery/Album/2700#4" target="_blank"><img src="http://images.anandtech.com/galleries/2700/Screen Shot 2013-03-26 at 4.04.08 PM_thumb.png" width="85" height="85" border="0"/></a><a href="/Gallery/Album/2700#5" target="_blank"><img src="http://images.anandtech.com/galleries/2700/Screen Shot 2013-03-26 at 4.05.26 PM_thumb.png" width="85" height="85" border="0"/></a></div></div></p>
</p>]]></description>
    <link>http://www.anandtech.com/show/6860/apple-to-ship-updated-a1428-iphone-5-with-aws-wcdma-enabled-for-tmobile-usa</link>
    <pubDate>Tue, 26 Mar 2013 18:26:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6860:news</guid>
    <category><![CDATA[ Smartphones]]></category>
    <pubDate>Tue, 26 Mar 2013 18:26:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6860:news</guid>
 	<category><![CDATA[ Smartphones]]></category>
</item>  
    
    
<item>
    
        <title>Nanoxia Deep Silence Cases Officially Available Stateside</title>
    <author>Dustin Sklavos</author>
    <description><![CDATA[ <p><p>
	Ever since we reviewed the Nanoxia <a href="http://anandtech.com/show/6479/nanoxia-deep-silence-1-case-review-you-asked-for-it-you-got-it">Deep Silence 1</a> and <a href="http://anandtech.com/show/6742/nanoxia-deep-silence-2-case-review-less-of-what-we-needed">Deep Silence 2</a> enclosures, they&#39;ve essentially been setting the standard for what a silent enclosure can and should be at their respective price points. However, at the time of each review, those respective price points were largely hypothetical. They were targets, but Nanoxia was still in talks for Stateside distribution, and that had been the refrain every time a comparison to either enclosure was brought up.</p>
<p>
	<a href="http://www.anandtech.com/show/6859/nanoxia-deep-silence-cases-officially-available-stateside"><img alt="" src="http://images.anandtech.com/doci/6479/Small (3 of 15).jpg" style="width: 484px; height: 565px;" /></a></p>
<p>
	I don&#39;t typically like doing news posts going &quot;hey, this is shipping now,&quot; but the DS1 is a Bronze Editor&#39;s Choice winner and a highly sought after case. NewEgg is now officially taking preorders for the <a href ="http://detonator.dynamitedata.com/cgi-bin/redirect.pl?user=u00000626&url=http%3a%2f%2fwww.newegg.com%2fProduct%2fProduct.aspx%3fItem%3dN82E16811281001">Nanoxia Deep Silence 1</a> and <a href ="http://detonator.dynamitedata.com/cgi-bin/redirect.pl?user=u00000626&url=http%3a%2f%2fwww.newegg.com%2fProduct%2fProduct.aspx%3fItem%3dN82E16811281002">Deep Silence 2</a>, with a shipping date of April 10th for the DS1 and 11th for the DS2. That would be exciting enough, but Nanoxia seems to have priced the Deep Silence cases exceedingly aggressively. My conclusions had always been predicated on both availability and on Nanoxia hitting price points that seemed frankly pie in the sky, but as it turns out, the DS1 is up for preorder for just $109, and the DS2 is going to go for just $89. At those price points, both cases are going to be incredibly tough to beat in the market.</p>
</p>]]></description>
    <link>http://www.anandtech.com/show/6859/nanoxia-deep-silence-cases-officially-available-stateside</link>
    <pubDate>Tue, 26 Mar 2013 13:36:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6859:news</guid>
    <category><![CDATA[ Cases/Cooling/PSUs]]></category>
    <pubDate>Tue, 26 Mar 2013 13:36:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6859:news</guid>
 	<category><![CDATA[ Cases/Cooling/PSUs]]></category>
</item>  
    
    
<item>
    
        <title>NVIDIA GeForce GTX 650 Ti Boost Review: Bringing Balance To The Force</title>
    <author>Ryan Smith</author>
    <description><![CDATA[ <p>
	Launching today is NVIDIA&#39;s answer to AMD&#39;s Radeon HD 7790 and 7850, the GeForce GTX 650 Ti Boost. The GTX 650 Ti Boost is based on the same GK106 GPU as the GTX 650 Ti and GTX 660, and is essentially a filler card to bridge the gap between them. By adding GPU boost back into the mix and using a slightly more powerful core configuration, NVIDIA intends to plug their own performance gap and at the same time counter AMD&rsquo;s 7850 and 7790 before the latter even reaches retail.</p>
]]></description>
    <link>http://www.anandtech.com/show/6838/nvidia-geforce-gtx-650-ti-boost-review-</link>
    <pubDate>Tue, 26 Mar 2013 08:00:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6838:news</guid>
    <category><![CDATA[ GPUs]]></category>
    <pubDate>Tue, 26 Mar 2013 08:00:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6838:news</guid>
 	<category><![CDATA[ GPUs]]></category>
</item>  
    
    
<item>
    
        <title>AMD Comments on GPU Stuttering, Offers Driver Roadmap &amp; Perspective on Benchmarking</title>
    <author>Ryan Smith</author>
    <description><![CDATA[ <p>
	AMD remained curiously quiet as to exactly why its hardware and drivers were so adversely impacted by new FRAPS based GPU testing methods. While our own foray into evolving GPU testing will come later this week, we had the opportunity to sit down with AMD to understand exactly what&rsquo;s been going on.</p>
<p>
	What follows is based on our meeting with some of AMD&#39;s graphics hardware and driver architects, where they went into depth in all of these issues. In the following pages we&rsquo;ll get into a high-level explanation of how the Windows rendering pipeline works, why this leads to single-GPU issues, why this leads to multi-GPU issues, and what various tools can measure and see in the rendering process.</p>
]]></description>
    <link>http://www.anandtech.com/show/6857/amd-stuttering-issues-driver-roadmap-fraps</link>
    <pubDate>Tue, 26 Mar 2013 02:28:00 EDT</pubDate>
    <guid isPermaLink="false">tag:www.anandtech.com,6857:news</guid>
    <category><![CDATA[ GPUs]]></category>
    <pubDate>Tue, 26 Mar 2013 02:28:00 EDT</pubDate>
 	<guid isPermaLink="false">tag:www.anandtech.com,6857:news</guid>
 	<category><![CDATA[ GPUs]]></category>
</item>  
    
</channel>
</rss