In the last couple of years, every show we have attended affords the same questions on the Internet of Things: where do we see it going, when is it going to take off, what volume should be expect and what form factor will it arrive? You may have noticed that AnandTech has been relatively light on IoT coverage, perhaps for good reason: there’s an awful lot of awful hardware moving around at low cost, with most of it still in that beta stage of anything.

When the Internet of Things Becomes Useful

For me (Ian) personally, IoT has to satisfy two crucial categories: it has to be able to improve my daily flow of things, and it has to seamlessly work together. At the minute, most IoT fails to do either, especially with sportswear that merely tells you about what you have done. I don’t necessarily want to know what I have done (yes, I get the big data and improve angle), I want to be able to get things done quicker with the equipment at hand. Something as simple as being able to preopen a Chrome window at the webpage I need on my PC while I’m in the kitchen washing up so that it is there when I get to the computer. Or something that will tell me if the fridge/freezer door has been left open for more than five minutes again. Or a series of power monitoring outlets that I can access from a central application to see where my energy provider is clearly picking numbers out of thin air from. Or I can preload the next few music tracks/videos in my stream when I’m in another part of the house so that when I get there, it isn’t buffering the content or the advertisements for the first fifteen seconds.  Does that sound like too much?

As we take baby steps into the future, ASUS is playing on the Google Weave project. Weave is designed to be the standardized API glue that binds products with Weave certification to be able to communicate with each other, the cloud, and the devices a user owns. As noted above, the design is meant for intra-comms within a network and internet communication over the web.

Over the next few years, as ASUS feels its way into the IoT space, I was told that Weave will be one of the standards they will keep close to, especially when it comes to hardware and software.

ASUS for the most part already offers a number of ‘SmartHome’ products, as shown above, and the assumption is that many of these down the line will fit into that Weave mold. My contention is that there are two very distinct levels of IoT product: ones that are designed to be binary (power switch is on/off), or ones that are designed to provide feedback (power switch that gives you a reading). It is the feedback devices that need to be configured so users can interpret the data. For example, if I have several power switches that can read power consumption, I want to be able to view how all of them are doing and perhaps record that data. If the software only allows me to connect with one at a time, or only view one at a time, that leaves it in the hands of the crowd that only wants binary operation. Ultimately I would argue it is the non-binary crowd who will be the early adopters. Same thing for smart locks, or temperature sensors – if I have several smart locks around the house, I want to know the status of each one, and perhaps I want push notifications when one is used (or abused).

I always say a lot of IoT devices have potential, the issue for me is always going to be utility, configuration, ease of use and software. Though the obvious prediction for now is that there will be several standards that won’t interoperate and a user will have to invest into a certain ecosystem to get the best benefit. That kind of sounds like the smartphone Android vs iOS debate all over again. Fingers crossed it runs a bit smoother for IoT.

Vivo Mini-PCs, Sticks, and a ZenBeam ASUS ZenFone 2 Deluxe Special Edition: 256GB and Upgraded SoC
Comments Locked

50 Comments

View All Comments

  • WhisperingEye - Wednesday, January 20, 2016 - link

    Panzerknacker- I don't understand why you replied to a phone question with a router question.
  • Xajel - Sunday, January 24, 2016 - link

    The main reason behind this is that most consumer devices hardly sustain the 1Gb connection.. only some extreme consumers like heavy media servers at home that serves multi 4K stream...

    so for a consumer, 1Gb is enough, and there's no devices that can make use of 10Gb for the consumer...

    Some advanced/enthusiast users uses a Link Aggregation connection as a backbone of their network ( NAS -> Switch <- HTPC/Main PC ) so these can serve multi streams in the same time without any drops... but that is rare as 1Gb is enough for most users already...

    Maybe pro users, like pro video editing needs these 10Gb links, but it's already rare situation to see this in a home user, a person with the need for 10Gb ethernet is already using high-end workstations with professional systems.. so it's not a consumer oriented product any more...

    Personally I thought about having 10Gb as a backbone for my home network just to be future proof.. but after looking again.. I found it too expensive, and I can make 2x 1Gb Link Aggregation which much less cost and still serve me well for few years a head ( NAS + HTPC + router + Switch all with LA connections )
  • Lieuchikaka - Thursday, June 2, 2016 - link

    http://mavangvn.vn/ma-vang-dien-thoai/dien-thoai-s...
  • Lieuchikaka - Thursday, June 2, 2016 - link

    http://mavangvn.vn/ma-vang-dien-thoai/dien-thoai-s...
  • rhx123 - Tuesday, January 19, 2016 - link

    Using their own standard instead of TB3 just screams of Vendor Lock-In, especially when TB3 can do 36gbps over an active cable.
  • SirKnobsworth - Tuesday, January 19, 2016 - link

    According to Tom's, they actually require 2 type C cables too.
  • Alexvrb - Tuesday, January 19, 2016 - link

    Your statement is ironic to me because thunderbolt is itself... a proprietary standard. For external graphics we really need a standardized "Type G" port or something that can provide all the bandwidth by itself. But that will probably never happen. For that matter, even a much tamer enclosure hosting "only" up to 150W GPUs would still be a huge boost for a laptop.
  • nathanddrews - Tuesday, January 19, 2016 - link

    So... what are the implications of USB-C displacing HDMI and DisplayPort connectors? I know that it technically is DP over alternate mode, but it's clearly very popular. It seems like many new displays have it built in. Adaptive Sync? Latency? Would there be a penalty of some kind for sending video output through the USB bus instead of directly from the GPU?

    My only experience is with a first-generation USB display that sucked immense balls.
  • SirKnobsworth - Tuesday, January 19, 2016 - link

    It's just a multiplexer that sends the signal over unused pins. You only get two lanes (as opposed to the usual 4), but that's fine as long as you don't need 4k60.
  • Ryan Smith - Tuesday, January 19, 2016 - link

    You can get 4 lanes of DP. It just uses up all the differential pairs, so you have to give up USB 3.x to get it (which is why DP 1.3 is going to be such a big deal).

Log in

Don't have an account? Sign up now