How pricey are the hyperscalers' Internet of Things offerings?

Lead Researchers: Owen Rogers, Research Director - Digital Economics Unit, and Christian Renaud, Research Director - Internet of Things

451 Research recently published its Technology & Business Insight: The Economics of IoT report, which compared the pricing models and costs of the cloud IoT offerings of three leading hyperscalers – AWS, Google and Microsoft. 451 Research Voice of the Enterprise IoT survey respondents identify these three large public cloud providers as the leading IoT platform vendors.

To discover these results was no simple task. Costs related to AWS IoT Core, Google IoT Core and Microsoft Azure IoT Hub are structured so differently that manual cost comparisons are almost impossible. Even where pricing models appear simple, subtle differences can have significant cost implications.

For instance, one hidden element that affects cost is that each provider has its own definition of what constitutes a message. Some IoT platforms do not charge for 'keep alive' messages of just a few bytes, while others do. In fact, some cloud providers round up small messages such as this to the nearest kilobyte. This results in a 64-byte 'ping' message being charged for 17 times the capacity.

To account for these differences and complete their analysis effectively, the analysts carefully identified nine pricing parameters that have a bearing on the cost of the hyperscaler cloud IoT platforms. The parameters still led to millions of permutations where one provider would be more cost-effective than the next. The only approach was to price all possible combinations and understand the differences using machine-learning techniques.

The analysts constructed a Python simulation that performed 10 million comparisons of US pricing. Each simulated scenario had a randomly selected configuration of the nine price-impacting parameters. Resulting data was then fed into an analytics platform that produced a Chi-square automatic interaction detection (CHAID) decision tree. The tree revealed which combinations of the parameters drove which provider (AWS, Google or Microsoft) to be the cheapest with a predictive strength of 96%.

Overall, the 451 Research Digital Economics Unit's methodology deduced that Microsoft Azure is the cheapest at scale, but AWS is the least expensive IoT provider for deployments of less than 20,000 devices that each send an average of three messages or fewer at under 6KB a minute. Google was not found to be exclusively the cheapest in any scenario.

To learn more about what factors drive the cloud-related costs of Internet of Things deployments, and in which scenarios each cloud provider has a cost advantage, register below for our webinar on June 13 at 1:00 pm ET.

997 Hits

Empowered users drive Contextual Experiences

Contextual experiences are driven by changes in user behavior, empowered by technologies such as smartphones, machine learning and the cloud. However, they have as much to do with demographic and lifestyle changes as they do with the technology that enables them. Given that 80% of online purchases in 2018 will be influenced by mobile, and within a decade the average person will have more conversations with bots than with other humans each day, organizations that fail to deliver contextual experiences will be passed over for those that do.

Users, not organizations, will increasingly determine how they consume information, engage with brands and get work done. Today's empowered users can now dictate the terms of their business engagements. According to data from 451 Research's Voice of the Connected User Landscape (VoCUL): 1H 2017 Corporate Mobility and Digital Transformation, 82% of businesses say that machine learning for automated contextual recommendations is important to creating personalized experiences. Growth in data for contextualized experiences, empowered by technologies such as mobile, cloud and machine learning, will create a significant gap between digital leaders and laggards when it comes to using technologies for strategic innovation.

Continue reading
1838 Hits

As intelligence becomes pervasive, data becomes the ultimate asset

Matt Aslett, Research Director, Data Platforms & Analytics
John Abbott, Founder & Distinguished Analyst

'Intelligence' is the ability to capture, analyze, understand and act on information, including the ability to recognize patterns, comprehend ideas, plan, predict, problem-solve, identify actions and make faster decisions. Traditionally, business intelligence has almost exclusively involved humans analyzing data generated by enterprise applications. But we are now in the midst of a revolution toward 'Pervasive Intelligence,' fueled by self-service analytics, the Internet of Things (IoT), artificial intelligence (AI), machine learning and deep learning, and business process automation tools and techniques – and enabled by the new economics of generating, storing and processing data.

Pervasive Intelligence has the potential to rapidly change the technology product and services landscape. Vendors that are able to translate data into value will survive and thrive. Those that do not will be left behind. We expect Pervasive Intelligence to be a significant catalyst for the rapid evolution of products and services. Those applications and services with the analytics and AI capabilities to translate data into intelligence will succeed, while those without will fall by the wayside. However, incumbent data platforms and analytics vendors hold the best cards due to their established installed customer bases and their substantial cash reserves, enabling them to acquire potential challengers and invest in research and development.

Pervasive Intelligence Data As The Ultimate Asset
Continue reading
1213 Hits

Industry perspectives on securing the enterprise from Voice of the Service Provider

451 Research's Voice of the Service Provider service helps to both qualify and quantify buying behaviors, business drivers and strategic priorities for the expanding universe of public cloud providers, hosters, MSPs, telcos, systems integrators, SaaS companies and colos.

The infrastructures and architectures of most large enterprises are starting to resemble those of service providers – with networked datacenters, diverse communication channels, distributed user communities, elastic and scalable services, and agile delivery. This constantly evolving and expanding ecosystem often includes a rapid influx of new technologies and capabilities that are only surpassed by the growing and intensifying threat landscape rising from these advancements. Not only are these environments much more difficult and complex to secure than traditional, centralized IT, but there is also a steep learning curve to fully grasp the intricacies of a 'service provider like' model. With this mind, 451 Research recently asked several service providers what advice they have for enterprises moving in this direction.

Enterprise infrastructures have evolved beyond being solely built to offer intra-company connectivity and services to employees. They now provide services directly to customers, enabling and powering new products and services, and integrating with a long chain of partners, suppliers and distributors. However, securing such customer-facing cloud environments that operate at scale is significantly more challenging. Organizations have to take into account factors such as threat detection and regulatory compliance across multi-cloud, identity management, and disparate tools for edge device authentication. Assuming all clouds are alike is one of the main mistakes enterprises make. Security experts with service providers of all sizes have many recommendations for enterprises such as having a defense-in-depth strategy with comprehensive controls, leveraging APIs and automation to the greatest extent possible, and partnering as needed to fill skills and capability gaps.
Continue reading
1130 Hits

The cloud transformation journey: Great expectations lead to a brave new world

It's easy to think of cloud adoption as a one-time event – you choose a cloud and you consume that cloud, and the rest is history – but realistically, for most enterprises this is an incremental and iterative process. Traditional refresh cycles drive periodic purchases of new hardware and updates of software – for cloud, rapidly growing feature sets and on-demand consumption require frequent reassessment of the venues and technologies that best meet changing enterprise needs. No one wants to move providers all the time – enterprises naturally try and optimize what they already have; providers that are best at accommodating them are likely to have the most loyal customers.

Our cloud transformation journey model shows the enterprise cloud cost experience over time. We identify the cycle of cloud consumption – migration and implementation, cost-savings and cost-increases, governance and optimization, and transformative value. The ups and downs of this experience and the time required to realize value vary by company and by application; with experience and automation, the amplitude of the curve flattens and the time to value shortens. And then it starts all over again. It represents the costs payable at each stage of the enterprise journey to value-adding 'utility' IT. Read the full report here.

The Cloud Transformation Jorney
1589 Hits