Podcast: State of the Internet amid coronavirus pandemic

With the nation in shutdown mode induced by the coronavirus pandemic, in-home media usage has soared, along with increased bandwidth exertion from students and those working remotely. Despite these pressures, connectivity has remained strong and consumers have expressed interest in adding more streaming services to their entertainment options, according to Craig Matsumoto, senior analyst at 451 Research, and John Fletcher, a media analyst at Kagan - a research unit within S&P Global Market Intelligence, who were both recent guest speakers on the latest episode of "MediaTalk," an S&P Global Market Intelligence podcast.
Continue reading
3355 Hits

Video: Voice of the Enterprise Digital Pulse Coronavirus Flash Survey



In our  Voice of the Enterprise: Digital Pulse, Coronavirus Flash survey, we measured the impact of the pandemic on business operations and strategy. Some of our key findings are summarized in the above video - including our 451 Take - but you can check out additional insights on the S&P Global Market Intelligence website

As the impact of the pandemic continues, we are sharing all of our public-facing insight as the pandemic unfolds on this COVID-19 microsite
1786 Hits

Top Reports from January 1 - March 16, 2020

Below is a summarized list of reports we have provided publicly since the start of Q1 2020. There is much more than the content below available in 451 Research's Dashboard. Log in or Apply for Trial access to browse all 451 content.

Applied Infrastructure & DevOps Research

Enterprise Network Automation Gets Competitive

Cloud & Managed Services Research
Cloud Trends in 2020: The Year of Complexity, and its Management
Access to Talent Driving Managed Services Opportunity
How to Navigate 451's Cloud-Native Research in 2020
The Old Managed Service and the Sea: Cloud Economics Trends in 2020

Customer Experience & Commerce Research
Digital Wallet Adoption: a Merchant and Consumer Perspective

Data, AI & Analytics Research
The Importance of Cloud-based AI Platforms for Enterprise AI
451 Perspective: Continuous Intelligence Emerges as Successor to Real-time Analytics

Datacenter Services & Infrastructure Research
Featured Data - United Arab Emirates: Leased Datacenter Market
The Trends Driving the Multi-Tenant Datacenter and Services Industry in 2020

Information Security Research
Perceptions Toward the Security Risk of Cloud are Shifting

Internet of Things Research
The Great Crew Change: A Demographic Time Bomb Driving Adoption of Industrial IoT, AR and AI
The Evolution and Complexity of the Games Industry Applied at the Edge and in the Cloud

Workforce Productivity & Collaboration Research
Coronavirus will Disrupt your Workforce: Ensure that you have the right Tooling Strategy for Remote Workers
Coronavirus will Disrupt your Workforce: Here are 10 Ways to Mitigate the Business Impact
 The 10 Workforce Productivity and Collaboration Trends You Need to Know in 2020

2020 Trends in IT   
Download these complimentary excerpts of our 2020 Preview reports
We completed our 2020 preview webinar series in February. All webinars are available to watch on demand here.
4555 Hits

Is your AI Infrastructure Prepared to Meet Future Demands?

Written by: Senior Research Associate Jeremy Korn and Research Vice President Nick Patience

Many organizations are underprepared for the demands AI and machine learning applications will place on their infrastructure, but they are prepared to spend money to change that situation.

Those are just a couple of conclusions we can draw from our new Voice of the Enterprise: AI and Machine Learning Infrastructure 2019 survey. Almost half (45%) of enterprises indicate that their current AI infrastructure will not be able to meet future demands (see Figure 1), which prompts a few questions:

• Why is that?
• What do they propose to do about it?
• Are they prepared to spend money to fix the problem?

Figure 1
figure 1 status of enterprise infrastructure for ai















Why is that?


Broadly speaking, data is the reason infrastructure needs to be overhauled to deliver AI at scale, with 89% of respondents in our survey saying they expect the volume of data in using the machine learning workloads to increase in the next year, and almost half projecting an increase of 25% or more. Much of that growth will come from unstructured data, since the most transformative use cases of AI and machine learning involve gaining insight from unstructured data, be it text, images, audio or video.

What do they propose to do about it?

Organizations understand that, for them to take advantage of AI at scale, it is not simply a case of scaling existing infrastructure. New infrastructure is needed to cope with the demands of machine learning workloads, including new scalable storage, dedicated accelerators and low-latency networks. These need to be deployed across a variety of execution venues.

Enterprises also express a variety of concerns about their AI infrastructures, from the security of these systems to the opacity of data management capabilities. Overhauling AI infrastructure demands more than just buying better hardware; it will require new tools and updates to architectural paradigms.

Are they prepared to spend money to fix the problem?

Yes, they are. Our survey shows that 83% of responding enterprises say they will expand AI infrastructure budgets next year, with 39% of those projecting an increase of 25% or more. Spending on cloud-based AI platforms will lead the charge, with 89% of respondents planning to increase spending on them in the next year.

Our Voice of the Enterprise: AI and Machine Learning Infrastructure 2019 survey contains a lot more data on subjects such as spending decision-makers, the specific points in the machine learning process that put strain on infrastructure, the types of AI-specific infrastructure components organizations are looking to buy, the areas in which skill shortages are most acute, and how often and where machine learning models are trained and deployed.

For more insight, check out this free Market Insight report.

4256 Hits

Taking a New Approach to Unstructured Data Management

Written by: Steven Hill - Senior Analyst, Applied Infrastructure and Storage Technologies – 451 Research

Enterprise storage has never been easy. Business depends on data—and all things data begin and end at storage—but the way we handle data in general, and unstructured data in particular, hasn’t really evolved at the same pace as other segments of the IT industry. Sure, we’ve made storage substantially faster and higher capacity, but we haven’t dealt with the real problems of storage growth caused by this increased performance and density; much less the challenges of managing data growth that’s now spanning multiple, hybrid storage environments across the world. The truth is, you can’t control what you can’t see; and as a result, a growing number of businesses are paying a great deal of money to store multiple copies of the same data over and over. Or perhaps even worse, keeping multiple versions of that same data without any references between them at all.

This massive data fragmentation between multiple storage platforms can be one of the major sources of unchecked storage growth; and added to that are the new risks of a “keep everything” approach to data management. Privacy-based initiatives like GDPR in the EU and California’s CCPA-2018 require a complete reevaluation of storage policies across many vertical markets to ensure compliance with these new regulations for securing, protecting, delivering, redacting, anonymizing and authenticating the deletion of data containing personally identifiable information (PII) on demand. While this can be a more manageable problem for database information, it’s a far greater challenge for unstructured data such as documents, video and images that make up a growing majority of enterprise data storage. Without some form of identification this data goes “dark” soon after it leaves the direct control of its creator, and initiatives like GDPR don’t make a distinction between structured and unstructured data.

There can be a number of perfectly good reasons for maintaining similar or matching data sets at multiple locations, such as data protection or increased availability. The real challenge lies in being able to maintain policy-based control of that data regardless of physical location, while at the same time making it available to the right people for the right reasons. Documents and media such as images, audio and video are making up a growing percentage of overall business data, and companies have a vested interest in making continued use of that data. But at the same time, there can be serious legal ramifications for not managing all this data properly that could potentially cost companies millions.

The cloud has changed the IT delivery model forever; and with a hybrid infrastructure, business IT is no longer limited by space, power and capital investment. The decisions regarding workload and data placement can now be based on the best combination of business needs, economics, performance and availability rather than by location alone; but with that freedom comes a need to extend data visibility, governance and policy to data wherever it may be. In this context, the problems of data fragmentation across multiple systems are almost inevitable; so, it really comes down to accepting this as a new challenge and adopting next-generation storage management based on an understanding of what our data is, rather than where it is.

Mass data fragmentation is a problem that existed before the cloud, but fortunately the technology needed to fix this is already available. From an unstructured data perspective, we believe this involves embracing a modern approach that can span data silos for backups, archives, file shares, testing and development data sets and object stores on that bridges on-premises, public cloud and at the edge. A platform-based approach can help to give you visibility into your data, wherever that data resides, and more importantly, can help you maintain greater control by reducing the number of data copies, managing storage costs, and ensuring your data stays in compliance and backed up properly. We also think an ideal solution seamlessly blends legacy, file-based storage with the management flexibility and scalability offered by metadata-based object storage. This requires a fundamental shift in the way we’ve addressed unstructured data management in the past; but it’s a change that offers the benefits of greater data availability and storage-level automation and provides a new set of options for controlling and protecting business data that’s both a major business asset and a potential liability if not handled correctly. 
1952 Hits