Is your AI Infrastructure Prepared to Meet Future Demands?

Written by: Senior Research Associate Jeremy Korn and Research Vice President Nick Patience

Many organizations are underprepared for the demands AI and machine learning applications will place on their infrastructure, but they are prepared to spend money to change that situation.

Those are just a couple of conclusions we can draw from our new Voice of the Enterprise: AI and Machine Learning Infrastructure 2019 survey. Almost half (45%) of enterprises indicate that their current AI infrastructure will not be able to meet future demands (see Figure 1), which prompts a few questions:

• Why is that?
• What do they propose to do about it?
• Are they prepared to spend money to fix the problem?

Figure 1
figure 1 status of enterprise infrastructure for ai















Why is that?


Broadly speaking, data is the reason infrastructure needs to be overhauled to deliver AI at scale, with 89% of respondents in our survey saying they expect the volume of data in using the machine learning workloads to increase in the next year, and almost half projecting an increase of 25% or more. Much of that growth will come from unstructured data, since the most transformative use cases of AI and machine learning involve gaining insight from unstructured data, be it text, images, audio or video.

What do they propose to do about it?

Organizations understand that, for them to take advantage of AI at scale, it is not simply a case of scaling existing infrastructure. New infrastructure is needed to cope with the demands of machine learning workloads, including new scalable storage, dedicated accelerators and low-latency networks. These need to be deployed across a variety of execution venues.

Enterprises also express a variety of concerns about their AI infrastructures, from the security of these systems to the opacity of data management capabilities. Overhauling AI infrastructure demands more than just buying better hardware; it will require new tools and updates to architectural paradigms.

Are they prepared to spend money to fix the problem?

Yes, they are. Our survey shows that 83% of responding enterprises say they will expand AI infrastructure budgets next year, with 39% of those projecting an increase of 25% or more. Spending on cloud-based AI platforms will lead the charge, with 89% of respondents planning to increase spending on them in the next year.

Our Voice of the Enterprise: AI and Machine Learning Infrastructure 2019 survey contains a lot more data on subjects such as spending decision-makers, the specific points in the machine learning process that put strain on infrastructure, the types of AI-specific infrastructure components organizations are looking to buy, the areas in which skill shortages are most acute, and how often and where machine learning models are trained and deployed.

For more insight, check out this free Market Insight report.

1317 Hits

Featured Data: VotE Survey Results Indicate that Automation Investment is on the Rise

Contributed by Principal Analyst Carl Lehmann

IT Environment automation - clehmann
According to our latest Voice of the Enterprise – Digital Pulse: Budgets & Outlook survey data, 58% of respondents indicated that their current IT environments are either highly or mostly reliant on manual processes. As these businesses move toward digital transformation, these organizations need to be able to tackle new IT projects that will allow them to adapt to digital disruption, adopt new technologies, and meet new business objectives. Automation is a key tool that can allow IT teams to refocus their time on these new projects. Thus, it is sensible that 75% of VotE respondents indicated they expect to increase their automation investment in the next 12 months.

I see this trend as a key driver for digital business and the future of hybrid IT as businesses transform for the future. This will be an important element as I collect my insights for my presentation – “Automation Effectiveness – Are You Prepared to Compete for the Future?” – at the 15th annual Hosting & Cloud Transformation Summit this coming September. Register today to get the insights you will need as you navigate the digital revolution.
1460 Hits

451 Research Shines as a “Go To” Analyst Firm as We Start our 20th Year

Over the past 19 years, 451 Research has worked hard to deliver unique and impactful insight and data about innovation in the technology industry, and we are thankful for the recognition that our clients give us. However, it is always special when your work is recognized by the tech industry and the analyst relations community more broadly. In just the past year, 451 Research and our analysts have been acknowledged as “highly valued” by three different analyst relations communities – further cementing our reputation as one of the technology industry’s premier firms.

Continue reading
2152 Hits

Taking a New Approach to Unstructured Data Management

Written by: Steven Hill - Senior Analyst, Applied Infrastructure and Storage Technologies – 451 Research

Enterprise storage has never been easy. Business depends on data—and all things data begin and end at storage—but the way we handle data in general, and unstructured data in particular, hasn’t really evolved at the same pace as other segments of the IT industry. Sure, we’ve made storage substantially faster and higher capacity, but we haven’t dealt with the real problems of storage growth caused by this increased performance and density; much less the challenges of managing data growth that’s now spanning multiple, hybrid storage environments across the world. The truth is, you can’t control what you can’t see; and as a result, a growing number of businesses are paying a great deal of money to store multiple copies of the same data over and over. Or perhaps even worse, keeping multiple versions of that same data without any references between them at all.

This massive data fragmentation between multiple storage platforms can be one of the major sources of unchecked storage growth; and added to that are the new risks of a “keep everything” approach to data management. Privacy-based initiatives like GDPR in the EU and California’s CCPA-2018 require a complete reevaluation of storage policies across many vertical markets to ensure compliance with these new regulations for securing, protecting, delivering, redacting, anonymizing and authenticating the deletion of data containing personally identifiable information (PII) on demand. While this can be a more manageable problem for database information, it’s a far greater challenge for unstructured data such as documents, video and images that make up a growing majority of enterprise data storage. Without some form of identification this data goes “dark” soon after it leaves the direct control of its creator, and initiatives like GDPR don’t make a distinction between structured and unstructured data.

There can be a number of perfectly good reasons for maintaining similar or matching data sets at multiple locations, such as data protection or increased availability. The real challenge lies in being able to maintain policy-based control of that data regardless of physical location, while at the same time making it available to the right people for the right reasons. Documents and media such as images, audio and video are making up a growing percentage of overall business data, and companies have a vested interest in making continued use of that data. But at the same time, there can be serious legal ramifications for not managing all this data properly that could potentially cost companies millions.

The cloud has changed the IT delivery model forever; and with a hybrid infrastructure, business IT is no longer limited by space, power and capital investment. The decisions regarding workload and data placement can now be based on the best combination of business needs, economics, performance and availability rather than by location alone; but with that freedom comes a need to extend data visibility, governance and policy to data wherever it may be. In this context, the problems of data fragmentation across multiple systems are almost inevitable; so, it really comes down to accepting this as a new challenge and adopting next-generation storage management based on an understanding of what our data is, rather than where it is.

Mass data fragmentation is a problem that existed before the cloud, but fortunately the technology needed to fix this is already available. From an unstructured data perspective, we believe this involves embracing a modern approach that can span data silos for backups, archives, file shares, testing and development data sets and object stores on that bridges on-premises, public cloud and at the edge. A platform-based approach can help to give you visibility into your data, wherever that data resides, and more importantly, can help you maintain greater control by reducing the number of data copies, managing storage costs, and ensuring your data stays in compliance and backed up properly. We also think an ideal solution seamlessly blends legacy, file-based storage with the management flexibility and scalability offered by metadata-based object storage. This requires a fundamental shift in the way we’ve addressed unstructured data management in the past; but it’s a change that offers the benefits of greater data availability and storage-level automation and provides a new set of options for controlling and protecting business data that’s both a major business asset and a potential liability if not handled correctly. 
1109 Hits

Has Integrated Automation conquered the land RPA and AI once battled for?

Contributed by Principal Analyst Carl Lehmann 

Much the way Winter came for the Game of Thrones heroes in the new season (we promise this is the only Game of Thrones reference and we will not share any spoilers), there is talk spreading in the tech industry that Integrated Automation has come to displace tools like robotic process automation (RPA). We certainly don’t disagree, in fact, we predicted back in 2017 that RPA companies would likely not survive as stand-alone vendors.

In this report from April 2017, we predicted that RPA vendors that focused only on automating repetitive tasks, while very welcome in many IT departments in the short term, would be less likely to survive as stand-alone vendors compared to more sophisticated platforms that can call upon various machine-learning (ML) technologies to add contextual awareness and guidance of unstructured interactions toward desired outcomes. Even RPA platforms that can automate based on rules, conditional routing and logical operations, and modify behavior based on their learnings were also considered tech that would likely be subsumed into ML platforms of hyperscale CSPs, IT leviathans and tool kits of larger systems integrators, according to our analysts. In our opinion, it was unlikely RPA would last long as a stand-alone product.

Again in August 2017, our analyst team noted a rising trend with BPM software transforming into a process- and content-oriented application development and runtime platform, which we coined as 'digital automation platform' (DAP). DAPs, as referenced in the report, will emerge as uniform development, integration and runtime environments that enable intelligent process automation (IPA) – a managerial discipline focused on intuitive user experiences, contextual awareness and transparent execution. Much like what others are describing as Integrated Automation today, DAP would require RPA capabilities – to create software 'bots' that automate repetitive human activities in business processes – and AI integration – to expose 'next best guess' activities for application developers and users (process stakeholders) and extract insight – in one solution. In particular, RPA was cited to “likely become a core enabling technology in several DAP vendors' offerings.” 

In short, DAPs and Integrated Automation sound less like the death of RPA and similar technologies, and more like the next logical evolution toward accelerating business operations and making them efficient. Both describe feature-rich development platforms for content- and process-oriented applications, and a method to extract knowledge from automated execution to meet the innovation and operational efficiency needs of enterprises. In fact, our most recent research highlighting this evolution (in this spotlight report, now available for public access) covers why we believe the core tools needed to discover and effect how value and advantage are created include next-generation DAPs, RPA technology, hybrid integration platforms (HIPs), and process mining technologies (PMT) platforms. 451 Research clients can access all Market Insight reports on RPA and DAP and beyond in our Research Dashboard. Don’t have access? Apply for a Trial.

Much the way Winter has come for the Game of Thrones heroes in the new season (we promise this is the only Game of Thrones reference and we will not share any spoilers), there is talk spreading in the tech industry that Integrated Automation has come to displace tools like robotic process automation (RPA). We certainly don’t disagree, in fact we predicted back in 2017 that RPA companies would likely not survive as a stand-alone vendors.

2921 Hits