Celebrating the 15th Annual Hosting & Cloud Transformation Summit – Presenter Q&A with William Fellows

Interview with William Fellows, Research Vice President

Next up in our Presenter Q&A series: Research Vice President William 'Wif' Fellows, who will lead the “Making Infrastructure Invisible – Simplifying the Jumble of Clouds, Containers and Venues” session at the Hosting & Cloud Transformation Summit (HCTS) on September 24 at 3:40 pm PT, in the Orovada Breakout Room 1 at the Aria in Las Vegas.

Q: What will you be discussing in this session?

A: The emergence of clouds, containers and microservices promises to dramatically simplify the deployment of applications and services in support of digital transformation, which enterprises tell us remains the organizing principle for their IT activity. However, this potential will only be realized if management and orchestration across these hybrid deployments is done effectively and with transparency to assure end users that services will meet their performance standards. We’ll discuss:
Continue reading
5297 Hits

Celebrating the 15th Annual Hosting & Cloud Transformation Summit – Presenter Q&A with Al Sadowski

Interview with Al Sadowski, Research Vice President

In May, we opened registration for our 15th annual Hosting & Cloud Transformation Summit, and the response has been incredible! We are excited to bring our leading insights on disruptive and innovative technologies back to Las Vegas. With two months to go before showtime, we will start by introducing our presenting analysts and their session topics through a new series of Q&As.

First up, Research Vice President Al Sadowski:

Q: What will you be discussing in this session?

A: The title of my session is “Voice of the Service Provider: How do Service Providers Stack Up?” I’ll be sharing insights into how service providers are transforming their own businesses in order to remain relevant in the eyes of enterprises. Supported by our latest survey data, the session will go into detail about differentiation strategies, budgets and spending comparisons, and product roadmaps. The session will be followed by a panel discussion with a few hardware and software vendors focused on enabling service providers with enterprise-facing products, along with service provider best practices.

Q: Why is this topic significant?

A: Service providers are no longer just purveyors of compute and storage; they are now seen as partners for services beyond basic infrastructure and as key allies in hybrid architectures. But nearly all have a long way to go in order to automate and affordably scale a business that can provide reliable, secure and performance products and services. According to our Voice of the Service Provider: Infrastructure Evolution 2019 survey results, 94% of service providers admit their IT environments will require partial or complete transformation over the next three years!

VSP PR Graphic 002
Continue reading
4211 Hits

Featured Data: VotE Survey Results Indicate that Automation Investment is on the Rise

Contributed by Principal Analyst Carl Lehmann

IT Environment automation - clehmann
According to our latest Voice of the Enterprise – Digital Pulse: Budgets & Outlook survey data, 58% of respondents indicated that their current IT environments are either highly or mostly reliant on manual processes. As these businesses move toward digital transformation, these organizations need to be able to tackle new IT projects that will allow them to adapt to digital disruption, adopt new technologies, and meet new business objectives. Automation is a key tool that can allow IT teams to refocus their time on these new projects. Thus, it is sensible that 75% of VotE respondents indicated they expect to increase their automation investment in the next 12 months.

I see this trend as a key driver for digital business and the future of hybrid IT as businesses transform for the future. This will be an important element as I collect my insights for my presentation – “Automation Effectiveness – Are You Prepared to Compete for the Future?” – at the 15th annual Hosting & Cloud Transformation Summit this coming September. Register today to get the insights you will need as you navigate the digital revolution.
4648 Hits

451 Research Shines as a “Go To” Analyst Firm as We Start our 20th Year

Over the past 19 years, 451 Research has worked hard to deliver unique and impactful insight and data about innovation in the technology industry, and we are thankful for the recognition that our clients give us. However, it is always special when your work is recognized by the tech industry and the analyst relations community more broadly. In just the past year, 451 Research and our analysts have been acknowledged as “highly valued” by three different analyst relations communities – further cementing our reputation as one of the technology industry’s premier firms.

Continue reading
4936 Hits

Taking a New Approach to Unstructured Data Management

Written by: Steven Hill - Senior Analyst, Applied Infrastructure and Storage Technologies – 451 Research

Enterprise storage has never been easy. Business depends on data—and all things data begin and end at storage—but the way we handle data in general, and unstructured data in particular, hasn’t really evolved at the same pace as other segments of the IT industry. Sure, we’ve made storage substantially faster and higher capacity, but we haven’t dealt with the real problems of storage growth caused by this increased performance and density; much less the challenges of managing data growth that’s now spanning multiple, hybrid storage environments across the world. The truth is, you can’t control what you can’t see; and as a result, a growing number of businesses are paying a great deal of money to store multiple copies of the same data over and over. Or perhaps even worse, keeping multiple versions of that same data without any references between them at all.

This massive data fragmentation between multiple storage platforms can be one of the major sources of unchecked storage growth; and added to that are the new risks of a “keep everything” approach to data management. Privacy-based initiatives like GDPR in the EU and California’s CCPA-2018 require a complete reevaluation of storage policies across many vertical markets to ensure compliance with these new regulations for securing, protecting, delivering, redacting, anonymizing and authenticating the deletion of data containing personally identifiable information (PII) on demand. While this can be a more manageable problem for database information, it’s a far greater challenge for unstructured data such as documents, video and images that make up a growing majority of enterprise data storage. Without some form of identification this data goes “dark” soon after it leaves the direct control of its creator, and initiatives like GDPR don’t make a distinction between structured and unstructured data.

There can be a number of perfectly good reasons for maintaining similar or matching data sets at multiple locations, such as data protection or increased availability. The real challenge lies in being able to maintain policy-based control of that data regardless of physical location, while at the same time making it available to the right people for the right reasons. Documents and media such as images, audio and video are making up a growing percentage of overall business data, and companies have a vested interest in making continued use of that data. But at the same time, there can be serious legal ramifications for not managing all this data properly that could potentially cost companies millions.

The cloud has changed the IT delivery model forever; and with a hybrid infrastructure, business IT is no longer limited by space, power and capital investment. The decisions regarding workload and data placement can now be based on the best combination of business needs, economics, performance and availability rather than by location alone; but with that freedom comes a need to extend data visibility, governance and policy to data wherever it may be. In this context, the problems of data fragmentation across multiple systems are almost inevitable; so, it really comes down to accepting this as a new challenge and adopting next-generation storage management based on an understanding of what our data is, rather than where it is.

Mass data fragmentation is a problem that existed before the cloud, but fortunately the technology needed to fix this is already available. From an unstructured data perspective, we believe this involves embracing a modern approach that can span data silos for backups, archives, file shares, testing and development data sets and object stores on that bridges on-premises, public cloud and at the edge. A platform-based approach can help to give you visibility into your data, wherever that data resides, and more importantly, can help you maintain greater control by reducing the number of data copies, managing storage costs, and ensuring your data stays in compliance and backed up properly. We also think an ideal solution seamlessly blends legacy, file-based storage with the management flexibility and scalability offered by metadata-based object storage. This requires a fundamental shift in the way we’ve addressed unstructured data management in the past; but it’s a change that offers the benefits of greater data availability and storage-level automation and provides a new set of options for controlling and protecting business data that’s both a major business asset and a potential liability if not handled correctly. 
3787 Hits