Celebrating the 15th Annual Hosting & Cloud Transformation Summit – Presenter Q&A with Carl Lehmann

Interview with Carl Lehmann, Principal Analyst

Automation remains one of the hottest topics in IT. Principal Analyst Carl Lehmann knows this all too well, and will bring his insight to the “Automation Effectiveness – Are you Prepared to Compete for the Future?” session at the Hosting & Cloud Transformation Summit (HCTS), which will be on September 24 at 4:20pm PDT, in the Orovada Breakout Room 2 at the Aria in Las Vegas.

Q: What will you be discussing in this session?
A: Aggressive digital business strategies and the scarcity of IT expertise on the open market have made the automation of IT processes a high priority. There are several fundamental levels of automation available to enterprise IT organizations. In general, they include infrastructure and cloud services provisioning, continuous deployments as part of DevOps strategy, and IT services management automation. All require new approaches to enable IT toolchain interoperability across what is now a highly distributed and diverse hybrid IT architecture. New approaches to IT automation will be the topic of this moderated discussion panel.

Q: Why is this topic significant?
A: To compete effectively in the modern digital business era, enterprises must seek new competitive advantages. While there are many ways to achieve this, common to nearly all enterprises is the need to improve business and IT process efficiencies, and to automate these improvements so they execute reliably and consistently.

In a recent 451 Research survey, we asked business and IT decision-makers about their current and expected future automation efforts. Of the 881 respondents, 50% stated that automation in their IT environment is mostly manual, with some automated processes; 75% say they expect this to change, citing IT automation to increase in the next 12 months.
Continue reading
1073 Hits

Celebrating the 15th Annual Hosting & Cloud Transformation Summit – Presenter Q&A with William Fellows

Interview with William Fellows, Research Vice President

Next up in our Presenter Q&A series: Research Vice President William 'Wif' Fellows, who will lead the “Making Infrastructure Invisible – Simplifying the Jumble of Clouds, Containers and Venues” session at the Hosting & Cloud Transformation Summit (HCTS) on September 24 at 3:40 pm PT, in the Orovada Breakout Room 1 at the Aria in Las Vegas.

Q: What will you be discussing in this session?

A: The emergence of clouds, containers and microservices promises to dramatically simplify the deployment of applications and services in support of digital transformation, which enterprises tell us remains the organizing principle for their IT activity. However, this potential will only be realized if management and orchestration across these hybrid deployments is done effectively and with transparency to assure end users that services will meet their performance standards. We’ll discuss:
Continue reading
1120 Hits

IBM acquires Red Hat, but what does that mean?

Key Analyst: Jay Lyman, Principal Analyst, Cloud Native and DevOps

Earlier this week, IBM announced its intent to acquire Red Hat for $33.4 billion, causing quite a buzz across the entire enterprise IT industry. That makes sense, given that the deal represents one of the industry’s biggest software acquisitions to date, IBM’s largest acquisition to date and considering the repercussions for key segments including cloud computing, hybrid cloud, automation and DevOps, containers and Kubernetes.

It feels like the end of an era in addition to the end of the road with Red Hat’s acquisition. This open source software pioneer helped force the enterprise IT industry, including software giants such as Microsoft and VMware, to take open source seriously. IBM acquiring Red Hat marks the beginning of the new age of open source – one populated by established giants and newer open source endeavors such as the dozens of projects that surround and support Kubernetes. That new age is also all about hybrid cloud and cloud-native applications. From a technology M&A standpoint, at $33.4 billion this acquisition marks the largest software transaction we’ve tracked since 2002 – which is impressive and validates the importance and prominence of open source software in the enterprise IT industry.

The deal is also very much about hybrid cloud. According to our Voice of the Enterprise: Cloud Hosting and Managed Services – Budgets and Outlook survey from earlier this year, 58% of respondents cited their organization is pursuing a hybrid strategy. Almost half (46%) of respondents said they expect their businesses to increase vendor spending to spread out more across vendors, either new or existing, as part of their hybrid strategy. IBM and other major public cloud providers like AWS, Microsoft and Google have also broadened their reach in hybrid cloud, but it has been limited mainly to their own clouds integrating with on-premises environments. In IBM’s case, all those public cloud players are already key partners and integrations for Red Hat’s software such as RHEL and OpenShift – expanding IBM’s hybrid cloud reach even further. This deal also follows collaboration and integration that IBM and Red Hat have already accomplished on software such as Linux, OpenStack, and containers and Kubernetes.

Despite how this deal is beneficial for both companies, there is some risk. Upholding Red Hat’s culture could be a challenge during the transition, despite both companies acknowledging its importance. To maintain Red Hat’s strategy and momentum and take full advantage of it, IBM will need to maintain Red Hat’s globally distributed work force with many remote employees rather than to consolidate it. Allowing Red Hat to operate as an independent unit would prove beneficial for everyone.

The deal is also an interesting consolidation of PaaS, given IBM based its former Bluemix PaaS software – later to become its CaaS – on the open source Cloud Foundry software and Red Hat's OpenShift PaaS is based on different open source code. For a deeper analysis on the deal, current clients can check out our deal analysis in our Research Dashboard.
6655 Hits

The Trouble with Cloud “Repatriation”

Key analysts: Liam Eagle - Research Manager, Voice of the Enterprise: Cloud, Hosting & Managed Services, and Melanie Posey - Research Vice President and General Manager, Voice of the Enterprise

We have all known someone who regretted getting a tattoo. A seemingly permanent choice they loved ten years ago, is now covered up or removed. But who knows where they will be in another 10 years. In a nutshell, this analogy describes the phenomenon many IT analysts called “cloud repatriation” – the shift of workloads from public cloud to local infrastructure environments.

We have also called this phenomenon “cloud repatriation,” though not without some internal healthy debates about the validity of the phrase (yes – it rivaled the infamous “what color is the dress” debate for us). The debate focused on the word “repatriation” and how it suggests a transition to a permanent state of being. That doesn’t accurately describe an enterprise’s relationship with IT infrastructure. Much like our acquaintance full of tattoo regret, the priorities, needs and sometimes the entire business model of an enterprise changes and makes that new infrastructure less effective or appropriate than it was first deployed.
The Trouble with Cloud Repatriation fig 1The term “repatriation” takes a lot for granted. It assumes a permanent outcome or “permanent residency,” if you will. It assumes an improvement of some kind. And it assumes some failure to deliver on the part of public cloud. But what about the rolling back, for instance, of a failed cloud migration project? Is that repatriation? What if they try again in a few months?

In our Voice of the Enterprise (VoTE) Cloud Transformation, Organizational Dynamics 2017 survey, 34% of respondents said they had moved their workloads from a public cloud to a private environment (cloud or otherwise). Notably, when we asked them to cite their reasons for the move, many of those matched the reasons we know businesses ultimately decide to shift to the public cloud in the first place: performance/availability issues, high cost, latency issues, security and more.
The Trouble with Cloud Repatriation fig 2
These reasons appeared in our survey again when we asked respondents about their reasons for using multiple infrastructure environments to operate individual workloads. Forty-seven (47%) percent of them said improving performance/availability was one of the main reasons for leveraging multiple infrastructure environments.

What does this mean? For us at 451 Research, we believe this phenomenon we are all hinting at isn’t a repatriation or a reverse migration, but a cloud evolution. Our data shows more businesses see the value of a hybrid IT strategy. In fact, 58% of VotE survey respondents said that they are “moving toward a hybrid IT environment that leverages both on-premises systems and off-premises cloud/hosted resources in an integrated fashion.” Hybrid IT does offer the opportunity to build a framework for workload portability and mobility to match the ever-changing needs (or tastes, looking back at the tattoo analogy) of an organization. While this strategy isn’t the solution for all IT organizations, it is true that VMs are rarely stagnant because the best execution venue (our term for “home”) changes based on the available resources and shifting requirements around issues like performance, security and availability.

So, here is our problem: how will we rename cloud repatriation? We have played with many ideas like workload (re)balancing and liquid workloads (no, we are not suggesting you water your workload like a plant – water and electronics still do not mix), but those don’t sit well with us either. Tweet us your ideas for a new term, one that fits the "fluid" nature of the best execution venue for workloads and hybrid IT. 

6839 Hits

Too Many Imperatives, Too Little Time

By Bob Winter

As Managing Director for part of 451’s Advisory team, I work with clients across all of the 451 channels. My team in particular is charged with helping marketers answer the most vexing questions their customers are asking them, using a combination of custom research and analyst insights.

Usually about mid-year, it becomes clear to me that there are three or four questions that thread through the dozens of engagements we do every year.

This year those questions, of course, can all be tracked back to “The Cloud”.

Specifically, it has become clear that the cloud is entering the “Early Majority” market, where mainstream, conservative businesses are embracing the move to a hybrid cloud model of digital architecture.

Some of the questions this raises are:
-How do we determine what workloads to move to the cloud?
-How do I adjust my development model to adapt to this transition?
-How do I deploy in this blurred environment where Infrastructure as a Service (IaaS), Platform as a Service(PaaS) and Containers seem to be converging and offering multiple options to achieve the same result?
-What are the risks ‚Äì and benefits ‚Äì of moving my applications? 

We address these answers in a Black and White paper we did recently for Red Hat entitled:

“THE IMPERATIVE FOR HYBRID IT: CLOUD AND INNOVATION IN THE MODERN ERA” By Donnie Berkholz, Research Director, Development, DevOps and IT Ops, 451 Research

You can download the paper here.

If you have any questions, feel free to contact me: bob.winter@the 451group.com, or Donnie Berkholz: donnie.berkholz@451research.com.
5085 Hits