Future of Productivity Software Part 2: The Future of Work is WorkOps

Blog post contributed by Chris Marsh, Research Director - Workforce Productivity & Compliance

In the first blog in this three-part series, we pointed to how in the report we question the ‘best of breed will win’ narrative and illustrated that the white space within the category opens the door for a new software archetype – ‘Workforce Intelligence Platform’ (WIP). In part 2, we will describe the new ‘WorkOps’ behaviors the WIP will underpin as reported in our Technology & Business Insight (TBI) report The Future of Productivity Software: New Work Archetypes, WorkOps and the Workforce Intelligence Platform.

Few doubt that the nature of work is being transformed by innovative new technologies, yet there isn’t the right way to describe this future for work. Enterprises struggle to conceive of it and vendors struggle to succinctly describe what they are enabling. At 451 Research, we use the term ‘WorkOps’ to help describe this future. In an obvious parallel with DevOps, WorkOps aims to unify business goals, work design and its execution – by having intelligence, workflow automation, collaboration and reporting flexibly tesselate across the lifecycle for more rapid and responsive work execution. WorkOps manifests in the local agility of the team or “TeamOps” which is fueled by “SoloOps” or the individual members of the workforces’ productivity gained from the emergent capabilities at their disposal.

Several trends we outline more fully in the TBI report underpin the emergence of WorkOps, TeamOps and SoloOps. Application estates will grow, but more work will execute across these apps relative to what is within apps. Improved search, intelligence, automation and connectivity bolster that transversal work. As a result, collaboration becomes more purposeful and the real-time modeling of work is more achievable. Highly flexible resource management allows work to pass out of formal reporting hierarchies into looser forms of self-organization.

We describe how WorkOps is conceived less formally than project management but is more open to change and adaptation as it tools the decentralization of work into teams where everyone essentially becomes a ‘project manager.’ WorkOps is an agile and lean method yet with a broader focus across the spectrum of work scenarios. In the Future of Productivity TBI report, we also provide:
  • A ‘hierarchy of employee motivation’ and ‘four pillars of employee engagement’ which describes how technology needs to support SoloOps and TeamOps towards executing improved business outcomes.
  • A cross plot illustration of how WorkOps relates to other work styles.
  • A graphical representation of what we mean by ‘the liquid enterprise,' the basis for enterprises’ future digital competitiveness.
In the final blog in this series, we will focus on how, with these changes, the common product value archetypes – such as managing work, collaborating around it, accessing applications, creating assets, reporting on work – are becoming less coherent as ways to explain modern work. The new emerging archetypes require vendors to design a message around them.
85 Hits

Prepping for HCTS – Q&A with Research Vice President Andy Lawrence

Our next HCTS 2018 speaker is very familiar with the Summit’s main stage: Research Vice President Andy Lawrence. Andy specializes in datacenter efficiency, market evolution and automation, and serves as Executive Director of our sister company, Uptime Research.

Q: What did you discuss last year? 

A: For the past four years, I have presented a keynote and led a discussion at HCTS that has focused on datacenter evolution as demands – including IT, engineering and market demands – change. Because the headline theme has not changed a lot, at times I have worried the audience would find the subject of limited relevance or repetitive, but none of these discussions or keynote presentations has come across as remotely like the one the year before. In fact, these sessions ended up being very lively, informative and sometimes surprising each year. The one constant in these discussions is our belief that datacenter designers and operators must be more responsive to the demands of their customers, because the way datacenters are designed and run is becoming ever more bundled with the services that can be supported. This belief came across clearly at HCTS 2017, as datacenter operators discussed how they are moving to lower cost, easily replicated, and large scale granular designs to meet the needs of the cloud service providers. 

Q: What was your biggest take away from last year’s HCTS?

A: One of the points that we made last year is there is strong pressure for suppliers to keep prices low to encourage business, but service providers and enterprises also want the supplier’s datacenter to meet many other demands including a high level of resiliency and agility, great connectivity, reliable analytics and to be energy efficient.  This year, as last year, demand for datacenter capacity is strong, very cloudy, and coming in large chunks – for which service providers do not want to pay too much. It is always intriguing to see if and how these competing demands are met.

Q: What will you be discussing in this session?

A: Recently, an entirely new category (though we are 451 Research have talked about it for the past few years) is beginning to emerge: edge datacenters. Edge is raising all kinds of questions: Is it all just hype? Who will own them? What will they look like? How will resiliency be achieved? Where will they be located and why?  

Edge is often described as both “paradigm-shifting” and “revolutionary,” but it is also just a new iteration of an existing technology or market. In my session, I will argue that, to a degree, both descriptions are correct: Edge computing means local IT connected to a core - a model which we have had in offices, factories, retail outlets, labs and telco facilities since the 1980s. At the same time, when you think of Edge as an extension of an advanced cloud infrastructure fabric – adding local compute and storage, supporting automation and self-optimization, and aggregating, policing and managing traffic – then its new importance is obvious.  

Q: Why should HCTS attendees find this session valuable/what can they hope to gain?

A: While some trends are hard to predict, the drivers for Edge computing are so strong that the only disagreement is not about if the trend will kick in, but when it will and how big the market and build out be. As discussed in 451 Research’s 2017 report – “Datacenters at the Edge”—5G, CDN, video, IoT, Edge analytics, augmented reality, driverless cars are just some of the drivers, but there are many others. This trend is not a swing of the pendulum from core to edge, but the beginning of a new, and complementary build out that has a lot of suppliers, operators and investors pretty excited. 

Forward looking analysis by 451 Research’s IoT team and by Uptime Institute Research’s infrastructure analysts and in-depth surveys, all point to a big build out of edge capacity. According to Uptime Institute’s recent annual global datacenter operator’s survey, 40% of respondents said their organization will require edge computing, while another 30% not sure. Who will own and operate what in the future is also an intriguing discussion: 37% of the Uptime survey respondents said they will use a mix of their own and colocation datacenters to manage and host their infrastructure. Does this suggest that while core enterprise data center may shrink in the middle and core layers, it will grow at the edge? I will try to add context and answer these questions at HCTS.

Q: Why are you excited to attend this year’s HCTS?

A: Elsewhere at HCTS, the topic that catches my eye and interest most is “building partnerships with hyperscalers.” While cloud is undoubtedly and definitively the architecture for the future, I still want to hear more about how they will meet all the concerns about governance, transparency, resiliency, lock in, openness, and even the role of large monopoly players in the IT ecosystem and in large economies. For me, discussions of cost, APIs, functions and services are interesting, but enterprise managers I talk to are also concerned about strategic risk. The most successful service providers and big cloud providers will need to score well on all these fronts. 

We are excited to welcome Andy back to the main stage at HCTS, which will be held at the Bellagio in Las Vegas, September 24-26. Register for HCTS 2018 to hear all the speakers you have met thus far in our ongoing series, including the previous Q&A with Melanie Posey.  

Our next HCTS 2018 speaker is very familiar with the Summit’s main stage: Research Vice President Andy Lawrence. Andy specializes in datacenter efficiency, market evolution and automation, and serves as Executive Director of our sister company, Uptime Research.

Q: What did you discuss last year?

A: For the past four years, I have presented a keynote and led a discussion at HCTS that has focused on datacenter evolution as demands – including IT, engineering and market demands – change. Because the headline theme has not changed a lot, at times I have worried the audience would find the subject of limited relevance or repetitive, but none of these discussions or keynote presentations has come across as remotely like the one the year before. In fact, these sessions ended up being very lively, informative and sometimes surprising each year. The one constant in these discussions is our belief that datacenter designers and operators must be more responsive to the demands of their customers, because the way datacenters are designed and run is becoming ever more bundled with the services that can be supported. This belief came across clearly at HCTS 2017, as datacenter operators discussed how they are moving to lower cost, easily replicated, and large scale granular designs to meet the needs of the cloud service providers.

Q: What was your biggest take away from last year’s HCTS?
A:
One of the points that we made last year is there is strong pressure for suppliers to keep prices low to encourage business, but service providers and enterprises also want the supplier’s datacenter to meet many other demands including a high level of resiliency and agility, great connectivity, reliable analytics and to be energy efficient.  This year, as last year, demand for datacenter capacity is strong, very cloudy, and coming in large chunks – for which service providers do not want to pay too much. It is always intriguing to see if and how these competing demands are met.

Q: What will you be discussing in this session?

A: Recently, an entirely new category (though we are 451 Research have talked about it for the past few years) is beginning to emerge: edge datacenters. Edge is raising all kinds of questions: Is it all just hype? Who will own them? What will they look like? How will resiliency be achieved? Where will they be located and why? 

Edge is often described as both “paradigm-shifting” and “revolutionary,” but it is also just a new iteration of an existing technology or market. In my session, I will argue that, to a degree, both descriptions are correct: Edge computing means local IT connected to a core - a model which we have had in offices, factories, retail outlets, labs and telco facilities since the 1980s. At the same time, when you think of Edge as an extension of an advanced cloud infrastructure fabric – adding local compute and storage, supporting automation and self-optimization, and aggregating, policing and managing traffic – then its new importance is obvious. 

Q: Why should HCTS attendees find this session valuable/what can they hope to gain?

A: While some trends are hard to predict, the drivers for Edge computing are so strong that the only disagreement is not about if the trend will kick in, but when it will and how big the market and build out be. As discussed in 451 Research’s 2017 report – “Datacenters at the Edge”—5G, CDN, video, IoT, Edge analytics, augmented reality, driverless cars are just some of the drivers, but there are many others. This trend is not a swing of the pendulum from core to edge, but the beginning of a new, and complementary build out that has a lot of suppliers, operators and investors pretty excited.

Forward looking analysis by 451 Research’s IoT team and by Uptime Institute Research’s infrastructure analysts and in-depth surveys, all point to a big build out of edge capacity. According to Uptime Institute’s recent annual global datacenter operator’s survey, 40% of respondents said their organization will require edge computing, while another 30% not sure. Who will own and operate what in the future is also an intriguing discussion: 37% of the Uptime survey respondents said they will use a mix of their own and colocation datacenters to manage and host their infrastructure. Does this suggest that while core enterprise data center may shrink in the middle and core layers, it will grow at the edge? I will try to add context and answer these questions at HCTS.

Q: Why are you excited to attend this year’s HCTS?

A: Elsewhere at HCTS, the topic that catches my eye and interest most is “building partnerships with hyperscalers.” While cloud is undoubtedly and definitively the architecture for the future, I still want to hear more about how they will meet all the concerns about governance, transparency, resiliency, lock in, openness, and even the role of large monopoly players in the IT ecosystem and in large economies. For me, discussions of cost, APIs, functions and services are interesting, but enterprise managers I talk to are also concerned about strategic risk. The most successful service providers and big cloud providers will need to score well on all these fronts.

We are excited to welcome Andy back to the main stage at HCTS, which will be held at the Bellagio in Las Vegas, September 24-26. Register for HCTS 2018 to hear all the speakers you have met thus far in our ongoing series, including the previous Q&A with Melanie Posey.  
83 Hits

Research Team News: New Roles for Seven Key 451 Research Analysts

Our global team of analysts works diligently to provide critical and timely insight into disruptive, innovative technologies and the companies building them out. We appreciate this hard work and are very proud of our analysts when they get to reap the rewards of that work. With that in mind, we are so excited to share that seven key analysts have recently moved to new roles at 451 Research.

Sheryl Kingstone, Matt Aslett and Christian Renaud have been promoted to Research Vice President. 

Together with Nick Patience, Matt will lead the expanded Data, Analytics and AI team. Matt has been with 451 Research nearly 11 years, primarily focusing on data management, data catalogs, business intelligence and analytics, and data science management, but he recently has been spending more of his time on AI. He is widely recognized in the industry, speaking at several industry and client events every year, and was named among the top 200 Thought Leaders in the field of Big Data and Analytics by AnalyticsWeek. 

Sheryl will continue to lead the Customer Experience & Commerce research, and she has also assumed the role of General Manager of Voice of the Connected User Landscape research. Sheryl is known for her expertise in customer experience software markets spanning ad tech, marketing, sales, commerce and service. On top of her regular research and speaking engagements, Sheryl contributes to Smart Customer Service. As the General Manager for Voice of the Connected User Landscape, she leverages her years of survey research experience to drive a significant part of our Customer Insight research.

Christian will lead the 451 Research Internet of Things (IoT) research team. Christian has spearheaded the development of our IoT coverage across all our research products: Market Insight, Technology & Business Insight, Voice of the Enterprise, Voice of the Connected User Landscape and Market Monitor. Examples include the Economics of IoT Technology & Business Insight report, which he worked on with Owen Rogers, and a Market Insight report about the FMCSA Electronic Logging Device (ELD) mandate. He frequently speaks at various events about emerging technology, entrepreneurship and trends. 

Dan Thompson has been promoted to Research Director – Multi-tenant Datacenters. In this new role, Dan will lead the North American MTDC team. Dan has covered multi-tenant datacenter landscapes in Japan, Indonesia, Singapore and Malaysia, as well as parts of the US. He is particularly focused on multi-tenant datacenters that are attempting to move up the stack to offer additional services beyond colocation and connectivity, including disaster recovery, security and cloud. Not only has Dan proven himself as a great analyst, but he has also been an important mentor within the organization. 

Newly promoted to Research Director, Jordan McKee leads our Customer Experience & Commerce coverage. Jordan is well known for his research focused on digital transformation across the commerce value chain, emphasizing major trends impacting retail, payment networks, payment processors and point-of-sale providers. Jordan regularly speaks at industry events and engages with the media, and he was listed on the Electronic Transactions Association's Forty Under 40 list for 2018.

Pat Daly and Teddy Miller have been promoted to Analyst. Pat has been a key member of both the Information Security and IoT teams. Pat is an alumnus of our first Research Associate class and progressed rapidly through the development model. Pat has established himself as the go-to analyst for insights on IoT security, including the protection of critical infrastructure, transportation and medical devices. He recently contributed to the Market Insight report covering the DUO Security acquisition.

Starting as an intern at 451 Research, Teddy has become a key member of the Multi-tenant Datacenter team with his coverage of the Chinese datacenter market. Teddy has made it possible for us to expand our coverage of this market and maintain our granular understanding of it as seen in his Market Map reports.  

Congratulations Matt, Sheryl, Christian, Dan, Jordan, Pat and Teddy! Thank you – to all our analysts – for their hard work. If you would like to explore their available research further, either log in to the Research Dashboard or apply for trial access.

Together with Nick Patience, Matt will lead the expanded Data, Analytics and AI team. Matt has been with 451 Research nearly 11 years, primarily focusing on data management, data catalogs, business intelligence and analytics, and data science management, but he recently has been spending more of his time on AI. He is widely recognized in the industry, speaking at several industry and client events every year, and was named among the top 200 Thought Leaders in the field of Big Data and Analytics by AnalyticsWeek. 

161 Hits

Future of Productivity Software Part 1: The Tortoise and the Hare

Blog post contributed by Chris Marsh, Research Director - Workforce Productivity & Compliance

As the tale goes, the hare races off leaving the tortoise in its tracks, only to slow down and allow the tortoise to triumph. The fable reminds me of the productivity software category. Growing innovation across collaboration, content and work management, workspaces, in assert creation tools and other segments has made a once dull category exciting again. It has heralded new rising stars to become the new incumbents, and some have even pushed the ‘best of breed will determine the future of IT’ narrative as part of the excitement. The narrative typically states that a few winners across the product categories emerge and play nicely with each other to serve future enterprise needs. We think this is premature:

  • With enterprises scrambling to give effect to their digital transformations, the term ‘best of breed’ has in fact been the path of least resistance – a better way to doing something everyone already understands. In this instance, a better way to manage content collaborate or provide access to applications.
  • The opportunity to disrupt within their segment to drive market share has downplayed the need to disrupt beyond their segment to position for the future of work that will look very different from the way it does now.
  • These vendors risk being the hare in the fable by not understanding the tortoise. The “tortoise” isn’t the legacy players, but rather is the vendors tooling the proverbial category “white space” that best of breed vendors have left wide open.
Existing product segments largely reflect the symbiosis of legacy tools, ways of organizing and existing buyers and buying rationales that must now be recast due to the imperative need to digitally transform. Productivity software will, as a result, have more opportunities to evolve from intersections across the category, segments catalyzed by new technologies such as AI and machine learning, than it will from innovations within each alone. Significant white space is defined by bringing together the reach across the spectrum of work scenarios with the ability to manage them with sophistication, whereas most tooling has historically traded off one for the other. We at 451 Research believe this white space will be inhabited by a new archetype software we term the ‘Workforce Intelligence Platform’ (WIP).

Catalyzed by intelligence, hybrid integration, decentralized workflow automation and likely the growing attention from the mega caps the WIP will have a strong gravitational pull on the entire category. This doesn’t have to imply the pendulum swings to all-in-one solutions, the way we currently conceptualize the WIP entertains several enablers of what we term transversal WorkOps, the subject of the second blog in this series. The inference is, though, that where we are now – legacy vendors disrupted by better alternatives within the same segment – just marks the beginning of even more innovation yet to come.

In the “The Future of Productivity Software” Technology & Business Insight (TBI) report, we:

  • Highlight the chronic challenges enterprises face with their workforce productivity, which the SaaS and best of breed explosion hasn’t done much to alleviate.
  • Provide a schema showing value across the tooling pyramid is inverting.
  • Propose a technical architecture for the WIP, map the positions of each productivity software segment relative to the white space it inhabits and outline in detail the directional travel of each towards or away from that space.
We will continue to discuss the future of productivity software and the corresponding TBI report in a three-part series. In the next blog, we will outline what we call “new WorkOps behaviors” that will support the emergence of what we term the ‘liquid enterprise’ – new digital native businesses, that will define future business competitiveness.
441 Hits

Webinar - The Age of Consumption: HCTS 2018 Q&A

451 Research’s 14th annual Hosting & Cloud Transformation Summit (HCTS) in Las Vegas from September 24-26 is coming up fast! HCTS welcomes executives in the hosting, cloud, datacenter and managed services sectors to hear timely, actionable insight into the competitive dynamics of innovation from 451 Research analyst talent and guest speakers including AWS, Microsoft, Huawei, ING, DH Capital and more. In this webinar, we will be discussing what to expect from this year’s Summit, some highlights from previous years and asking about upcoming sessions with three 451 Research Vice Presidents who will be taking the main stage at HCTS: Kelly Morgan, Melanie Posey and Al Sadowski. Join this Q&A webinar to learn about:
  • What is the Age of Consumption and what should we expect to see from it?
  • Why should organizations have a strong business-IT alignment when moving forward with a Digital Transformation strategy?
  • What, if any, key shifts are happening in the multi-tenant datacenter space as a result of the age of consumption?
  • Why are hyperscalers both a friend AND foe to other service providers?
The webinar will be held August 21, 2018, at 2:00 pm ET. Register to attend below. 
 
410 Hits