Future of Productivity Software Part 2: The Future of Work is WorkOps

Blog post contributed by Chris Marsh, Research Director - Workforce Productivity & Compliance

In the first blog in this three-part series, we pointed to how in the report we question the ‘best of breed will win’ narrative and illustrated that the white space within the category opens the door for a new software archetype – ‘Workforce Intelligence Platform’ (WIP). In part 2, we will describe the new ‘WorkOps’ behaviors the WIP will underpin as reported in our Technology & Business Insight (TBI) report The Future of Productivity Software: New Work Archetypes, WorkOps and the Workforce Intelligence Platform.

Few doubt that the nature of work is being transformed by innovative new technologies, yet there isn’t the right way to describe this future for work. Enterprises struggle to conceive of it and vendors struggle to succinctly describe what they are enabling. At 451 Research, we use the term ‘WorkOps’ to help describe this future. In an obvious parallel with DevOps, WorkOps aims to unify business goals, work design and its execution – by having intelligence, workflow automation, collaboration and reporting flexibly tesselate across the lifecycle for more rapid and responsive work execution. WorkOps manifests in the local agility of the team or “TeamOps” which is fueled by “SoloOps” or the individual members of the workforces’ productivity gained from the emergent capabilities at their disposal.

Several trends we outline more fully in the TBI report underpin the emergence of WorkOps, TeamOps and SoloOps. Application estates will grow, but more work will execute across these apps relative to what is within apps. Improved search, intelligence, automation and connectivity bolster that transversal work. As a result, collaboration becomes more purposeful and the real-time modeling of work is more achievable. Highly flexible resource management allows work to pass out of formal reporting hierarchies into looser forms of self-organization.

We describe how WorkOps is conceived less formally than project management but is more open to change and adaptation as it tools the decentralization of work into teams where everyone essentially becomes a ‘project manager.’ WorkOps is an agile and lean method yet with a broader focus across the spectrum of work scenarios. In the Future of Productivity TBI report, we also provide:
  • A ‘hierarchy of employee motivation’ and ‘four pillars of employee engagement’ which describes how technology needs to support SoloOps and TeamOps towards executing improved business outcomes.
  • A cross plot illustration of how WorkOps relates to other work styles.
  • A graphical representation of what we mean by ‘the liquid enterprise,' the basis for enterprises’ future digital competitiveness.
In the final blog in this series, we will focus on how, with these changes, the common product value archetypes – such as managing work, collaborating around it, accessing applications, creating assets, reporting on work – are becoming less coherent as ways to explain modern work. The new emerging archetypes require vendors to design a message around them.
4566 Hits

Future of Productivity Software Part 1: The Tortoise and the Hare

Blog post contributed by Chris Marsh, Research Director - Workforce Productivity & Compliance

As the tale goes, the hare races off leaving the tortoise in its tracks, only to slow down and allow the tortoise to triumph. The fable reminds me of the productivity software category. Growing innovation across collaboration, content and work management, workspaces, in assert creation tools and other segments has made a once dull category exciting again. It has heralded new rising stars to become the new incumbents, and some have even pushed the ‘best of breed will determine the future of IT’ narrative as part of the excitement. The narrative typically states that a few winners across the product categories emerge and play nicely with each other to serve future enterprise needs. We think this is premature:

  • With enterprises scrambling to give effect to their digital transformations, the term ‘best of breed’ has in fact been the path of least resistance – a better way to doing something everyone already understands. In this instance, a better way to manage content collaborate or provide access to applications.
  • The opportunity to disrupt within their segment to drive market share has downplayed the need to disrupt beyond their segment to position for the future of work that will look very different from the way it does now.
  • These vendors risk being the hare in the fable by not understanding the tortoise. The “tortoise” isn’t the legacy players, but rather is the vendors tooling the proverbial category “white space” that best of breed vendors have left wide open.
Existing product segments largely reflect the symbiosis of legacy tools, ways of organizing and existing buyers and buying rationales that must now be recast due to the imperative need to digitally transform. Productivity software will, as a result, have more opportunities to evolve from intersections across the category, segments catalyzed by new technologies such as AI and machine learning, than it will from innovations within each alone. Significant white space is defined by bringing together the reach across the spectrum of work scenarios with the ability to manage them with sophistication, whereas most tooling has historically traded off one for the other. We at 451 Research believe this white space will be inhabited by a new archetype software we term the ‘Workforce Intelligence Platform’ (WIP).

Catalyzed by intelligence, hybrid integration, decentralized workflow automation and likely the growing attention from the mega caps the WIP will have a strong gravitational pull on the entire category. This doesn’t have to imply the pendulum swings to all-in-one solutions, the way we currently conceptualize the WIP entertains several enablers of what we term transversal WorkOps, the subject of the second blog in this series. The inference is, though, that where we are now – legacy vendors disrupted by better alternatives within the same segment – just marks the beginning of even more innovation yet to come.

In the “The Future of Productivity Software” Technology & Business Insight (TBI) report, we:

  • Highlight the chronic challenges enterprises face with their workforce productivity, which the SaaS and best of breed explosion hasn’t done much to alleviate.
  • Provide a schema showing value across the tooling pyramid is inverting.
  • Propose a technical architecture for the WIP, map the positions of each productivity software segment relative to the white space it inhabits and outline in detail the directional travel of each towards or away from that space.
We will continue to discuss the future of productivity software and the corresponding TBI report in a three-part series. In the next blog, we will outline what we call “new WorkOps behaviors” that will support the emergence of what we term the ‘liquid enterprise’ – new digital native businesses, that will define future business competitiveness.
4581 Hits

The Trouble with Cloud “Repatriation”

Key analysts: Liam Eagle - Research Manager, Voice of the Enterprise: Cloud, Hosting & Managed Services, and Melanie Posey - Research Vice President and General Manager, Voice of the Enterprise

We have all known someone who regretted getting a tattoo. A seemingly permanent choice they loved ten years ago, is now covered up or removed. But who knows where they will be in another 10 years. In a nutshell, this analogy describes the phenomenon many IT analysts called “cloud repatriation” – the shift of workloads from public cloud to local infrastructure environments.

We have also called this phenomenon “cloud repatriation,” though not without some internal healthy debates about the validity of the phrase (yes – it rivaled the infamous “what color is the dress” debate for us). The debate focused on the word “repatriation” and how it suggests a transition to a permanent state of being. That doesn’t accurately describe an enterprise’s relationship with IT infrastructure. Much like our acquaintance full of tattoo regret, the priorities, needs and sometimes the entire business model of an enterprise changes and makes that new infrastructure less effective or appropriate than it was first deployed.
The Trouble with Cloud Repatriation fig 1The term “repatriation” takes a lot for granted. It assumes a permanent outcome or “permanent residency,” if you will. It assumes an improvement of some kind. And it assumes some failure to deliver on the part of public cloud. But what about the rolling back, for instance, of a failed cloud migration project? Is that repatriation? What if they try again in a few months?

In our Voice of the Enterprise (VoTE) Cloud Transformation, Organizational Dynamics 2017 survey, 34% of respondents said they had moved their workloads from a public cloud to a private environment (cloud or otherwise). Notably, when we asked them to cite their reasons for the move, many of those matched the reasons we know businesses ultimately decide to shift to the public cloud in the first place: performance/availability issues, high cost, latency issues, security and more.
The Trouble with Cloud Repatriation fig 2
These reasons appeared in our survey again when we asked respondents about their reasons for using multiple infrastructure environments to operate individual workloads. Forty-seven (47%) percent of them said improving performance/availability was one of the main reasons for leveraging multiple infrastructure environments.

What does this mean? For us at 451 Research, we believe this phenomenon we are all hinting at isn’t a repatriation or a reverse migration, but a cloud evolution. Our data shows more businesses see the value of a hybrid IT strategy. In fact, 58% of VotE survey respondents said that they are “moving toward a hybrid IT environment that leverages both on-premises systems and off-premises cloud/hosted resources in an integrated fashion.” Hybrid IT does offer the opportunity to build a framework for workload portability and mobility to match the ever-changing needs (or tastes, looking back at the tattoo analogy) of an organization. While this strategy isn’t the solution for all IT organizations, it is true that VMs are rarely stagnant because the best execution venue (our term for “home”) changes based on the available resources and shifting requirements around issues like performance, security and availability.

So, here is our problem: how will we rename cloud repatriation? We have played with many ideas like workload (re)balancing and liquid workloads (no, we are not suggesting you water your workload like a plant – water and electronics still do not mix), but those don’t sit well with us either. Tweet us your ideas for a new term, one that fits the "fluid" nature of the best execution venue for workloads and hybrid IT. 

12456 Hits

Creating Value with an Enterprise Data Bazaar

Lead researcher: Katy Ring, Research Director – IT Services

At 451 Research, we believe ‘the enterprise data bazaar’ can help organizations that aim to become more agile by using data to inform the direction and development of their businesses. The phrase ‘enterprise data bazaar’ is a term used to define an environment where many people can access and leverage that information to build data-driven products.

To achieve this, businesses need unified data management layers, so that data scientists and subject matter experts can decide how to deal with the stored data. These layers enable the use of datasets – or a data lake – to provide value without siloing information within the organization. However, many organizations have ended up with what could be described as a ‘data swamp’ – a single environment housing large volumes of raw data that cannot be easily accessed for any purpose, let alone multiple uses. Creating a data bazaar with these management layers would break apart the swamp by putting security at the foundation of this approach by building out the data governance and self-service data preparation functionalities.

When speaking to our clients that have data lakes, many are struck with the realization that they did not fully comprehend the risks associated with what they have built. Companies struggle to audit their lakes as part of compliance measures since each source system has difference governance and security policies. This struggle is caused by the self-service nature of a data lake, where data can be access for nearly any purpose, making it unclear is a company has protected PII data as part of regulations like GDPR.  

When companies are in this scenario, vendors and service providers are opening an internal role for a chief data officer (CDO) that can help get the business back on track. Together, this group can work out a remedy for the situation. One solution is to build a “sandbox” environment that includes company-wide policy, controls and metadata management with a ‘citizen’ data integrator tool which allows the user to give back or develop analytics on how they are using the data. With this type of tool, users can still access data in a self-service way and allow that access to be overseen by the IT group or CDO before it moves to production as a data product.

In addition to this self-service ‘sandbox’ data preparation layer, IT service providers can help companies with data governance and the data supply chain. Such providers assist in sourcing, managing and enriching the data, and sell managed services for policing data consumption. For example, in an audit, organizations need to know the data they hold, who uses it and what for. This regulation provides a strong opportunity for developing the enterprise data bazaar.

Furthermore, the self-service analytics and governance layers need to be architected the right way to enable a range of use cases over time, and this is often not what results from the development of a single-use-case project. Therefore a CDO role is so very important: this individual is the internal champion with authority to get agreement on a company-wide strategy for the capture, management and sharing of data.

Katy Ring, research director of IT services at 451 Research, examines the benefits of enterprise data bazaar, the technologies, service providers and strategies used to enable them in her Technology and Business Impact report on the Enterprise Data Bazaar. Learn more about this report.
2305 Hits

Congratulations Ian Hughes: Doctor IoT!

Last week, 451 Research’s senior analyst for the Internet of Things Ian Hughes was honored with an honorary degree of Doctor of Technology from Solent University.

ian hughes blog imageIn many ways, such an honor was inevitable for Ian, at least in our not-so-humble opinion. With 30 years of experience in the industry, Ian has served as an emerging technologies developer, software engineer, cross-industry app developer and a video game programmer before joining the 451 Research family. He has numerous patents to his name including virtual sporting event simulations, activity tracking and location-based services.

Most notably, Ian became an entrepreneur in 2009 when he started his Feeding Edge. During this time, he regularly appeared as “Super Geek” on the UK children’s program “The Cool Stuff Collective,” built virtual multi-user training hospitals and wrote a regular technology column for a fashion magazine. If you are familiar with Ian on social media, you know he also published two sci-fi novels under his online name Epredator.

Ian has been a familiar face at Solent University for years, hosting speaking engagements for students and faculty about Augmented and Virtual Reality for business and industrial usage as well as the gaming industry – his first love. He also chairs the BCS Animation and Games specialist group at the university and is a STEMnet ambassador. We couldn’t be more proud of our “Dr. IoT” and hope you join us in congratulating him!

Ian’s most recent work at 451 Research include a Technology & Business Insight report entitled “Exploring Industrial Internet of Things Adoption Rates and Maturity Across Industry Types,” a webinar on the same topic which can be accessed on our blog and numerous Market Insight reports tackling many areas of the IoT space. Be sure to also check out his interview with Solent University.
3614 Hits