Navigation

Community Innovation

Continuing our guest blog series focusing on all things BPM, independent analyst Sandy Kemsley lends her expertise about the Cloud and microservices. In this post, Sandy explores the power and flexibility that the Cloud and microservices have in the process space and provides her commentary on the advantages companies can leverage when using them.  We couldn’t agree more, as both the Cloud and microservices are important aspects of our process offering. Explore how we’re innovating in this space through our recent Activiti 7 open source project. 

cloud microservices bpm

My usual research and writing topics focus on business process management (BPM) and systems (BPMS), content management, case management, social enterprise and a number of other related application-level technologies. Sometimes, however, I like to dig down into the technical underpinnings to explain how to deploy applications that are resilient, scalable and available. Today, that means looking briefly at cloud architectures and microservices, then looping back to what that means for BPMS’ and other applications.

Cloud Architectures

In the early days of cloud computing, “cloud” was synonymous with multi-tenanted public cloud infrastructure, such as that provided by Google or Amazon. Today, however, it has come to mean other types of virtualized platforms that use on-premise hardware, managed private servers or a combination of public and private. What’s cloud-like about these non-public installations is that they can aggregate physical servers into a pool for use by virtual environments – either full virtual machine (VM) images, or containers that share some of the underlying kernel resources – and manage the dynamic expansion and contraction of resource requirements. They can even manage environments that span public and private hardware as a single environment, either by specifying which VMs/containers should run on private or public hardware based on data sovereignty or external access requirements, or by allowing “cloudbursting” from private to public hosting to manage demand spikes. There’s a huge savings in training technical administrators and DevOps teams by using the same tools for public, hybrid or private cloud deployments, e.g., deploying applications in Docker containers and using Kubernetes for managing those containers, regardless of whether you’re using OpenStack to create your private cloud infrastructure or Amazon’s AWS for public cloud.

To sum up, you need to move beyond full VM virtualization to containerization in order to reap savings in computing resources, then allow those containers to float on a sea of virtualized infrastructure that can scale on demand, and span private and public hosts.

Microservices

Service-oriented architecture – where services of one system/application are offered to another using a standard protocol – is not new. The consumption of services has shifted from heavier SOAP interfaces to lighter-weight RESTful web services, but the basic idea is the same: your application wants to perform a function that is offered by another application, so you make a call to that application, pass it some data, and it performs an action or passes back a result.

Microservices are (arguably) just a modern interpretation of service-oriented architecture, but are much more about how a system that offers services is designed: instead of a single monolithic system with multiple service endpoints, it’s a set of independently-deployable services that communicate with each other using technology-agnostic protocols. This allows individual microservices to be scaled as required, and updated to meet new business requirements without redeploying the entire system. Considering how this works with cloud infrastructure described above, individual microservices can be virtualized in their own container(s), and different microservices from the same logical system can be deployed on private or public cloud infrastructure.

The code used to call microservices isn’t much different from calling a monolithic system via its service endpoints; in production, however, the microservices offer greater resilience and scalability, and can be upgraded to newer versions more quickly with less regression testing.

3

What Does This Mean For BPM?

Getting back to BPM, think about how cloud infrastructure impacts your BPMS deployment:

  • Cloud architectures require a different BPM architecture. Not all BPM solutions are built for cloud-native architectures: a monolithic BPMS stuffed into a Docker container will not be able to leverage the advantages of modern cloud infrastructures, but must be deployed to the public cloud in an “all or nothing” fashion. Consider carefully your current and future needs as you consider investments.
  • Application portability drives agility. Deploying your BPMS on a cloud infrastructure allows your deployment teams to leverage private, hybrid or public cloud infrastructure to support your business strategies: for example, new requirements for mobile process participants coupled with an ability to quickly take advantage of public cloud infrastructures outside your firewall may give you a competitive advantage. Administration is also streamlined, with BPMS administration separated from the “bare metal” administration issues, and deployment teams using unified tooling for cloud administration, monitoring, and scaling.
  • Old problems, better solutions. BPM deployments are often at the leading edge of an organization’s use of technology, creating complexity for BPMS vendors in scalability, user interactions, distributed automation and security. Cloud services provide options for faster, simpler and more cost-effective solutions, such as replacing traditional multi-tenancy with newer serverless approaches. Many BPM deployments can also benefit from cloud-based intelligent services – machine learning, blockchain, IoT and more – to solve old BPM problems with better solutions.
  • Microservices drive real value. Rather than old-school monolithic BPM, a microservice- and container-enabled BPM architecture reduces the headache of creating infrastructures to address new use cases. DevOps teams can roll out new services quickly with less risk due to service decoupling, and the distributed nature of such architectures seamlessly facilitates public-facing and behind-the-firewall applications.

Use cases for external participants and mobile devices have driven the rise in the number of (public) cloud BPMS offerings, and systems that have traditionally been deployed on premise are scrambling to find ways to offer cloud capabilities without compromising security or data sovereignty.

That’s where microservices come into play. If your BPMS is designed using a microservice architecture, each of the service containers can be deployed independently of the others, either on private or public infrastructure, while still sharing data between services. Furthermore, scaling individual services – such as your process engine or audit logging, two notorious performance troublemakers – can be done without having to scale up all of the other services in step. You may even be able to skip some of the services offered as part of the BPMS, such as single sign-on, in favor of comparable services that you’re already using for other applications.

These benefits of a microservices design exist for any type of enterprise system, and that should be a feature for systems that you’re building or buying. If you’re building applications on a platform such as a microservices-based BPMS, it’s a lot easier to gain these benefits, and to optimize a private, hybrid or public cloud deployment.

Are you currently using or considering a cloud and/or a microservices based infrastructure? Let us know the ways you’re using this technology to innovate. Join the conversation on Twitter, @skemsley, @alfresco

Leave a comment

Previous Post:

© 2018 Alfresco Software, Inc. All Rights Reserved.