Thursday, April 20, 2017

bpmNEXT 2017 (part5)

Last half day of bpmNEXT is starting. Today I will up as the second presenter of the day, doing my demo on our case modeler and continuous task optimization.

Making business processes dance to the user's tune
Paul Holmes Higgin - Flowable Project

Paul introduced Flowable, a recent fork of Activiti (where most of that engineering team followed), and how it can be used for case management due to its dynamic modification capabilities that are coming in v6. In the example he showed, he dynamically added a new (not predefined) task or even a complete new process into a running instance. 

Supporting unstructured work
Kris Verlaenen - Red Hat

I presented on some of the challenges our customers are seeing in the context of unstructured work.  Firstly, modeling unstructured work confronts us with the limitations of existing standards like BPMN2, so we are presenting a higher-level model where the work is modeled as a number of stages, each containing any number activities.  It is however a visualization layer and developers can still rely on the capabilities of the underlying specification for the execution semantics.  Secondly, since unstructured work is typically unplanned, we are using continuous optimization (using OptaPlanner) for task assignment, so we can help users to work on the most important tasks first, taking into account user's constraints and preferences.

As an intermezzo, Lloyd presented on the Business Architecture Meta-Model, where they are trying to link a lot of the concepts that are used at different levels, linking Business Architecture to BPM concepts.  As we all agree that using a term like activity or process can be very ambiguous.  It's an OMG RFP where Lloyd invited everyone that might be interested to take a look.

Digital strategy deployment using business capabilities
Denis Gagne - Trisotech

Denis presented Trisotech Digital Enterprise Suite.  They are combining models and concepts at different level (like strategies and capabilities all the way to BPMN, DMN and CMMN models, etc.), and linking them for being to trace the relationship between all of these.   In the demo he showed an example where you could see how capabilities where implemented using processes (BPMN2) and decision logic (DMN).  Their landscaping tool (as they call it) can be used for brainstorming and collecting ideas, from free flow to based on existing models / approaches.  However, these ad-hoc models (as the result of the brainstorm) can later be linked to future models all the way up to the implementation.  Because models are connected, they can be used for getting executive overviews (for example related to maturity, performance, technical dept, etc.).  All of this information goes into one big digital enterprise graph.

With that, presentations are done.  Time to wrap up and start preparing for next year !

bpmNEXT 2017 (part 4)

Through a lens Starkly: transforming data into business information
E. Scott Menter - BPLogix

BPLogix is virtualizing data (possibly coming from a lot of different data sources) and aggregating it to get visibility in what is going on for business users.  The data flow analyzer allows you to drill down into your data, you can for example see how a chart is combining different queries to visualize key indicators (without knowing the technical details of the data).  Their tools allows users to define processes as a combination of activities with eligibility criteria (as they presented last year).  By combining this with their data analyzer, it allows you to clearly visualize what happened and based on what information (and to drill down further on where that data came from, etc.).

Process modeling and metrics: the next generation
Max Young - Capital Labs

Capital Labs is adding a third dimension to processes.  When performing simulation it is critical that it matches with reality.  Their tool, called BPM Scout, allows you to import processes from different tools, and then perform visual simulations on top of those.  But it allows multiple (3D) layers, where a process on the top layer can trigger for example rules on a different layer, etc.  Users can define their own advanced KPIs so you can show the value that the customer can expect.  It also interacts with management tools and allows you to generate full documentation (including simulation information) for your process.  And it allows you to export all your processes to for example IBM Blueworks (even with an application generated for you to start the process).

BPM with humans in the age of digital transformation
Francois Bonnet - ITESOFT W4

Francois is showing their tool that is trying to assist humans in their job.  At process definition time, it cannot only validate your BPMN processes, but assist the user in what the solution might be (for example adding missing elements).  The process also be simulated step by step.  Simulation can even compare the execution of a specific instance to the plan (gantt chart), giving information if you are ahead or behind the schedule.  Simulation supports concepts like timers and signal events etc.

Taking BPMN to infinity and beyond
Jakob Freund - Camunda
Camunda is presenting their next generation "Big Workflow" engine.  They have written a new version of their workflow engine with the idea of being able to scale this infinitely.  It is using publish-subscribe at it's core: rather than storing the state of an instance as a row in a traditional database, they are generating and storing event rather than updating that one row.  The events are written to a log file on file system and in a distributed setting it is replicated. This allows them to scale to 100x the throughput of their traditional engine.

Getting to know your users with Brazos CX insights
Scott Francis and Ivan Kornienko - BP3 Global

BP3 is presenting Brazos CX insights to help developers improve the usability of their applications for end users.  The tool is continuously measuring your application (load time, how the application is used, form validation errors, etc.).  Analysis learns you how much time users spent at different parts of your application, how much each area of your application is used (or not used), what the common validation errors are, etc.  And if necessary you can drill down even further to get more detailed information, all the way up to a timeline visualization of an individual session, or get average data for a large number of sessions.  Obviously all this information should be used to improve the usability of your application.

Cognitive customer service
Pramod Sachdeva - Princeton Blue

Princeton Blue is presenting their cognitive customer service solution.  It is monitoring customer interactions (from different possible resources like calls, emails, etc.).  Rules can be used to proactively monitor this data and create escalations, that can then be handled appropriately (for example through a process).  During this escalation, it can actually recommend what the most appropriate action might be.  Reports can be generated to slice-and-dice all this information in different ways (based on topic, customer type, sentiment, etc.).

Wednesday, April 19, 2017

bpmNEXT 2017 (part 3)

Starting the second day at bpmNEXT, where Edson is kicking off (with Bruce) on the DMN execution engine we've been working on on the Drools project.

An executable DMN solution for business users
Bruce Silver - methodandstyle.com, Edson Tirelli - Red Hat

Bruce and Edson are showing the first complete implementation of DMN. It's a collaboration of multiple companies, where Trisotech is providing a DMN modeler, Red Hat has a completely open-source DMN execution engine (as part of the Drools project - the first and currently the only implementation passing the full TCK) and Method and Style is offering a methodology and guardrails around it.
Bruce did a demo where he showed a decision table to define some decision logic and validation that will help you find potential issues with the table. Using a pre-qualification of a loan example, he showed a DRD to help find the interest rate of an applicant.  After executing the rule, the decision can be visualized by annotating it on top of the DRD diagram.
Edson zoomed in on various topics: how validation is done at several levels, an example using advanced expressions depending on level 3 DMN support, how to extend the language with a custom function and different ways of execution (from embedded to a REST service in the cloud).

Boost business process agility with DMN
Eduardo Chiocconi - Oracle

Oracle is showing their cloud-based DMN service that allows you to create and publish a DMN decision service.  Using an expense approval process, they showed how to first model the decision using DMN and then integrate the resulting decision service into a process. This is targeting business professionals rather than IT personnel.  By adding a decision service to the process (it is aware of which decision services are available) and mapping the inputs and outputs (in a graphical way, avoiding any scripting), the DMN service can be integrated in the process. By extracting the decision logic from the process itself, it has been given a separate life cycle and can be updated dynamically.

Making the standard real: the DMN TCK
Keith Swenson - Fujitsu

Keith presented the DMN TCK.  It's a set of DMN models (focusing on the most important use cases), input data sets and expected results (using an xml format for both).  Once a runner is provided (that is able to invoke the implementation of a particular vendor), it produces a CSV with the results. DMN supports multiple levels of compliance, where level 3 includes full FEEL support.  There currently are only a limited set of test cases, but Keith is asking everyone to submit their own test cases to extend the TCK.  Great results in one year, given that this was basically started from a discussion at his bpmNEXT presentation last year.

Decision enabled robotic process automation
Larry Goldberg - Sapiens DECISION

Larry presented a use case where process, decisions and robots were combined.  Data being sent in is first sent to a decision service (which he called the brain) to determine which robots (using RPA) need to be triggered to collect additional information from back-end systems.  Further validations by the brain and/or manual checks continue until the brain is satisfied.  This allowed the company to increase the number of transactions they were handling and at the same time reducing the amount of FTEs required to perform the work (as a lot of the data collection was automated using RPA). Larry showed how Sapiens DECISION can be used to define the underlying processes and decisions.

Accelerating digital transformation with an Open Cloud Platform
Harald Schubert - SAP

SAP presented their Cloud Platform, and more specifically the Workflow service in there.  He showed how to start a new process from scratch, with a simple process that included a human task.  When starting an instance of this process through their UI, this task obviously ended up in the task inbox. Internally they are using the open-source Activiti engine for execution.  SAP Cloud Platform comes with a lot of out-of-the-box platform services that can be integrated into the process, like for example a gamification service (and associated UI).  This service was called through the SAP Integration layer (as the process currently only supports REST calls).

bpmNEXT 2017 (part 2)

This afternoon the demos are starting, where the conference is again using the Ignite format: every presenter first has a 5-minute presentation (15s per slide) followed by 20min of live demo.  This format is used to force presenters to focus on the demo itself.

Creating a Digital Workforce with Robotic Process Automation
Anthony Yung - Kofax

Kofax is showing their Robotic Process Automation (RPA) solution.  As a use case he showed a "Customer due diligence process" (also called Know Your Customer) where a custom application is used to collect the necessary information about a customer and then a "robot" is used to analyze some of that information, for example do a google search on this customer and analyze the results.
He showed the Kofax designer where the google search was defined as a number of manual steps (open the google page, put in the query, perform the query, collect the results from the results page, etc.), without requiring any scripting.  This "robot" was then exposed as service (available through REST) so it can be reused.  These robots can then be called from your business processes as well.

BPM with Blockchain
Michuel Valdes-Faura and Lionel Palachin - Bonitasoft

Bonitasoft integrated their BPM platform with Blockchain, to achieve mostly the following advantages: allow multiple partners to trust the common process, customer engagement and end-to-end traceability. The use case they showed was a car order management process, where a car is being sold to a customer (including payment and notification).  The car is modeled as a Blockchain asset, and they implemented several connectors to interact with blockchain from the process, to for example create a transaction, etc.
They made a case for integration of BPM and blockchain in both ways: have your processes interact with blockchain but also having companies building on top of blockchain to use BPM to offer custom applications for their use case.

Real-time Process Deviance Monitoring
Michal Rosik, minit

Minit is using process mining techniques for deviance monitoring (i.e. searching for abnormal behavior at runtime). Their tools allows you to look at collected data (where it's not a requirement that the use case is already modeled as a process) from different perspectives, like frequency (how many time is some activity executed), time (which activity is causing delay), financial (what's the cost associated with each activity), etc.  The mined process diagram is annotated graphically with the relevant information.  By defining which variants are acceptable, at runtime they can monitor for any deviations using dashboards that show runtime information and allow you to drill down in case deviations are detected.  It is also applicable in the context of IoT, where a much larger number of events is typically expected.

Analytics for leveraging BPM assessment and management action
Jude Chagas Pereira, Frank Kowalkowski, Gil Laware - IYCON

Afterspyre uses analytics to look at data (that is pulled in from different data sources) to help analysts get better insights and make better decisions. For example, it can look at existing BPM data and help make decisions on which projects are most optimal to optimize, etc.  It can also compare two different processes to detect how much similarity they have, or they can be ranked based on different attributes.  Charts present this information at a higher level for managers to consume.  It also supports semantic analysis of text and keywords used in processes.

The recipe for successful digital transformation
Derek Miers - Structure Talent, MWD Advisors

Derek made the case that digital transformation isn't just process + technology & stir.  BPM isn't the silver bullet that will solve everything.  You have to engage your audience to get there.  Business transformation requires you to rethink and change everything you do (up to the entire structure of your company) and it needs to be customer-centric.
He presented a framework for business transformation that starts from understanding your customer's experience.  He mentioned BPM sometimes seems to be focusing much more on improving the existing processes rather than helping people to rethink them and to help co-create their future. Sometimes you need to redesign outside-in.  And BPM vendors should think about how they can help their customers doing that (rather than optimizing the process they don't need).

Now we're off to the roof top for some drinks and dinner !

Tuesday, April 18, 2017

bpmNEXT 2017 (part 1)

Back at Santa Barbara this year to attend the bpmNEXT conference, where I will be speaking on Thursday.  But before that will happen, we have a full 3 days of presentations and (even more important) demos from a lot of different vendors and experts.

BPM 2017-2021: Outlook for the Next Five Years
Nathaniel Palmer

Nathaniel is starting with a view of the BPM market from his point of view. In 2016, he predicted that the 3 R's (Robots, Rules and Relationships) will be defining the BPM market.  It's clear that rules have a significant impact on BPM nowadays as a way to drive decisions (for example with DMN). According to Google Trends, Robot Process Automation (RPA) is gaining more attention as well.  The interest in business processes is pretty stable (where BPMN is kinda following the same trend but just at a smaller scale). Digital Transformation is a term that has grown and is similar although BPM is still the overarching term that combines all of this.
Automation is forcing us to step away from the traditional architecture associated with BPM.  The future of BPMS vendors isn't just about process management but also includes automation, machine learning and decision management, all driven by an event-driven foundation.
Rather than predicting where BPM is going, he suggested we would all work towards defining it ourselves.  Or as he said, let's all "make automation great again"!

The Top 10 Technologies that will impact BPM in the next 5 years
Jim Sinur - Aragon Research

Jim, with his 50 (!) years of experience in IT, and many years of experience as analyst for BPM (for Gartner and Aragon Research nowadays), highlighted a few technologies he believes are going to be more and more important, including for example:
  • Predictive apps get smarter (decision management is key)
  • Big data and learning (using machine learning, deep learning and cognitive computing)
  • Internet of things (standardization kicking off there now - resulting in a lot of smart devices at the edges and more goal-driven decentralized management)
  • Rise of chatbots (moving to full language and action)
  • Virtual Reality
  • Work hubs (workbenches focused on specific roles)
  • Drones
  • Blockchain
Jim believes the Digital Business Platform (DBP) is what is / will be combining all these technologies (kinda disagreeing with Nathaniel there) as a place where business and IT collaborate.  Things like digital identity (including your preferences) and change management (across technologies) will be key.

The New Wave of Automation
Neil Ward-Dutton - MWD Advisors

Neil explained how a major shift in our experience of automation is underway.  Traditionally, we have been trained to work around limitations of automation (we are for example all used to pushing keys on computers the entire day), however that is changing, where automation is now changing for us.  Neil introduced 3 layers of change, called the 3 I's: Interaction (sending and responding more like humans - like chatbots), Insight (interactive analytics - like recommendation engines) and Integration (resources being exposed with open interfaces - like smark infrastructure).  Main drivers from his point of view are the rapidly changing technology, business pressures and familiarity of automation.  Use cases range from automating high-volume routine tasks to low-volume expert assistance and in between (make everyone as good as the best).
He concluded with some guidance for the audience (if we want to help define the future): embrace the shift to self-service, the shift to networked (cloud-based) platforms and the shift to learning systems.

The Great Migration: How to survive the leap from BPM as we knew it to the era of the digital workforce
Clay Richardson - Digital FastForward

Clay believes 75% of the current BPM programs nowadays won't survive the shift to digital.  He is no longer a BPM analyst at Forrester as he wanted to focus more on actually helping customers make the digital step (not just making them excited only). The challenge is not necessarily the technology but how to get (and keep) the right skills and mindset for digital transformation.  You will have to use approaches like hiring new talent, reinventing the workforce or outsourcing innovation (or all of the above). And help teams to design, validate and learn (using new methodologies and tools).  And it's not just about what you learn but how you learn: it needs to be interactive and immersive (learning gamification).  And put these people in front of the customer (even if it means leaving their comfort zone) so they learn about what it is they need.
Want to take this gamification to the extreme? Apparently you learn better when combining learning with escape rooms - even including the zombies - looking forward to that experience ! :)

Saturday, April 15, 2017

bpmNEXT and Red Hat Summit

We have a lot of interesting things in the works with Drools, jBPM and Optaplanner and we are going to show you! And what better opportunity to take a look under the hood at what is coming than joining us on a session, side talk or over a happy hour in the upcoming conferences?

Here is a short list of the sessions we have on two great conferences this month.  First of all, on bpmNEXT Edson will talk about DMN support and I will give a talk on unstructured cases.  

Red Hat Summit is 2 weeks later and there will be a larger number of presentations around BPM (and a lot more of course)! We hope to meet you there!

Oh, and check the bottom of this post for a discount code for the Red Hat Summit registration!


Santa Barbara, California April 18-20, 2017