Wednesday, November 20, 2019

Kogito deep dive video from Devoxx

This year at Devoxx Belgium, Maciej, Edoardo and Mario held a 3h deep dive on Kogito.  Since Devoxx is so awesome to share the recordings of all their presentation online, wanted to give everyone the opportunity to go and watch this!



I also had the opportunity to help out at the Red Hat booth for 2 days, and it was a great opportunity to sync up with a lot of people and do some Kogito evangelization.  And was there live for the big announcement of Quarkus doing its 1.0 release !



Wednesday, September 18, 2019

Etymology of Kogito

After writing up an introduction to our Kogito effort, it seems people are interested in hearing a little but more about the name, where it comes from, what the logo means, and (what seems to be the most important question) how to pronounce Kogito?  Yes, there even was a JIRA issue [KOGITO-284] opened to address this issue!

First, the name Kogito itself comes from:
"Cogito, ergo sum"
a Latin philosophical proposition by René Descartes usually translated into English as "I think, therefore I am" [Wikipedia].  So Kogito simply means "I think", and refers to how users are encoding business knowledge using various formats (processes, rules, constraints, etc.).  The 'c' was replaced with a 'k' as a reference to Kubernetes, our target cloud platform, and Kie where the k stands for knowledge.

"Kogito, ergo automate" therefore means, "I think, therefore I automate" and refers to the use of business automation to encode business knowledge.



Our logo is a reference to Odin, the Norse God that gave up an eye for wisdom [Wikipedia].
“According to mythology, Odin ventured to the mystical Well of Urd at the base of the world-tree that holds the cosmos together. The well was guarded by Mimir, a shadowy being who becomes all knowing by drinking the magical waters. Odin asked for a drink and Mimir replied that Odin must sacrifice an eye for a drink. Odin gouged out his own eye, dropped it into the well, and was allowed to drink from the waters of cosmic knowledge.”
Finally, how do I pronounce Kogito?  Since it comes from the Latin phrase "Cogito, ergo sum", the obvious first question could be, how do I pronounce that?  As it turns out, not an easy question to answer, but in the end the Italians in our team proclaimed this to be the only correct pronunciation:
[ˈkoː.d͡ʒi.to]
so that's with the emphasis on the first syllable, and the 'g' pronounced as 'dji', or (if you're not skilled in phonetic language at all like me ;)) just listen to the video below:


Some good news though, because it seems no mortal person is able to consistently pronounce it this way, other pronunciations are completely fine too!

Monday, September 16, 2019

An intro to Kogito

The KIE team has been working for quite a few months on the Kogito project, our next-gen solution leveraging processes and rules for building intelligent cloud-native applications.

 

What are we trying to achieve?  Basically, when you as a developer or team are trying to build intelligent cloud-native applications, Kogito wants to help you with that by letting you use processes or rules in this context in a way that matches that ecosystem (!).  Kogito is focusing on make it as easy as possible for developers to turn a set of processes and/or rules in your own domain-specific cloud-native (set of) service(s).


This is a continuation of the efforts of the KIE team (including the Drools, jBPM, Optaplanner and AppFormer teams) to offer pure open-source solutions for business rules, business processes and constraint solving.  The KIE team however decided to have a new effort targeting specifically this goal, for the following reasons:
  • Technology-driven: As you will see below, there's a lot of great technology available for building cloud-native applications, but to be able to fully leverage these technologies in the context of business automation, we had to make a few radical changes.

  • Focus and innovation: We wanted to focus specifically on what is needed to build next-gen cloud-native applications, and how you can leverage processes and rules in this context.  This allows us to offer something that really fits this ecosystem and doesn't bring in additional baggage that isn't relevant.
So while this effort builds on years of experience and battle-tested capabilities, this also allowed us to leave some baggage behind and focus 100% on the problem at hand.

Kogito, ergo cloud
 
When you're building cloud-native applications, there's a lot of great technology out there (some of it you're probably already using).  Kogito is closely aligned and leveraging these technologies, so you can build highly scalable cloud-native services, with extremely quick startup times and low footprint. Picking up some of these technologies and truly taking advantage of them sometimes required quite radical changes (so this definitely not a lift-and-shift of our existing engines but built from the ground up).

For example:
  • Kubernetes is our target platform for building and managing containerized applications at scale.
  • Quarkus is the new native Java stack for Kubernetes that you can leverage when you build Kogito applications and it's a game changer.  But don't worry, if you are building your applications with Spring Boot, we will help you with that as well!
  • GraalVM allows you to use native compilation, resulting in extremely quick startup times (a native Kogito service start about 100x faster ~ 0.003ms) and minimal footprint, which is almost a necessity in this ecosystem nowadays, especially if you are looking at small serverless applications.  If you're interested in what's behind this, I would recommend to read Mario's blog about this.
  • Building serverless applications? Leverage Knative and Kogito together so your applications can scale up or down to zero based on the need.
  • Kogito applications behave like any other service you build, so you can instantly leverage technologies like Prometheus and Grafana for monitoring and analytics with optional extensions.
  • Internally we leverage quite a lot of other core middleware technogies like Kafka, Infinispan, KeyCloak, etc. This means we take care of setting these up (on demand, for our internal messaging, persistence and security requirements for example) but we strongly encourage you to start leveraging these technologies for your own use cases as well.

Kogito, ergo developer

We want to make the life of developers easy, by offering them instant productivity and making sure we integrate well with how they are building their applications.  So rather than asking developers to come to us with their requirements, we are coming to them !
  • The tooling required to build your processes and rules needs to be closely integrated with the  workflow the developer is already using to build cloud-native services.  Therefore we have spent a lot of time on allowing this tooling to be embeddable.  For example, we just released the first alpha release of our VSCode extension (see video below, credits to Alex) which allows you to edit your processes (still using BPMN 2.0 standard) from within VSCode, next to your other application code.  We're working on a similar experience for Eclipse Che.
  • Instant productivity means it should be trivial to develop, build and deploy your service locally so you can test and debug without delay.  Both Quarkus and Spring Boot offer a dev mode to achieve this, Quarkus even offering live reload of your processes and rules in your running application (extremely useful in combination with the advanced debug capabilities).
  • Once you're ready to start deploying your service into the cloud, we take advantage of the Operator Framework to guide you through every steps.  The operator automates a lot of the steps for you.  For example, you can just give it a link to where your application code lives in git, and the operator can check it out, build it (if necessary including native compilation) and deploy the resulting service.  We are working on extending this to also provision (on demand) more of the optional services that you might need (like for example a KeyCloak instance for security, or Infinispan for your persistence requirements).  We also offer a Command Line Interface (CLI) to simplify some of these tasks.

Kogito, ergo domain

Kogito has a strong focus on building your own domain-specific services.  While we hope you can leverage our technology to significantly help with that, we want developers to be able to build the service they need, exactly how they want it.  As a result, the fact that Kogito is leveraged to do a lot of the hard work is typically hidden and your service exposes itself as any other with its own domain-specific APIs.
To achieve this, Kogito relies a lot on code generation.  By doing so we can take care of 80% of the work, as we can generate a domain-specific service (or services) for you, based on the process(es) and/or rule(s) you have written.  For example, a process for onboaring employees could result in a remote REST api endpoints being generated that you can use to onboard new employees or get information on their status (all using domain-specific JSON data).


Additionally, domain-specific data can also be exposed (through events or in a data index) so it can easily be consumed and queried by other services.


Architecture

When using Kogito, you're still building a cloud-native application as a set of independent domain-specific services, collaborating to achieve some business value.  The processes and/or rules you use to describe the behavior are executed as part of the services you create, highly distributed and scalable (no centralized orchestration service).  But (by using this additional compilation step) the runtime your service uses is completely optimized for what your service needs, nothing more.

If you need long-lived processes, runtime state can be persisted externally in a data grid like Infinispan.  Each service also produces events that can be consumed.  For example using Apache Kafka these event can be aggregated and indexed in a data index service, offering advanced query capabilities (using GraphQL).


What's coming next?

At this point, Kogito 0.3.0 is the latest release (from August 23rd), but we have much more coming on our roadmap before our 1.0.0 release which is targeted towards the end of the year. 


Get started

And now I believe you are ready to give it a try yourself, so please do and let us know! You can start with building one of the out-of-the-box examples, or by creating your first project from scratch.  Follow our getting started documentation here !  You will see you can build your own domain-specific service in minutes.

Or if you want to watch a small presentation (and demo!) from Maciej, check out his latest DevNation Live talk here.

Wednesday, April 24, 2019

bpmNEXT 2019 impressions, day 3

This is part of a 5-part blog series on bpmNEXT 2019:
Day 1
Day 1 (part 2)
Day 2
Day 2 (part 2)
Day 3


Last (half) day where I have to present myself as well (as 3rd of the day).

A Well-Mixed Cocktail: Blending Decision and RPA Technologies in 1st Gen Design Patterns
Lloyd Dugan

Lloyd introduced an RPA-enabled case mgmt platform that is used in the context of a use case to determine eligibility for Affordable Care Act. Using Sapiens for decisions and Appian for BPM, approximately 4000 people are using this as a work mgmt application (where work is assigned to people so they can work through this).  To be able to achieve higher throughput, they however combined this with RPA that emulate the behavoir of the users.  He showed (unfortunately in a prerecorded video, not a live demo) how they implemented the robots to perform some of the work (up to 50% of the total work done by the users !). The robots learned how to soft fail if there were issues (in which case the work would go back into the queue), needed to accomodate for latency, etc.




Emergent Synthetic Process
Keith Swenson - Fujitsu

Keith presented a way to customize processes to different contexts (for example slightly different regulations / approaches in different countries) by being able to generate a customized process for your specific context (when you start the process).  Rather than encoding processes in a procedural manner (after A do B), he is using "service descriptions" to define the tasks and the preconditions. You can then generate a process from this by specifying your goal and context and working backwards to create a customized process from this.  This allows you to add new tasks to these processes easily (as this is much more declarative logic and therefore additive).
The demo showed a travel application with approval by different people. Service descriptions can have required tasks, required data, etc.  The process is generated by working backwards from the goal, adding required steps one by one.  Different countries can add their own steps, leading to small customizations in the generated process.




Automating Human-Centric Processes with Machine Learning
Kris Verlaenen - Red Hat

I was up next !  I presented on how to combine Process Automation and Machine Learning (ML), to create a platform that combines the benefits of encoding business logic using a combination of business processes, rules etc. but at the same time can become more intelligent over time by observing and learning from the data during execution.  The focus was on introducing "non-intrusive" ways of combining processes with ML, to assist users with performing their tasks rather than to try and replace them.
The demo was using the it-orders application (one of our out-of-the-box case management demos that employees can use to order laptops) that focused on 3 main use cases:
  • Augmenting task data:  While human actors are performing tasks in your processes or cases, we can observe the data and try to predict task outcomes based on task inputs.  Once the ML algorithm (using Random Forest algorithm, with the SMILE library as the implementation) has been trained a little, it can start augmenting the data with possible predictions, but also with a confidence it has on that prediction, the relative importance of the input parameters, etc.  In this case, the manager approving the order would be able to see this augmented data in his task form and use it to make the right decision.
  • Recommending tasks:  Case management allows users to add addition dynamic tasks to running cases (even though they weren't modeled in the case upfront) in specific situations.  Similarly, these can be monitored and ML could be used to detect patterns.  These could be turned into recommendations, where a user is presented with a recommendation to do (or assign) a task based on what the ML algorithm has learned.  This can help the users significantly to not forget things or to assist them by preparing most of the work (they simply have to accept the recommendation).
  • Optimizing processes based on ML: One of the advantages of the Random Forest algorithm is that you can inspect the decision trees that are being trained to see what they have learned so far.  Since ML also has disadvantages (that it can be biased or that it is simply learning from what is being done, which is not necessarily correct behavior), analyzing what was learned so far and integrating this back into the process (and/or rules etc.) has significant advantages as well.   We extended the existing case with additional logic (like for example an additional decision service to determine whether some manager approvals could be automated, or additional ad-hoc tasks included in the case that would be triggered under certain circumstances), so that some of the patterns detected by ML would be encoded and enforced by the case logic itself.
These non-introsive ways of combining processes with ML is very complementary (as it allows us to take advantage of both approaches which mitigates some of the disadvantages of ML) and allows users to start getting advantages of ML and build up confidence in small and incremental steps.




ML, Conversational UX, and Intelligence in BPM
Andre Hofeditz, Seshadri Sreeniva - SAP SE

SAP is presenting "live processes" that are created by combining predefined building blocks, running on their platform with support for conversational user experience, decision management, task inbox, etc.  

SAP API Business Hub has been extended to also include live processes. Using an employee onboarding scenario, they show how a running instance can be "configured" (only in specific situations, which you can define during authoring) after which you can change the template and generate a new variant.  The process visibility workbench allows to generate a customizable UI for monitoring progress of your processes.
Next, they show how you can extend the platform by using recipes, which can be imported in SAP web IDE and deployed into the platform, adding additional capabilities that will be available in your live processes from that point forward.
Finally, they showed an intelligent assistant that is a sort of chatbot that can respond to voice.  It can give an aggregated view of your tasks, complete the tasks through the conversational UI, etc.  They showed how the chatbot can be programmed by defining tasks with triggers, requirements and actions, which can then be deployed as a microservice on the SAP cloud.




DMN TCK
Keith Swenson 

Keith explained the efforts that are going into the DMN TCK, a set of tests to verify the compliance of DMN engines.  When running these tests, it takes a large number of models and test cases (currently over a thousand but still growing) and check the results.  He explained some of the challenges and opportunities in this context (e.g. error handling).
While many vendors claim DMN compatibility, Red Hat is one of the few vendors that actually has the results to prove it !



That concludes bpmNEXT 2019!  As previous years, I very much enjoyed the presentations, but probably even more the discussions during the breakouts and evenings.

Wednesday, April 17, 2019

bpmNEXT 2019 impressions, day 2 (part 2)

This is part of a 5-part blog series on bpmNEXT 2019:
Day 1
Day 1 (part 2)
Day 2
Day 2 (part 2)
Day 3


BPM, Serverless and Microservices: Innovative Scaling on the Cloud    Thomas Bouffard, Philippe Laumay - Bonitasoft

Bonitasoft is explaining how they are containerizing their BPM platform as micro-services and using new technology like serverless.

A payroll process with some human interaction.  They also simulate a load of 100 users by using some script that runs simultaneously.  When they do this, the UI itself becomes unresponsive.  After containerizing their application, they use kubernetes to be able to scale up the engine to 3 pods, increasing the capacity what the engine can handle.
In a second step, they externalize some of the work that the process is doing as an asynchronous lambda function.  This moves some of the CPU usage outside of the BPM platform, making it easier for the engine itself to scale.




Performance Management for Robots
Mark McGregor and Alessandro Manzi - Signavio

Signavio is sharing their strategy on combining robots with human actors and how to manage your robotic workforce.  They recommend treating robots similar to human resources, by making sure they have clear job descriptions, are evaluated and if necessary fired.

Signavio has traditionally offered tools to analyze your existing processes related to performance, which they can for example use to identify tasks that would be appropriate for applying RPA.  Once identified, simulation allows you to figure out what the consequences would be of applying robots to perform some of the work.  For example, the solution using robots might have a higher cost (for example due to licenses) but decrease cycle time, allowing you to make a conscious decision.  Detailed analysis of the performance of the robots once applied could lead to the robot to be fired if it's performing badly.




The Case of the Intentional Process
Paul Holmes-Higgin, Micha Kiener, Flowable

Flowable is using "micro-processes" to manage chatbots and describe their behavior.  Around these micro-processes (using BPMN), they use case mgmt (CMMN) to link different chatbots together to make sure they are being used in the right context.

In the demo they have different types of chatbots, to assist banking customers with various kinds of functions.  In their cases they model "intents" that the chatbot will be trying to detect, and how to respond in case the intent should be executed (using signal events that will trigger a specific process fragment), all context-aware.
Using a chatbot conversational UI, you can run through a process, where the chatbot is asking for all the appropriate information step by step by following the process and collecting the results.  If necessary it can recommend to switch to a different chatbot or a real human user.  The chatbot interface also supports various commands (e.g. /create task ...), and show other relevant information (e.g. open tasks) in the UI.  
When trying to continue the conversation through WhatsApp (which has the limitation you can only show text, no buttons), the chatbot is smart enough to be aware of those limitations and fall back to text-based replies (e.g. type "yes" or "no") instead of buttons or forms.




Industry Round Table: Advancing the Value Proposition of 'Intelligent Automation'

Nathaniel is launching a panel discussion about the name, moving away from BPM to "Intelligent Automation"?
  • Even though some products offer a unified platform to apply BPM, DM, RPA, AI and integration, customers are often still doing it in a siloed way, customers need to think more holistically
  • Intelligent might refer more to A.I., automation might refer more to RPA -> taking advantage of some of the hype around these technologies to sell the porfolio.
  • We might be selling BPM technology, but we might be marketing more a vision that is broader than that.
  • BPM is not just technology, it's a methodology
  • Intelligent Process Automation?
  • workflow or orchestration is becoming more popular again (for the technology) but is less marketable
  • What are we doing?
    • Technology to help by automating some of the work
    • "Free the humans"
That concludes the second day, half a day left tomorrow, where I will be presenting myself.

Tuesday, April 16, 2019

bpmNEXT 2019 impressions, day 2

This is part of a 5-part blog series on bpmNEXT 2019:
Day 1
Day 1 (part 2)
Day 2
Day 2 (part 2)
Day 3


Keynote: Best of Breed: Rolling Your Own Digital Automation Platform using BPMS and Microservices
Sandy Kemsley

Sandy Kemsley (who is also blogging about the various presentations) is starting off day 2 with a keynote on how large customers are building their own digital automation platforms in house leveraging available technologies, like BPM.  Nowadays, taking a best of breed approach is replacing the legacy "monolith".  In the last decade, the BPMS became the new monolith because it was trying to fill a gap in app development (with constantly increasing requirements around forms, graphical modeling, BAM, etc.) which lead to large suites including one specific solution for each of these requirements.  Agility however is a new competitive differentiator.
The new Digital Automation Platform is much more a (dynamic) collection of independent microservices, where the best-of-breed approach allows you to swap services in or out.


This might not be the solution for everyone (yet), but might be interesting for small to mid-sized companies looking for a COTS system to manage core processes, or for large companies with a large development team.
As a lesson for vendors, she recommends to separate components and price accordingly, and to make sure you can build microservices for your processes and decisions.




Business Automation as a Service
Denis Gagne - Trisotech

Trisotech is presenting their business automation as a service offering, allowing business users to express their logic in a simple way and now execute it directly as well.

The demo starts with a simple process to turn on the light when a twitter message is received.  After defining this simple process within the tool, it's deployed into the cloud with one click, and the lamp he brought on stage starts flashing every time someone tweets #Trisotech.  Next, the process is extended to include a sentiment analysis service, to analyze the text included in the tweet, after which the light starts flashing green (or yellow or red) for every positive (or neutral or negative) tweet.
Next, a more complex example is used to track customer leads.  When going to a demo website, you can register your details and the process will route your request to the right sales person, email you the slides and register you in the CRM system.

Trisotech is working closely with Red Hat, so great to see how they have built this great tool to allow people to quickly create and deploy processes and decisions into the cloud.




Business-Composable Services for the Mortgage Industry     
Bruce Silver - Method and Style

Bruce shares his experience of using Trisotech to model a use case in the mortgage industry, using a combination of BPMN and DMN.

Applying for a loan requires quite complex decision models to determine eligibility and determine loan amount etc.  The mortgage industry has standardized quite of lot of this, which enables creating some form of reusable service.  A DMN model is used to describe the logic, using FEEL for the expressions.  While the logic is complex sometimes, the resulting model should still be understandable by domain experts.
The data that is used as input is a standardized XML format (Mismo), which is mapped to a more DMN-friendly format (including validation etc.) using a separate process that is deployed as a service as well.  Similarly, the input can also be a pdf file in which case a different process is used to extract the data from there.  Using a simple test web page to provide the inputs (that is generated as part of the process deployment), the service produces the expected results.




Industry Round Table: The Coming Impact of Decision Services and Machine Learning on Business Automation

Another panel, this time focusing on decision services and A.I.
  • Consensus (at least here it seems) that decision management has great synergy with process automation.
  • Standards are really important, although not all vendors are using BPMN or DMN, which is fine
    • DMN is not backed as much by the big vendors (Red Hat is one of them though), so it's future is still much less clear
  • Need to define and demystify A.I. as there are various types of intelligence
  • Challenges with "black box" A.I. that cannot clearly explain why
  • Ethical considerations
    • Automation is disrupting labor force
    • Some decisions are now being implemented in cold hard code
  • The required skillset to deal with A.I. is only increasing

bpmNEXT 2019 impressions, day 1 part 2

This is part of a 5-part blog series on bpmNEXT 2019:
Day 1
Day 1 (part 2)
Day 2
Day 2 (part 2)
Day 3


Democratizing machine learning with BPM
Scott Menter - BP Logic

In the demo from BP Logix, they show how they have integrated Machine Learning into their Process Director to start using it in combination with processes.

In this case we are trying to make a prediction on employee attrition (whether they are likely to leave the company).  You start by creating a learner object. After selecting a datasource (a database) and possibly some transformation, you can select which inputs you want to use (giving information or even suggestions on the available data, or visualizing characteristics about the data you selected) and train the model with the selected data.
This data can then be used in for example a form to show potential attrition rate while you are filling in information about an employee.  Or it can be used in a process to drive a decision.

By integrating the learner objects into the Process Director, the learning curve to start using this is much lower, as it's all integrated in one solution (even if the learner objects might actually be encoded by a different actor).




Leveraging process mining to enable human and robot collaboration
Michal Rosik - Minit

Minit suggests to use process mining to improve your RPA strategy.  The strategy is two-fold: (1) use it to pick the right process to apply RPA to, select the right activity and person, to get a higher degree of success (as 40% of RPA projects fail); and (2) to monitor the results to make sure everyone is happy.

They apply this to a purchase process, where there are various bottlenecks detected to fill in the right order number, etc. (using standard process mining).  They allow you to drill down several layers to inspect the details of the selected activity, how for example the human actor is using a combination of the browser, skype, etc., the steps they take (possibly multiple variations) to get the necessary information.  These detailed steps could then be used as a basis to generate the RPA script.
After applying the RPA robots to automate some of the steps, the same process mining can be used to monitor and compare the results.  For example, the average completion time might not have improved as expected, in which case we can analyze why that might be (for example that the bots are creating an increased load on the system, causing performance issues).
Finally, Minit dashboards exposes all this information in interactive BI charts.




Process mining and DTO - How to derive business rules and ROI from the data
Massimiliano Delsante, Luca Fontanili - Cognitive Technology

Cognitive Technology is moving from traditional process mining to creating a Digital Twin of your Organization (DTO).  This includes process discovery, cost analysis, simulation, etc. but for example also a new feature to derive actual business rules from the data (rather than traditional probabilities).

The demo is showing the use case of closing a bank account.  They can generate a BPMN diagram from the mining data, but now they even detect correlations for decisions (gateways) using machine learning, to also discover the conditions that are underlying.  After verification by a human actor and/or simulation, these conditions can be added to the process.  The decision can also be extracted separately using DMN, called from the process model. Finally, simulation can be used to identify possible improvements by applying for example RPA to automate some of the tasks: the simulation engine can generate new data with the suggested improvements, and this data can then be mined again to verify the results.




Is the Citizen Developer Story a Fairytale?
Neil Miller, KissFlow

KissFlow is a no-code platform for citizen developers.  Neil starts by showing the runtime application first, showing various kinds of forms to start a process, tracking current state, performing work, etc.  These forms have various pretty advanced features to load form data, printing, getting assistence, etc.
Next, we shifted to the tool to create this.  First the forms: composed of various field types like text fields and dropdowns to tables and advanced fields like signatures etc.  The process itself is a drag and drop tool but using a quite different visualization that is still a flow chart but tries to be as simple as possible for citizen developers (with inline editing of actions etc. inside the diagram, etc. - which reminded me a lot about Zapier for defining integrations).
They are also working on a new KissFlow version 3.0, which will be available soon.  The forms and process modeling still look pretty similar, but this new version is adding various features to simplify collaboration, having things like threads where people are collaborating, more adaptive processes, using kanban boards, more extensive reports, etc.




Insightful process analysis
Jude Chagas Pereira, Frank Kowalkowski, Gil Laware

Wizly is a tool that allows you to run analysis on collected log data, to do things like compliance checks, correlation checks, relationship and sentiment analysis, etc.

The demo shows a call center use case.  After loading in the data of about 2000 cases into the tool, the flow model can be generated from the log data and start running analytics.  The compliance analysis shows us various information about the paths that are being executed (or not).  Next, we can run further analysis, in this case zooming in on problems with bagage-related problems.  This allows us to find possible causes (like canceled flights) but also to filter down to get even more insights.
DNA analysis detects possible paths and can visualize relations between your data (with the capability to filter further down if necessary).  Finally, fourbox plots the data on some form of bubble chart.  They were only able to show some of the features, as they explained they have a lot more analytical capabilities under the hood.




Improving the execution of work with an AI driven automation platform
Kramer Reeves, Michael Lim, Jeff Goodhue - IBM

IBM has worked hard in the last few years to integrate some of their offerings into one unified platform, that they are presenting here.

This demo stars with the authoring, where the case builder, process designer and decision center are combined to define the business logic.  Next, we switched to the runtime UI where new cases can be started and managed and we run through a few steps of the case.  
Next they showed some more advanced integrations: a robot is launched to automatically perform one of the steps, interaction with a chatbot to help find the data I need, analysis charts to help with the decision making, etc.  The final step is to use Watson AI to make recommendations.
Finally, we got a look of the new Business Automation Studio, where you can build business applications in a low-code manner.  You can create forms for business users, and these can be linked (by associating actions with the different buttons) to call new pages or backend functions.




Wrapping up
That concludes day 1 (at least for you, we still have a wine and bear tasting and dinner :-)).  If you are interested to get another view of what's happening here, be sure to check out Sandy Kemsley's blog, who is blogging about the different presentations as well and has a lot more experience in doing this as an independent analyst for many years on BPM and related technologies :)

Monday, April 15, 2019

bpmNEXT 2019 impressions

Back again this year in Santa Barbara for 3 days of discussions on bpmNEXT between (mostly) vendors on what we collectively believe the future might be looking like, or what challenges we face and how we can solve them.

And I will be blogging about my impressions of the presentations here.

This is part of a 5-part blog series on bpmNEXT 2019:
Day 1
Day 1 (part 2)
Day 2
Day 2 (part 2)
Day 3


BPM 2019-2023: Outlook for the next five years 
Nathaniel Palmer

Nathaniel is kicking it off with his traditional presentation of looking forward and predicting what we might see in the next few years in the context of “Intelligent Automation” (yes, the conference is still called bpmNEXT but the consensus seems to be that this term seems to cover better what we are discussing here).  He actuall started by looking back to the predictions from 5 years ago, where he predicted the 3 R’s (Rules, Relationships and Robots).  This all seemed fairly accurate, as “robots” are here now, and they definitely need rules. And the data is scattered across many systems (over 13 systems on average, many external) and all need to be related. Some of the 2019 prediction are that by 2022,
  • 50% of the work will be done by robots
  • 70% of the work will be done on third party cloud platforms: so that means that the “Intelligent Automation Platform” architecture that was presented a first time a few years ago, has been updated a little to reflect this (where the event bus is now much more inherently part of the cloud itself)
  • 80% of the user interaction will be done through an interface other than the smart phone (think smart speakers), moving from a worklist metaphor to much more conversational interaction.
Since the concept of work is now much broader (including robots, autonomous intelligence, decision services, etc.), what’s the best way to represent and model this, as traditional flow charts have reached their limits in representing more adaptive requirements.
He also made a case for intelligent automation to shift to taking much more short term decisions based on the most recent live events as the business value is typically much higher if the response time is as small as possible and based on the most recent data rather than calculated the best approach long upfront (like how for example Waze can give me a much better route compared to just finding the shortest route long upfront).
Or as Nathaniel summarized it himself:

But we are responsible to make that happen, as “the best way to predict the future is to create it” !




Technology combinations that digitally deliverJim Sinur

Jim is making a case for open “Digital Business Platforms” where emerging digital technologies can be combined and are at the basis for digital transformation.  Quite a few of these exist and could be considered proven solutions but typically only in specific areas (e.g. AI, analytics, BPM, collaboration, cloud or IoT).

He then described how combining some of these create so-called productive pairs: for example CJM (customer journey mapping) + BPM, or BPM + AI / RPA / PM (process mining) or IoT and AI.  Or even triplets: BPM + PM + RPA, BPM + IoT + AI, Arch + Low Code + RPA, Workflow + Content + Collaboration and Unified Communication + AI + BPM.  Combining these technologies creates a platform with particularly advantages to achieve certain set of use cases.
Jim made a case for vendors to collaborate on this, because some of these technologies can be very complementary and improve customer experience and satisfaction.




Industry round table: how cloud architecture is redefining product suites and automation platform strategies

Next is a panel, with participants from IBM, BonitaSoft and our own Phil Simpson from Red Hat.  This is much difficult to summarize, so I'll just aggregate some of the ideas.

Why / how cloud?
  • Unified platform that gives access to independent services in a containerized way
  • Public / private / hydrid and multi-cloud
  • Componentisation to guarantee elasticity of the solution
  • Pick and choose the features you need rather than installing one big monolith
  • Making the platform easily available where you are measuring
Vendor-neutral cloud strategy?
  • Standards can be useful to achieve this to some extend
  • There will always be differentiators built on top
  • It's going to be hard to do this without more standardization (e.g. common data model)
Partner ecosystem?
  • Use of an integration and api mgmt solution to be less dependent on specific integration
  • Partner ecosystem is changing, from partners implementing solutions to partners offering value-add on top
  • Cloud as a vehicle towards a unified target ecosystem, open-source as a way to generate collaboration
  • System integrators are evolving to become more consultants delivering best practices than the technical challenges alone
  • Partners build relationships and focus on specific verticals or areas
Do the standards invented decades ago need to evolve to adapt to this new reality?
  • The advantage is that standards like BPMN and DMN are independent of the technology used to do the actual orchestration
  • Connection to events is missing?
  • Running into the limitations on how the standards can be interpreted vs how they were written, which leads to various challenges

Re-aligning BPM in the age of intelligent automation
Malcolm Ross, Appian

Appian believes that Intelligent Automation needs to be an effort to seamlessly offer RPA, AI, Integration and BPM altogether (rather than isolated silos for each of them).  BPM is the glue that brings all of this together, like for example it combines RPA with humans, AI optimized BPM or BPM integrates systems.

The demo is showing an invoice processing application that is being enhanced with RPA.  After attaching my invoice in the UI, I can send the robot automation desktop where data can be extracted and for example uploaded to an FTP site.  BPM can be used to handle for example exceptions during this part (creating new tasks for a human to manually solve).  Data from both BPM and RPA also needs to be combined into a holistic view on what is happening.
From the authoring point of view, you can set up integrations separately using various connectors (to for example RPA systems but for example also Google NLP) and then use these by calling them from the process.
Interesting analogy on why and when RPA: RPA is like ibuprofen where service integration would be like amoxicilline - ibuprofen solves the short term pain and is easily accessible, where amoxicilline is much more difficult to get but solves the issue at the root by killing the infection, there clearly is a market for both.

Thursday, April 11, 2019

bpmNEXT 2019 and Red Hat Summit 2019

In the next few months, I will have the opportunity to present at both bpmNEXT and Red Hat Summit.

bpmNEXT

Next week (April 15 - 17), bpmNEXT is taking place again in Santa Barbara, where lots of vendors in the BPM space (or whatever you prefer calling it nowadays - business automation, workflow, orchestration) are coming to showcase and discuss some of their latest achievements.  Check out the conference agenda for the full schedule.  I will be presenting on Wednesday on:

Automating Human-Centric Processes with Machine Learning
Kris Verlaenen, Red Hat
Many business processes involve human actors to perform some of the steps that are required to achieve the business goal.  In this context, human actors are typically expensive, can cause unwanted delay or become a bottleneck.  Automating some of these tasks can have a tremendous return on investment, and Machine Learning brings the missing bits to seamlessly automate work initially assigned to humans as soon as there is enough confidence in the expected outputs.  By integrating Machine Learning into our Red Hat Business Automation portfolio, customers can gradually start using Machine Learning to assist or gradually even replace the human actor(s) in a very simple manner, without even having to make any changes to the process definitions in their organisation.
Last year, the recordings were available almost immediately, so that should give everyone an opportunity to take advantage of the great presentations that are typically shown there.  I will also blog my impressions during the event as last years.



 

I will also attend Red Hat Summit this year, which is taking place in Boston on May 7-9.  I have a minitheater session where I will be talking about all the work we have been doing for our next generation architecture for cloud-native business automation.  We got a lot of exciting things we've been working on, so I'm really excited we'll be able to share this information with everyone soon.

Automating business operations in a hybrid cloud world
Kris Verlaenen, Red Hat
Business automation helps you automate the many processes and decisions in your applications. In the context of a hybrid cloud, we have been working on our next generation architecture to support process automation and decision management in a true cloud-native manner, taking full advantage of the cloud infrastructure and many of the recent technical advances in that context.  We will demonstrate how business automation simplifies building your own domain-specific applications, leveraging extremely small and efficient execution yet still taking advantage of a lot of the capabilities business automation could offer you as well (from managing human interaction, auditing to monitoring and admin operations).
If you are attending, feel free to reach out if you want to meet up!  But we'll share the same information with the wider community as well of course, so stay tuned !