Showing posts with label presentation. Show all posts
Showing posts with label presentation. Show all posts

Friday, April 24, 2020

Virtual Red Hat Summit 2020, April 28-29

Next week Red Hat Summit 2020 will be held, not in San Francisco as we were hoping, but as a virtual event.  While this unfortunately won't give us the possibility to meet in person, a lot of the keynotes and breakout sessions will be held online.

Virtual Red Hat Summit is completely FREE, so if you haven't done so yet, register today!

Below is an overview of various sessions around business automation.  So if you're looking for the latest news on Kogito, our next gen cloud-native business automation toolkit, or how to leverage Red Hat Process Automation Manager and Decision Manager for use cases that involve microservice orchestration or machine learning, or to hear from our customers.  But take a look at the full agenda as well.

There will also be an opportunity to come and chat with us in the community area.  After signing in, click Explore and open up the "Middleware & cloud applications" Community Central chat room to ask questions!  Or you can just join our KIE chat channels we announced recently anytime.

Below is the list of presentations around business automation that I am aware of !

The state-of-the-art of developer tools to build business-intelligent apps for RHPAM v7 and Kogito
Eder Ignatowicz (Red Hat), Alex Porcelli (Red Hat)

Empowering Amadeus’ competitive advantage with cloud-native decision making on Quarkus
Matteo Casalino (Amadeus), Giacomo Margaria (Amadeus), Mario Fusco (Red Hat)

Modern business workflows as microservices: How we won with Red Hat Process Automation Manager
Mauro Vocale (Red Hat), Giovanni Marigi (Red Hat)

Why building intelligent cloud-native business applications is easier with Kogito
Kris Verlaenen (Red Hat)

Cloud, sweet cloud: Feeling at home with serverless decision making using Kogito and Camel-K
Daniele Zonca (Red Hat), Edoardo Vacchi (Red Hat), Luca Burgazzoli (Red Hat)

Integrating scalable machine learning into business workflows
Rui Vieira (Red Hat)

Solve the unsolvable: Why artificial intelligent systems can solve planning problems better than humans
Satish Kale (Red Hat), Geoffrey De Smet (Red Hat)

Transforming decision automation to be cloud-based and FaaS-like at BBVA
Antonio Valle Gutierrez (BBVA), Beatriz Alzola (BBVA), Marcos Regidor (Red Hat)
This is available on demand so no specific timing.

Wednesday, November 20, 2019

Kogito deep dive video from Devoxx

This year at Devoxx Belgium, Maciej, Edoardo and Mario held a 3h deep dive on Kogito.  Since Devoxx is so awesome to share the recordings of all their presentation online, wanted to give everyone the opportunity to go and watch this!



I also had the opportunity to help out at the Red Hat booth for 2 days, and it was a great opportunity to sync up with a lot of people and do some Kogito evangelization.  And was there live for the big announcement of Quarkus doing its 1.0 release !



Wednesday, April 24, 2019

bpmNEXT 2019 impressions, day 3

This is part of a 5-part blog series on bpmNEXT 2019:
Day 1
Day 1 (part 2)
Day 2
Day 2 (part 2)
Day 3


Last (half) day where I have to present myself as well (as 3rd of the day).

A Well-Mixed Cocktail: Blending Decision and RPA Technologies in 1st Gen Design Patterns
Lloyd Dugan

Lloyd introduced an RPA-enabled case mgmt platform that is used in the context of a use case to determine eligibility for Affordable Care Act. Using Sapiens for decisions and Appian for BPM, approximately 4000 people are using this as a work mgmt application (where work is assigned to people so they can work through this).  To be able to achieve higher throughput, they however combined this with RPA that emulate the behavoir of the users.  He showed (unfortunately in a prerecorded video, not a live demo) how they implemented the robots to perform some of the work (up to 50% of the total work done by the users !). The robots learned how to soft fail if there were issues (in which case the work would go back into the queue), needed to accomodate for latency, etc.




Emergent Synthetic Process
Keith Swenson - Fujitsu

Keith presented a way to customize processes to different contexts (for example slightly different regulations / approaches in different countries) by being able to generate a customized process for your specific context (when you start the process).  Rather than encoding processes in a procedural manner (after A do B), he is using "service descriptions" to define the tasks and the preconditions. You can then generate a process from this by specifying your goal and context and working backwards to create a customized process from this.  This allows you to add new tasks to these processes easily (as this is much more declarative logic and therefore additive).
The demo showed a travel application with approval by different people. Service descriptions can have required tasks, required data, etc.  The process is generated by working backwards from the goal, adding required steps one by one.  Different countries can add their own steps, leading to small customizations in the generated process.




Automating Human-Centric Processes with Machine Learning
Kris Verlaenen - Red Hat

I was up next !  I presented on how to combine Process Automation and Machine Learning (ML), to create a platform that combines the benefits of encoding business logic using a combination of business processes, rules etc. but at the same time can become more intelligent over time by observing and learning from the data during execution.  The focus was on introducing "non-intrusive" ways of combining processes with ML, to assist users with performing their tasks rather than to try and replace them.
The demo was using the it-orders application (one of our out-of-the-box case management demos that employees can use to order laptops) that focused on 3 main use cases:
  • Augmenting task data:  While human actors are performing tasks in your processes or cases, we can observe the data and try to predict task outcomes based on task inputs.  Once the ML algorithm (using Random Forest algorithm, with the SMILE library as the implementation) has been trained a little, it can start augmenting the data with possible predictions, but also with a confidence it has on that prediction, the relative importance of the input parameters, etc.  In this case, the manager approving the order would be able to see this augmented data in his task form and use it to make the right decision.
  • Recommending tasks:  Case management allows users to add addition dynamic tasks to running cases (even though they weren't modeled in the case upfront) in specific situations.  Similarly, these can be monitored and ML could be used to detect patterns.  These could be turned into recommendations, where a user is presented with a recommendation to do (or assign) a task based on what the ML algorithm has learned.  This can help the users significantly to not forget things or to assist them by preparing most of the work (they simply have to accept the recommendation).
  • Optimizing processes based on ML: One of the advantages of the Random Forest algorithm is that you can inspect the decision trees that are being trained to see what they have learned so far.  Since ML also has disadvantages (that it can be biased or that it is simply learning from what is being done, which is not necessarily correct behavior), analyzing what was learned so far and integrating this back into the process (and/or rules etc.) has significant advantages as well.   We extended the existing case with additional logic (like for example an additional decision service to determine whether some manager approvals could be automated, or additional ad-hoc tasks included in the case that would be triggered under certain circumstances), so that some of the patterns detected by ML would be encoded and enforced by the case logic itself.
These non-introsive ways of combining processes with ML is very complementary (as it allows us to take advantage of both approaches which mitigates some of the disadvantages of ML) and allows users to start getting advantages of ML and build up confidence in small and incremental steps.




ML, Conversational UX, and Intelligence in BPM
Andre Hofeditz, Seshadri Sreeniva - SAP SE

SAP is presenting "live processes" that are created by combining predefined building blocks, running on their platform with support for conversational user experience, decision management, task inbox, etc.  

SAP API Business Hub has been extended to also include live processes. Using an employee onboarding scenario, they show how a running instance can be "configured" (only in specific situations, which you can define during authoring) after which you can change the template and generate a new variant.  The process visibility workbench allows to generate a customizable UI for monitoring progress of your processes.
Next, they show how you can extend the platform by using recipes, which can be imported in SAP web IDE and deployed into the platform, adding additional capabilities that will be available in your live processes from that point forward.
Finally, they showed an intelligent assistant that is a sort of chatbot that can respond to voice.  It can give an aggregated view of your tasks, complete the tasks through the conversational UI, etc.  They showed how the chatbot can be programmed by defining tasks with triggers, requirements and actions, which can then be deployed as a microservice on the SAP cloud.




DMN TCK
Keith Swenson 

Keith explained the efforts that are going into the DMN TCK, a set of tests to verify the compliance of DMN engines.  When running these tests, it takes a large number of models and test cases (currently over a thousand but still growing) and check the results.  He explained some of the challenges and opportunities in this context (e.g. error handling).
While many vendors claim DMN compatibility, Red Hat is one of the few vendors that actually has the results to prove it !



That concludes bpmNEXT 2019!  As previous years, I very much enjoyed the presentations, but probably even more the discussions during the breakouts and evenings.

Wednesday, April 17, 2019

bpmNEXT 2019 impressions, day 2 (part 2)

This is part of a 5-part blog series on bpmNEXT 2019:
Day 1
Day 1 (part 2)
Day 2
Day 2 (part 2)
Day 3


BPM, Serverless and Microservices: Innovative Scaling on the Cloud    Thomas Bouffard, Philippe Laumay - Bonitasoft

Bonitasoft is explaining how they are containerizing their BPM platform as micro-services and using new technology like serverless.

A payroll process with some human interaction.  They also simulate a load of 100 users by using some script that runs simultaneously.  When they do this, the UI itself becomes unresponsive.  After containerizing their application, they use kubernetes to be able to scale up the engine to 3 pods, increasing the capacity what the engine can handle.
In a second step, they externalize some of the work that the process is doing as an asynchronous lambda function.  This moves some of the CPU usage outside of the BPM platform, making it easier for the engine itself to scale.




Performance Management for Robots
Mark McGregor and Alessandro Manzi - Signavio

Signavio is sharing their strategy on combining robots with human actors and how to manage your robotic workforce.  They recommend treating robots similar to human resources, by making sure they have clear job descriptions, are evaluated and if necessary fired.

Signavio has traditionally offered tools to analyze your existing processes related to performance, which they can for example use to identify tasks that would be appropriate for applying RPA.  Once identified, simulation allows you to figure out what the consequences would be of applying robots to perform some of the work.  For example, the solution using robots might have a higher cost (for example due to licenses) but decrease cycle time, allowing you to make a conscious decision.  Detailed analysis of the performance of the robots once applied could lead to the robot to be fired if it's performing badly.




The Case of the Intentional Process
Paul Holmes-Higgin, Micha Kiener, Flowable

Flowable is using "micro-processes" to manage chatbots and describe their behavior.  Around these micro-processes (using BPMN), they use case mgmt (CMMN) to link different chatbots together to make sure they are being used in the right context.

In the demo they have different types of chatbots, to assist banking customers with various kinds of functions.  In their cases they model "intents" that the chatbot will be trying to detect, and how to respond in case the intent should be executed (using signal events that will trigger a specific process fragment), all context-aware.
Using a chatbot conversational UI, you can run through a process, where the chatbot is asking for all the appropriate information step by step by following the process and collecting the results.  If necessary it can recommend to switch to a different chatbot or a real human user.  The chatbot interface also supports various commands (e.g. /create task ...), and show other relevant information (e.g. open tasks) in the UI.  
When trying to continue the conversation through WhatsApp (which has the limitation you can only show text, no buttons), the chatbot is smart enough to be aware of those limitations and fall back to text-based replies (e.g. type "yes" or "no") instead of buttons or forms.




Industry Round Table: Advancing the Value Proposition of 'Intelligent Automation'

Nathaniel is launching a panel discussion about the name, moving away from BPM to "Intelligent Automation"?
  • Even though some products offer a unified platform to apply BPM, DM, RPA, AI and integration, customers are often still doing it in a siloed way, customers need to think more holistically
  • Intelligent might refer more to A.I., automation might refer more to RPA -> taking advantage of some of the hype around these technologies to sell the porfolio.
  • We might be selling BPM technology, but we might be marketing more a vision that is broader than that.
  • BPM is not just technology, it's a methodology
  • Intelligent Process Automation?
  • workflow or orchestration is becoming more popular again (for the technology) but is less marketable
  • What are we doing?
    • Technology to help by automating some of the work
    • "Free the humans"
That concludes the second day, half a day left tomorrow, where I will be presenting myself.

Tuesday, April 16, 2019

bpmNEXT 2019 impressions, day 2

This is part of a 5-part blog series on bpmNEXT 2019:
Day 1
Day 1 (part 2)
Day 2
Day 2 (part 2)
Day 3


Keynote: Best of Breed: Rolling Your Own Digital Automation Platform using BPMS and Microservices
Sandy Kemsley

Sandy Kemsley (who is also blogging about the various presentations) is starting off day 2 with a keynote on how large customers are building their own digital automation platforms in house leveraging available technologies, like BPM.  Nowadays, taking a best of breed approach is replacing the legacy "monolith".  In the last decade, the BPMS became the new monolith because it was trying to fill a gap in app development (with constantly increasing requirements around forms, graphical modeling, BAM, etc.) which lead to large suites including one specific solution for each of these requirements.  Agility however is a new competitive differentiator.
The new Digital Automation Platform is much more a (dynamic) collection of independent microservices, where the best-of-breed approach allows you to swap services in or out.


This might not be the solution for everyone (yet), but might be interesting for small to mid-sized companies looking for a COTS system to manage core processes, or for large companies with a large development team.
As a lesson for vendors, she recommends to separate components and price accordingly, and to make sure you can build microservices for your processes and decisions.




Business Automation as a Service
Denis Gagne - Trisotech

Trisotech is presenting their business automation as a service offering, allowing business users to express their logic in a simple way and now execute it directly as well.

The demo starts with a simple process to turn on the light when a twitter message is received.  After defining this simple process within the tool, it's deployed into the cloud with one click, and the lamp he brought on stage starts flashing every time someone tweets #Trisotech.  Next, the process is extended to include a sentiment analysis service, to analyze the text included in the tweet, after which the light starts flashing green (or yellow or red) for every positive (or neutral or negative) tweet.
Next, a more complex example is used to track customer leads.  When going to a demo website, you can register your details and the process will route your request to the right sales person, email you the slides and register you in the CRM system.

Trisotech is working closely with Red Hat, so great to see how they have built this great tool to allow people to quickly create and deploy processes and decisions into the cloud.




Business-Composable Services for the Mortgage Industry     
Bruce Silver - Method and Style

Bruce shares his experience of using Trisotech to model a use case in the mortgage industry, using a combination of BPMN and DMN.

Applying for a loan requires quite complex decision models to determine eligibility and determine loan amount etc.  The mortgage industry has standardized quite of lot of this, which enables creating some form of reusable service.  A DMN model is used to describe the logic, using FEEL for the expressions.  While the logic is complex sometimes, the resulting model should still be understandable by domain experts.
The data that is used as input is a standardized XML format (Mismo), which is mapped to a more DMN-friendly format (including validation etc.) using a separate process that is deployed as a service as well.  Similarly, the input can also be a pdf file in which case a different process is used to extract the data from there.  Using a simple test web page to provide the inputs (that is generated as part of the process deployment), the service produces the expected results.




Industry Round Table: The Coming Impact of Decision Services and Machine Learning on Business Automation

Another panel, this time focusing on decision services and A.I.
  • Consensus (at least here it seems) that decision management has great synergy with process automation.
  • Standards are really important, although not all vendors are using BPMN or DMN, which is fine
    • DMN is not backed as much by the big vendors (Red Hat is one of them though), so it's future is still much less clear
  • Need to define and demystify A.I. as there are various types of intelligence
  • Challenges with "black box" A.I. that cannot clearly explain why
  • Ethical considerations
    • Automation is disrupting labor force
    • Some decisions are now being implemented in cold hard code
  • The required skillset to deal with A.I. is only increasing

bpmNEXT 2019 impressions, day 1 part 2

This is part of a 5-part blog series on bpmNEXT 2019:
Day 1
Day 1 (part 2)
Day 2
Day 2 (part 2)
Day 3


Democratizing machine learning with BPM
Scott Menter - BP Logic

In the demo from BP Logix, they show how they have integrated Machine Learning into their Process Director to start using it in combination with processes.

In this case we are trying to make a prediction on employee attrition (whether they are likely to leave the company).  You start by creating a learner object. After selecting a datasource (a database) and possibly some transformation, you can select which inputs you want to use (giving information or even suggestions on the available data, or visualizing characteristics about the data you selected) and train the model with the selected data.
This data can then be used in for example a form to show potential attrition rate while you are filling in information about an employee.  Or it can be used in a process to drive a decision.

By integrating the learner objects into the Process Director, the learning curve to start using this is much lower, as it's all integrated in one solution (even if the learner objects might actually be encoded by a different actor).




Leveraging process mining to enable human and robot collaboration
Michal Rosik - Minit

Minit suggests to use process mining to improve your RPA strategy.  The strategy is two-fold: (1) use it to pick the right process to apply RPA to, select the right activity and person, to get a higher degree of success (as 40% of RPA projects fail); and (2) to monitor the results to make sure everyone is happy.

They apply this to a purchase process, where there are various bottlenecks detected to fill in the right order number, etc. (using standard process mining).  They allow you to drill down several layers to inspect the details of the selected activity, how for example the human actor is using a combination of the browser, skype, etc., the steps they take (possibly multiple variations) to get the necessary information.  These detailed steps could then be used as a basis to generate the RPA script.
After applying the RPA robots to automate some of the steps, the same process mining can be used to monitor and compare the results.  For example, the average completion time might not have improved as expected, in which case we can analyze why that might be (for example that the bots are creating an increased load on the system, causing performance issues).
Finally, Minit dashboards exposes all this information in interactive BI charts.




Process mining and DTO - How to derive business rules and ROI from the data
Massimiliano Delsante, Luca Fontanili - Cognitive Technology

Cognitive Technology is moving from traditional process mining to creating a Digital Twin of your Organization (DTO).  This includes process discovery, cost analysis, simulation, etc. but for example also a new feature to derive actual business rules from the data (rather than traditional probabilities).

The demo is showing the use case of closing a bank account.  They can generate a BPMN diagram from the mining data, but now they even detect correlations for decisions (gateways) using machine learning, to also discover the conditions that are underlying.  After verification by a human actor and/or simulation, these conditions can be added to the process.  The decision can also be extracted separately using DMN, called from the process model. Finally, simulation can be used to identify possible improvements by applying for example RPA to automate some of the tasks: the simulation engine can generate new data with the suggested improvements, and this data can then be mined again to verify the results.




Is the Citizen Developer Story a Fairytale?
Neil Miller, KissFlow

KissFlow is a no-code platform for citizen developers.  Neil starts by showing the runtime application first, showing various kinds of forms to start a process, tracking current state, performing work, etc.  These forms have various pretty advanced features to load form data, printing, getting assistence, etc.
Next, we shifted to the tool to create this.  First the forms: composed of various field types like text fields and dropdowns to tables and advanced fields like signatures etc.  The process itself is a drag and drop tool but using a quite different visualization that is still a flow chart but tries to be as simple as possible for citizen developers (with inline editing of actions etc. inside the diagram, etc. - which reminded me a lot about Zapier for defining integrations).
They are also working on a new KissFlow version 3.0, which will be available soon.  The forms and process modeling still look pretty similar, but this new version is adding various features to simplify collaboration, having things like threads where people are collaborating, more adaptive processes, using kanban boards, more extensive reports, etc.




Insightful process analysis
Jude Chagas Pereira, Frank Kowalkowski, Gil Laware

Wizly is a tool that allows you to run analysis on collected log data, to do things like compliance checks, correlation checks, relationship and sentiment analysis, etc.

The demo shows a call center use case.  After loading in the data of about 2000 cases into the tool, the flow model can be generated from the log data and start running analytics.  The compliance analysis shows us various information about the paths that are being executed (or not).  Next, we can run further analysis, in this case zooming in on problems with bagage-related problems.  This allows us to find possible causes (like canceled flights) but also to filter down to get even more insights.
DNA analysis detects possible paths and can visualize relations between your data (with the capability to filter further down if necessary).  Finally, fourbox plots the data on some form of bubble chart.  They were only able to show some of the features, as they explained they have a lot more analytical capabilities under the hood.




Improving the execution of work with an AI driven automation platform
Kramer Reeves, Michael Lim, Jeff Goodhue - IBM

IBM has worked hard in the last few years to integrate some of their offerings into one unified platform, that they are presenting here.

This demo stars with the authoring, where the case builder, process designer and decision center are combined to define the business logic.  Next, we switched to the runtime UI where new cases can be started and managed and we run through a few steps of the case.  
Next they showed some more advanced integrations: a robot is launched to automatically perform one of the steps, interaction with a chatbot to help find the data I need, analysis charts to help with the decision making, etc.  The final step is to use Watson AI to make recommendations.
Finally, we got a look of the new Business Automation Studio, where you can build business applications in a low-code manner.  You can create forms for business users, and these can be linked (by associating actions with the different buttons) to call new pages or backend functions.




Wrapping up
That concludes day 1 (at least for you, we still have a wine and bear tasting and dinner :-)).  If you are interested to get another view of what's happening here, be sure to check out Sandy Kemsley's blog, who is blogging about the different presentations as well and has a lot more experience in doing this as an independent analyst for many years on BPM and related technologies :)

Monday, April 15, 2019

bpmNEXT 2019 impressions

Back again this year in Santa Barbara for 3 days of discussions on bpmNEXT between (mostly) vendors on what we collectively believe the future might be looking like, or what challenges we face and how we can solve them.

And I will be blogging about my impressions of the presentations here.

This is part of a 5-part blog series on bpmNEXT 2019:
Day 1
Day 1 (part 2)
Day 2
Day 2 (part 2)
Day 3


BPM 2019-2023: Outlook for the next five years 
Nathaniel Palmer

Nathaniel is kicking it off with his traditional presentation of looking forward and predicting what we might see in the next few years in the context of “Intelligent Automation” (yes, the conference is still called bpmNEXT but the consensus seems to be that this term seems to cover better what we are discussing here).  He actuall started by looking back to the predictions from 5 years ago, where he predicted the 3 R’s (Rules, Relationships and Robots).  This all seemed fairly accurate, as “robots” are here now, and they definitely need rules. And the data is scattered across many systems (over 13 systems on average, many external) and all need to be related. Some of the 2019 prediction are that by 2022,
  • 50% of the work will be done by robots
  • 70% of the work will be done on third party cloud platforms: so that means that the “Intelligent Automation Platform” architecture that was presented a first time a few years ago, has been updated a little to reflect this (where the event bus is now much more inherently part of the cloud itself)
  • 80% of the user interaction will be done through an interface other than the smart phone (think smart speakers), moving from a worklist metaphor to much more conversational interaction.
Since the concept of work is now much broader (including robots, autonomous intelligence, decision services, etc.), what’s the best way to represent and model this, as traditional flow charts have reached their limits in representing more adaptive requirements.
He also made a case for intelligent automation to shift to taking much more short term decisions based on the most recent live events as the business value is typically much higher if the response time is as small as possible and based on the most recent data rather than calculated the best approach long upfront (like how for example Waze can give me a much better route compared to just finding the shortest route long upfront).
Or as Nathaniel summarized it himself:

But we are responsible to make that happen, as “the best way to predict the future is to create it” !




Technology combinations that digitally deliverJim Sinur

Jim is making a case for open “Digital Business Platforms” where emerging digital technologies can be combined and are at the basis for digital transformation.  Quite a few of these exist and could be considered proven solutions but typically only in specific areas (e.g. AI, analytics, BPM, collaboration, cloud or IoT).

He then described how combining some of these create so-called productive pairs: for example CJM (customer journey mapping) + BPM, or BPM + AI / RPA / PM (process mining) or IoT and AI.  Or even triplets: BPM + PM + RPA, BPM + IoT + AI, Arch + Low Code + RPA, Workflow + Content + Collaboration and Unified Communication + AI + BPM.  Combining these technologies creates a platform with particularly advantages to achieve certain set of use cases.
Jim made a case for vendors to collaborate on this, because some of these technologies can be very complementary and improve customer experience and satisfaction.




Industry round table: how cloud architecture is redefining product suites and automation platform strategies

Next is a panel, with participants from IBM, BonitaSoft and our own Phil Simpson from Red Hat.  This is much difficult to summarize, so I'll just aggregate some of the ideas.

Why / how cloud?
  • Unified platform that gives access to independent services in a containerized way
  • Public / private / hydrid and multi-cloud
  • Componentisation to guarantee elasticity of the solution
  • Pick and choose the features you need rather than installing one big monolith
  • Making the platform easily available where you are measuring
Vendor-neutral cloud strategy?
  • Standards can be useful to achieve this to some extend
  • There will always be differentiators built on top
  • It's going to be hard to do this without more standardization (e.g. common data model)
Partner ecosystem?
  • Use of an integration and api mgmt solution to be less dependent on specific integration
  • Partner ecosystem is changing, from partners implementing solutions to partners offering value-add on top
  • Cloud as a vehicle towards a unified target ecosystem, open-source as a way to generate collaboration
  • System integrators are evolving to become more consultants delivering best practices than the technical challenges alone
  • Partners build relationships and focus on specific verticals or areas
Do the standards invented decades ago need to evolve to adapt to this new reality?
  • The advantage is that standards like BPMN and DMN are independent of the technology used to do the actual orchestration
  • Connection to events is missing?
  • Running into the limitations on how the standards can be interpreted vs how they were written, which leads to various challenges

Re-aligning BPM in the age of intelligent automation
Malcolm Ross, Appian

Appian believes that Intelligent Automation needs to be an effort to seamlessly offer RPA, AI, Integration and BPM altogether (rather than isolated silos for each of them).  BPM is the glue that brings all of this together, like for example it combines RPA with humans, AI optimized BPM or BPM integrates systems.

The demo is showing an invoice processing application that is being enhanced with RPA.  After attaching my invoice in the UI, I can send the robot automation desktop where data can be extracted and for example uploaded to an FTP site.  BPM can be used to handle for example exceptions during this part (creating new tasks for a human to manually solve).  Data from both BPM and RPA also needs to be combined into a holistic view on what is happening.
From the authoring point of view, you can set up integrations separately using various connectors (to for example RPA systems but for example also Google NLP) and then use these by calling them from the process.
Interesting analogy on why and when RPA: RPA is like ibuprofen where service integration would be like amoxicilline - ibuprofen solves the short term pain and is easily accessible, where amoxicilline is much more difficult to get but solves the issue at the root by killing the infection, there clearly is a market for both.

Thursday, April 11, 2019

bpmNEXT 2019 and Red Hat Summit 2019

In the next few months, I will have the opportunity to present at both bpmNEXT and Red Hat Summit.

bpmNEXT

Next week (April 15 - 17), bpmNEXT is taking place again in Santa Barbara, where lots of vendors in the BPM space (or whatever you prefer calling it nowadays - business automation, workflow, orchestration) are coming to showcase and discuss some of their latest achievements.  Check out the conference agenda for the full schedule.  I will be presenting on Wednesday on:

Automating Human-Centric Processes with Machine Learning
Kris Verlaenen, Red Hat
Many business processes involve human actors to perform some of the steps that are required to achieve the business goal.  In this context, human actors are typically expensive, can cause unwanted delay or become a bottleneck.  Automating some of these tasks can have a tremendous return on investment, and Machine Learning brings the missing bits to seamlessly automate work initially assigned to humans as soon as there is enough confidence in the expected outputs.  By integrating Machine Learning into our Red Hat Business Automation portfolio, customers can gradually start using Machine Learning to assist or gradually even replace the human actor(s) in a very simple manner, without even having to make any changes to the process definitions in their organisation.
Last year, the recordings were available almost immediately, so that should give everyone an opportunity to take advantage of the great presentations that are typically shown there.  I will also blog my impressions during the event as last years.



 

I will also attend Red Hat Summit this year, which is taking place in Boston on May 7-9.  I have a minitheater session where I will be talking about all the work we have been doing for our next generation architecture for cloud-native business automation.  We got a lot of exciting things we've been working on, so I'm really excited we'll be able to share this information with everyone soon.

Automating business operations in a hybrid cloud world
Kris Verlaenen, Red Hat
Business automation helps you automate the many processes and decisions in your applications. In the context of a hybrid cloud, we have been working on our next generation architecture to support process automation and decision management in a true cloud-native manner, taking full advantage of the cloud infrastructure and many of the recent technical advances in that context.  We will demonstrate how business automation simplifies building your own domain-specific applications, leveraging extremely small and efficient execution yet still taking advantage of a lot of the capabilities business automation could offer you as well (from managing human interaction, auditing to monitoring and admin operations).
If you are attending, feel free to reach out if you want to meet up!  But we'll share the same information with the wider community as well of course, so stay tuned !


Wednesday, May 9, 2018

BBVA is a Red Hat Innovation Award 2018 winner !


BBVA is a customer-centric, global financial services group based in Spain. BBVA chose to make Red Hat technology a key piece of its new cloud platform.  BBVA built its platform using several solutions, including Red Hat OpenStack Platform, Red Hat OpenShift Container Platform and Red Hat JBoss BPM Suite.


You can ready the full success story here.

As a result, we have been working very closely with them over the last year, and very excited to see all of that hard work showcased here! I have a presentation on Thursday with BBVA on this topic, see if you are attending Red Hat Summit, hope to see you there !

Antonio Valle Gutierrez (BBVA), Marcos Regidor, Kris Verlaenen (Red Hat)


[If you're not attending, I recently did a presentation + demo at bpmNEXT about our cloud strategy as well, so if you're interested in the topic, hope this might be useful as well]

Tuesday, May 8, 2018

Red Hat Summit in San Francisco 2018 (May 8th - 10th)

This year, Red Hat summit is again in San Francisco, and I'm excited to be able to attend again.  We're kicking off today, but I personally have a presentation on Thursday afternoon (co-speaking with a customer on our BPM cloud strategy), and a jBPM Birds of Feather session on Wednesday with Maciej (where anyone can just walk in to come and talk or ask questions), but more about that later.


We're kicking off with the keynotes.  If you're interested, you can watch the keynotes live at https://www.redhat.com/en/summit/2018 (or in replay later). This first keynote already includes a live demo (by Burr Sutter and team) showing a true hybrid cloud (combining private and public cloud).

And an awesome extra: this year Business Optimizer (Optaplanner) is used for scheduling all the different sessions, and got a honorable mention by Jim Whitehurst during his keynote. 
"It really is a fenomenal tool!"
Jim Whitehurst - Red Hat President and CEO
 Business Optimizer is part of our Red Hat Process Automation Manager product (formerly known as Red Hat BPM Suite).  Congratulations to the team!
 
This year has record attendance (7.000+), a ton of breakout sessions (325) and even more opportunities to talk to the experts directly.  If you are around and would like to talk, we'll be happy to see you in one of our sessions !  There's quite a few people from our team here, to try and help you with whatever questions you might have.

Antonio Valle Gutierrez (BBVA), Marcos Regidor, Kris Verlaenen (Red Hat)

Kris Verlaenen, Maciej Swiderski (Red Hat) - Moscone West - 2103

So just reach out (at the booth or through social media) and we would love to hear your thoughts.  Hope to see you here !

Thursday, April 19, 2018

bpmNEXT 2018 day 3

Part of a series of blog posts about bpmNEXT 2018:
bpmNEXT 2018 kicking off!
bpmNEXT 2018 day 1 (part 2)
bpmNEXT 2018 day 2
bpmNEXT 2018 day 2 (part 2)
bpmNEXT 2018 day 3 

When Artificial Intelligence meets Process-Based Applications
Nicolas Chabanoles and Nathalie Cotté - Bonitasoft

Rather than just relying on reporting for insight in your processes, adding Artificial Intelligence is taking it a step further towards being able to create a shorter feedback look, do predictions, etc.  Using a custom loan request application, they show how data is extracted from the database into ElasticSearch, after which they build a predictive model from that.  Using process mining techniques they apply time-based analysis to predict the likeliness of certain requests still able to reach their SLA (or not).  Based on that information the operational people can decide which requests to prioritize.


Understanding Your Models and What They Are Trying to Tell You
Tim Stephenson - Know Process

Searching and looking into your process models can become complex if you have lots of processes.  By indexing these processes, a quite nifty query language can then be used to go and query those models.  Search for processes, data and resource involved, etc.  While the theory sounds nice, in reality it didn't always seem to be that simple.


Exploiting Cloud Infrastructure for Efficient Business Process Execution
Kris Verlaenen - Red Hat

My own presentation, about executing processes in a cloud environment in a distributed manner.  We've introduced services like controllers (to keep track of your engines running everywhere and manage them), and smart routers (to route requests to the right engine and aggregate data across them).  Our monitoring console allows you to connect to any engine out there (in this case a engine embedded in a sample order application deployed on a Minishift instance).  In the demo I showed how you can then update the SLA expectations of the embedded process, and after deploying this new version of the project monitor for changes.


Dynamic Work Assignment
Lloyd Dugan - Serco

In the context of Obamacare, Lloyd presented a use case of improving task assignment by scoring tasks (based on eligibility, severity, etc.) and assigning them to the right people.  Replacing a model where tasks were mostly just kept in queues and workers needed to go and go pick them up and choose, it allows tasks to be put "on hold" so they temporarily would not show up in the queues.
Geoffrey would have loved to see this, as I think the combination of process and rules was nice for solving this issue, but imho it's missing an actual constraint solving component (like OptaPlanner) to do the score calculations.

Finally, Keith Swenson presented about the DMN Technology Compatibility Toolkit (TCK).  Unfortunately I had to leave early to catch my flight, so just linking the video here.




bpmNEXT 2018 day 2 (part 2)

Part of a series of blog posts about bpmNEXT 2018:
bpmNEXT 2018 kicking off!
bpmNEXT 2018 day 1 (part 2)
bpmNEXT 2018 day 2
bpmNEXT 2018 day 2 (part 2)
bpmNEXT 2018 day 3 

RPA Enablement: Focus on Long-Term Value and Continuous Process Improvement
Massimiliano Delsante - Cognitive Technology Ltd.

The myInvenio tool can be used to discover processes based on data already collected.  It will derive the process (the tasks, actors, sequence, etc.) from the data and cross-check that with the cases that are already recorded (for example see which are deviating, where time is spent, etc.).
This information can then be used to derive which activities might be the best candidates for automation.  By running a simulation, you can decide for example to add two robots for automating one of the steps (at least the simple cases) and keep one employee for more complex and exceptional cases.



Integration is Still Cool, and Core in your BPM Strategy
Ben Alexander - PMG.net

PMG provides drag and drop low-code processes, with pre-built connectors.  The process included human tasks for approval, but also supported integration with email, phone or text, or slack, etc.  It contacts external services (like Azure ML) for risk assessment, and included some RPA integration.



Making Process Personal
Paul Holmes-Higgin and Micha Kiener - Flowable

Chat is becoming more and more an important communication channel for customers.  Flowable showed an example of how banks are using lots of different channels to communicate with customers, like a chatbot, and using BPMN2 and CMMN during conversations.  
A digital assistant is constantly helping the client advisor during his conversation by creating (sub)cases, advising actions, etc.  For example, it can help enter a client address change, validate the information, ask validation, send confirmation emails, involve a compliance officer if necessary, etc. Behind the scenes, the digital assistant is backed by a process (with forms etc.).  Finally, integrating Machine Learning can be used to replace some of the manual steps.



Robotics, Customer Interactions, and BPM
Francois Bonnet - ITESOFT

A demo with an actual (3d printed, open-source) robot !  Francois brought a robot with video and voice recognition capabilities.  The robot could be used for example in a shop for greeting clients.  Voice recognition can be used to start a process (for example when the customer comes in).  The robot can respond to several commands, follow, do face recognition, take pictures, etc. all by configuring various processes.  The voice and face recognition isn't always working perfectly yet, but interesting to see anyway !



The Future of Voice in Business Process Automation
Brandon Brown - K2

Voice recognition can be used to create a chatbot.  The chatbox can for example be used to request PTO, get your tasks (and complete or even delegate them).  But chatbots aren't great for everything.  Some data is just easier to provide in a structured form.  But even forms can be enhanced with for example sentiment analysis (to automatically update the data based on the sentiment detected from the text provided in the form).  You can then for example create standard processes for how to respond to certain sentiments.



State Machine Applied to Corporate Loans Process
Fernando Leibowich Beker - BeeckerCo

Processes can be unstructured and rely on rules for defining when tasks should be triggered or not.  The demo is using IBM BPM state machine in combination with IBM ODM where the rules define what the next state will be based on the current state and the input.

Wednesday, April 18, 2018

bpmNEXT 2018 day 2

Part of a series of blog posts about bpmNEXT 2018:
bpmNEXT 2018 kicking off!
bpmNEXT 2018 day 1 (part 2)
bpmNEXT 2018 day 2
bpmNEXT 2018 day 2 (part 2)
bpmNEXT 2018 day 3 

An awesome surprise this year, the videos from yesterday are already available on youtube!  So I've updated my posts from yesterday with the links, amazing job!

BPM 2018-2022: Outlook for the Next Five Years
Nathaniel Palmer

Nathaniel is starting with an outlook of where we are (maybe?) going in the next few years.  The three R's that will define BPM in his point of view are Robots, Rules and Relationships.  With everything running in the cloud.  And using Blockchain ;) 
Interaction has already significantly changed (with everyone having a smartphone), but he predicts the smartphone (as we know it) will go away in the next five years - with consumer adoption of new interfaces accelerating even more.
Robots (including any kind of smart device or service) will represent customers in various interactions.  And will do a lot of the work done by employees nowadays.  Even autonomously. This all will have an impact to application architectures, almost introducing a 4th tier in typical 3-tier architectures.
The future-proof BPM platform (aka the Digital Transformation Platform) brings together various capabilities (like Workflow Mgmt, Decision Mgmt, Machine Learning, etc.) - possibly from different vendors - processing events from many different sources (services, IoT devices, robots, etc.).
And he ended with the advice, that the best way to invent the future, is to help create it !


A Next-Generation Backendless Workflow Orchestration API for ISVs
Brian Reale and Taylor Dondich, ProcessMaker

ProcessMaker is showcasing their cloud-based process service.  It exposes a REST api for interacting with it, and has connectors to various external services.  The service does not come with a BPMN2 designer, but they accept BPMN2 and offer a programmatic interface to create processes as well.  They also introduced a "simplified" designer that ISVs can use to define processes (that underneath exports to BPMN2 as well), but hides a lot of the more complex constructs available.




CapBPM’s IQ – No-code BPM development – Turning Ideas into Value
Max Young, Capital Labs

To avoid being locked into one vendor, IQ is offering a generic web-based user interface for BPM, that can be used on top of various underlying BPM platforms.  On the authoring side you can define process and data models and do different kinds of analysis.  In the end, it generates open-source application code that work with a specific product (that your developers can use as a starting point).



Monitoring Transparency for High-Volume, Next-Generation Workflows
Jakob Freund and Ryan Johnston - Camunda

Camunda is showing Zeebe, their next generation process execution platform.  The demo starts when an arbitrage opportunity is detected, and then does various risk calculations. Zeebe Simple Monitor is a web-based monitoring tool to look at deployed processes and running instances.  With Optimize you can create and look at reports based on the various events that Zeebe is generating, including charts, heat maps, alerts, etc.
And as a treat, they showed a doom like easter egg inside their Cockpit, where you can walk through your process "dungeon" and shoot tokens with your shotgun :)

bpmNEXT 2018 day 1 (part 2)

Part of a series of blog posts about bpmNEXT 2018:
bpmNEXT 2018 kicking off!
bpmNEXT 2018 day 1 (part 2)
bpmNEXT 2018 day 2
bpmNEXT 2018 day 2 (part 2)
bpmNEXT 2018 day 3 

Decision as a Service (DaaS): The DMN Platform Revolution
Denis Gagné - Trisotech

Denis is showing the progress Trisotech has made offering DMN modeling and execution capabilities as a service.  The DMN Modeler is a complete modeling environment for DMN, including collaboration, simulation, test cases, searching, etc. After creating a DMN model, he showed various ways of creating a new DMN Decision Service to expose.
Next, this can be deployed into the cloud (including using our own Drools or Red Hat Decision Manager DMN engine).  Once deployed, it can be tested with a simple HTML form, it has a rest API, debugging the environment allows you to look at the requests that were actually made, etc.  Using an API mgmt tool, you can add even more features like authorization.
Finally, it's of course possible to include these decision services into your processes.



Timing the Stock Market with DMN
Bruce Silver - methodandstyle.com

Bruce implemented a DMN model for predicting when to buy and sell stocks.  Based on historical stock data, it uses a DMN model to detect patterns (based on local min and max, smoothing, etc.) and   This service is then orchestrated by using a process (using Microsoft Flow) to go and download 1 year of data for specific stocks, process it and present the results - using various connectors (to get information from and into Excel, call the REST decision service, etc.).  His goal was to show how a non-programmer like himself can use DMN to do real life use cases that can then be fully executed. And you should buy his DMN Cookbook for all the details :)


Smarter Contracts with DMN
Edson Tirelli - Red Hat

One of the challenges of using Blockchain for smart contracts is that some of the languages used there (for example in Ethereum) isn't always easy to understand or use (especially for non-experts).  The goal Edson had upfront was trying to use DMN instead, as a language for smart contracts that users can understand.  Using an example of selling a property, he showed how some of the logic was externalized from the contract into a DMN decision service.  The contract raised an event, that the Ethereum Oracle picks up and contacts the DMN service (running in the cloud).  Using a simple web app to initialize and finalize the sale, you could see the Blockchain being updated with all the relevant data.
Pretty cool, although as Edson is my colleague I am obviously biased ;)



Designing the Data-Driven Company
Jochen Seemann - MID GmbH


The Business Decision Map is a way to represent decisions at different levels: tactical decision, operational decisions and business events. Using the example of a car rental company, it allows you to represent the decisions they need to make at the different levels.  Using the MID Innovator tool, these decisions can be represented using DMN.  But other options like PMML and Machine learning can also be combined.



Using Customer Journeys to Connect Theory with Reality
Till Reiter and Enrico Teterra - Signavio

Since the focus of any company should be on the customer, Signavio developed a new notation for presenting customer journeys and link those to processes and business intelligence.  Using the example of a communication company where a customer has a connectivity issue, they showed an end-to-end example.  The customer (with different moods) is going through various steps, and traffic lights link these to actual data collected at runtime, or to the business process involved.  Drilling into the data, it became apparent that a process improvement to reduce the number of field visits would be worth the effort, and everything was linked to the data to substantiate that claim.



Discovering the Organizational DNA
Jude Chagas Pereira, IYCON
Frank Kowalkowski, Knowledge Consultants, Inc.

Afterspyre offers various kind of analytics to help organizations make the right decisions.   By modeling your organizational DNA (like objectives, technology solutions, datacenters, etc.), the tool can then find any relationships between all these (for example which datacenter is running which objectives).  Other options include sentiment analysis (based on feedback from customers), affinity matrices (checking how well different thinks go together), ranking (comparing different options with each other), etc.

Tuesday, April 17, 2018

bpmNEXT 2018 kicking off !

Part of a series of blog posts about bpmNEXT 2018:
bpmNEXT 2018 kicking off!
bpmNEXT 2018 day 1 (part 2)
bpmNEXT 2018 day 2
bpmNEXT 2018 day 2 (part 2)
bpmNEXT 2018 day 3 

Attending bpmNEXT event again this year in Santa Barbara.  Have been looking forward to this event for quite a few months, so happy to be able to join again this year.  Will try to blog about my impressions.  My presentation itself will be on day 3!


Welcome and Business of BPM Kickoff
Bruce Silver

Bruce started with a kickoff and introduction, explaining why bpmNEXT is different from other BPM events out there (on purpose!), trying to bring together some of the best and brightest people leading BPM efforts across the globe.  And he's right (at least in my opinion), bpmNEXT is different, which is why I enjoy returning to it every year.


The Future of Process in Digital Business
Jim Sinur - Aragan Research

Jim is pitching how process is now part of a much bigger 'digital' shift.  The focus is on the customer journey (or employee or partner journey), to make everything smarter, faster and better - hopefully resulting in new business opportunities, better customer loyalty, agility, etc.  A lot of different technologies (including BPM and DMN of course but also AI, chatbots, self service, etc.) are all converging towards the same goals.  Rather than just data, the focus is moving more to intelligence.  And rather than doing it all at once, he presented 10 mini journeys that can get you closer one step at a time, focused on one specific area they have seen customers have success in (content, collaboration, process, persona, customer interaction, analytics, AI, agile, low code and business functions).  He zoomed in on areas like the decision management framework and customer journey mapping.  But processes are still at the center of IT innovation, although they are driven by much more, including AI, wearables, etc.




A new architecture for automation
Neil Ward-Dutton - mwd advisors

Neil is trying to summarize for us a lot of the discussions he's been having with their community related to automation.  There is an abundance in technology (all playing a part in automation), resources (with cloud), competitors, etc. generating lots of expectation (and investigation) but also fear, chaos and disruption.  Customers need a way to organize this tsunami of technologies.
Neil introduced a model for representing how work gets done.  Customers need to think about how this applies to them, ranging from very programmatic (P) (like straight-through processes), transactional (T) to very exploratory (E) work (like case mgmt).  Depending on your focus, different technologies (AI, Decision Mgmt, Machine Learning, RPA, etc.) might be playing a role in that.  With a rapid moving technology market, customers might end up with a combination of a lot of those.


After these introductory talks, the ignite presentations are kicking off.

Secure, Private, Decentralized Business Processes for Blockchains
Vanessa Bridge - ConsenSys

Consensys is using BPMN in combination with Blockchain.  By using processes to interact with the Blockchain, it simplifies how to work with smart contracts and takes advantage of some of the process capabilities (e.g. timers) for some of the logic.  They are presenting two use cases: a token sale and anonymous voting.  
Whenever a request for buying tokens comes in, the process is responsible for creating the smart contract (encrypting some of the information), checking the funds available and passing along the tokens, etc.
The voting system allows you to put in some information about the vote itself and who should participate.  Again a smart contract is created and allows participants to register and do their vote (again encrypting).



Turn IoT Technology into Operational Capability
Pieter van Schalkwyk - XMPro

IoT devices produce a lot of data, but how to create the glue that connects this data into your operational decisions?  By creating data flows (in this case from a cooling tower for example), you can combine data from different listeners, transform it, and take actions (using a library of extensible components).  Active listeners will be looking for the relevant data from the IoT devices and can then for example end up triggering a BPM tool, call an AI predictive service running in the cloud, etc.  Doing so can transform your Internet of Things into an Internet of People, helping the people making the operational decisions as much as possible.


Business Milestones as Configuration: Process Director App Events
Scott Menter - BPLogix

One of the challenges executing processes is how to easily get an idea of its status, one that makes sense at the business level. (Low-level) app events (coming from your processes) are given business context (making them business events) and used and combined to keep track of business goals.  A journal is then collecting these business events and can be inspected by business users, reacted on, etc.

 

More coming after lunch.