Showing posts with label jbpm. Show all posts
Showing posts with label jbpm. Show all posts

Thursday, April 16, 2020

New community channels on Zulip Chat

We're happy to announce the immediate availability of a new public channel for all projects under the KIE umbrella, i.e. the Kogito, Drools, jBPM and Optaplanner communities ! 
Zulip Chat channels: https://kie.zulipchat.com/


Inside our KIE organization you will find various streams where you can follow any of the topic discussions, create your own topic to ask a question or even help out others.  Since most of the developers use this for their day to day discussions as well, you will find a lot of experts there, and a ton of information.

Please join our community of Kogito, Drools, jBPM, and OptaPlanner experts, hang out, learn and become part of the next generation of cloud-native business automation!

Monday, September 16, 2019

An intro to Kogito

The KIE team has been working for quite a few months on the Kogito project, our next-gen solution leveraging processes and rules for building intelligent cloud-native applications.

 

What are we trying to achieve?  Basically, when you as a developer or team are trying to build intelligent cloud-native applications, Kogito wants to help you with that by letting you use processes or rules in this context in a way that matches that ecosystem (!).  Kogito is focusing on make it as easy as possible for developers to turn a set of processes and/or rules in your own domain-specific cloud-native (set of) service(s).


This is a continuation of the efforts of the KIE team (including the Drools, jBPM, Optaplanner and AppFormer teams) to offer pure open-source solutions for business rules, business processes and constraint solving.  The KIE team however decided to have a new effort targeting specifically this goal, for the following reasons:
  • Technology-driven: As you will see below, there's a lot of great technology available for building cloud-native applications, but to be able to fully leverage these technologies in the context of business automation, we had to make a few radical changes.

  • Focus and innovation: We wanted to focus specifically on what is needed to build next-gen cloud-native applications, and how you can leverage processes and rules in this context.  This allows us to offer something that really fits this ecosystem and doesn't bring in additional baggage that isn't relevant.
So while this effort builds on years of experience and battle-tested capabilities, this also allowed us to leave some baggage behind and focus 100% on the problem at hand.

Kogito, ergo cloud
 
When you're building cloud-native applications, there's a lot of great technology out there (some of it you're probably already using).  Kogito is closely aligned and leveraging these technologies, so you can build highly scalable cloud-native services, with extremely quick startup times and low footprint. Picking up some of these technologies and truly taking advantage of them sometimes required quite radical changes (so this definitely not a lift-and-shift of our existing engines but built from the ground up).

For example:
  • Kubernetes is our target platform for building and managing containerized applications at scale.
  • Quarkus is the new native Java stack for Kubernetes that you can leverage when you build Kogito applications and it's a game changer.  But don't worry, if you are building your applications with Spring Boot, we will help you with that as well!
  • GraalVM allows you to use native compilation, resulting in extremely quick startup times (a native Kogito service start about 100x faster ~ 0.003ms) and minimal footprint, which is almost a necessity in this ecosystem nowadays, especially if you are looking at small serverless applications.  If you're interested in what's behind this, I would recommend to read Mario's blog about this.
  • Building serverless applications? Leverage Knative and Kogito together so your applications can scale up or down to zero based on the need.
  • Kogito applications behave like any other service you build, so you can instantly leverage technologies like Prometheus and Grafana for monitoring and analytics with optional extensions.
  • Internally we leverage quite a lot of other core middleware technogies like Kafka, Infinispan, KeyCloak, etc. This means we take care of setting these up (on demand, for our internal messaging, persistence and security requirements for example) but we strongly encourage you to start leveraging these technologies for your own use cases as well.

Kogito, ergo developer

We want to make the life of developers easy, by offering them instant productivity and making sure we integrate well with how they are building their applications.  So rather than asking developers to come to us with their requirements, we are coming to them !
  • The tooling required to build your processes and rules needs to be closely integrated with the  workflow the developer is already using to build cloud-native services.  Therefore we have spent a lot of time on allowing this tooling to be embeddable.  For example, we just released the first alpha release of our VSCode extension (see video below, credits to Alex) which allows you to edit your processes (still using BPMN 2.0 standard) from within VSCode, next to your other application code.  We're working on a similar experience for Eclipse Che.
  • Instant productivity means it should be trivial to develop, build and deploy your service locally so you can test and debug without delay.  Both Quarkus and Spring Boot offer a dev mode to achieve this, Quarkus even offering live reload of your processes and rules in your running application (extremely useful in combination with the advanced debug capabilities).
  • Once you're ready to start deploying your service into the cloud, we take advantage of the Operator Framework to guide you through every steps.  The operator automates a lot of the steps for you.  For example, you can just give it a link to where your application code lives in git, and the operator can check it out, build it (if necessary including native compilation) and deploy the resulting service.  We are working on extending this to also provision (on demand) more of the optional services that you might need (like for example a KeyCloak instance for security, or Infinispan for your persistence requirements).  We also offer a Command Line Interface (CLI) to simplify some of these tasks.

Kogito, ergo domain

Kogito has a strong focus on building your own domain-specific services.  While we hope you can leverage our technology to significantly help with that, we want developers to be able to build the service they need, exactly how they want it.  As a result, the fact that Kogito is leveraged to do a lot of the hard work is typically hidden and your service exposes itself as any other with its own domain-specific APIs.
To achieve this, Kogito relies a lot on code generation.  By doing so we can take care of 80% of the work, as we can generate a domain-specific service (or services) for you, based on the process(es) and/or rule(s) you have written.  For example, a process for onboaring employees could result in a remote REST api endpoints being generated that you can use to onboard new employees or get information on their status (all using domain-specific JSON data).


Additionally, domain-specific data can also be exposed (through events or in a data index) so it can easily be consumed and queried by other services.


Architecture

When using Kogito, you're still building a cloud-native application as a set of independent domain-specific services, collaborating to achieve some business value.  The processes and/or rules you use to describe the behavior are executed as part of the services you create, highly distributed and scalable (no centralized orchestration service).  But (by using this additional compilation step) the runtime your service uses is completely optimized for what your service needs, nothing more.

If you need long-lived processes, runtime state can be persisted externally in a data grid like Infinispan.  Each service also produces events that can be consumed.  For example using Apache Kafka these event can be aggregated and indexed in a data index service, offering advanced query capabilities (using GraphQL).


What's coming next?

At this point, Kogito 0.3.0 is the latest release (from August 23rd), but we have much more coming on our roadmap before our 1.0.0 release which is targeted towards the end of the year. 


Get started

And now I believe you are ready to give it a try yourself, so please do and let us know! You can start with building one of the out-of-the-box examples, or by creating your first project from scratch.  Follow our getting started documentation here !  You will see you can build your own domain-specific service in minutes.

Or if you want to watch a small presentation (and demo!) from Maciej, check out his latest DevNation Live talk here.

Monday, July 16, 2018

Red Hat Process Automation Manager v7.0

jBPM is completely open-source and therefore most of my blogs are typically about the latest and greatest feature that was just introduced in the community.  However, Red Hat also offers a supported version, with the testing, certification, and maintenance releases necessary for enterprise production use (for a quick intro on potential differences, take for example a look here).

And recently, as announced in this press release, Red Hat unveiled Red Hat Process Automation Manager 7.  The most obvious change you might notice immediately is that the product was renamed - formerly known as Red Hat JBoss BPM Suite.  Since jBPM has evolved beyond just BPM - with features such as decision management, case management and constraint solving closely integrated - it was time to also reflect that in the product naming.  Similarly, Red Hat Decision Manager 7 was released a few months ago, focusing on the Drools and Optaplanner bits.

However, nothing changes structurally.  Red Hat Process Automation Manager is based on jBPM (to be more precise, it was based on the jBPM 7.7.0.Final release) and actually is a super-set of Red Hat Decision Manager, so it also includes all the rules and constraint solving capabilities as well (Drools and Optaplanner).  Since it is completely open-source, you will see the same set of components there as you see in the community: the process execution server (kie-server), the web-based console (business-central aka the workbench - for both authoring and runtime deployment and administration), smart router, controller, Eclipse tooling, etc.  OpenShift images and templates (supporting these capabilities in the cloud) are available too for those targeting cloud deployment.

Red Hat Process Automation Manager also includes an advanced open source user experience platform from Red Hat partner Entando. It can be used to quickly develop modern UI/UX layers for user interaction with business process applications, including a drag & drop UI development tool with widgets to create task lists, forms, process graphs, etc.

Red Hat Process Automation Manager is part of the Business Automation portfolio, which includes Red Hat Process Automation Manager and Red Hat Decision Manager, but also the Red Hat Mobile Application Platform and in the future also big data analytics through Daikon.

More questions?  Take a look at the product pages !

Friday, July 13, 2018

Maciej Swiderski is the new jBPM community lead


I am very glad to be able to announce that Maciej (aka "Magic") Swiderski will officially become the new jBPM community lead.

Maciej is one of the most productive engineers I have ever known.  And while that has led to huge expectations whenever he starts working on something new, he somehow manages to constantly over-deliver anyway.  To be fair, I have to say "officially" as he's been doing the bulk of that work for a long time.  Everyone that ever interacted in the community no doubt knows him, and his blog might be even more famous, probably almost any customer question is answered in one of the numerous blogs he has written over the last few years.  I remember exchanging emails with him in 2010, the early days of jBPM 5, but he was even active in the community before that.  He joined full-time a few years later, and ever since has taken care of anything related to process execution for years.  Nowadays, he's involved in so much (from case management to our cloud story) and producing so much work that I saw no other solution than to just make him responsible for it ;-)
Afbeeldingsresultaat voor maciej swiderski

Well deserved, and long overdue !  Congratulations Maciej.

PS: I'm not going anywhere in case anyone is wondering, still 100% involved, but given Maciej's continuous focus on the community and with the team growing this is the right move !

Friday, September 22, 2017

Watch Drools, jBPM and Optaplanner Day LIVE (Sept 26)

We will be streaming all the sessions of the Drools, jBPM and Optaplanner day in New York on September 26th 2017 LIVE !  Check the full agenda here.

Use the following link to watch: http://red.ht/2wuOgi1

Or watch it here:


Part 2:



Tuesday, August 8, 2017

jBPM 7.1 available

Since we have moved to a more agile delivery with monthly community releases, we are happy to announce the availability of jBPM 7.1.0.Final.

You can find all information here:
Downloads
Documentation
Release notes
Ready to give it a try but not sure how to start?  Take a look at the jbpm-installer chapter.

The focus of this release has been mostly on improving the experience and the capabilities for process and task administrators.  These admins are keeping an eye on your infrastruction, making sure the execution of all processes in your system is in good health and resolving any issues that might show up.

To make it easier for these process and task administrators to do their work, we have added a bunch of improvements and new features for them:
  • Error management: Errors that happen during the execution of your processes (or tasks, jobs, etc.) are now better detected and stored.  This could for example be an (unhandled) exception during the execution of your process instance or a job that has hit its retry limit.  
    • At the core engine level, errors are stored in the database and can be acknowledged.  Note that the engine will always guarantee a consist state of all your process instances, so when an exception like this happens, the engine is rolled back to last known state and the error is logged.
    • Through the web-based console, process admins can take a look at any exception that might have happened in detail throught the new Execution errors view, acknowledge them and if possible take action to resolve the issue.
    • The process instance list has been extended with a new column to show any errors possibly related to that instance.

  • Quick filters: Searching for information about specific process instances, tasks, jobs or errors is now made easier by offering a new search tab where you can try to find the data you need by adding quick filters (for example related to the state of your process instances or the time it was started, the name of the task, etc.)
  • Navigation: new actions have been added to the process instance, task, jobs and errors view to more easily navigate between them where appropriate.  For example, you can navigate to the errors associated with a specific process instance (if any) or to take a look at the process instance associated with a specific task or job.
  • Task admin view: the task view that was included in previous versions has been split into two separate views:
    • Tasks: Aims to be used by task operators / end users to work on tasks assigned (or potentially assigned) to them
    • Task administration: Designed to be used by administrators, to manage tasks belonging to other users. This perspective is only available for users with roles admin and process-admin. It is similar to the former "Admin" filter tab on the former tasks perspective.
  • Project and team metrics
    • A brand new dashboard is now available for every project listed in the authoring library. After opening the project details page, a metrics card shows up on the right side of the screen. The cards shows the history of contributions (commits) made to that specific project over time. Clicking on the View All link gives access to the full dashboard which shows several metrics all about the project’s contributions.

    • A brand new dashboard has also been added to the Teams page. A metrics card on the right side shows the history of all contributions (commits). Clicking on the View All link gives access to a full dashboard showing overall contributions metrics. 


More detail can be found in the full release notes.  Especially to our process and task administrators, enjoy !

Drools, jBPM and Optaplanner are switching to agile delivery!

Edson recently blogged about how Drools, jBPM and Optaplanner are moving towards a more agile delivery.  The goal is to be able to release new features much quicker and more often to the community, by having monthly community releases.

Since this obviously has an impact to our entire community (hopefully overall a positive impact of course ;)), wanted to highlight some of the most important consequences as well:
  • More frequent releases gives the community earlier access to new features
  • Reducing the scope of each release allows us to do more predictable releases
  • Since bug fixes as usual are included in each release as well, users will be able to pick those up quicker as well
As a result, starting with v7.0 a few weeks ago, you should see releases more often now.  It does mean that each individual release will be smaller in size.  But overall we believe we will be able to deliver new features and fixes faster and more predictable !

Feel free to take a look at Edson's blog for a little more details.

Tuesday, August 1, 2017

Drools, jBPM and Optaplanner Day: September 26 / 28, 2017 (NY / Washington)

Red Hat is organizing a Drools, jBPM and Optaplanner Day in New York and Washington later this year to show how business experts and citizen developers can use business processes, decisions and other models to develop modern business applications.
This free full day event will focus on some key aspects and several of the community experts will be there to showcase some of the more recent enhancements, for example:
  • Using the DMN standard (Decision Model and Notation) to define and execute decisions
  • Moving from traditional business processes to more flexible and dynamic case management
  • The rise of cloud for modeling, execution and monitoring
IT executives, architects, software developers, and business analysts who want to learn about the latest open source, low-code application development technologies.

Detailed agenda and list of speakers can be found on each of the event pages.

Places are limited, so make sure to register asap !

Tuesday, July 4, 2017

Take a look at jBPM 7.0

It's been a while since we released a new mayor version of the jBPM project, but I'm happy to announce that jBPM 7.0.0.Final is now available.
For those not yet familiar with our project, jBPM is a completely open-source Business Process Management (BPM) and case management solution.  It supports to full life cycle of processes and cases, from authoring tools through execution all the way to monitoring and management.
For the readers that don't have too much time to read all of the details below, some of the major new features include:
  • Case management capabilities
  • New simplified authoring experience for creating projects
  • Business dashboards
  • Process and task admin api
  • Process and task console can connect to any (set of) execution server(s)
  • Preview of a new process designer and form modeler
  • A new security management UI
  • Upgrades to Java8, WildFly 10, EAP 7, etc.
You can find all information here:
Downloads
Documentation
Release notes

Ready to give it a try but not sure how to start?  Take a look at the jbpm-installer chapter.

A quick introduction to some of the most important features is available below.

Case management 

Case management has been a hot topic in the BPM world for a few years now (and maybe even longer under terms like flexible and adaptive processes etc.).   Case management use cases are different from more traditional business processes since they (typically) require more flexibility and support more unstructured and unplanned work. Rather than following a nicely predefined plan from start to finish, actions are more ad-hoc decisions, what to do next is more based on the data associated with the case, the end user needs to be given the flexibility to decide what to do next (although recommendations are welcome), etc.
Ever since v5 our core engine has always had a lot of advanced features to support more flexible and adaptive use cases. While we did introduce some case management building blocks in v6 already, v7 comes with a lot more extensive support for case management use cases:
  • Core engine: extended to support more advanced features like case file, ad hoc and dynamic work, stages and milestones, case roles, etc.  All these features are available through the remote API as well.
  • The web-based authoring environment has been extended to support defining your own cases, with a case project wizard, additional case building blocks and properties in the process editor, etc.
  • A new web-based case management UI that showcases how you can use the latest features and manage cases.  This UI is built from a number of independent UI building blocks that you can use in your own application as well.

New authoring experience

The experience you get when you open the workbench for the first time, create a new project (or import an example one) and create your first processes, data models and forms has been updated significantly.



Business dashboards

Where it was possible to create your own dashboards in v6 using the (separate) dashbuilder application, dashbuilder has been refactored completely to better align with the workbench technology.  It is now possible to do all of this from within the workbench, and integrate it in your own applications as well.



Process and task admin api

A new API has been introduced that includes powerful capabilities for process and task administrators. The process admin API allows you to:
  • get all process definition nodes
  • cancel node instance
  • retrigger node instance
  • update timer (absolute or relative)
  • list timer instances
  • trigger node
The task admin API allows you to:
  • add/remove potential owners, excluded owners and business admins
  • add/remove task inputs and outputs
  • list/create/cancel escalations and notifications
Process and task console separate from execution server

Our web-based management and monitoring console used an embedded execution server in v6 to execute all process and task operations.  We also offered a standalone process execution server.  In v7 the monitoring console is a UI front-end only, all requests for process and task data and operations on them are delegated to a standalone execution server.   The main advantage is that the console can now 'connect' to basically any (set of) execution servers out there.  

When multiple independent kie-servers are used, you can either connect to a specific one or use the smart router to aggregate information across multiple servers:
  • requests can be sent to the smart router, it will be able to figure out which of the known kie-server instances the request should be sent to
  • when trying to retrieve information, the smart router can collect information from different servers and aggregate that information for you
Preview of new form modeler

The form modeler has been upgraded significantly as well.  The new form layout system (based on the Bootstrap Grid system) allows more advanced and flexible layouts, new widgets, generation of forms, a Java-based file format and much more.  We will do a few more feature enhancements before we will remove the old form modeler in one of the future minor releases.



Preview of a new process designer

We are working on a completely new web-based process designer, and this release introduces a early preview (where we only support a small subset of the full feature set).  The idea is to move away from a very developer-focused UI and introduce an easier to use interface for different kinds of users.  Properties behave much more as advanced forms (rather than a table of key-value pairs) and the user is assisted as much as possible whenever we can (using content assist, etc.).

Currently it is still recommended to use the existing designer for modeling your business processes (since the capabilities of the new one are still limited) but feel free to give it a try and let us know what you think.


A new security management UI

While it was already possible to define users and groups (and their relationship), a new security management UI allows you to define permissions for all of these users (or groups).  You can control who can use which part of the UI, but also which projects users have access to, etc.



Decision Model and Notation (DMN)

Drools has introduced support for the DMN standard, and since jBPM integrates closely with Drools for rule execution, you can now trigger DMN rules from a business rule task.

Other features
  • Minimum Java version was upgraded to Java8
  • Support for WildFly 10 and EAP7
  • New preferences page
  • Data source management
Please take a look at the full release notes for more details. jBPM is integrated closely with the Drools and Optaplanner projects (for business rules and constraint optimization respectively), so take a look at the Drools and Optaplanner release announcements for more details on how some of the new features you can use in combination with your processes as well !

Thursday, June 23, 2016

Process-driven applications on Red Hat Summit 2016

Next week (June 27 - July 1st 2016), Red Hat Summit and DevNation are taking place again, in San Francisco.  As usual, it's a huge event with a ton of interesting talks.  Learn the latest and greatest from all different products Red Hat offers (cloud, data, automation, integration, you name it), with something for everyone (admins, architects, managers, etc.).






I'll be doing a session on Tuesday June 28, 3:30 - 4.30pm, on Process-driven applications: let BPM do some of your work.  It will be a (quick) overview of what we've been building out over the last few years with jBPM (and Red Hat JBoss BPM Suite as the product offering) and how you can use it to build your applications.  But we give you the option to choose which building blocks you find valuable, and keep trying to add more and more value (for example in the context of case management or rapid application development).

I hope to see some of you in San Francisco, feel free to come and ask questions at my session, try to find me (or some of the other engineering team members) at the middleware booth, just say hi if you see me walking around somewhere or drop me a message if you can't find me but would like to meet up ;)

There are a lot of other interesting session, but highlighting a few others that are related (in chronological order):

Friday, April 29, 2016

bpmNEXT 2016 retrospective

Concluding with a few impressions from bpmNEXT last week.  While it's impossible to summarize everything that happened there (I guess you could just join next year), here are some of my key takeaways:
  • BPM has reached maturity stage, we're past the hype phase (at least that's what some of the analysts seem to be saying).  Innovation is still there, although more by the smaller players (typically in specific areas).  The main features of BPMS's are well understood.  Big vendors are trying to differentiate in other areas (sometimes even moving away from the BPM name).  As a result, BPM is becoming 'invisible': it's always there, people can always rely on the power it provides, but it has become more mainstream.
  • Are BPM products growing or are they becoming part of a bigger ecosystem?  And what should we call this bigger entity then?  Luckily we didn't go into finding a new name, but there seems to be some agreement that we are (still) struggling with defining what BPM is (even after a few decades!).
  • Open-source is an important aspect of the BPM eco-system, both for commoditization, free entry and for innovative research.
  • A common misunderstanding is that low-code BPM is for business analysts only.  Low-code BPM tries to lower the entry barrier by hiding some of the underlying complexity and offering easier to use user interfaces / experience.  While this is an absolute requirement to get business analysts involved, low-code BPM can be just as useful for the hardcore developer as well (as long as they still have full control and can take advantage of the full power of the engine)!
  • Most BPM vendors seem to moving towards supporting 'adaptive cases' or more 'unstructured processes' as well.  While there might be various approaches (like for example using the ad-hoc sub-process in BPMN2 vs calling a separate CMMN case vs some custom solution), I expect more convergence in the next few years.
     
  • DMN was a hot topic amongst several vendors, gaining a lot of traction it seems.  CMMN seems to be struggling more though, and a healthy part of the discussion was around what we might learn from this and where it should lead us.
  • Fun fact for those that attended: A tractor can actually look like a bison ! During one of the demos, Watson was used to do automatic recognition of images.  After uploading the image of a tractor, Watson decided it might be a bison.  While most of us found that funny (and it made bison one of the buzzwords of the conference), it might seem that Watson was right after all: apparently bison is a brand of tractors as well.  As we probably should have expected, AI is already smarter than us.
I blogged earlier about each of the presentations and demos on-site here: day 1 (part1 and part2), day2 (part3 and part4) and day3 (part5).  Recordings should hopefully be available soon on the bpmNEXT website as well.

Personally, I'd like to thank Bruce Silver and Nathaniel Palmer (and everyone that helped, in or outside the spotlight) on organising this great conference !  And all the attendees as well for the interesting discussions.  It's a unique experience to have vendors discuss strategy in such an open way.  And the venue and conference schedule are ideal to continue discussions over lunch or during the evening (with a nice beer on the rooftop).

Already hoping I'll be able to come back next year !

Thursday, April 21, 2016

bpmNEXT 2016 (part 5)

Final half-a-day of bpmNEXT presentations and demos, before heading back home.

Intent-driven and future-proof user experiences
Appian talked about UIs.  They have created an architecture called Sail UI using a server-side approach to UI and focusing on the intent of the UI (rather than the details / technology) so it can evolve over time. The same UI design can be applied to radically different architectures like GWT, Facebook React or native mobile apps.  The UI adapts automatically based on the environment (for example barcode scanning component behaves radically differently on desktop vs mobile).

Continuous integration: tools to empower DevOps in process-based application
Bonitasoft talked about testing processes using CI when doing live changes to your processes and applications.  Using Docker, JUnit and Jenkins, they run various tests on Jenkins on newer versions of the process to detect regressions.

Combining DMN with BPMN and CMMN - the open source way
Camunda showed how they implemented DMN (at least parts of) as a decision service.  DMN can be called from BPMN or CMMN using a decision task, or standalone.   Their Cockpit application allows you to figure out why certain decision have been made at runtime (for specific instances) by looking at the audit data - annotated on top of the decision table itself. 

How I learned to tell the truth with BPM
Gene Rawles is adding another dimension (yes, literally, 3D) to modeling processes, where you can have processes at different layers (2D) and use lines to connect them (in 3D), to simplify explaining what's actually going on for example.  They allow importing processes from different systems, and are not limited to processes but also a rules or a SOA (services) layer.

Self-managed agile organizations
Keith Swenson is ending the conference presentations with a talk on various topics:
  • Self-management.  Using the term 'sociocrary', it's about self-managed teams which are highly decentralised and about collaboration (using consensus rather than voting - to get everyone on board with the decision).  How can we support these teams better?
  • He made a distinction between token-based and state-based engines - where jBPM (since v5) definitely falls in the second category - and wondering if there's way to describe the difference and should we consider these in combination with BPMN or CMMN?
  • He launched the question if there should be one (1!) open-source DMN engine for everyone to use, although this sparked the question whether this would be a reference implementation, which one to use and if there's still differentiation possible for vendors.
  • And he wrapped up talking about the future, where he believes a more conversational model (focusing on easy interactions with the user rather than the process itself).

Unfortunately I'll have to miss the wrap up session, heading back to LA slightly early to catch my flight.

bpmNEXT 2016 (part 4)

The afternoon of day 2 is starting (after a long lunch break):

Decision modeling service
Oracle presented their decision modeling service, based on DMN, for extracting decision logic (for example from the process).  After a quick introduction to DMN and the Feel expression language, the demo dived into two examples to calculate cost and request approval.

Dynamic decision models
Jacob from OpenRules presented their decision analysis capabilities.  Decisions are typically a combination of a set of different rules, and using a web-based UI the user can activate and deactivate specific rules dynamically, to see how they influence the decision.  But they can even do what-if analysis to find the optimal solution, all based on the decision model already defined.

The dirty secret in process and decision managementt
Sapiens Decision Suite is analyzing business data (i.e. at the business level - business user friendly) that is going into decisions.  After hooking this business data (defined as a 'logical unit') up to actual data sources (supporting different types), you can generate a web service that represents the decision service.  Rather than the traditional approach of passing all the (low-level) data to the decision service to get your result (which might not even be possible in big data use csaes), this allows you to only pass business-level keys and the rest of the data is fetched on the fly from the underlying systems.

Business process management in the cloud: changing the playing field
IBM's perspective on (running processes on) the hybrid cloud and using analytics in there.  The demo is running a few processes on IBM BPM on Cloud and using services like Watson.  The claim process used Watson to recognize an uploaded image (as a car for example) and Spark machine learning for predictive analytics (based on previous data, create a model about how likely are we going to accept a claim).  The magic seemed to be in the services though, as from the process perspective it's just a matter of doing the right REST calls.

Model, generate, compile in the cloud and deploy ready-to-use mobile process apps
Orchestra BPMS is offering the ability to generate mobile applications for processes.  Rather than using a generic out-of-the-box mobile application, they offer different building blocks (to for example start new instances, a task list, audit capabilities, etc.) and after making your choice the application can be compiled and downloaded for iOS and Android.

Dynamic validation of integrated BPMN, CMMN and DMN
Trisotech is enabling companies to do digital transformation by using a 'digital enterprise graph' of the organisation and allows you to link concepts in different models. The 'Kommunicator' tool supports BPMN, CMMN (using a case task in the process model to call the case) and DMN (whenever you have a decision task).  Animation technology (across the different models) can be used for learning, validation and communication.

Simplified CMMN
Lloyd Dugan made (what he called) a modest proposal to drive adoption of CMMN, especially for use cases where BPMN really struggles to model in an understandable way.  For example by eliminating some of the CMMN constructs, mapping case management concepts to business architecture concepts, etc.  Could CMMN be the 'unification' that brings BPMN and DMN into a bigger world that BPMN and DMN can't describe on their own currently?

Wednesday, April 20, 2016

bpmNEXT 2016 (part 3)

Starting day 2 of bpmNEXT.  [Had to give my own presentation as well in this slot, and had to write this up afterwards, resulting in slightly shorter write ups.]

Cloud architecture accelerating innovation in application development
Salesforce showcased how to developed applications within their tool.  They offer two options at the modeling level, one more data-driven (by listening for and triggering data changes) and WorkRelay, which is the more traditional process available through their AppExchange partner ecosystem.  The demo included a significant mobile part and for example also the Wave Analytics capability that you can leverage.

One model, three dimensions: flow, case and time into a unified development
BP Logix presented a model where you define when tasks should execute as a combination of precedence (did x happen), eligibility (constraints) and necessity (do you need me).  Tasks will execute if all three are satisfied.  It also allows explicitly modeling and keeping track of goals etc.

Building advanced case-driven applications
In my own demo, I tried to explain how we are supporting adaptive and dynamic cases (which we see as an extension of processes rather than some different) - by supporting more advanced features like case roles, milestones, etc. on top of what we already have - a flexible BPMN2 process engine.  We also allow you to build custom applications - specifically tailored to the use case you're trying to solve -  by combining and customizing generic UI building blocks.
I'll do a more detailed blog, covering a lot of the details of what's underneath the hoods, once I get back from traveling.

BPM and enterprise social networks for flexible case management
Francois Bonnet talked about combining process with social network information to an aggregated view of what is happening at runtime. Information gathered into social profiles / and signals being exchanged between processes and the social network allow features like linking social discussions with process, aggregated time line (showing what happened in the process in combination with other social events), assignment using social data, etc.  At the model level, additional constructs allow specifying how signals should be exchanged.

A business process application with no process
Scott Francis from BP3 presented a use case related to outpatient care, where patient, doctor and nurses need to collaborate. With a strong focus (and track record) on design, the UIs nicely support mobile devices (a necessity in this context), where the application adapts itself to the available space (on the tablet or phone).  Regardless of where you are and what device you use, you always have access to the latest state.

bpmNEXT 2016 (part 2)

Continuation of bpmNEXT impressions.  Especially the BPM analyst panel was very interesting and spiked a lot of discussion !


BPM Analyst Panel
Maureen Fleming, Sandy Kemsley, Clay Richardson, Jim Sinur, Neil Ward-Dutton
From the analyst point of view, some remarkable trends:
  • Customers don't necessarily want to start from process anymore, UI and UX are becoming the first more.
  • Rather than a focus on process, we need immediate data analytics  and decisions.
  • We need to get customers involved in the processes (apps, smart devices, etc.) and collect the data to drive these processes (and make them more transparent).
  • Customers are looking for low-code solutions, even if that means not using BPM(S).  It just needs to be good enough.
  • Open-source vendors are crucial for innovation and free(mium) entry. 
  • Low-code BPM isn't about business analyst and prototype applications only, it's also for developers and enterprise applications.
  • A lot has changed over the last 15 years,computing power and data is everywhere and millenials think differently.
Unanimous agreement that we need to continue this discussion over wine and beer later !

Process design and automation for a new economy
Ian Ramsey (8020 BPM) presented a model (targeting processes in the services sector) based on tasks and events and an event engine, to solve issues like maintainability, exceptions etc.  [Basically a declarative way to define processes rather than a procedural flow.]
A process is composed of stages, each containing tasks (using the typical blocks per stage each containing tasks).  Data is defined as (high-level) entities with possible states.  Tasks define start and outcome events (referencing the data states mentioned earlier), and the tool can generate a flowchart by combining all tasks.  He showed how you can define a rather complex model using this way, and how it dynamically adapts itself if you remove a tasks.

Process intelligence for the digital age: combining intelligent insights with process mining
SAP is combining process visibility and process intelligence with process mining to solve problems in realtime.  In the demo, dashboards are used to get insights into orders being processed, and figure out where they are stuck, to take immediate action. 
Process mining is used to do root cause analysis, and dynamically discovers the process model by looking at all the (low-level) events captured at runtime across systems (even for unstructured processes).  It allows you to filter on various properties of the live running instances and then allows you to see the process model for this selection, to detect issues with this subset.  You can then define additional KPIs to easily keep track of these instances.
Process Intelligence
Signavio presented a new component related to process analysis.  After uploading various logs and defining KPIs, it will perform an analysis and show dashboards that allow you to drill down into the details.  The data can be overlaid on top of the process model to for example show most used paths, deviations from the process model, etc.

Leveraging cognitive computing amd decision management to deliver actionable customer insight
Princeton Blue showed their cognitive computing solution that monitors customer events (from for example social media or other unstructured data) to combine it with BPM, BRMS and CEP.  Pramod used an elaborate example that used data collected from customer profilies and twitter feeds and rules and event processing to analyze this.  Based on IBM Watson they were able to categorize user sentiment as positive, negative, angry, etc. (and he mentioned that open-source like Drools could also be used as rules and complex event processing engine as an alternative).

Dealing with digital disruption
Jim Sinur talked in more depth about digital transformation, which companies cannot ignore if they want to stay relevant in the future (as it's going to happen anyway).  But this digital transformation introduces disruption on multiple levels, with strong demand for continuous business agility and change.  His advice was to focus on differentiating features for your business.  And you'll inevitably end up with legacy and new working together (at least short-term).  And process might be the secret sauce of digital business.

Now we're off to the roof for some whine tasting, and no doubt continue the discussions.

Tuesday, April 19, 2016

bpmNEXT 2016 (part 1)

This year, I'm attending bpmNEXT for the first time.  

It a nice conference, currently the 4th year, where BPM vendors, analysts and researchers collectively show their innovation, vision and research and discuss among each other.
Now in its fourth year, bpmNEXT is the definitive showcase of the next generation of Business Process Management software – including intelligent operations, the Internet of Things (IoT), case management, business decision management, and goal-directed processes.
bpmNEXT has consistently been the defining launch pad for the ideas and innovations surrounding technology-led business process innovation. Presentations are not case studies of past successes, but vision statements on where we need to go next.
This is no typical “how to” conference aimed at BPM newcomers. It is designed specifically for those already chest deep in BPM and wanting to get in early on the next generation of process innovation technology – touch it, see it, and influence it.
- See more at: http://www.bpmnext.com/#sthash.Q0RZDMoe.dpu
I'm presenting on 'Case-driven applications' on the second day, but looking forward to see what others have planned as well.  And the discussions and networking around it of course, probably one of the best selling points of this conference (although the Santa Barbara location comes close as well ;)).

Here's a quick overview (from my point of view) about some of the key topics of the first half of day 1.

BPM 2016-2020
Bruce Silver and Nathaniel Palmer are kicking off, with an outlook for the next 5 year, how all companies (and he gave Tesla as a prime example) are becoming software companies, where there's increasing focus on process and the battle for the end user is turning into a battle of easy-to-use UIs (not UI, but a combination of multiple different UIs).

Schroeder's BPM revisited
Neil Ward-Dutton talked about digital transformation, as a digital thread that you have to weave globally across your entire organization (and even outside that boundary to customers) that focuses on decisions, agility, etc.  There's need for a platform to manage your knowledge efficiently.  And BPM offers a lot of building blocks to achieve this, but maybe it's time to think about rethinking / repackaging some of this to focus on the customer's bigger picture.

Positioning Business Modeling
Panel: Clay Richardson, Denis Gagne, Kramer Reeves (Bruce Silver moderator)
BPM modeling can no longer be seen as something separate, as when talking about modeling there's other areas like modeling cases, decisions, organizations, KPIs, etc. and they are all related.  Is this 'business modeling' and how do we bring them together, how do we as vendors sell this story to analysts and customers?
  • Incredible appetite from business analysts for building (almost any kind of) models to quickly prototype
  • How do we measure ROI?
    • BPM has operational automation benefits
    • Is it about speed to market?
    • Capturing knowledge in a standardized way (for long-term sustainability)
  • Is the BPM approach sustainable?  Can we keep adding more models / adding more capabilities as minimal requirements to BPM products?
  • Aren't some models throw-away (high-level communication vehicles rather than operational model)?
While there's no clear answers on some of these questions, I guess the consensus seemed to be that we can all work together on this to make a bigger pie for all of us.

Building a Value-Added BPM Business
Panel: Pramod Sachdeva, Scott Francis, Jonathan Sapir (Nathaniel Palmer moderator)
How, from a customer's point of view, is the market changing?  What are customers looking for?
  • Customers are looking for much more than just a process, they are looking for complete solutions.
  • Is BPM shifting, or are the applications that we are building with BPM shifting?  Are BPM products growing or are they becoming part of a bigger ecosystem?
  • Can we get end users more involved?  Can everyone (end users, developers, QE, etc.) each participate at their level? What is preventing this, as this isn't purely a technological issue.
  • What skillsets are required when looking for business analysts or IT to be involved in such an effort?
  • We need to be realistic about what end users can change, by constraining what they can change we might be able to allow them to do so.  For example by extracting decision logic in rules (that could be updated).
  • We want to give users a low-code environment where they can take control.

jBPM 6.4.0.Final

The jBPM 6.4.0.Final release is now available.  It brings a new look and feel and a select set of features, some extremely powerful like advanced query capabilities or

A highlight of some of the most important changes is added below, full details can be found in the release notes.
To get started:
Downloads
Documentation
Release Notes

Ready to give it a try but not sure how to start?  Take a look at the jbpm-installer chapter.
jBPM 6.4 is released alongside Drools (for business rules), check out the new features in the Drools release blog.
We thank everyone who helped out!

Core engine

The query capabilities of the engine have been extended significantly to support searching for process instances or tasks in combination with process instance or task data to almost any complexity (including complex constraints) in a very efficient mannerThis is available through both Java and remote APIs.  Maciej blogged about this already, see:
Others improvements include:
 - improved auditing for task data (so it can be queried more easily)
 - pluggable notifications for task deadlines

Workbench

The entire workbench was updated with a new look and feel (based on Bootstrap3 and PatternFly), making it look cleaner, we hope you like it !


Translations for Russian and Traditional Chinese were also added.

Other improvements include:
 - duplicate version detection for projects
 - ability to disable automatic building of projects

User and group management UI

You can now manage your users and groups inside the workbench using new perspective to manage users and groups (and assign users to groups).  It is based on a pluggable implementation to support plugging in different kinds of authentication security domains, configurable in the application server.



New execution server management UI

The execution server UI (to create and manage deployments to one or multiple kie-server instances) has been redesigned.



Runtime Console

The task list can also visualize task data, configurable as additional columns in the task table (similar to how this was provided for process instances in the 6.3 release).



Designer

Our web-based process designer includes a new 'Process Documentation' panel that gives an overview of the most important information of each of the nodes in the process (details that aren't typically visible in the diagram) and can be printed (to paper or pdf) for documentation purposes.



Dashboards

New process and task dashboards included in the workbench, showing all your important runtime information.


Thursday, November 12, 2015

Process-driven applications at Devoxx 2015

I just finished my presentation at Devoxx this year, on process-driven applications.  The conference is a lot of fun, with a big Red Hat presence, talking about JBoss Middleware, OpenShift, Red Hat Developer program and a lot more.



It wasn't the regular presentation where we try to showcase all capabilities and list all the features of our project (as most people probably have seen one of those at some point already), but I wanted to focus on something special instead.

The project has evolved significantly in the last few years, and I believe we have now reached a point where we have a lot of building blocks in place to help you develop your application.

Rather than focusing on the technology, process-driven application development starts from a different goal, i.e. building something customized to what you need. By taking advantage of the workbench, you can build and execute your processes as usual, but rather than relying on the generic tooling we provide, you have access to all data and features we offer out-of-the-box, but combine them in a customized way.

In the demo, I built out a small expenses process, and a custom screen (using AngularJS) that can list my current expenses and create new ones just the way I want to.  You can even add some small dashboards to keep track of the number of open expense reports or a quick overview of how many expense reports you submitted in the last year and when.

I also showed how to support more flexible and adaptive cases, where you want to give the end user the capability to make decisions or to dynamically add new tasks (all the way to the extreme where you don't define anything upfront but start a new ad-hoc case).  And obviously you can combine both, creating a custom application to drive your patient cases:

(click to enlarge)

My slides are available here.  The presentation itself was recorded and is available (for free) as well: