Friday, September 22, 2017

Watch Drools, jBPM and Optaplanner Day LIVE (Sept 26)

We will be streaming all the sessions of the Drools, jBPM and Optaplanner day in New York on September 26th 2017 LIVE !  Check the full agenda here.

Use the following link to watch:

Or watch it here:

Part 2:

Tuesday, August 8, 2017

jBPM 7.1 available

Since we have moved to a more agile delivery with monthly community releases, we are happy to announce the availability of jBPM 7.1.0.Final.

You can find all information here:
Release notes
Ready to give it a try but not sure how to start?  Take a look at the jbpm-installer chapter.

The focus of this release has been mostly on improving the experience and the capabilities for process and task administrators.  These admins are keeping an eye on your infrastruction, making sure the execution of all processes in your system is in good health and resolving any issues that might show up.

To make it easier for these process and task administrators to do their work, we have added a bunch of improvements and new features for them:
  • Error management: Errors that happen during the execution of your processes (or tasks, jobs, etc.) are now better detected and stored.  This could for example be an (unhandled) exception during the execution of your process instance or a job that has hit its retry limit.  
    • At the core engine level, errors are stored in the database and can be acknowledged.  Note that the engine will always guarantee a consist state of all your process instances, so when an exception like this happens, the engine is rolled back to last known state and the error is logged.
    • Through the web-based console, process admins can take a look at any exception that might have happened in detail throught the new Execution errors view, acknowledge them and if possible take action to resolve the issue.
    • The process instance list has been extended with a new column to show any errors possibly related to that instance.

  • Quick filters: Searching for information about specific process instances, tasks, jobs or errors is now made easier by offering a new search tab where you can try to find the data you need by adding quick filters (for example related to the state of your process instances or the time it was started, the name of the task, etc.)
  • Navigation: new actions have been added to the process instance, task, jobs and errors view to more easily navigate between them where appropriate.  For example, you can navigate to the errors associated with a specific process instance (if any) or to take a look at the process instance associated with a specific task or job.
  • Task admin view: the task view that was included in previous versions has been split into two separate views:
    • Tasks: Aims to be used by task operators / end users to work on tasks assigned (or potentially assigned) to them
    • Task administration: Designed to be used by administrators, to manage tasks belonging to other users. This perspective is only available for users with roles admin and process-admin. It is similar to the former "Admin" filter tab on the former tasks perspective.
  • Project and team metrics
    • A brand new dashboard is now available for every project listed in the authoring library. After opening the project details page, a metrics card shows up on the right side of the screen. The cards shows the history of contributions (commits) made to that specific project over time. Clicking on the View All link gives access to the full dashboard which shows several metrics all about the project’s contributions.

    • A brand new dashboard has also been added to the Teams page. A metrics card on the right side shows the history of all contributions (commits). Clicking on the View All link gives access to a full dashboard showing overall contributions metrics. 

More detail can be found in the full release notes.  Especially to our process and task administrators, enjoy !

Drools, jBPM and Optaplanner are switching to agile delivery!

Edson recently blogged about how Drools, jBPM and Optaplanner are moving towards a more agile delivery.  The goal is to be able to release new features much quicker and more often to the community, by having monthly community releases.

Since this obviously has an impact to our entire community (hopefully overall a positive impact of course ;)), wanted to highlight some of the most important consequences as well:
  • More frequent releases gives the community earlier access to new features
  • Reducing the scope of each release allows us to do more predictable releases
  • Since bug fixes as usual are included in each release as well, users will be able to pick those up quicker as well
As a result, starting with v7.0 a few weeks ago, you should see releases more often now.  It does mean that each individual release will be smaller in size.  But overall we believe we will be able to deliver new features and fixes faster and more predictable !

Feel free to take a look at Edson's blog for a little more details.

Tuesday, August 1, 2017

Drools, jBPM and Optaplanner Day: September 26 / 28, 2017 (NY / Washington)

Red Hat is organizing a Drools, jBPM and Optaplanner Day in New York and Washington later this year to show how business experts and citizen developers can use business processes, decisions and other models to develop modern business applications.
This free full day event will focus on some key aspects and several of the community experts will be there to showcase some of the more recent enhancements, for example:
  • Using the DMN standard (Decision Model and Notation) to define and execute decisions
  • Moving from traditional business processes to more flexible and dynamic case management
  • The rise of cloud for modeling, execution and monitoring
IT executives, architects, software developers, and business analysts who want to learn about the latest open source, low-code application development technologies.

Detailed agenda and list of speakers can be found on each of the event pages.

Places are limited, so make sure to register asap !

Tuesday, July 4, 2017

Take a look at jBPM 7.0

It's been a while since we released a new mayor version of the jBPM project, but I'm happy to announce that jBPM 7.0.0.Final is now available.
For those not yet familiar with our project, jBPM is a completely open-source Business Process Management (BPM) and case management solution.  It supports to full life cycle of processes and cases, from authoring tools through execution all the way to monitoring and management.
For the readers that don't have too much time to read all of the details below, some of the major new features include:
  • Case management capabilities
  • New simplified authoring experience for creating projects
  • Business dashboards
  • Process and task admin api
  • Process and task console can connect to any (set of) execution server(s)
  • Preview of a new process designer and form modeler
  • A new security management UI
  • Upgrades to Java8, WildFly 10, EAP 7, etc.
You can find all information here:
Release notes

Ready to give it a try but not sure how to start?  Take a look at the jbpm-installer chapter.

A quick introduction to some of the most important features is available below.

Case management 

Case management has been a hot topic in the BPM world for a few years now (and maybe even longer under terms like flexible and adaptive processes etc.).   Case management use cases are different from more traditional business processes since they (typically) require more flexibility and support more unstructured and unplanned work. Rather than following a nicely predefined plan from start to finish, actions are more ad-hoc decisions, what to do next is more based on the data associated with the case, the end user needs to be given the flexibility to decide what to do next (although recommendations are welcome), etc.
Ever since v5 our core engine has always had a lot of advanced features to support more flexible and adaptive use cases. While we did introduce some case management building blocks in v6 already, v7 comes with a lot more extensive support for case management use cases:
  • Core engine: extended to support more advanced features like case file, ad hoc and dynamic work, stages and milestones, case roles, etc.  All these features are available through the remote API as well.
  • The web-based authoring environment has been extended to support defining your own cases, with a case project wizard, additional case building blocks and properties in the process editor, etc.
  • A new web-based case management UI that showcases how you can use the latest features and manage cases.  This UI is built from a number of independent UI building blocks that you can use in your own application as well.

New authoring experience

The experience you get when you open the workbench for the first time, create a new project (or import an example one) and create your first processes, data models and forms has been updated significantly.

Business dashboards

Where it was possible to create your own dashboards in v6 using the (separate) dashbuilder application, dashbuilder has been refactored completely to better align with the workbench technology.  It is now possible to do all of this from within the workbench, and integrate it in your own applications as well.

Process and task admin api

A new API has been introduced that includes powerful capabilities for process and task administrators. The process admin API allows you to:
  • get all process definition nodes
  • cancel node instance
  • retrigger node instance
  • update timer (absolute or relative)
  • list timer instances
  • trigger node
The task admin API allows you to:
  • add/remove potential owners, excluded owners and business admins
  • add/remove task inputs and outputs
  • list/create/cancel escalations and notifications
Process and task console separate from execution server

Our web-based management and monitoring console used an embedded execution server in v6 to execute all process and task operations.  We also offered a standalone process execution server.  In v7 the monitoring console is a UI front-end only, all requests for process and task data and operations on them are delegated to a standalone execution server.   The main advantage is that the console can now 'connect' to basically any (set of) execution servers out there.  

When multiple independent kie-servers are used, you can either connect to a specific one or use the smart router to aggregate information across multiple servers:
  • requests can be sent to the smart router, it will be able to figure out which of the known kie-server instances the request should be sent to
  • when trying to retrieve information, the smart router can collect information from different servers and aggregate that information for you
Preview of new form modeler

The form modeler has been upgraded significantly as well.  The new form layout system (based on the Bootstrap Grid system) allows more advanced and flexible layouts, new widgets, generation of forms, a Java-based file format and much more.  We will do a few more feature enhancements before we will remove the old form modeler in one of the future minor releases.

Preview of a new process designer

We are working on a completely new web-based process designer, and this release introduces a early preview (where we only support a small subset of the full feature set).  The idea is to move away from a very developer-focused UI and introduce an easier to use interface for different kinds of users.  Properties behave much more as advanced forms (rather than a table of key-value pairs) and the user is assisted as much as possible whenever we can (using content assist, etc.).

Currently it is still recommended to use the existing designer for modeling your business processes (since the capabilities of the new one are still limited) but feel free to give it a try and let us know what you think.

A new security management UI

While it was already possible to define users and groups (and their relationship), a new security management UI allows you to define permissions for all of these users (or groups).  You can control who can use which part of the UI, but also which projects users have access to, etc.

Decision Model and Notation (DMN)

Drools has introduced support for the DMN standard, and since jBPM integrates closely with Drools for rule execution, you can now trigger DMN rules from a business rule task.

Other features
  • Minimum Java version was upgraded to Java8
  • Support for WildFly 10 and EAP7
  • New preferences page
  • Data source management
Please take a look at the full release notes for more details. jBPM is integrated closely with the Drools and Optaplanner projects (for business rules and constraint optimization respectively), so take a look at the Drools and Optaplanner release announcements for more details on how some of the new features you can use in combination with your processes as well !

Thursday, April 27, 2017

BPM at Red Hat Summit 2017 !

Red Hat Summit is happening next week and I'm very excited to be able to join again! As I blogged before, there are going to be a ton of presentations related to BPM, BRMS and beyond.  And this year we are joining with a pretty big team as well, so I expect a lot of interesting side conversations as well.

Apart from the presentations, there are going to be a lot of other interesting other things as well, like training, workshops, Birds of a Feather sessions, etc.  In the exhibition area, there are going to be numerous booths where you will be able to talk to us (and other experts) or come and ask questions. The Red Hat Summit Party is even going to be at a Red Sox game at Fenway Park (they are playing the Baltimore Orioles)!

If you want to meet up (with any of us), during the day or in the evening, feel free to let me know !  Or just come to one of my presentations and ask questions there :-)

Thursday, April 20, 2017

bpmNEXT 2017 (part5)

Last half day of bpmNEXT is starting. Today I will up as the second presenter of the day, doing my demo on our case modeler and continuous task optimization.

Making business processes dance to the user's tune
Paul Holmes Higgin - Flowable Project

Paul introduced Flowable, a recent fork of Activiti (where most of that engineering team followed), and how it can be used for case management due to its dynamic modification capabilities that are coming in v6. In the example he showed, he dynamically added a new (not predefined) task or even a complete new process into a running instance. 

Supporting unstructured work
Kris Verlaenen - Red Hat

I presented on some of the challenges our customers are seeing in the context of unstructured work.  Firstly, modeling unstructured work confronts us with the limitations of existing standards like BPMN2, so we are presenting a higher-level model where the work is modeled as a number of stages, each containing any number activities.  It is however a visualization layer and developers can still rely on the capabilities of the underlying specification for the execution semantics.  Secondly, since unstructured work is typically unplanned, we are using continuous optimization (using OptaPlanner) for task assignment, so we can help users to work on the most important tasks first, taking into account user's constraints and preferences.

As an intermezzo, Lloyd presented on the Business Architecture Meta-Model, where they are trying to link a lot of the concepts that are used at different levels, linking Business Architecture to BPM concepts.  As we all agree that using a term like activity or process can be very ambiguous.  It's an OMG RFP where Lloyd invited everyone that might be interested to take a look.

Digital strategy deployment using business capabilities
Denis Gagne - Trisotech

Denis presented Trisotech Digital Enterprise Suite.  They are combining models and concepts at different level (like strategies and capabilities all the way to BPMN, DMN and CMMN models, etc.), and linking them for being to trace the relationship between all of these.   In the demo he showed an example where you could see how capabilities where implemented using processes (BPMN2) and decision logic (DMN).  Their landscaping tool (as they call it) can be used for brainstorming and collecting ideas, from free flow to based on existing models / approaches.  However, these ad-hoc models (as the result of the brainstorm) can later be linked to future models all the way up to the implementation.  Because models are connected, they can be used for getting executive overviews (for example related to maturity, performance, technical dept, etc.).  All of this information goes into one big digital enterprise graph.

With that, presentations are done.  Time to wrap up and start preparing for next year !

bpmNEXT 2017 (part 4)

Through a lens Starkly: transforming data into business information
E. Scott Menter - BPLogix

BPLogix is virtualizing data (possibly coming from a lot of different data sources) and aggregating it to get visibility in what is going on for business users.  The data flow analyzer allows you to drill down into your data, you can for example see how a chart is combining different queries to visualize key indicators (without knowing the technical details of the data).  Their tools allows users to define processes as a combination of activities with eligibility criteria (as they presented last year).  By combining this with their data analyzer, it allows you to clearly visualize what happened and based on what information (and to drill down further on where that data came from, etc.).

Process modeling and metrics: the next generation
Max Young - Capital Labs

Capital Labs is adding a third dimension to processes.  When performing simulation it is critical that it matches with reality.  Their tool, called BPM Scout, allows you to import processes from different tools, and then perform visual simulations on top of those.  But it allows multiple (3D) layers, where a process on the top layer can trigger for example rules on a different layer, etc.  Users can define their own advanced KPIs so you can show the value that the customer can expect.  It also interacts with management tools and allows you to generate full documentation (including simulation information) for your process.  And it allows you to export all your processes to for example IBM Blueworks (even with an application generated for you to start the process).

BPM with humans in the age of digital transformation
Francois Bonnet - ITESOFT W4

Francois is showing their tool that is trying to assist humans in their job.  At process definition time, it cannot only validate your BPMN processes, but assist the user in what the solution might be (for example adding missing elements).  The process also be simulated step by step.  Simulation can even compare the execution of a specific instance to the plan (gantt chart), giving information if you are ahead or behind the schedule.  Simulation supports concepts like timers and signal events etc.

Taking BPMN to infinity and beyond
Jakob Freund - Camunda
Camunda is presenting their next generation "Big Workflow" engine.  They have written a new version of their workflow engine with the idea of being able to scale this infinitely.  It is using publish-subscribe at it's core: rather than storing the state of an instance as a row in a traditional database, they are generating and storing event rather than updating that one row.  The events are written to a log file on file system and in a distributed setting it is replicated. This allows them to scale to 100x the throughput of their traditional engine.

Getting to know your users with Brazos CX insights
Scott Francis and Ivan Kornienko - BP3 Global

BP3 is presenting Brazos CX insights to help developers improve the usability of their applications for end users.  The tool is continuously measuring your application (load time, how the application is used, form validation errors, etc.).  Analysis learns you how much time users spent at different parts of your application, how much each area of your application is used (or not used), what the common validation errors are, etc.  And if necessary you can drill down even further to get more detailed information, all the way up to a timeline visualization of an individual session, or get average data for a large number of sessions.  Obviously all this information should be used to improve the usability of your application.

Cognitive customer service
Pramod Sachdeva - Princeton Blue

Princeton Blue is presenting their cognitive customer service solution.  It is monitoring customer interactions (from different possible resources like calls, emails, etc.).  Rules can be used to proactively monitor this data and create escalations, that can then be handled appropriately (for example through a process).  During this escalation, it can actually recommend what the most appropriate action might be.  Reports can be generated to slice-and-dice all this information in different ways (based on topic, customer type, sentiment, etc.).

Wednesday, April 19, 2017

bpmNEXT 2017 (part 3)

Starting the second day at bpmNEXT, where Edson is kicking off (with Bruce) on the DMN execution engine we've been working on on the Drools project.

An executable DMN solution for business users
Bruce Silver -, Edson Tirelli - Red Hat

Bruce and Edson are showing the first complete implementation of DMN. It's a collaboration of multiple companies, where Trisotech is providing a DMN modeler, Red Hat has a completely open-source DMN execution engine (as part of the Drools project - the first and currently the only implementation passing the full TCK) and Method and Style is offering a methodology and guardrails around it.
Bruce did a demo where he showed a decision table to define some decision logic and validation that will help you find potential issues with the table. Using a pre-qualification of a loan example, he showed a DRD to help find the interest rate of an applicant.  After executing the rule, the decision can be visualized by annotating it on top of the DRD diagram.
Edson zoomed in on various topics: how validation is done at several levels, an example using advanced expressions depending on level 3 DMN support, how to extend the language with a custom function and different ways of execution (from embedded to a REST service in the cloud).

Boost business process agility with DMN
Eduardo Chiocconi - Oracle

Oracle is showing their cloud-based DMN service that allows you to create and publish a DMN decision service.  Using an expense approval process, they showed how to first model the decision using DMN and then integrate the resulting decision service into a process. This is targeting business professionals rather than IT personnel.  By adding a decision service to the process (it is aware of which decision services are available) and mapping the inputs and outputs (in a graphical way, avoiding any scripting), the DMN service can be integrated in the process. By extracting the decision logic from the process itself, it has been given a separate life cycle and can be updated dynamically.

Making the standard real: the DMN TCK
Keith Swenson - Fujitsu

Keith presented the DMN TCK.  It's a set of DMN models (focusing on the most important use cases), input data sets and expected results (using an xml format for both).  Once a runner is provided (that is able to invoke the implementation of a particular vendor), it produces a CSV with the results. DMN supports multiple levels of compliance, where level 3 includes full FEEL support.  There currently are only a limited set of test cases, but Keith is asking everyone to submit their own test cases to extend the TCK.  Great results in one year, given that this was basically started from a discussion at his bpmNEXT presentation last year.

Decision enabled robotic process automation
Larry Goldberg - Sapiens DECISION

Larry presented a use case where process, decisions and robots were combined.  Data being sent in is first sent to a decision service (which he called the brain) to determine which robots (using RPA) need to be triggered to collect additional information from back-end systems.  Further validations by the brain and/or manual checks continue until the brain is satisfied.  This allowed the company to increase the number of transactions they were handling and at the same time reducing the amount of FTEs required to perform the work (as a lot of the data collection was automated using RPA). Larry showed how Sapiens DECISION can be used to define the underlying processes and decisions.

Accelerating digital transformation with an Open Cloud Platform
Harald Schubert - SAP

SAP presented their Cloud Platform, and more specifically the Workflow service in there.  He showed how to start a new process from scratch, with a simple process that included a human task.  When starting an instance of this process through their UI, this task obviously ended up in the task inbox. Internally they are using the open-source Activiti engine for execution.  SAP Cloud Platform comes with a lot of out-of-the-box platform services that can be integrated into the process, like for example a gamification service (and associated UI).  This service was called through the SAP Integration layer (as the process currently only supports REST calls).

bpmNEXT 2017 (part 2)

This afternoon the demos are starting, where the conference is again using the Ignite format: every presenter first has a 5-minute presentation (15s per slide) followed by 20min of live demo.  This format is used to force presenters to focus on the demo itself.

Creating a Digital Workforce with Robotic Process Automation
Anthony Yung - Kofax

Kofax is showing their Robotic Process Automation (RPA) solution.  As a use case he showed a "Customer due diligence process" (also called Know Your Customer) where a custom application is used to collect the necessary information about a customer and then a "robot" is used to analyze some of that information, for example do a google search on this customer and analyze the results.
He showed the Kofax designer where the google search was defined as a number of manual steps (open the google page, put in the query, perform the query, collect the results from the results page, etc.), without requiring any scripting.  This "robot" was then exposed as service (available through REST) so it can be reused.  These robots can then be called from your business processes as well.

BPM with Blockchain
Michuel Valdes-Faura and Lionel Palachin - Bonitasoft

Bonitasoft integrated their BPM platform with Blockchain, to achieve mostly the following advantages: allow multiple partners to trust the common process, customer engagement and end-to-end traceability. The use case they showed was a car order management process, where a car is being sold to a customer (including payment and notification).  The car is modeled as a Blockchain asset, and they implemented several connectors to interact with blockchain from the process, to for example create a transaction, etc.
They made a case for integration of BPM and blockchain in both ways: have your processes interact with blockchain but also having companies building on top of blockchain to use BPM to offer custom applications for their use case.

Real-time Process Deviance Monitoring
Michal Rosik, minit

Minit is using process mining techniques for deviance monitoring (i.e. searching for abnormal behavior at runtime). Their tools allows you to look at collected data (where it's not a requirement that the use case is already modeled as a process) from different perspectives, like frequency (how many time is some activity executed), time (which activity is causing delay), financial (what's the cost associated with each activity), etc.  The mined process diagram is annotated graphically with the relevant information.  By defining which variants are acceptable, at runtime they can monitor for any deviations using dashboards that show runtime information and allow you to drill down in case deviations are detected.  It is also applicable in the context of IoT, where a much larger number of events is typically expected.

Analytics for leveraging BPM assessment and management action
Jude Chagas Pereira, Frank Kowalkowski, Gil Laware - IYCON

Afterspyre uses analytics to look at data (that is pulled in from different data sources) to help analysts get better insights and make better decisions. For example, it can look at existing BPM data and help make decisions on which projects are most optimal to optimize, etc.  It can also compare two different processes to detect how much similarity they have, or they can be ranked based on different attributes.  Charts present this information at a higher level for managers to consume.  It also supports semantic analysis of text and keywords used in processes.

The recipe for successful digital transformation
Derek Miers - Structure Talent, MWD Advisors

Derek made the case that digital transformation isn't just process + technology & stir.  BPM isn't the silver bullet that will solve everything.  You have to engage your audience to get there.  Business transformation requires you to rethink and change everything you do (up to the entire structure of your company) and it needs to be customer-centric.
He presented a framework for business transformation that starts from understanding your customer's experience.  He mentioned BPM sometimes seems to be focusing much more on improving the existing processes rather than helping people to rethink them and to help co-create their future. Sometimes you need to redesign outside-in.  And BPM vendors should think about how they can help their customers doing that (rather than optimizing the process they don't need).

Now we're off to the roof top for some drinks and dinner !

Tuesday, April 18, 2017

bpmNEXT 2017 (part 1)

Back at Santa Barbara this year to attend the bpmNEXT conference, where I will be speaking on Thursday.  But before that will happen, we have a full 3 days of presentations and (even more important) demos from a lot of different vendors and experts.

BPM 2017-2021: Outlook for the Next Five Years
Nathaniel Palmer

Nathaniel is starting with a view of the BPM market from his point of view. In 2016, he predicted that the 3 R's (Robots, Rules and Relationships) will be defining the BPM market.  It's clear that rules have a significant impact on BPM nowadays as a way to drive decisions (for example with DMN). According to Google Trends, Robot Process Automation (RPA) is gaining more attention as well.  The interest in business processes is pretty stable (where BPMN is kinda following the same trend but just at a smaller scale). Digital Transformation is a term that has grown and is similar although BPM is still the overarching term that combines all of this.
Automation is forcing us to step away from the traditional architecture associated with BPM.  The future of BPMS vendors isn't just about process management but also includes automation, machine learning and decision management, all driven by an event-driven foundation.
Rather than predicting where BPM is going, he suggested we would all work towards defining it ourselves.  Or as he said, let's all "make automation great again"!

The Top 10 Technologies that will impact BPM in the next 5 years
Jim Sinur - Aragon Research

Jim, with his 50 (!) years of experience in IT, and many years of experience as analyst for BPM (for Gartner and Aragon Research nowadays), highlighted a few technologies he believes are going to be more and more important, including for example:
  • Predictive apps get smarter (decision management is key)
  • Big data and learning (using machine learning, deep learning and cognitive computing)
  • Internet of things (standardization kicking off there now - resulting in a lot of smart devices at the edges and more goal-driven decentralized management)
  • Rise of chatbots (moving to full language and action)
  • Virtual Reality
  • Work hubs (workbenches focused on specific roles)
  • Drones
  • Blockchain
Jim believes the Digital Business Platform (DBP) is what is / will be combining all these technologies (kinda disagreeing with Nathaniel there) as a place where business and IT collaborate.  Things like digital identity (including your preferences) and change management (across technologies) will be key.

The New Wave of Automation
Neil Ward-Dutton - MWD Advisors

Neil explained how a major shift in our experience of automation is underway.  Traditionally, we have been trained to work around limitations of automation (we are for example all used to pushing keys on computers the entire day), however that is changing, where automation is now changing for us.  Neil introduced 3 layers of change, called the 3 I's: Interaction (sending and responding more like humans - like chatbots), Insight (interactive analytics - like recommendation engines) and Integration (resources being exposed with open interfaces - like smark infrastructure).  Main drivers from his point of view are the rapidly changing technology, business pressures and familiarity of automation.  Use cases range from automating high-volume routine tasks to low-volume expert assistance and in between (make everyone as good as the best).
He concluded with some guidance for the audience (if we want to help define the future): embrace the shift to self-service, the shift to networked (cloud-based) platforms and the shift to learning systems.

The Great Migration: How to survive the leap from BPM as we knew it to the era of the digital workforce
Clay Richardson - Digital FastForward

Clay believes 75% of the current BPM programs nowadays won't survive the shift to digital.  He is no longer a BPM analyst at Forrester as he wanted to focus more on actually helping customers make the digital step (not just making them excited only). The challenge is not necessarily the technology but how to get (and keep) the right skills and mindset for digital transformation.  You will have to use approaches like hiring new talent, reinventing the workforce or outsourcing innovation (or all of the above). And help teams to design, validate and learn (using new methodologies and tools).  And it's not just about what you learn but how you learn: it needs to be interactive and immersive (learning gamification).  And put these people in front of the customer (even if it means leaving their comfort zone) so they learn about what it is they need.
Want to take this gamification to the extreme? Apparently you learn better when combining learning with escape rooms - even including the zombies - looking forward to that experience ! :)

Saturday, April 15, 2017

bpmNEXT and Red Hat Summit

We have a lot of interesting things in the works with Drools, jBPM and Optaplanner and we are going to show you! And what better opportunity to take a look under the hood at what is coming than joining us on a session, side talk or over a happy hour in the upcoming conferences?

Here is a short list of the sessions we have on two great conferences this month.  First of all, on bpmNEXT Edson will talk about DMN support and I will give a talk on unstructured cases.  

Red Hat Summit is 2 weeks later and there will be a larger number of presentations around BPM (and a lot more of course)! We hope to meet you there!

Oh, and check the bottom of this post for a discount code for the Red Hat Summit registration!

Santa Barbara, California April 18-20, 2017