BPMN 2.0 Extensions: Data, Content, Rules and Goals

We use a subset of BPMN 2.0 graphic modeling concepts for our adaptive process visualization in Papyrus ACM. To shorten this post and keep the principle discussion about flowcharts product-independent I moved it to ‘The Problem with Flowcharts’. Here I focus on how we are able to reduce the limitations of BPMN within the Papyus Platform.

Flowcharts are like all models: THEY ARE WRONG, but some models are still useful. The usefulness of BPMN for adaptive processes lies in its artifact representation, pools, swimlanes and the extensibility. While we avoid flowcharts as a must, it makes sense to add data, content and rules while organizing the process around GOALS in BPMN. Mostly, because the notation is reasonably well known by BPM experts.  Once all goals and related tasks and rules have been defined, the case can also be looked at in the BPMN flowchart, which is editable and does graphically show the actual state of the case execution. All the data, content, rules and GUI forms are displayed as well as comment stickers and can be directly edited by a skilled person. While we can export the BPMN formats in XPDL, there is no standard for the extensions. We can however export them in any XML format if so desired. The complete process with all elements is stored, version controlled and deployed within the domain network of the Papyrus Platform.

For the business user we use a ‘Case Builder’ GUI where users can select GOALS from a library and drag them into a case. Incoming content events do not have to be predefined. Classification will assign the content/message to the right case and/or activity.  Users can select or create new GOALS to deal with it. The goal-orientation is a powerful ‘KEEP-IT-SIMPLE’ approach. The User-Trained Agent (process mining) will recommend to add a certain goal for a type of message, if users have done this before in similar situations. Goals can contain goal-rules and authorized business users can add or modify goal, global or local business rules using a Natural Language Rule (NLR) editor that shows all data in the context and verifies rule validity.

Creating or editing rules in the LIVE case with data context in any language.

Why extend BPMN 2.0 for business user presentation?

Despite the progress from 1.1, the enhanced and additional definitions still are prone to very ambiguous models. An ‘Activity’ can still represent any number of different functions and the new event types are lacking in detail on how they interact with the flow. There is still no artifact method, attribute and state modelling and no business rules. The proposed UML-like data modeling is non-existent. All inbound and outbound content, as well as GUI artefacts and rules will still have to be done outside the BPMN model. Hence, no model preservation, no roundtrip, no usability by the business and thus a lot of project management bureaucracy.

Because of the above, the BPMN 2.o Standard can only represent a small part of a complete process and it is neither usable as-is for very dynamic processes, nor is it easy enough to use for the business users to describe their processes. Especially the interaction of various processes is extremely difficult to design and coordinate. Business users cannot create event handling exceptions for intersecting processes. Therefore the intersection has been moved away from sending and receiving events to asyncronous goals.


BPMN View for Process Goals in Swimlanes and Groups

A standard BPMN model has only the acting agents (users) as real world entities whose decisions to perform functions on artifacts can’t be modeled. BPMN 2.0, as all flowchart models, is STILL functionally blind to the inner function of the major elements of a business process (content data, context and related rules) and therefore to its decision logic. Because of the programmed data, content and rule functions this requires long projects to implement, and thus orthodox flowcharts reduce the agility of a business rather than improve it.

We extended BPMN 2.0 to support a bottom-up modeling approach, where knowledgable or skilled actors assign real-world entities with state-changes and rules to process goals that are linked to top-down business targets, to produce a much more realistic and most of all adaptable model.  Users find it much easier to interact with real world entities and their states than with abstract flows. Rather than to enforce agents to perform in a certain way, the system simply enforces basic rules of the game and creates substantial transparency and therefore flexibility and adaptability. Process management has to offer complex real world models of people acting as a social group on business entities. Socializing to define flowcharts still creates the above limitations.

While BPMN is not a tool for business people to design large complex process networks, the representation can be helpful for understanding once the processes are organized around business goals and mostly controlled by rules and not gateway flow-logic.

Advertisements
Posted in ACM, BPMN | Tagged , , | 1 Comment

ACM Empowerment or BPM Enforcement?

Some BPM consultants propose that processes are the most important corporate asset. I disagree because a process is an abstract entity that produces no value. Value is defined by human interaction and perception in the real world. While abstract processes promise to make that human interaction more controllable they ignore human nature and workplace psychology, much as socialism and communism do. These are idealistic concepts that fail in the real world of individual human agents. People are at their best when they feel that their contribution is valued as an individual. Therefore the idea of empowerment – making people responsible for their work – has been around for some time.

Why you would want to empower employees?

Empowerment is often misunderstood as authority for decisionmaking for everyone about anything. Empowerment is about assigning authority, goals and means to the right people. Some tries at empowerment have failed to show the hoped for results because they followed the idea that all people are the same. The most important element of empowerment is the realization that people are essential, but that they have different skills and interests. Not clever and dumb, or lazy and hardworking, but just people in the wrong place. Empowerment is a better way of people management that enables a stronger customer focus.

I go with the 80/20 Pareto rule. 20% of people are responsible for 80% of results. But one can not fire the other 80% of people as the remaining would again structure the same. It means that only 20% of people have the interest and capability to take responsibility. I see them as process owners (PO) who can be given goals to chase. 20% is actually enough as the ideal team size is ten which provides one teamleader and one assistant/stand-in per team.

Empowerment requires two important elements: first, a people focus as above and second, business and process transparency. Transparency is best achieved by a collaborative process support infrastructure such as provided by Adaptive Case Management. It won’t be your run-of-the-mill BPM/SOA software. Transparency enables monitoring of (business) goal achievement of each team to verify if goals are set sensibly and well understood. There are no flows for the process of a PO, but simply a hierarchy of goals and tasks, driven by states and events. POs are empowered to question goals and can decide to change the way their goals are achieved.

What does Papyrus ACM provide to empower people in a process-owning business?

  1. Forget flowcharts and focus on GOALS: A process flow focus requires bureaucracy for process improvements. Goal orientation enables the business to link all management hierarchies simply together and does not require additional metrics. The goal is the metric and it must be directly linked to the perception of the customer.
  2. Define the process owners: Model the structure of process owners for all processes that you need to improve. Map out through which real-world deliverables they serve each other. Define the hierarchy of goals.
  3. Customer service goals: The goal achievement (customer satisfaction) optimization loop must be responsibility of that process owner. A normal flowchart can not redesign and optimize itself.
  4. Real-time business data: The process owner needs real-time business data to measure his goal achievement and the authority to execute towards those goals. IT is essential for that to add the transparency that the PO and the executive needs.
  5. Define real-world entities only: Real world entities to be delivered to a customer according to cost and quality goals are plausible. Process flow is abstract and not understood.
  6. States – Events – Rules: Real world entities that have plausible states and are linked together with rules create plausible process states and can be easily improved by adding new entities, new rules and new actors without redesign.
  7. Do not fragment process execution: Most BPM analyst and consultants propose that you need multiple best-of-breed BPM products to cover all process needs. That is not sensible, because each process might change over time to require different process concepts. The process environment must allow new process concepts to be defined and used within the same infrastructure. Structured (20%), Dynamic (60%) and Ad-Hoc social processes (20%) must all be handled in one system.
  8. Service and Support processes are the same: Deliverables in all process types are achievable with the same adaptive, goal oriented state/event/rule model. Using an adaptive process model reduces the need for support meta-processes such as change management dramatically. The functional change management of the processes must be part of the process platform and executable by the process owners.

Conclusion:

So why would empowerment work better than stick and carrot, known as reward and punishment, or strict quality monitoring? Each action in an empowered organization drives productivity forward, while rigidly planned organizations (hierarchically or not) waste bureaucratic energy on analysis and designs, policing procedures and reward/punishment systems. Not only that, but each controlling or monitoring action, and each reward will cause counterproductive forces in the organization.

Posted in ACM, BPM | Tagged , , , , , , , | 1 Comment

Why Adaptive Case Management?

What keeps the managers, directors and executives of a business awake at night, tossing and turning? The long-range questions about where the ecomony is going and what the competition is or will be doing? Does anyone think they are wondering if they should install BPM or ACM? Much more likely their twisted bedsheets will be caused by the two related questions:

  1. Are we doing the right things?
  2. Are we doing things right?

One cannot separate these two questions and consider one or the other as more or less important. Yes, 1. ought to be asked first with 2. right thereafter. Not to do so is similar to considering effectiveness and efficiency independently. That is unfortunately one of the major fallacies of business management today. Too often efficiency is the only consideration when poor effectiveness might be the cause for the lack of profitability. The efficiency and cost-cutting focus applied in too many BPM implementations would tie knowledge workers into straight jackets and kick them downhill, while bureaucrats folly over their rolling speed in governance meetings. Makes me still wonder …

Doing the right things right can only be improved by one essential element of business management: TRANSPARENCY. Is transparency about management demanding information from the business units? No, transparency has to start from the top down. Executives first need to make their strategy, directors their targets and managers (process owners) their goals transparent to those who execute. That consolidated view allows to look at the things the business is doing to discuss if they are being done right. A business even should be as transparent as sensibly possible to its customers.

ACM provides real-time transparency and empowerment!

Information about how things are going makes only sense in one way: REAL-TIME! There is little benefit in knowing that things went wrong last year by means of business intelligence, but one ought to know right at the moment they go wrong to enable steps to make them right! How can one ensure that things that go wrong are being corrected? The people with the right knowhow need to be empowered to take ACTION. But which action will get things right? Who knows? Will it always be same action that fixes an apparently similar problem? Can a rigid, flowcharted execution ensure that nothing goes wrong? Not likely, except for a few simple business activities like moving a box from one room the other. Empowering the right people to do the right things right puts the organization into auto-pilot.  Empowerment is not about using Twitter, YouTube and iPhone apps but about authority, goals and means.

To know if the business is doing right things right for a certain outcome needs immediate feedback from the customer. Does that mean that the customer ought to be real-time connected to the customer-focused processes of your business? Yes, that is exactly what focused means.

At this point Adaptive Case Management comes into play. It is essentially empowerment technology that puts the customers and actors in the driver seat. No amount of social networking will improve flowcharted processes before, during or after things went wrong. ACM allows the business to empower selectively and securely all the people that do things and those for whom things are being done. ACM is about communication and process as ONE! ACM leaves the automation of the low-value, highly repetitive administration tasks to BPM, but it provides the platform for the high-value, unique and skill or knowledge intensive customer service processes. That is where customer loyalty is being created and maintained.

But that is not the only place the ACM empowers, because it also interconnects the management layers and enables continuous  innovation and optimization without ANY bureaucratic governance overhead. Reorganizing a business could become an exercise that executives and directors can perform by rearranging tactical targets on their iPad.

Sounds like a dream? Yes, it does. Doesn’t it?

Posted in ACM | Tagged , , , , , | Leave a comment

Can BPM encode knowledge?

There are BPM proponents who say that using structured process should be seen as experience encoded into process flowcharts during analysis. I disagree, because a procedure that may have worked in the past is not goal oriented. Experience is actionable knowledge gained by an individual and is not something one can copy. Applying experience happens by people at each singular activity towards the goal. The legal requirement to document a process refers to the actual instance/enactment, but does not mean all processes have to be the same and follow the same outdated, encoded knowledge.

Can we be sure? Yes. The higher you go in the management hierarchy, the less predefined processes you will find, need and be able to work with. Otherwise, why would one need management and executives? What if executives themselves could layout a capability map, create a set of goals, list the data entities that they want information on, link it to the content that describes their strategy, pass the requests to the assigned process owners, who then assign activities to the experts and everyone can watch it happening in real-time and make changes as it happens. Now that is a sensible goal for technology and it has been mine for over ten years.

Only businesses who managed to use technology as an innovation enabler are shooting past those that control IT and/or processes by using governance, centers of excellence and best practices. Each day a business does not innovate it falls behind because the economy is a six lane highway and the speed limits are going up each year. If you stop to execute lengthy innovation processes to figure out whether you need to go straight or exit, you will get run over. Missing the right exit will cost you time and money. Businesses take thousands of those decisions each day and the more of these are automated, the less does a business consider direction in relationship to outside conditions. Evolutionary change can’t be encoded into idea -> invention -> innovation processes. Innovation happens every minute between our ears.

I propose that the only way to improve both understanding and decision making is to offer a real-time perspective on what is currently happening. You watch and steer NOW, because the past can’t be undone, no matter how well you might capture it statistically. The future can’t be enforced or predicted regardless of how well the outdated statistics fit under the bell curve. It all comes down to using experience in the light of current knowledge.

While Enterprise 2.0 proposes to empower people in real-time by copying social networking benefits from the Internet, it is very questionable that this might actually work. One cannot force people to share knowledge and experience is even harder to share. Corporate politics also stop people from sharing knowledge willingly regardless whether they can or not. ACM gathers the knowledge from people’s ACTIONS, so knowledge sharing is no longer dependent on the willingness of the owner.

ACM is in difference a navigation system for the economy highway with real-time traffic information. The important element is a change in the DESIGN paradigm that is missed by ‘social’ proponents. It is neither simply socializing during the BPM design phase, nor is it adding social communication/collaboration to a pre-designed process. The social aspect is what we defined in ACM as ‘moving the process creation into the execution.’

“Adaptive process technology exposes structured (business data) and unstructured (content) information to the business actors of structured (business) and unstructured (social) organizations to interactively create, modify and securely execute – with knowledge gathered during execution – structured (process) and unstructured (case) work in a transparent and auditable manner.”

The key of ADAPTIVE is that the actor may get some suggested activity, but he can take another decision (given the authority) and his acting is recorded and he may even be prompted to explain his decision. The gathered knowledge is reusable in the template and available to other actors, but it will never be turned into hardcoded knowledge that replaces human intuition.

CEOs will need to become as  technology savvy as their CIOs and understand the immense power of change potential that IT can create if it is not held back by bureaucracy. Change and innovation is pulled forward by the gravity of the fitness landscape of your organization and it is created by people enabled and empowered by technology.

Posted in ACM, BPM, Enterprise 2.0 | Tagged ,

Process Mining versus User-Trained Agent

ISIS Papyrus Adaptive Case Management is on a much higher level than any other implementation. The key element of that is the User-Trained Agent (UTA) that is enabled by the combination of the model-preserving concept and the object-relational database of the Papyrus Platform.

So how is the UTA different from the ideas of process mining as pursued by Christian W. Günther and Wil M.P. van der Aalst at the Eindhoven University of Technology?

The first point is: Where do you mine from? If you perform process mining from a normal BPM system there is little to be mined as processes are executed as-is with a few decision taken. Aalst pursued also a direction to integrate ADPET2 with process mining because there at least the process can be changed by the user. The idea was that process mining can deliver information about how process models need to be changed, while adaptive process provides the tools to safely make changes. Process mining moved along for a while, with Aalst admitting later that “the problem lies with process mining assumptions that do not hold in less-structured, real-life environments, and thus ultimately make process mining fail there”. So the problem is the same one as with ADEPT2. The real world is simply not a university test lab.

To improve the process mining, it would have to happen from logs of user interaction in freely collaborative environments, such as email or case management. However, allowing users to work in less restrictive environments will lead to more diverse behavior, leading to “spaghetti” process models that — while an accurate picture of reality in principle — do not allow meaningful abstraction and are useless to process analysts. The graph below has been mined from machine test logs using the Heuristics Miner, showing events as activity nodes in the process model. As everything is connected it is impossible to interpret a process flow direction or any cause and effect dependencies. That led to the concept of the Fuzzy Miner, which  combines analyzing the significance and correlation of graph elements to achieve abstraction and aggregation. Fuzzy Miner uses a very flexible algorithm and the tuning can be tedious to improve results. Aalst says “that process mining, in order to become more meaningful, and to become applicable in a wider array of practical settings, needs to address the problems it has with unstructured processes.”

Process Mining with Edge Filtering (Source: Van der Aalst)
Process Mining with Edge Filtering (Source: Van der Aalst)

I chose a completely different approach to adaptiveness and to process mining because I approached it from a human perspective. Who knows how to execute a certain process step? The actor! He should be able to create and change anything in principle and I need to learn from his actions. The business user has however no notion of flowcharts. He just works with process resources, such as content. First, our process and data model is built using an objects meta-model. Therefore ANYTHING can be adapted at runtime or in the template by any authorized user. For process mining, the Papyrus User-Trained Agent does not look at process logs of enactment. It performs pattern recognition on the data objects and their relationships of the complete state space at the time an action is performed by the user.

The UTA analyzes what elements of the pattern are relevant for its subsequent repeated actions. That will include information about previously executed steps and their results. Because the data model of the process execution is fully accessible in real-time, the pattern recognition is performed in REAL-TIME each time a process relevant action is performed by a user. To utilize the gathered pattern knowledge, each time a change is triggered in the state space observed by the agent, it tries to identify a matching pattern and if it does so, recommends that action to the user. The larger plan beyond that action is not relevant to the UTA. A sequence of recognized patterns will create a fully executable process, while allowing the process knowledge to be shared between multiple and differing executions. If the actor performs that recommended action the confidence level of that recommendation is raised, if not the pattern is analyzed for differences to the recommended action. If it is the same, the confidence level is lowered. One or more of the most confident recommendations can be presented to the actor to chose. UTAs can run in parallel and observe all kinds of state-spaces for either specific roles, queues or arbitrary groups of objects. The UTA does not only learn process execution knowledge, but also if a prospect is considered creditworthy or if his behavior seems fraudulent. The UTA does not abstract, deduct or induct rules. It builds on the premise that most of the relevant data that cause a user to perform the current action are in the state space and thus performs a transductive pattern recognition transfer from actor to agent. That is the content of my 2007 UTA patent in short.

The most interesting part of machine learning is in a truly dynamic environment, because here any guidance to the user can substantially speed up and improve decision making. The dynamics happen because unknown events can influence the process at any time. What most people don’t realize that it is not just the outside that is uncontrollable and unexpected, but any ACTION that is set by a user can have unforeseen consequenzes. These are most important cases to learn from and in a BPM system they have then to be solved OUTSIDE the process system. That simply is not right! But that does not stop BPM vendors to simply call their product ‘adaptive’ and claim that they can do everything that ACM does with ad-hoc workflows and socal interaction. That is simply not correct, because they are not context linked to the case, its data and content.

We are now entering the BUZZWORD BINGO stage of ADAPTIVE process with vendors left and right claiming to be adaptive in nature, often enough while name-dropping all other attributes as well (agile, flexible, dynamic, social). Adaptive process has to involve improvement feedback from the enactment into the process template, either by the actors or by some software mechanism. I propose that in the real world, the feedback and improvement must not just be about the process steps but about any entity or resource in the process/case. The theoretical adaptive approach only focuses on the flowchart improvement and that is its downfall, just like with orthodox BPM vendors claiming to be adaptive. A truly adaptive solution must enable process creation from scratch and allow it to be turned into a fully automated process with backend integration without needing upfront flow analysis.

Posted in ACM | Tagged , , | 1 Comment

The Scientific Roots of ACM

The idea that a predefined flowcharted processes needs to be continuously adapted by users or the software has been there since before 1995. I started to develop the Papyrus Platform in 1997 with the premise of a fully adaptive application. Before that also formal research took place by for example by Manfred Reichert and Peter Dadam and others at the University of Ulm, as well as Christian W. Günther and Wil M.P. van der Aalst at the Eindhoven University of Technology. While Reichert and Dadam focused on user-interactive flowcharts with their ADEPT system, Günther and Aalst searched for the holy grail in process mining from enactment logs.

ADEPT1 from 1998 mainly focused on ad-hoc flexibility and distributed execution of processes, plus process schema evolution, which allows for automatic propagation of process changes to already running instances. With ADEPT2, processes are modelled by applying high-level change operations starting from scratch. These change operations with pre- and post-conditions try to ensure the structural correctness of the resulting process graph. While that enables syntax-driven modeling and guidance to the process designer, it does limit (by design) the functionality of the result. ADEPT2 also uses a formally independent data flow. Data exchange between activities happens by global process variables stored as different versions of data objects. That leads however to  different versions of a data within AND-Split and XOR-Join. While that can cause issues, it is required for correct rollback of process instances when activities fail. ADEPT2 offers a usable solution for process flow improvement, but it is built on a purely academic process theory that leaves many aspects of real-world business needs – such as resources – in the dust.

Process Mining in ADEPT2 (Source: Van der Aalst)

Concepts of ADEPT2 can be found in some BPMN based systems as well as in Papyrus, particularly the object data model and the model preserving approach. I see however a process as state/event driven by its data objects, content and activities and the flow being just one way to look at it. I do not see the need for a formally error free process, because exceptions can happen at any point in time and their resolution is a normal activity, typically executed by a superuser or expert. They could immediately improve the process template if necessary. I also see rules as essential to map controls across multiple processes, while ADEPT2 only uses normal decision gateways. In ADEPT2 each task will have a fixed state engine, while I see a state engines as a freely definable object property. That points to another major difference in my research to the one of the ADEPT2 team, because I built the data elements and the process execution ON TOP of the same freely definable object model. That means further that all data and process objects have a visible existence OUTSIDE the process and can for example be linked into several cases and their state changes will influence multiple task states. Object integrity can always be ensured through transaction locks.

ADEPT2 is also available as a commercial product.

Posted in ACM | Tagged , , , ,

BPM is about Cost-Cutting

Whenever the discussion starts there are those who try to enlighten us that BPM ‘is not about technology’ and ‘not about flowcharts’ but it is a management concept and a methodology. Some call it a practice. In these times I would call BPM common sense management. It is no longer new or a mystery! We are rather going backwards with BPM to Tayloristic management concepts. Is there really anyone left in a management position in larger than SMBs who does NOT KNOW what the idea behind process management is? If yes, he/she shouldn’t be there. And once someone understands the acronym does he really need more than a day to understand what the idea is and how to implement it as a management approach? There are numerous books that will tell you how to do it. So many different BPM approaches have popped up over the years from inside-out, via top-down, to bottom-up, so clearly we need now outside-in. Why does anyone still need to be consulted on the principle approach to define work into process structures with real-world handovers, a focus on customer outcomes, assigning owners and empower them with goals, authority and means – and OFF YOU GO! It doesn’t take an Einstein to conclude that these ought to be aligned with business strategy!

I think the reason for all this consulting is that no-one in management wants to be left standing when the music stops (it seems we are dancing musical chairs). Many BPM projects are hidden manpower/cost reduction schemes and no one wants to be the bad guy. To me BPM is not a magic discipline but rather employing common business sense, that anyone with half a brain can understand in a day. Yes, we all want to improve how we work (meaning how we do processes), with less waste, clear goals and all necessary information. However, in these times the discussion of BPM without software is totally ridiculous. IT is the one and only core competitive means if businesses know how to use it. Why would anyone even think about doing process management without software?

What is wrong with the idea of BPM as implemented in many BPMS, is that huge amounts of time have to be spent to analyze process flowcharts that the people then have to adhere to. That does really come from BPM as a methodology. Only a small percentage of work (20%) might be that stable. During the ACM Tweetjam I tweeted that ‘Process is not an assembly line!’ and Connie Moore of Forrester Research retweeted ‘Agree, agree, agree!’ (Phew, I am not alone.) Businesses no longer want those complex analysis projects and the rush to buy Sharepoint despite all its shortcomings, rather than large ECM suites,  sends a clear message to all of us.

The most popular fad is now to buy drop-in process packages for BPMS to speed up implementation. That is fine but now new. Businesses often buy ERP because they are buying the hardcoded processes to improve the way their businesses work, lacking the skill to do it themselves. That is all the hardcoded processes a business can survive. Encoding more processes in BPM is not beneficial.

I give you five good reasons:

  1. Business is about people and not about processes and control.
  2. People are about relationships and not about performance oriented pay.
  3. Good business relationships bring value to everyone – customers, employees, and owners.
  4. Open communication is the only way to improve relationships (not a CRM DB …)
  5. IT is the only tool to improve communications in an enterprise.

Is cost cutting by BPM automation really the only answer to increased competition, less loyal customers, and less predictable business cycles? There are those who propose that it is possible to increase revenues, improve service quality and at the same time reduce costs by means of various magic methodologies and technologies. Yes, cutting costs is the easy way out to make results look better. But it is dangerous too. BP was cutting costs while oil well drilling in the gulf. In the long run that damages any business, which is why the next CEO will have an even harder time to pull it up from it’s sick bed using the same old snake oil. For BP that might be too late.

Well, you might say, A FOOL WITH A TOOL IS STILL A FOOL on any level, so no amount of technology will save a weak CEO. True. But the greatest of people WILL BE held back by the bureaucracy of methodology and by those hardcoded processes made for low-cost, unskilled, and simply replaceable staff. No, I am not making this up! Agile, flowcharted BPM or hardcoded ERP processes may have cut costs, but they don’t make a business more competitive despite all the claims to the contrary! HP commissioned Coleman Parkes Research in February and March 2010 to look at how current IT budgets were spent and how organizations estimated the cost of lost time, effort and opportunities due to innovation gridlock. Of 560 interviews, half of the business executives said that 40 percent of budgets are spent on mission critical systems, 30 percent on legacy systems, and only 30 percent on new IT initiatives (of which half fail or underperform), which prevented their organizations from keeping up with the competition. If executives  don’t understand IT and its power, they are like old generals who fight all the old battles again despite a change in war technology. And clearly, they will loose!

There isn’t any proof that a business gets longterm benefits from analytic BPM and complex BPMS implementations. Technology must allow processes to evolve to any structure they might need at any point in time without bureaucratic overhead. The focus must be not to cut cost, but to make the best people (knowledge workers) the most productive and effective. The most important knowledge workers of a business are at the top management level. This is where IT still fails to deliver business value.

Posted in BPM | Tagged , , | 1 Comment