Why product training is so important

by Frank 23. June 2013 06:00

I run a company called the Knowledgeone Corporation that produces a software application called RecFind 6 that is used to provide records management, document management, workflow, document imaging, email management and general business process management functionality. Every installation is different because we configure RecFind 6 to the exact requirements of each customer. All installations include some form of business process management and many include a reasonable degree of complexity, especially, when integrating to other systems.

We are always proposing to new and existing customers and negotiating contracts and the one item in the pricing summary that is always under attack is training. As well as questioning the need for face to face training, many customers also try to reduce the cost by just training a core group that will then train the rest of the staff who will use the new system.

I try to explain that effective and complete training is fundamental to the success of the project; that training isn’t a cost, it is an investment in success. I rarely win.

I also try to convince my customers of the importance of ongoing training for new releases and new employees but I am rarely successful.

I try to explain that cutting costs on training is a sure fire way to ensure that the project will never be as successful as it could be. I rarely win this argument either.

And finally, I always recommend that an IT person attends the training course because his/her services will be need by the application administrator throughout the year. This rarely happens.

Yet, time after time and in example after example, projects (not just ours) are significantly less successful than they should be because someone in management decided to cut costs by skimping on training; by not training operational staff in how to use the product in the most cost effectively and productive way possible.

If you skimp on training you are almost certainly dooming your project to failure.

Lack of knowledge on how to best use a product is an insidious cancer. The project may begin with a big bang and lots of congratulations but deep within your organization the cancer has already started to grow. “I don’t like this product.” “This product is too hard to use.” “I can’t find anything with this product.” “My staff don’t understand this product.”

By year two, many people and departments simply don’t use the product any more. By year three there is a concerted push to find a replacement for this product that “is too hard to use. No one understands it.” The replacement project manager or application owner, who hasn’t been trained, is unable to address the complaints and soon also decides that the problem is with the product. It would be a bad career move to decide anything else.

In year four the organization begins looking for a replacement product. In year five, at great expense they select a replacement product and then lower costs by skimping on training. The cycle starts again.

If you skimp on training and re-training your project is doomed to failure.

How many expensive failures does it take before we learn the lesson?

Training is an investment in productivity, not a cost.

Records Management in the 21st century; you have computers now, do it differently

by Frank 1. June 2013 06:32

I own and run a computer software company called the Knowledgeone Corporation and we have specialised in what is now known as enterprise content management software since 1984 when we released our first product DocFind. We are now into the 8th iteration of our core and iconic product RecFind and have sold and installed thousands of RecFind sites where we manage corporate records and electronic documents.

I have personally worked with hundreds of customers to ensure that we understand and meet their requirements and I have also designed and specified every product we have delivered over the last 29 years so while I have never been a practicing records manager, I do know a great deal about records and document management and the vagaries of the practise all around the world.

My major lament is that many records managers today still want to run their ‘business’ in exactly the same way it was run 30 or 50 or even a hundred years ago. That is, as a physical model even when using computers and automated solutions like our product RecFind 6. This means we still see overly complicated classification systems and overcomplicated file numbering systems and overcomplicated manual processes for the capture and classification of paper, document images, electronic documents and emails.

It is a mindset that is locked in the past and can’t see beyond the confines of the file room.

I also still meet records managers that believe each and every employee has a responsibility to ‘become’ a junior records manager and both fully comprehend and religiously follow all of the old-fashioned and hopelessly overcomplicated and time-consuming processes laid out for the orderly capture of corporate documents.

I have news for all those locked-in-the-past records managers. Your approach hasn’t worked in the last 30 years and it certainly will not work in the future.

Smart people don’t buy sophisticated computer hardware and application software and then try to replicate the physical model for little or no benefit. Smart people look at what a computer system can do as opposed to 20,000 linear feet of filing shelves or 40 Compactuses and 30 boxes of filing cards and immediately realize that they have the power to do everything differently, faster, most efficiently and infinitely smarter.  They also realize that there is no need to overburden already busy end users by a forcing them to become very bad and very inconsistent junior records managers. End users are not hired to be records managers they are hired to be engineers, sales people, accountants, PAs, etc., and most already have 8 hours of work a day without you imposing more on them.

There is always a better way and the best way is to roll out a records and document and email management system that does not require your end users to become very bad and inconsistent junior records managers. This way it may even have a chance of actually working.

Please throw that old physical model away. It has never worked well when applied to computerised records, document and email management and it never will. Remember that famous adage, “The definition of insanity is to keep doing the same thing and to expect the results to be different”?

I guarantee two things:

1.     Your software vendor’s consultant is more than happy to offer advice and guidance; and

2.     He/she has probably worked in significantly more records management environments than you have and has a much broader range of experience than you do.

It doesn’t hurt to ask for advice and it doesn’t hurt to listen.

Can you save money with document imaging?

by Frank 4. November 2012 06:00

I run a software company called Knowledgeone Corporation that produces an enterprise content management solution called RecFind 6 that includes extensive document imaging capabilities. We have thousands of customers around the world and as far as I can see most use RecFind 6 for document imaging of one kind or another.

This certainly wasn’t the case twenty years ago when document imaging tools were difficult to use and were expensive stand-alone ‘specialised’ products. Today however, almost every document management or records management product includes document imaging capabilities as a normal part of the expected functionality. That is, document imaging has gone from being an expensive specialised product to just a commodity, an expected feature in almost any information management product.

This means most customers have a readily available, easy-to-use and cost-effective document imaging tool at their fingertips. That being the case there should be no excuse for not utilizing it to save both time and money. However, I guarantee that I could visit any of my customers and quickly find unrealised opportunities for them to increase productivity and save money by using the document imaging capabilities of my product RecFind 6. They don’t even have to spend any money with me because the document imaging functions of RecFind 6 are integrated as ‘standard’ functionality and there is no additional charge for using them.

So, why aren’t my customers and every other vendor’s customers making best use of the document imaging capabilities of their already purchased software?

In my experience there are many reasons but the main ones are:

Lack of knowledge

To the uninitiated document imaging may look simple but there is far more to it than first appears and unless your staff have hands-on experience there is unlikely to be an ‘expert’ in your organization. For this reason I wrote a couple of Blogs earlier this year for the benefit of my customers; Estimating the cost of your next imaging job and The importance of document imaging. This was my attempt to add to the knowledge base about document imaging.

Lack of ownership

The need for document imaging transects the whole enterprise but there is rarely any one person or department charged with ‘owning’ this need and with applying best-practice document imaging policies and procedures to ensure that the organization obtains maximum benefits across all departments and divisions. It tends to be left to the odd innovative employee to come up with solutions just for his or her area.

Lack of consultancy skills

We often say that before we can propose a solution we need to know what the problem is. The way to discover the true nature of a problem is to deploy an experienced consultant to review and analyse the supposed problem and then present an analysis, conclusions and recommendations that should always include a cost-benefit analysis. In our experience very few organizations have staff with this kind of expertise.

Negative impact of the Global Financial Crisis that began in 2008

All over the world since 2008 our customers have been cutting staff and cutting costs and eliminating or postponing non-critical projects. Some of this cost cutting has been self-defeating and has produced negative results and reduced productivity. One common example is the cancelling or postponing of document imaging projects that could have significantly improved efficiency, productivity and competitiveness as well as reducing processing costs.  This is especially true if document imaging is combined with workflow to better automate business processes.  I also wrote a Blog back in July 2012 for the benefit our customers to better explain just what business process management is all about called Business Process Management, just what does it entail?

In answer to the original question I posed, yes you can save money utilizing simple document imaging functionality especially if you combine the results with new workflow processes to do things faster, more accurately and smarter. It is really a no-brainer and it should be the easiest cost justification you have ever written.

We have already seen how most information management solutions like RecFind 6 have embedded document imaging capabilities so most of you should have existing and paid-for document imaging functionality you can leverage off.

All you really need to do to save your organization money and improve your work processes is look for and then analyse any one of many document imaging opportunities within your organization.

A clue, wherever there is paper there is a document imaging opportunity.

Are you also confused by the term Enterprise Content Management?

by Frank 16. September 2012 06:00

I may be wrong but I think it was AIIM that first coined the phrase Enterprise Content Management to describe both our industry and our application solutions.

Whereas the term isn’t as nebulous as Knowledge Management it is nevertheless about as useful when trying to understand what organizations in this space actually do. At its simplest level it is a collective term for a number of related business applications like records management, document management, imaging, workflow, business process management, email management and archiving, digital asset management, web site content management, etc.

To simple people like me the more appropriate term or label would be Information Management but as I have already covered this in a previous Blog I won’t beleaguer the point in this one.

When trying to define what enterprise content management actually means or stands for we can discard the words ‘enterprise’ and ‘management’ as superfluous to our needs and just concentrate on the key word ‘content’. That is, we are talking about systems that in some way create and manage content.

So, what exactly is meant by the term ‘content’?

In the early days of content management discussions we classified content into two broad categories, structured and unstructured. Basically, structured content had named sections or labels and unstructured content did not. Generalising even further we can say that an email is an example of structured content because it has commonly named, standardised and accessible sections or labels like ‘Sender’, ‘Recipient’, ‘Subject’ etc., that we can interrogate and rely on to carry a particular class or type of information. The same general approach would regard a Word document as unstructured because the content of a Word document does not have commonly named and standardised sections or labels. Basically a Word document is an irregular collection of characters that you have to parse and examine to determine content.

Like Newtonian physics, the above generalisations do not apply to everything and can be argued until the cows come home. In truth, every document has an accessible structure of some kind. For example, a Word document has an author, a size, a date written, etc. It is just that it is far easier to find out who the recipient of an email was than the recipient of a Word document. This is because there is a common and standard ‘Tag’ that tells us who the recipient is of an email and there is no such common and standard tag for a Word document.

In our business we call ‘information about information’ (e.g., the recipient and date fields on an email) Metadata. If an object has recognizable Metadata then it is far easier to process than an object without recognizable Metadata. We may then say that adding Metadata to an object is the same as adding structure.

Adding structure is what we do when we create a Word document using a template or when we add tags to a Word document. We are normalizing the standard information we require in our business processes so the objects we deal with have the structure we require to easily and accurately identify and process them.

This is of course one of the long-standing problems in our industry, we spend far too much time and money trying to parse and interpret unstructured objects when we should be going back to the coal face and adding structure when the object is first created. This is of course relatively easy to do if we are creating the objects (e.g., a Word document) but not easy to achieve if we are receiving documents from foreign sources like our customers, our suppliers or the government. Unless you are the eight-hundred pound gorilla (like Walmart) it is very difficult to force your partners to add the structure you require to make processing as fast and as easy and as accurate as possible.

There have been attempts in the past to come up with common ‘standards’ that would have regulated document structure but none have been successful. The last one was when XML was the bright new kid on the block and the XML industry rushed headlong into defining XML standards for every conceivable industry to facilitate common structures and to make data transfer between different organizations as easy and as standard as possible. The various XML standardisation projects sucked up millions or even billions of dollars but did not produce the desired results; we are still spending billions of dollars each year parsing unstructured documents trying to determine content.

So, back to the original question, what exactly is Enterprise Content Management? The simple answer is that it is the business or process of extracting useful information from objects such as emails and PDFs and Word documents and then using that information in a business process. It is all about the process of capturing Metadata and content in the most accurate and expeditious manner possible so we can automate business processes as much as possible.

If done properly, it makes your job more pleasant and saves your organization money and it makes your customers and suppliers happier. As such it sounds a lot like motherhood (who is going to argue against it?) but it certainly isn’t like manna from heaven. There is always a cost and it is usually significant. As always, you reap what you sow and effort and cost produces rewards.

Is content management something you should consider? The answer is definitely yes with one proviso; please make sure that the benefits are greater than the cost.

 

Is Information Management now back in focus?

by Frank 12. August 2012 06:00

When we were all learning about what used to be called Data Processing we also learned about the hierarchy or transformation of information. That is, “data to information to knowledge to wisdom.”

Unfortunately, as information management is part of what we call the Information Technology industry (IT) we as a group are never satisfied with simple self-explanatory terms. Because of this age-old flaw we continue to invent and hype new terms like Knowledge Management and Enterprise Content Management most of which are so vague and ill-defined as to be virtually meaningless but nevertheless, provide great scope for marketing hype and consultants’ income.

Because of the ongoing creation of new terminology and the accompanying acronyms we have managed to confuse almost everyone. Personally I have always favoured the term ‘information management’ because it tells it like it is and it needs little further explanation. In the parlance of the common man it is an “old un, but a good un.”

The thing I most disliked about the muddy knowledge management term was the claim that computers and software could produce knowledge. That may well come in the age of cyborgs and true artificial intelligence but I haven’t seen it yet. At best, computers and software produce information which human beings can convert to knowledge via a unique human cognitive process.

I am fortunate in that I have been designing and programming information management solutions for a very long time so I have witnessed first-hand the enormous improvements in technology and tools that have occurred over time. Basically this means I am able to design and build an infinitely better information management solution today that I could have twenty-nine years ago when I started this business.  For example, the current product RecFind 6 is a much better, more flexible, more feature rich and more scalable product than the previous K1 product and it in turn was an infinitely better product than the previous one called RecFind 5.

One of the main factors in them being better products than their predecessors is that each time we started afresh with the latest technology; we didn’t build on the old product, we discarded it completely and started anew. As a general rule of thumb I believe that software developers need to do this around a five year cycle. Going past the five year life cycle inevitably means you end up compromising the design because of the need to support old technology. You are carrying ‘baggage’ and it is synonymous with trying to run the marathon with a hundred pound (45 Kg) backpack.

I recently re-read an old 1995 white paper I wrote on the future of information management software which I titled “Document Management, Records Management, Image Management Workflow Management...What? – The I.D.E.A”. I realised after reading this old paper that it is only now that I am getting close to achieving my lofty ambitions as espoused in the early paper. It is only now that I have access to the technology required to achieve my design ambitions. In fact I now believe that despite its 1995 heritage this is a paper every aspiring information management solution creator should reference because we are all still trying to achieve the ideal ‘It Does Everything Application’ (but remember that it was my I.D.E.A. first).

Of course, if you are involved in software development then you realise that your job is never done. There are always new features to add and there are always new releases of products like Windows and SQL server to test and certify against and there are always new releases of development tools like Visual Studio and HTML5 to learn and start using.

You also realise that software development is probably the dumbest business in the world to be part of with the exception of drug development, the only other business I can think of which has a longer timeframe between beginning R&D and earning a dollar. We typically spend millions of dollars and two to three years to bring a brand new product to market. Luckily, we still have the existing product to sell and fund the R&D. Start-ups however, don’t have this option and must rely on mortgaging the house or generous friends and relatives or venture capital companies to fund the initial development cycle.

Whatever the source of funding, from my experience it takes a brave man or woman to enter into a process where the first few years are all cost and no revenue. You have to believe in your vision, your dream and you have to be prepared for hard times and compromises and failed partnerships. Software development is not for the faint hearted.

When I wrote that white paper on the I.D.E.A. (the It Does Every Thing Application or, my ‘idea’ or vision at that time) I really thought that I was going to build it in the next few years, I didn’t think it would take another fifteen years. Of course, I am now working on the next release of RecFind so it is actually more than fifteen years.

Happily, I now market RecFind 6 as an information management solution because information management is definitely back in vogue. Hopefully, everyone understands what it means. If they don’t, I guess that I will just have to write more white papers and Blogs.

What is a ‘Prescriptive’ RFQ/RFP and why is it bad?

by Frank 22. July 2012 06:02

Twenty years ago our main way of competing for business was to respond to Request For Quotes (RFQ) and Request For Proposals (RFP). Our sales people and technical people spend months on end laboriously responding to detailed questionnaires and spread sheets with only a small chance of winning because of the number of vendors invited to respond. Luckily, this is no longer the main way we compete for business and we now complete only a fraction of the RFQ/RFPs we used to; much to the relief of my hard working sales and pre-sales staff.

Now we only respond to RFQs and RFPs if we have prior engagement plus the opportunity for questions and engagement during the process together with a good fit for our software and services (we sell information management software and services). We also heavily qualify every opportunity and the first step is to initially speed read and scan all proposal documents for what we call ‘road blocks’.

Road blocks are contractual conditions, usually mandatory ones, which would automatically disqualify us from responding. Sometimes these are totally unfair, one-sided and non-commercial contractual conditions and sometimes they are mandatory features we don’t have and often, the road block is simply the prescriptive nature of the request document.

By prescriptive I mean that the request document is spelling out in detail exactly how the solution should work down to the level of screen design, architecture and keystrokes. In most cases prescriptive requests are the result of the author’s experience with or preference for another product.

As we produce a ‘shrink-wrapped’ or ‘off-the-shelf’ product, the RecFind 6 suite, we aren’t able to change the way it looks and works and nor can we change the architecture. In almost every case we could solve the business problem but not in the exact way specified by the author. Because our product RecFind 6 is highly configurable and very flexible we can solve almost any information management or business process management problem but in our particular way with our unique architecture and our unique look and feel.

In the old days we may have tried to enter into a dialog with the client to see if our solution, although working differently to the way the author envisioned a solution working, would be acceptable.  The usual answer was, “Why don’t you propose your solution and then we will decide.” Sometimes we did respond and then learned to our chagrin that our response was rejected because it didn’t meet some of the prescriptive requirements. Basically, a big waste of time and money. So, we no longer respond to prescriptive RFQs/RFPs.

But, why is a prescriptive RFQ/RFP a bad thing for the client? Why is it a bad practice to be avoided at all costs?

It is a bad thing because it reduces the client’s options and severely narrows the search for the best solution. In our experience, a prescriptive RFQ/RFQ is simply the result of someone either asking for the product they first thought of someone who is so inflexible that he/she isn’t able to think outside the box and isn’t open to innovative solutions.

The end result of a prescriptive RFP/RFQ is always that the client ends up with a poor choice; with a third best or worse solution to the problem.

The message is very simple. If you want to find the best possible solution don’t tell the vendors what the solution is. Rather tell the vendors what the problem is and give them the opportunity to come up with the most innovative and cost-effective solution possible.  Give them the opportunity to be innovative and creative; don’t take away these so very important options.

Please do yourself, and your organization, a favour. If you want the best possible solution clearly explain what the problem is and then challenge the vendors to come up with their best shot. Prescriptive requirements always deny you the best solution.

Business Processes Management, BPM, BPO; just what does it entail?

by Frank 15. July 2012 06:00

Like me I am sure that you have been inundated with ads, articles, white papers and proposals for something called BPM or BPO, Business Process Management, Business Process Outsourcing and Business Process Optimisation.

Do you really understand what it all means?

BPM and BPO certainly aren’t new, there have been many companies offering innovative and often cutting-edge technology solutions for many years. The pioneering days were probably the early 1980’s. One early innovator I can recall (and admired) was Tower Technology because their office was just across from our old offices in Lane Cove.

In the early days BPM was all about imaging and workflow and forms. Vendors like Tower Technology used early version of workflow products like Staffware and a whole assortment of different imaging and forms products to solve customer processing problems. It involved a lot of inventing and a lot of creative genius to make all those disparate products work and actually do what the sales person promised. More often than not the final solution didn’t quite work as promised and it always seemed to cost a lot more than quoted.

Like all new technologies everyone had to go through a learning process and like most new technologies, for many years the promises were far ahead of what was actually delivered.

So, is it any different today? Is BPM a proven, reliable and feature-rich and mature technology?

The answer dear friends is yes and no; just as it was twenty-five or more years ago.

There is a wonderful Latin phrase ‘Caveat Emptor’ which means “Let the buyer beware”. Caveat Emptor applies just as much today as it did in the early days because despite the enormous technological progress we have all witnessed and experienced we are still pushing the envelope. We are still being asked to do things the current software and hardware can’t quite yet handle. The behind the scenes technicians are still trying to make the product do what the sales person promised in good faith (we hope) because he didn’t really understand his product set.

Caveat Emptor means it is up to the buyer to evaluate the offering and decide if it can do the job. Of course, if the vendor lies or makes blatant false claims then Caveat Emptor no longer applies and you can hit them with a lawsuit.  However, in reality it is rarely as black and white as that. The technology is complex and the proposals and explanations are full of proprietary terminology, ambiguities, acronyms and weaselly words.

Like most agreements in life you shouldn’t enter into a BPM contract unless you know exactly what you are getting into. This is especially true with BPM or BPO because you are talking about handing over part of your core business processes to someone else to ‘improve’. If you don’t understand what is being proposed then please hire someone who does; I guarantee it will be worth the investment. This is especially true if you are outsourcing customer or supplier facing processes like accounts payable and accounts receivable. Better to spend a little more up front than suffer cost overruns, failed processes and an inbox full of complaints.

My advice is to always begin with some form of a consultancy to ‘examine’ your processes and produce a report containing conclusions and recommendations. The vendor may (should) offer this as part of its sales process and it may be free or it may be chargeable.  Personally, I believe in the old adage that you get what you pay for so I would prefer to pay to have a qualified and experienced professional consultant do the study. The advantage of paying for the study is that you then ‘own’ the report and can then legally provide it to other vendors to obtain competitive quotes.

You should also have a pretty good idea of what the current processing is costing you in both direct and indirect costs (e.g., lost sales, dissatisfied customers, unhappy staff, etc.) before beginning the evaluation exercise. Otherwise, how are you going to be able to judge the added value of the vendor’s proposal?

In my experience the most common set of processes to be ‘outsourced’ are those to do with accounts payable processing. This is the automation of all processes beginning with your purchase order (and its line items), the delivery docket (proof of receipt), invoices (and line items) and statements. The automation should reconcile invoices to delivery dockets and purchase orders and should throw up any discrepancies such as items invoiced but not delivered, variations in price, etc. Vendors will usually propose what is commonly called an automatic matching engine; the software that reads all the documents and does its best to make sure you only pay for delivered goods that are exactly as ordered.

If the vendor’s proposal is to be attractive it must replace your manual processing with an automated model that is faster and more accurate. Ideally, it would also be more cost-effective but even if it is more costly than your manual direct cost estimate it should still solve most of your indirect cost problems like unhappy suppliers and late payment fees.

In essence, there is nothing magical about BPM and BPO; it is all about replacing inefficient manual processes with much more efficient automated ones using clever computer software. The magic, if that is the word to use, is about getting it right. You need to know what the current manual processing is costing you. You need to be absolutely sure that you fully understand the vendor’s proposal and you need to build in metrics so you can accurately evaluate the finished product and clearly determine if it is meeting its stated objectives.

Please don’t enter into negotiations thinking that if it doesn’t work you can just blame the vendor. That would be akin to cutting off your nose to spite your face. Remember Caveat Emptor; success or failure really depends upon how well you do your job as the customer.

Are you running old and unsupported software? What about the risks?

by Frank 29. April 2012 20:59

Many years ago we released a 16 bit product called RecFind version 3.2 and we made a really big mistake. We gave it so much functionality (much of it way ahead of its time) and we made it so stable that we still have thousands of users.

It is running under operating systems like XP it was never developed for or certified for and is still ‘doing the job’ for hundreds of our customers. Most frustratingly, when we try to get them to upgrade they usually say, “We can’t justify the expense because it is working fine and doing everything we need it to do.”

However, RecFind 3.2 is decommissioned, unsupported and, the databases it uses (Btrieve, Disam and an early version of SQL Server) and also no longer supported by their vendors.

So our customers are capturing and managing critical business records with totally unsupported software. Most importantly, most of them also do not have any kind of support agreement with us (and this really hurts because they say they don’t need a support agreement because the system doesn’t fail) so when the old system catastrophically fails, which it will, they are on their own.

Being a slow learner, ten years ago I replaced RecFind 3.2 and RecFind 4.0 with RecFind 5.0, a brand new 32 bit product. Once again I gave it too much functionality and made it way too stable. We now have hundreds of customers still using old and unsupported versions of RecFind 5.0 and when we try to convince them to upgrade we get that same response, “It is still working fine and doing everything we need it to do.”

If I was smarter I would have built-in a date-related software time bomb to stop old systems from working when they were well past their use-by date. However, that would have been a breach of faith so it is not something we have or will ever do. It is still a good idea, though probably illegal, because it would have protected our customers’ records far better than our old and unsupported systems do now.

In my experience, most senior executives talk about risk management but very few actually practice it. All over the world I have customers with millions of vital business records stored and managed in systems that are likely to fail the next time IT updates desktop or server operating systems or databases. We have warned them multiple times but to no avail. Senior application owners and senior IT people are ignoring the risk and, I suspect, not making senior management aware of the inevitable disaster. They are not managing risk; they are ignoring risk and just hoping it won’t happen in their reign.

Of course, it isn’t just our products that are still running under IT environments they were never designed or certified for; this is a very common problem. The only worse problem I can think of is the ginormous amount of critical business data being ‘managed’ in poorly designed, totally insecure and teetering-on-failure, unsupportable Access and Excel systems; many of them in the back offices of major banks and financial institutions. One of my customers called the 80 or so Access systems that had been developed across his organization as the world’s greatest virus. None had been properly designed, none had any security and most were impossible to maintain once a key employee or contractor had left.

Before you ask, yes we do produce regular updates for current products and yes we do completely redesign and redevelop our core systems like RecFind about every five years to utilize the very latest technology. We also offer all the tools and services necessary for any customer to upgrade to our new releases; we make it as easy and as low cost as possible for our customers to upgrade to the latest release but we still have hundreds of customers and many thousands of users utilizing old, unsupported and about-to-fail software.

There is an old expression that says you can take a horse to water but you can’t make it drink. I am starting to feel like an old, tired and very frustrated farmer with hundreds of thirsty horses on the edge of expiration. What can I do next to solve the problem?

Luckily for my customers, Microsoft Windows Vista was a failure and very few of them actually rolled it out. Also, luckily for my customers, SQL Server 2005 was a good and stable product and very few found it necessary to upgrade to SQL Server 2008 (soon to be SQL Server 2012). This means that most of my customers using old and unsupported versions of RecFind are utilizing XP and SQL Server 2005, but this will soon change and when it does my old products will become unstable and even just stop working. It is just good luck and good design (programmed tightly to the Microsoft API) that some (e.g., 3.2) still work under XP. RecFind 3.2 and 4.0 were never certified under XP.

So we have a mini-Y2K coming but try as I may I can’t seem to convince my customers of the need to protect their critical and irreplaceable (are they going to rescan all those documents from 10 years ago?) data. And, as I alluded to above, I am absolutely positive that we are only one of thousands of computer software companies in this same position.

In fairness to my customers, the Global Financial Crisis of 2008 was a major factor in the disappearance of upgrade budgets. If the call is to either upgrade software or retain staff then I would also vote to retain staff. Money is as tight as it has ever been and I can understand why upgrade projects have been delayed and shelved. However, none of this changes the facts or averts the coming data-loss disaster.

All over the world government agencies and companies are managing critical business data in old and unsupported systems that will inevitably fail with catastrophic consequences. It is time someone started managing this risk; are you?

 

Workflow – What does it really entail?

by Frank 8. April 2012 06:00

Workflow has been defined as “the glue that binds business processes together.” Depending upon your background and experience that particular definition may or may not be as clear as mud. Despite having been a key factor in business application processing for a very long time workflow is still very poorly understood by many in business and is more often than not too narrowly defined.

For example, you do not need to pay big bucks for a heavy-duty workflow package and all the services associated with it to implement workflow in your organization. Workflow is really about automating some business process using whatever tool is appropriate. You can automate a business process with Word or Excel or Outlook for that matter and the most common starting point is to first capture a paper document as a digital document using simple tools like a document scanner. You don’t even need a computer (apart from the human brain, the world’s best computer) to implement workflow.

Designing and implementing workflow is more about the thought processes, about evaluating what you are doing and why you are doing it and then trying to figure out a better and more efficient way to do it. It is about documenting and analysing a current business process and then redesigning it to make it more appropriate and more efficient. It is by making it more efficient that you make productivity gains; ideally, you end up doing more with less and adding more value.

You shouldn’t undertake any investigation of new workflows without first having defined objectives and metrics. You should also always begin with some basic questions of your staff or end-users:

  1. What are you doing now that you think could be done better?
  2. What aren’t you doing now that you think you should be doing?
  3. What are you doing now that you don’t think is necessary?

I call these the three golden questions and they have served me well throughout my consulting career. They are simple enough and specific enough that most end-users can relate to them and produce answers. These three simple questions provide the foundation for any business process re-engineering to come. They are also the catalyst to kick off the required thought processes in your end-users. Out of these three simple questions should come many more questions and answers and the information you need to solve the problem.

In every case in the past I have been able to add value well before using tools and creating workflows just by suggesting changes to current manual business processes. As I said earlier, workflow is really about thought processes, “How can I do this in a better and more efficient way?”

Adding value always begins by saving time and money and usually also entails providing better access to information. Real value in my experience is about ensuring that workers have access to the precise information they need (not more and not less) at the precise time they need it (not earlier and not later).  It sound simple but it is the root of all successful business processes, that is, “please just give me what I need when I need it and then I can get the job done.” Modern ‘just-in-time’ automated production lines only work if this practice is in place; it is fundamental to the low cost, efficient and high quality production of any product or service.

When something ‘just works’ very few of us notice it but when something doesn’t work well it frustrates us and we all notice it. Frustrated workers are not happy or productive workers. If we do our job well we take away the sources of frustration by improving work processes to the point where they ‘just work’ and are entirely appropriate and efficient and allow us to work smoothly and uninterrupted without frustration and delays. This should be our objective when designing new workflows.

Metrics are important and should always be part of the project. You begin by taking measurements at the beginning and then after careful analysis, predict what the measurements will be after the project. You must have a way of measuring, using criteria agreed beforehand with your end-users, whether or not you have been successful and to what degree. It is a very bad trades person who leaves without testing his work. We have all had experiences with bad trades people who want to be paid and away before you test the repaired appliance, roof or door. Please do not be a bad trades person.

Metrics are the way we test our theory. For example, “If we re-engineer this series of processes the way I have recommended you will save two hours of time per staff member per day and will be able to complete the contract review and sign off within two days instead of seven days.” The idea is to have something finite to measure against. We are talking quantitative as opposed to qualitative measurement. An example of a qualitative measurement would be, “If we re-engineer this series of processes the way I have recommended everyone will be happier.” Metrics are a quantitative way to measure results.

In summary, implementing workflow should always be about improving a business process; about making it better, more appropriate and more efficient. Any workflow project should begin with the three golden questions and must include defined objectives and quantitative metrics. The most important tool is the human brain and the thought processes that you will use to analyse current processes and design improved processes. Every new workflow should add value; if it doesn’t you should not be doing it.

Critically, workflow must be about improving the lot of your staff or end users. It is about making a process easier, more natural, less frustrating and even, more enjoyable. The staff or end users are the only real judges because no matter how clever you think your solution is if they don’t like it, it will never work.

The Importance of Document Imaging

by Frank 1. April 2012 06:00

 

Document imaging or the scanning of paper documents, has been around a long time. Along with workflow, it was the real beginning of office automation.

Document imaging did for office automation what barcode technology did for physical records management and asset management. That is, it allowed manual processes to be automated and improved; it provided tangible and measurable productivity improvements and as well as demonstrably better access to information for the then fledgling knowledge worker.

Today we have a paradox, whereas we seem to take document imaging for granted we still don’t utilize it to anything like its full capabilities. Most people use document scanners of one kind or another, usually on multi-function-devices, but we still don’t appear to use document scanning nearly enough to automate time-consuming and often critical business processes.

I don’t really know why not because it isn’t a matter of missing technology; we seem to have every type of document scanner imaginable and every type of document scanning software conceivable.  We just seem to be stuck in the past or, we just are not applying enough thought to analysing our day to day business processes; we have become lazy.

Business processes based on the circulation of paper documents are archaic, wasteful, inefficient and highly prone to error because of lost and redundant copies of paper documents; in fact they are downright dangerous. Yet, every organization I deal with still has critical business process based on the circulation of paper. How incredibly careless or just plain stupid is that?

Let’s look at it from the most basic level. How many people can read a paper document at any one point in time? The answer is one and one only. How many people can look at a digital image of a document at any one point in time? The answer is as many as need to. How hard is it to lose or damage a paper document? The answer is it is really, really easy to lose of damage or deface a paper document. How hard is it to lose or damage or deface or even change a secure digital copy of a document? The answer is it is almost impossible in a well-managed document management system.

So why are we still circulating paper documents to support critical business processes? Why aren’t we simply digitising these important paper documents and making the business process infinitely faster and more secure? For the life of me, I can’t think of a single valid reason for not digitising important paper documents. The technology is readily available with oodles of choice and it isn’t difficult to use and it isn’t expensive. In fact, digitizing paper will always save you money.

So why do I still see so many organizations large and small still relying on the circulation of paper documents to support important business processes? Is it a lack of thought or a lack of imagination or a lack of education? Can it really be true that thirty-years after the beginning of the office automation revolution we still have tens of thousands or even millions of so called knowledge workers with little knowledge of basic office automation? If so, and I believe it is true from my observations, then it is a terrible reflection on our public and corporate education systems.

In a world awash in technology like computers, laptops, iPhones and iPads how can we be so terribly ignorant of the application and benefits of such a basic and proven technology as document imaging?

Some of the worst example can be found in large financial organizations like banks and insurance companies. The public perception is that banks are right up there with the latest technology and most people look at examples like banking and payment systems on smartphones as examples of that. But, go behind the front office to the back office and you will usually see a very different world; a world of paper and manual processes, many on the IT department’s ‘backlog’ of things to attend to, eventually.

Here is a really dumb example of this kind of problem. I recently decided to place a term deposit with an online bank. Everything had to be done online and the website didn’t even offer the download of PDFs which would have been useful so you could read through pages of information at your leisure and find out what information they required so you could make sure you had it handy when completing the forms on the website.

I managed to find a phone number and rang them up and asked for the documentation in PDF form only to be told they were paperless and that everything had to be done online. So I persisted going from page to page on the website, never knowing what would be required next until the last page and yes, you guessed right. On the very last page the instructions were to print out the completed forms, sign them and mail them in. Paperless for me; much to my inconvenience and paper for them, again much to my inconvenience.  There is really no excuse for this kind of brainless twaddle that puts the consumer last.  Their processes obviously required a signature on a paper document so the whole pretence of an online process was a sham; their processes required paper.

Hopefully, when they received my paper documents they actually scanned and digitized them but I am willing to bet that if I could get into their back office I would find shelf after shelf of cardboard file folders and paper documents. Hopefully, next time I ring up they can actually find my documents. Maybe I could introduce them to the revolutionary new barcode technology so they could actually track and manage their paper documents far more efficiently?

The message is a simple one. If you have business processes based on the circulation of paper you are inefficient and are wasting money and the time of your staff and customers. You are also taking risks with the integrity of your data and your customer’s data.

Please do everyone a favour and look carefully at the application of document imaging, a well-proven, affordable, easy to implement and easy to manage business process automation tool.

 

Month List