How to clean up your shared drives, Frank’s approach

by Frank 22. August 2014 06:00

In my time in this business (enterprise content management, records management, document management, etc.) I have been asked to help with a ‘shared drive problem’ more times than I can remember. This particular issue is analogous with the paperless office problem. Thirty years ago when I started my company I naively thought that both problems would be long gone by now but they are not.

I still get requests for purely physical records management solutions and I still get requests to assist customers in sorting out their shared drives problems.

The tools and procedures to solve both problems have been around for a long time but for whatever reason (I suspect lack of management focus) the problems still persist and could be described as systemic across most industry segments.

Yes, I know that you can implement an electronic document and records management system (we have one called RecFind 6) and take away the need for shared drives and physical records management systems completely but most organizations don’t and most organizations still struggle with shared drives and physical records. This post addresses the reality.

Unfortunately, the most important ingredient in any solution is ‘ownership’ and that is as hard to find as it ever was. Someone with authority, or someone who is prepared to assume authority, needs to take ownership of the problem in a benevolent dictator way and just steam-roll a solution through the enterprise. It isn’t solvable by committees and it requires a committed, driven person to make it happen. These kind of people are in short supply so if you don’t have one, bring one in.

In a nutshell there are three basic problems apart from ownership of the problem.

1.     How to delete all redundant information;

2.     How to structure the ‘new’ shared drives; and

3.     How to make the new system work to most people’s satisfaction.

Deleting redundant Information

Rule number one is don’t ever ask staff to delete the information they regard as redundant. It will never happen. Instead, tell staff that you will delete all documents in your shared drives with a created or last updated date greater than a nominated date (say one-year into the past) unless they tell you specifically which ‘older’ documents they need to retain. Just saying “all of them” is not an acceptable response. Give staff advance notice of a month and then delete everything that has not been nominated as important enough to retain.  Of course, take a backup of everything before you delete, just in case. This is tough love, not stupidity.

Structuring the new shared drives

If your records manager insists on using your already overly complex, hierarchical corporate classification scheme or taxonomy as the model for the new shared drive structure politely ask them to look for another job. Do you want this to work or not?

Records managers and archivists and librarians (and scientists) understand and love complex classification systems. However, end users don’t understand them, don’t like them and won’t use them. End users have no wish to become part-time records managers, they have their own work to do thank you.

By all means make the new structure a subset of the classification system, major headings only and no more than two levels if possible. If it takes longer than a few seconds to decide where to save something or to find something then it is too complex. If three people save the same document in three different places then it is too complex. If a senior manager can’t find something instantly then it is too complex. The staff aren’t to blame, you are.

I have written about this issue previously and you can reference a white paper at this link, “Do you really need a Taxonomy?”

The shared drives aren’t where we classify documents, it is where we make it as easy and as fast as possible to save, retrieve and work on documents; no more, no less. Proper classification (if I can use that term) happens later when you use intelligent software to automatically capture, analyse and store documents in your document management system.

Please note, shared drives are not a document management system and a document management system should never just be a copy of your shared drives. They have different jobs to do.

Making the new system work

Let’s fall back on one of the oldest acronyms in business, KISS, “Keep It Simple Stupid!” Simple is good and elegant, complex is bad and unfathomable.

Testing is a good example of where the KISS principle must be applied. Asking all staff to participate in the testing process may be diplomatic but it is also suicidal. You need to select your testers. You need to pick a small number of smart people from all levels of your organization. Don’t ask for volunteers, you will get the wrong people applying. Do you want participants who are committed to the system working, or those who are committed to it failing? Do you want this to succeed or not?

If I am pressed for time I use what I call the straight-line-method. Imagine all staff in a straight line from the most junior to the most senior. Select from both ends, the most junior and the most senior. Chances are that if the system works for this subset that it will also work for all the staff in between.

Make it clear to all that the shared drives are not your document management system. The shared drives are there for ease of access and to work on documents. The document management system has business rules to ensure that you have inviolate copies of important documents plus all relevant contextual information. The document management system is where you apply business rules and workflow. The document management system is all about business process management and compliance. The shared drives and the document management system are related and integrated but they have different jobs to do.

We have shared drives so staff don’t work on documents on ‘private’ drives, inaccessible and invisible to others. We provide a shared drive resource so staff can collaborate and share information and easily work on documents. We have shared drives so that when someone leaves we still have all their documents and work-in-process.

Please do all the complex processes required in your document management system using intelligent software, automate as much as possible. Productivity gains come about when you take work off staff, not when you load them up with more work. Give your staff as much time as possible so they can use their expertise to do the core job they were hired for.

If you don’t force extra work on your staff and if you make it as easy and as fast as possible to use the shared drives then your system will work. Do the opposite and I guarantee it will not work.

Document Imaging, Forms Processing & Workflow – A Guide

by Frank 28. July 2014 06:00

Document imaging (scanning) has been a part of most business processing since the early 1980s. We for example, produced our first document imaging enabled version of RecFind in 1987. So it isn’t new technology and it is now low risk, tried and proven technology.

Even in this age of electronic documents most of us still receive and have to read, analyse and process mountains of paper.

I don’t know of any organization that doesn’t use some form of document imaging to help process paper documents. Conversely, I know of very few organizations that take full advantage of document imaging to gain maximum value from document imaging.

For example, just scanning a document as a TIFF file and then storing it on a hard drive somewhere is almost a waste of time. Sure, you can then get rid of the original paper (but most don’t) but you have added very little value to your business.

Similarly, capturing a paper document without contextual information (Metadata) is not smart because you have the document but none of the important transactional information. Even converting a TIFF document to a PDF isn’t smart unless you first OCR (Optical Character Recognition) it to release the important text ‘hidden’ in the TIFF file.

I would go even further and say that if you are not taking the opportunity to ‘read’ and ‘capture’ key information from the scanned document during the scanning process (Forms Processing) then you aren’t adding anywhere near as much value as you could.

And finally, if you aren’t automatically initiating workflow as the document is stored in your database then you are criminally missing an opportunity to automate and speed up your internal business processes.

To give it a rating scale, just scanning and storing TIFF files is a 2 out of 10. If this is your score you should be ashamed to be taking a pay packet. If you are scanning, capturing contextual data, OCRing, Forms Processing, storing as a text-searchable PDF and initiating workflow then you get a 10 out of 10 and you should be asking your boss for a substantial raise and a promotion.

How do you rate on a scale of 0 to 10? How satisfied is your boss with your work? Are you in line for a raise and a promotion?

Back in the 1980s the technology was high-risk, expensive and proprietary and few organizations could afford the substantial investment required to scan and process information with workflow.

Today the technology is low cost and ubiquitous. There is no excuse for not taking full advantage of document imaging functionality.

So, where do you start?

As always, you should begin with a paper-flow analysis. Someone needs to do an inventory of all the paper you receive and produce and then document the business processes it becomes part of.

For every piece of paper you produce you should be asking “why?” Why are you producing paper when you could be producing an electronic document or an electronic form?

In addition, why are you producing multiple copies? Why are you filing multiple copies? What do your staff actually do with the paper? What happens to the paper when it has been processed? Why is it sitting in boxes in expensive off-site storage? Why are you paying to rent space for that paper month after month after month? Is there anything stored there that could cause you pain in any future legal action?

And most importantly, what paper can you dispose of?

For the paper you receive you need to work out what is essential and what can be discarded. You should also talk to your customers, partners and suppliers and investigate if paper can be replaced by electronic documents or electronic forms. Weed out the non-essential and replace whatever you can with electronic documents and electronic forms. For example, provide your customers, partners and suppliers with Adobe electronic forms to complete, sign and return or provide electronic forms on your website for them to complete and submit.

Paper is the enemy, don’t let it win!

Once you have culled all the paper you can, you then need to work out how to process the remaining paper in the most efficient and effective manner possible and that always ends up as a Business Process Management (BPM) exercise. The objectives are speed, accuracy, productivity and automation.

Don’t do anything manually if you can possibly automate it. This isn’t 30 years ago when staff were relatively cheap and computers were very expensive. This is now when staff are very expensive and computers are very cheap (or should I say low-cost?).

If you have to process paper the only time it should be handled is when it is taken from the envelope and fed into a document scanner. After that, everything should be automated and electronic. Yes, your records management department will dutifully want to file paper in file folders and archive boxes but even that may not be necessary.  Don’t accept the mystical term ‘compliance’ as a reason for storing paper until you really do understand the compliance legislation that applies to your business. In most cases, electronic copies, given certain safeguards, are acceptable.

I am willing to bet that your records manager will be operating off a retention schedule that is old, out-of-date, modified from another schedule, copied, modified again and ‘made-to-fit’ your needs. It won’t be his/her fault because I can probably guarantee that no budget was allocated to update the retention schedule on an ongoing basis. I am also willing to bet that no one has a copy of all of the current compliance rules that apply to your business.

In my experience, ninety-percent plus of the retention schedules in use are old, out-of-date and inappropriate for the business processes they are being applied to. Most are also way too complicated and crying out for simplification. Bad retention schedules (and bad retention practices – are you really destroying everything as soon as you are allowed?) are the main reason you are wasting thousands or millions of dollars a year on redundant offsite storage.

Do your research and save a fortune! Yes, records are very important and do deserve your attention because if they don’t get your attention you will waste a lot of money and sooner or later you will be penalised for holding information you could have legally destroyed a long time ago. A good records practice is an essential part of any corporate risk management regime. Ignore this advice at your peril.

Obviously, processing records efficiently requires software. You need a software package that can:

  1. Scan, OCR and Forms Process paper documents.
  2. Capture and store scanned images and associated Metadata plus any other kind of electronic document.
  3. Define and execute workflow.
  4. Provide search and inquiry capabilities
  5. Provide reporting capabilities.
  6. Audit all transactions.

The above is obviously a ‘short-list’ of the functionality required but you get the idea. There must be at least several hundred proven software packages in the world that have the functionality required. Look under the categories of:

  1. Enterprise Content Management (ECM, ECMS)
  2. Records Management (RM, RMS)
  3. Records and Document Management
  4. Document Management (DM, DMS)
  5. Electronic Document and Records Management (EDRMS)
  6. Business Process Management (BPM)

You need to define your business processing requirements beginning with the paper flow analysis mentioned earlier. Then convert your business processing requirements into workflows in your software package. Design any electronic forms required and where possible, re-design input paper forms to facilitate forms processing. Draw up procedures, train your staff and then test and go live.

The above paragraph is obviously a little short on detail but I am not writing a “how-to” textbook, just a simple guide. If you don’t have the necessary expertise then hire a suitably qualified and experienced consultant (someone who has done it before many times) and get productive.

Or, you can just put it off again and hope that you don’t get caught.

 

Are you still losing information in your shared drives?

by Frank 18. November 2012 06:00

Organizations both large and small, government and private have been accumulating electronic documents in shared drives since time immemorial (or at least since the early 1980’s when networked computers and file servers became part of the business world). Some organizations still have those early documents, “just in case”.

Every organization has some form of shared drives whether or not they have an effective and all-encompassing document management system in place (and very few organizations even come close to meeting this level of organization).

All have megabytes (1 million bytes or characters, 106=ten to the power of 6) of information stored in shared drives, the vast majority has gigabytes (109), many now have terabytes (1012) and the worst have petabytes (1015).

As all the IT consultants are now fixated on “Big Data” and how to solve the rapidly growing problem it won’t be long before we are into really big numbers like exabytes (1018), zettabytes (1021) and finally when civilization collapses under the weight, yottabytes. For the record, a yottabyte is 1024 or one quadrillion gigabytes or to keep it simple, one septillion bytes. And believe me the problem is real because data breeds faster than rabbits and mice.

Most of this electronic information is unstructured (e.g., Word and text files of various kinds) and most of it is unclassified (other than maybe being in named folders or sub-folders or sub-sub-folders). None of it is easily searchable in a normal lifetime and there are multiple copies and versions some of which will lead to legal and compliance nightmares.

The idea of assigning retention schedules to these documents is laughable and in general everyone knows about the problem but no one wants to solve it. Or, more precisely, no one wants to spend the time and money required to solve this problem. It is analogous to the billions of dollars being wasted each year by companies storing useless old paper records in dusty offsite storage locations; no one wants to step up and solve the problem. It is a race to see which will destroy civilization first, electronic or paper records.

When people can’t find a document they create a new one. No one knows which is the latest version and no one wants to clean up the store in case they accidentally delete something they will need in a month or a year (or two or three). Employees often spend far more (frustrating) time searching for a document to use as a template or premise than it would take to create a new one from scratch.

No one knows what is readable (WordStar anyone?) and no one knows what is relevant and no one knows what should be kept and what should be destroyed. Many of the documents have become corrupted over time but no one is aware of this.

Some organizations have folders and sub folders defined in their shared drives which may have at one time roughly related to the type of documents being stored within them. Over time, different people had different ideas about how the shared drives and folders should be organized and they have probably been changed and renamed and reorganized multiple times.  Employees however, didn’t always follow the rules so there are miss-filings, dangerous copies and orphans everywhere.

IT thinks it is an end user problem and end users think it is an IT problem.

The real problem is that most of these unstructured documents are legal records (evidence of a business transaction) and some are even vital records (essential to the ongoing operation of the entity). Some could be potentially damaging and some could be potentially beneficial but no one knows. Some could involve the organization in legal disputes, some could involve the organization in  compliance disputes and some could save the organization thousands or millions of dollars; but no one knows.

Some should have been properly destroyed years ago (thus avoiding the aforementioned legal and compliance disputes) and some should never have been destroyed (costing the organization evidence of IP ownership or a billable transaction). But, no one knows.

However, everyone does know that shared drives waste an enormous amount of people’s time and are a virtual ‘black hole’ for both important documents and productivity.

There is a solution to the shared-drives problem but it can’t happen until some bright and responsible person steps up and takes ownership of both the problem and the solution.

For example, here is my recommendation using our product RecCapture (other vendors will have similar products designed as ours is to automatically capture all new and modified electronic documents fully automatically according to a set of business rules you develop for your organization). RecCapture is an add-on to RecFind 6 and uses the RecFind 6 relational database to store all captured documents.

RecCapture allows you to:

  • Develop and apply an initial set of document rules (which to ignore, which to keep, how to store and classify them, etc.) based on what you know about your shared drives (and yes, the first set of rules will be pretty basic because you won’t know much about the vast amount of documents in your shared drives).
  • Use these rules to capture and classify all corporate documents from your shared drives and store and index them in the RecFind 6 relational SQL database (the initial ‘sweep’).
  • Once they are in the relational database you can then utilize advanced search and global change capabilities to further organize and classify them and apply formal retention schedules.You will find that it is a thousand times easier to organize your documents once they are in RecFind 6.
  • Once the documents are saved in the RecFind 6 database (we maintain them in an inviolate state as indexed Blobs) you can safely and confidently delete most of them from your shared drives.
  • Then use these same document rules (continually being updated as you gain experience and knowledge) to automatically capture all new and modified (i.e., new versions) electronic documents as they are stored in your shared folders. Your users don’t need to change the way they work because the operation of RecCapture is invisible to them, it is a server-centric (not user-centric) and a fully automatic background process.
  • Use the advanced search features, powerful security system and versioning control of RecFind 6 to give everyone appropriate access to the RecCapture store so users can find any document in seconds thus avoiding errors and frustration and maximizing productivity and job satisfaction.

RecCapture isn’t expensive, it isn’t difficult to set up and configure and it isn’t difficult to maintain. It can be installed, configured and operational in a few days. It doesn’t interfere with your users and doesn’t require them to do anything other than their normal work.

It captures, indexes and classifies documents of any type. It can also be used to automatically abstract any text based document during the capture process. It makes all documents findable online (full text and Metadata) via a sophisticated search module (BOOLEAN, Metadata, Range searching etc.) and military strength security regime.

Accredited users can access the document store over the network and over the Internet.  Stored documents can be exported in native format or industry standard XML. It is a complete and easy to implement solution to the shared drives problem.

I am sure that Knowledgeone Corporation isn’t the only vendor offering modern tools like RecFind 6 and RecCapture so there is no excuse for you continuing to lose documents in your shared drives.

Why don’t you talk to a few enterprise content software vendors and find a tool that suits you? You will be amazed at the difference in your work environment once you solve the shared drives problem.  Then ask the boss for a pay rise and a promotion; you deserve it.

Can you save money with document imaging?

by Frank 4. November 2012 06:00

I run a software company called Knowledgeone Corporation that produces an enterprise content management solution called RecFind 6 that includes extensive document imaging capabilities. We have thousands of customers around the world and as far as I can see most use RecFind 6 for document imaging of one kind or another.

This certainly wasn’t the case twenty years ago when document imaging tools were difficult to use and were expensive stand-alone ‘specialised’ products. Today however, almost every document management or records management product includes document imaging capabilities as a normal part of the expected functionality. That is, document imaging has gone from being an expensive specialised product to just a commodity, an expected feature in almost any information management product.

This means most customers have a readily available, easy-to-use and cost-effective document imaging tool at their fingertips. That being the case there should be no excuse for not utilizing it to save both time and money. However, I guarantee that I could visit any of my customers and quickly find unrealised opportunities for them to increase productivity and save money by using the document imaging capabilities of my product RecFind 6. They don’t even have to spend any money with me because the document imaging functions of RecFind 6 are integrated as ‘standard’ functionality and there is no additional charge for using them.

So, why aren’t my customers and every other vendor’s customers making best use of the document imaging capabilities of their already purchased software?

In my experience there are many reasons but the main ones are:

Lack of knowledge

To the uninitiated document imaging may look simple but there is far more to it than first appears and unless your staff have hands-on experience there is unlikely to be an ‘expert’ in your organization. For this reason I wrote a couple of Blogs earlier this year for the benefit of my customers; Estimating the cost of your next imaging job and The importance of document imaging. This was my attempt to add to the knowledge base about document imaging.

Lack of ownership

The need for document imaging transects the whole enterprise but there is rarely any one person or department charged with ‘owning’ this need and with applying best-practice document imaging policies and procedures to ensure that the organization obtains maximum benefits across all departments and divisions. It tends to be left to the odd innovative employee to come up with solutions just for his or her area.

Lack of consultancy skills

We often say that before we can propose a solution we need to know what the problem is. The way to discover the true nature of a problem is to deploy an experienced consultant to review and analyse the supposed problem and then present an analysis, conclusions and recommendations that should always include a cost-benefit analysis. In our experience very few organizations have staff with this kind of expertise.

Negative impact of the Global Financial Crisis that began in 2008

All over the world since 2008 our customers have been cutting staff and cutting costs and eliminating or postponing non-critical projects. Some of this cost cutting has been self-defeating and has produced negative results and reduced productivity. One common example is the cancelling or postponing of document imaging projects that could have significantly improved efficiency, productivity and competitiveness as well as reducing processing costs.  This is especially true if document imaging is combined with workflow to better automate business processes.  I also wrote a Blog back in July 2012 for the benefit our customers to better explain just what business process management is all about called Business Process Management, just what does it entail?

In answer to the original question I posed, yes you can save money utilizing simple document imaging functionality especially if you combine the results with new workflow processes to do things faster, more accurately and smarter. It is really a no-brainer and it should be the easiest cost justification you have ever written.

We have already seen how most information management solutions like RecFind 6 have embedded document imaging capabilities so most of you should have existing and paid-for document imaging functionality you can leverage off.

All you really need to do to save your organization money and improve your work processes is look for and then analyse any one of many document imaging opportunities within your organization.

A clue, wherever there is paper there is a document imaging opportunity.

Will the Microsoft Surface tablet unseat the iPad?

by Frank 28. October 2012 06:00

I run a software company called Knowledgeone Corporation that produces a content management system called RecFind 6. We need to be on top of what is happening in the hardware market because we are required to support the latest devices such as Apple’s iPad and Microsoft’s Surface tablet. Our job after all is to capture and manage content and the main job of devices like the iPad and Surface tablet is to allow end users to search for and display content.

At this time we plan to support both with our web client but each device has its special requirements and we need to invest in our software to make sure it perfectly suits each device. The iPad is by now a well-known partner but the Surface tablet is still something of a mystery and we await the full local release and our first test devices.

As we produce business software for corporations and government our focus is on the use of tablets in a business scenario.  This means using the tablets for both input and output meaning, capturing information and documents from the end user and well as presenting information and documents to the end user.

When looked at from a business perspective the Surface tablet starts to be a much better proposition for us than the iPad. I say this because of three factors; connectivity, screen size and open file system. To my mind these are the same three factors that severely limit the use of the iPad in a business environment.

Let me elaborate; I can connect more devices to the Surface, the slightly larger screen makes it easier to read big or long documents and the open file system allows us to easily upload and download whatever documents the customer wants. Ergo, the Surface is a much more useful product for our needs and the needs of our corporate and government customers.

So, after a superficial comparison, the Surface appears to have it all over the iPad or does it?

Maybe not given the early reviews of the buggy nature of Windows 8 on RT. Maybe not given that Windows 8 will never be as easy to use or as intuitive as iOS. Maybe not given that the iPad just works and no end user ever needed a training course or user manual. I very much doubt that end users will ‘learn’ Windows 8 as easily as they learnt iOS.

One unkind reviewer even referred to the Surface as a light-weight notebook.  I don’t agree though with its attached keyboard it is very close. I do think it is different to a notebook and I do applaud Microsoft for its investment and innovation. I think the Surface is a new product as opposed to a new generation notebook and I think most end users will see it that way too.

As is often the case both products have strengths and weaknesses and the real battle is yet to come as early adopters buy the Surface and test it. This is a critical time for acceptance and I hope Microsoft hasn’t released this product before it is ready. The early reviews I have read about the RT version are not encouraging especially as everyone still has awful memories of the Visa experience.

Microsoft is super brave because it is releasing two new products at the same time, the Surface hardware and Windows 8. Maybe it would have been smarter to get Windows 8 out and proven before loading it on the Surface but my guess is that Microsoft marketing is in one hell of hurry to try to turn the iPad tide around. There must be a lot of senior executives in Microsoft desperate to gain control of the mobile revolution in the same way they dominated the PC revolution. The Surface plus Windows 8 is a big-bang approach rather than the more conservative get-wet-slowly approach and I sincerely wish them all the best because we all need a much better tablet for business use. Apple also needs a little scare to remind it to be more respectful of the needs of its customers. Competition is always a good thing for consumers and Apple has had its own way with the iPad for too long now.

Don’t get me wrong, I love my iPad but I am frustrated with its shortcomings and I am hoping that more aggressive competition will force them to lift their game and stop being so damn arrogant.

I am about to place my orders for some Surface tablets for testing as soon as the Windows 8 Pro version is available and promise an update sometime soon about what we find. Watch out for an update in a month or so.

Are you also confused by the term Enterprise Content Management?

by Frank 16. September 2012 06:00

I may be wrong but I think it was AIIM that first coined the phrase Enterprise Content Management to describe both our industry and our application solutions.

Whereas the term isn’t as nebulous as Knowledge Management it is nevertheless about as useful when trying to understand what organizations in this space actually do. At its simplest level it is a collective term for a number of related business applications like records management, document management, imaging, workflow, business process management, email management and archiving, digital asset management, web site content management, etc.

To simple people like me the more appropriate term or label would be Information Management but as I have already covered this in a previous Blog I won’t beleaguer the point in this one.

When trying to define what enterprise content management actually means or stands for we can discard the words ‘enterprise’ and ‘management’ as superfluous to our needs and just concentrate on the key word ‘content’. That is, we are talking about systems that in some way create and manage content.

So, what exactly is meant by the term ‘content’?

In the early days of content management discussions we classified content into two broad categories, structured and unstructured. Basically, structured content had named sections or labels and unstructured content did not. Generalising even further we can say that an email is an example of structured content because it has commonly named, standardised and accessible sections or labels like ‘Sender’, ‘Recipient’, ‘Subject’ etc., that we can interrogate and rely on to carry a particular class or type of information. The same general approach would regard a Word document as unstructured because the content of a Word document does not have commonly named and standardised sections or labels. Basically a Word document is an irregular collection of characters that you have to parse and examine to determine content.

Like Newtonian physics, the above generalisations do not apply to everything and can be argued until the cows come home. In truth, every document has an accessible structure of some kind. For example, a Word document has an author, a size, a date written, etc. It is just that it is far easier to find out who the recipient of an email was than the recipient of a Word document. This is because there is a common and standard ‘Tag’ that tells us who the recipient is of an email and there is no such common and standard tag for a Word document.

In our business we call ‘information about information’ (e.g., the recipient and date fields on an email) Metadata. If an object has recognizable Metadata then it is far easier to process than an object without recognizable Metadata. We may then say that adding Metadata to an object is the same as adding structure.

Adding structure is what we do when we create a Word document using a template or when we add tags to a Word document. We are normalizing the standard information we require in our business processes so the objects we deal with have the structure we require to easily and accurately identify and process them.

This is of course one of the long-standing problems in our industry, we spend far too much time and money trying to parse and interpret unstructured objects when we should be going back to the coal face and adding structure when the object is first created. This is of course relatively easy to do if we are creating the objects (e.g., a Word document) but not easy to achieve if we are receiving documents from foreign sources like our customers, our suppliers or the government. Unless you are the eight-hundred pound gorilla (like Walmart) it is very difficult to force your partners to add the structure you require to make processing as fast and as easy and as accurate as possible.

There have been attempts in the past to come up with common ‘standards’ that would have regulated document structure but none have been successful. The last one was when XML was the bright new kid on the block and the XML industry rushed headlong into defining XML standards for every conceivable industry to facilitate common structures and to make data transfer between different organizations as easy and as standard as possible. The various XML standardisation projects sucked up millions or even billions of dollars but did not produce the desired results; we are still spending billions of dollars each year parsing unstructured documents trying to determine content.

So, back to the original question, what exactly is Enterprise Content Management? The simple answer is that it is the business or process of extracting useful information from objects such as emails and PDFs and Word documents and then using that information in a business process. It is all about the process of capturing Metadata and content in the most accurate and expeditious manner possible so we can automate business processes as much as possible.

If done properly, it makes your job more pleasant and saves your organization money and it makes your customers and suppliers happier. As such it sounds a lot like motherhood (who is going to argue against it?) but it certainly isn’t like manna from heaven. There is always a cost and it is usually significant. As always, you reap what you sow and effort and cost produces rewards.

Is content management something you should consider? The answer is definitely yes with one proviso; please make sure that the benefits are greater than the cost.

 

Is Information Management now back in focus?

by Frank 12. August 2012 06:00

When we were all learning about what used to be called Data Processing we also learned about the hierarchy or transformation of information. That is, “data to information to knowledge to wisdom.”

Unfortunately, as information management is part of what we call the Information Technology industry (IT) we as a group are never satisfied with simple self-explanatory terms. Because of this age-old flaw we continue to invent and hype new terms like Knowledge Management and Enterprise Content Management most of which are so vague and ill-defined as to be virtually meaningless but nevertheless, provide great scope for marketing hype and consultants’ income.

Because of the ongoing creation of new terminology and the accompanying acronyms we have managed to confuse almost everyone. Personally I have always favoured the term ‘information management’ because it tells it like it is and it needs little further explanation. In the parlance of the common man it is an “old un, but a good un.”

The thing I most disliked about the muddy knowledge management term was the claim that computers and software could produce knowledge. That may well come in the age of cyborgs and true artificial intelligence but I haven’t seen it yet. At best, computers and software produce information which human beings can convert to knowledge via a unique human cognitive process.

I am fortunate in that I have been designing and programming information management solutions for a very long time so I have witnessed first-hand the enormous improvements in technology and tools that have occurred over time. Basically this means I am able to design and build an infinitely better information management solution today that I could have twenty-nine years ago when I started this business.  For example, the current product RecFind 6 is a much better, more flexible, more feature rich and more scalable product than the previous K1 product and it in turn was an infinitely better product than the previous one called RecFind 5.

One of the main factors in them being better products than their predecessors is that each time we started afresh with the latest technology; we didn’t build on the old product, we discarded it completely and started anew. As a general rule of thumb I believe that software developers need to do this around a five year cycle. Going past the five year life cycle inevitably means you end up compromising the design because of the need to support old technology. You are carrying ‘baggage’ and it is synonymous with trying to run the marathon with a hundred pound (45 Kg) backpack.

I recently re-read an old 1995 white paper I wrote on the future of information management software which I titled “Document Management, Records Management, Image Management Workflow Management...What? – The I.D.E.A”. I realised after reading this old paper that it is only now that I am getting close to achieving my lofty ambitions as espoused in the early paper. It is only now that I have access to the technology required to achieve my design ambitions. In fact I now believe that despite its 1995 heritage this is a paper every aspiring information management solution creator should reference because we are all still trying to achieve the ideal ‘It Does Everything Application’ (but remember that it was my I.D.E.A. first).

Of course, if you are involved in software development then you realise that your job is never done. There are always new features to add and there are always new releases of products like Windows and SQL server to test and certify against and there are always new releases of development tools like Visual Studio and HTML5 to learn and start using.

You also realise that software development is probably the dumbest business in the world to be part of with the exception of drug development, the only other business I can think of which has a longer timeframe between beginning R&D and earning a dollar. We typically spend millions of dollars and two to three years to bring a brand new product to market. Luckily, we still have the existing product to sell and fund the R&D. Start-ups however, don’t have this option and must rely on mortgaging the house or generous friends and relatives or venture capital companies to fund the initial development cycle.

Whatever the source of funding, from my experience it takes a brave man or woman to enter into a process where the first few years are all cost and no revenue. You have to believe in your vision, your dream and you have to be prepared for hard times and compromises and failed partnerships. Software development is not for the faint hearted.

When I wrote that white paper on the I.D.E.A. (the It Does Every Thing Application or, my ‘idea’ or vision at that time) I really thought that I was going to build it in the next few years, I didn’t think it would take another fifteen years. Of course, I am now working on the next release of RecFind so it is actually more than fifteen years.

Happily, I now market RecFind 6 as an information management solution because information management is definitely back in vogue. Hopefully, everyone understands what it means. If they don’t, I guess that I will just have to write more white papers and Blogs.

Does the customer want to deal with a sales person?

by Frank 8. July 2012 06:00

We are in the enterprise content management business or more explicitly in the information management business and we provide a range of solutions including contract management, records management, document management, asset management, HR management, policy management, etc. We are a software company that designs and develops its own products. We also develop and provide all the services required to make our products work once installed at the customer’s site.

However, we aren’t in the ‘creating innovative software’ business even though that is what we do; we are really in the ‘selling our innovative software’ business because without sales there would be no business and no products and no services (and no employees).

We have been in business for nearly 30 years and have watched and participated as both technology and practices have evolved over that time. Some changes are easy to see. For example, we no longer product paper marketing collateral, we produce all of our marketing collateral in HTML or PDF form for delivery via our website and email. We also now market to the world via our website and the Internet, not just to our ‘local’ area.

Another major area of change has been the interface between the customer and the vendor. Many companies today no longer provide a human-face interface. Most big companies and government agencies no longer maintain a shopfront; they require you to deal with them via a website. Some don’t even allow a phone call or email; your only contact is via a web form.

Sometimes the website interface works but mostly it is a bit hit and miss and a very frustrating experience as the website fails or doesn’t offer the option you need. My pet hate is being forced to fill in a web form and then never hearing back from the vendor. Support is often non-existent or very expensive. From my viewpoint, a major failing of the modern paradigm is that I more often than not cannot get the information I need to evaluate a product from the website. This is when I try to find a way to ask them to please have a sales person contact me as I need to know more about their product or service.

I look forward to a sales person contacting me because I know what I want and I know what questions I need answers to. However, the sad truth is that I am rarely contacted by a sales person (and I refuse to speak to anyone from an Indian call centre because I have no wish to waste my time). However, experience with my customers and prospects tells me that not everyone is as enamoured with sales people as I am. In fact, many of the people I have contact with are very nervous of sales people, some are even afraid of them.

Unfortunately for me, we aren’t in a business where we can sell our products and services via a webpage and cart checkout. We need to understand the customer’s business needs before we can provide a solution so we need to employ high quality sales people who are business savvy and really understand business processes. It is not until I know enough to be able to restate the customer’s requirement in detail that I am in a position to make a sale. Conversely, the customer isn’t going to buy anything from me until he/she is absolutely sure I understand the problem and can articulate the solution.

So, in my industry I rely on a human interface and that usually means a sales person. But, do I really need a sales person and do my customers and prospective customers really want to speak to a sales person? Is there a more modern alternative? Please trust me when I say I have pondered this question many, many times.

Those in my business (selling information management solutions) will know how hard it is to find a good sales person and how hard it is to keep them. The good ones are less than ten-percent of the available pool and even after you hire them they are still besieged by offers from recruiters. Finding and retaining good sales people is in my opinion the biggest problem facing all the companies in our industry. They are also the most expensive of human resources and after paying a recruitment fee and a big salary you are then faced with the 80:20 rule; that is, 20% of the sales force produces 80% of your revenues.

Believe me, if I could find a way to meet my sales targets without expensive and difficult to manage sales people I would. However, as our solutions are all about adapting our technology to the customer’s often very complex business processes this is not a solution that can be sold via a website or automated questionnaire; it requires a great deal of skill and experience.

So for now dear customer, please deal with my sales person; he or she is your best chance of solving that vexing problem that is costing your organization money and productivity. All you really need to do is be very clear about what you want and very focussed on the questions you want answered. There is nothing to be afraid of because if you do your homework you will quickly be able to differentiate the good sales person from the bad sales person and then take the appropriate action. I never deal with a bad sales person and nor should you. I also really enjoy dealing with a professional sales person who knows his/her business and knows how to research and qualify my needs.

A good sales person uses my time wisely and saves me money. A bad sales person doesn’t get the chance to waste my time. This should be your approach too; be happy and willing to deal with a sales person but only if he/she is a professional and can add value to your business.

Sales people call this the value proposition. More explicitly; if the sales person is not able to articulate a value proposition to the customer that resonates with the customer then he/she shouldn’t be there. Look for the value proposition; if it isn’t apparent, close the meeting. Make each and every sales person understand, if they aren’t able to articulate a value proposition for your business then there is no point in continuing the conversation.

Dealing with a sales person isn’t difficult; it is all up to you to know what you want (the value proposition) and what questions to ask. Do your preparation and you will never fear a sales person again.

 

Where have all the (good) applicants gone?

by Frank 24. June 2012 06:00

I am told again and again by the popular press and unpopular politicians (is there any other kind?) that we in Australia have a skills shortage. I agree but with a strong proviso; we have a skills shortage but we don’t have an applicant shortage.

We have been advertising for a support specialist (we actually hired one), software sales people and experienced .NET programmers. We are trying to grow and expand and the lack of good quality staff is the major impediment.

I have placed the ads on SEEK, on LinkedIn and am also using the services of several recruiting firms so we at least have a wide coverage.

The initial problem is that the majority of candidates either don’t read the ad or don’t understand the ad or just plain ignore the requirements in the ad. Please note that we are talking about very clear and unambiguous requirements like:

  • Please note that applications without a personalised cover letter articulating why you have the right attributes to be successful in this role will not be considered.
  • Previous applicants need not apply and all applicants must be Australian citizens or legal residents.

We also list skill or experience prerequisites which most applicants also either misread, don’t understand or just plain ignore. Again, we list them very clearly as follows:

  • You will have 3+ years’ experience programming in .NET (preferably VB)
  • You will have 3+ years’ experience working with SQL Server (2005/2008)
  • Experience with the most of the following: .NET 3.5, ASP, AJAX, LINQ, Threading, Web Services, JavaScript, IIS

Of course, as you may guess, the next biggest problem is that the claims in the resume/curriculum vitae simply do not match reality. We for example now test all programing applicants and less than ten-percent of the people we interview come even close to passing a simple programming test. For example, applicants who claim to be certified and experts in topics like SQL are unable to answer even the most elementary questions about SQL.

The funniest (strangest?) thing is that invariably, when we ask them after the test why they rated themselves as a 9 out of 10 in SQL but don’t seem to know anything about SQL, they still rate themselves as a 9 out of ten. It is at that point that you realise there is no point in continuing the interview.

We have now changed our approach and in order not to waste time we conduct a simple phone interview with applicants before deciding to bring them in. As you would guess, most never get past the simple phone interview.

In a nutshell, the ‘norm’ appears to be that applicants ignore the requirements in the ad and also lie about their experience and skills in their resumes. Sometimes the lies are so obvious it is funny. For example, we always check applicants in social networking sites like LinkedIn. The differences between the public profile on LinkedIn and the resume we receive are often amazing; different companies, different titles, different dates of employment. It reminds me of that old question, “Are you lying now or were you lying then?” As soon as we see big differences between the LinkedIn profile and the resume we lose interest.

Recruiters are also in the main, simple hopeless. They want a huge fee for placing an ad on SEEK and sending you a resume. Most don’t interview candidates or screen them in any way or even check references and none take any responsibility. Most beg for an appointment so they can really understand your requirements and then totally ignore them after taking up an hour or two of your valuable time.

However, even after the ‘information-gathering’ appointment and us supplying the recruiter with detailed written requirements the first few resumes we receive are usually nothing like what we asked for. Invariably, when I summon up enough patience to call them as ask why they wasted my time sending me resumes that are nothing like our requirements the answer is usually, “Oh, I thought you might be interested in this one.” Luckily I am not in the habit of gnashing my teeth or I would have none left.

Let me translate that response, “I am a recruiter on a low base salary and high commission and I can’t pay the mortgage on my girlfriend’s flat unless you take one of my candidates so I am going to send you whatever I have in the hope I can earn some commission.”

Then there is the question of literacy and professionalism or the lack thereof. To be fair, a lot of our programming candidates (most actually) are new to this country and English isn’t their first language so we expect to see some unusual phrasing and sentence construction in the resume. Most programming candidates however, despite language difficulties, do a pretty good job in the resume. It is only when we do a phone interview that we discover the candidate’s real grasp of English and unfortunately, for most new arrivals, I can’t employ them in my development environment if they can’t communicate technical matters and nuances at an expert level. It isn’t my job to teach them English.

The real surprise, or shock, is the number of ‘sales professionals’ who can’t spell or construct a sentence or even format a document despite English being their first language. I need these people to be able to construct well-written, cohesive selling proposals for my clients and if the resume is an indicator of their abilities then they fail abysmally.

More importantly, you have to ask if this is the effort they put into an extremely important document selling themselves what hope do you have of getting a well-written and totally professional proposal for your customers? We simply reject any sales candidate with a poorly written and formatted resume.

It is strange that most resumes from programming candidates who are also recent arrivals to our country are generally much better written that the resumes of so-called experienced sales professionals who were schooled here. There is obviously something seriously amiss with our education system and the standards of the companies they worked for previously.

The sad bottom line is that out of one hundred applicants we will only want to interview ten and out of those ten only one will prove to be suitable. I would like to say that this is a one-percent success rate but it isn’t because the one good candidate always gets several offers and the chance of actually hiring them is no better than one in two. This give me a success rate of at best, one in two-hundred.

My theory is that there is a major mismatch between available candidates and the available positions with a lot of poorly qualified people in the market and very few highly qualified people in the market. So we definitely have an ‘available’ skills shortage. It is an awful thing to say but I can only see this situation getting worse in the next few years as our economy slows down because the few good people are going to stay where they are and wait out the bad times.

Where are those cyborgs I see in movies like Prometheus; how much do I have to pay and how long do we have to wait?

 

What is really involved in converting to a new system?

by Frank 27. May 2012 06:00

Your customer’s old system is now way past its use by date and they have purchased a new application system to replace it. Now all you have to do is convert all the data from the old system to the new system, how hard can that be?

The answer is it that can be very, very hard to get right and it can take months or years if the IT staff or the contractors don’t know what they are doing. In fact, the worst case is that no one can actually figure out how to do the data conversion so you end up two years later still running the old, unsupported and now about to fail system. The really bad news is that this isn’t just the worst case scenario, it is the most common scenario and I have seen it happen time and time again.

People who are good at conversions are good because they have done it successfully many times before. So, don’t hire a contractor based on potential and a good sales spiel, hire a contractor based on record, on experience and on a good many previous references. The time to learn how to do a conversion isn’t on your project.

I will give you guidelines on how to handle a data conversion but as every conversion is different, you are going to have to adapt my guidelines to your project and you should always expect the unexpected. The good news is that if you have a calm, logical and experienced head then any problem is solvable. We have handled hundreds of conversions from every type of system imaginable to our RecFind product and we have never failed even though we have run into every kind of speed bump imaginable. As they say, “expect the best, plan for the worst, and prepare to be surprised.”

1.    Begin by reviewing the application to be converted by looking at the ‘screens’ with someone who uses the system and understands it. Ask the user what fields/data they want to convert. Take screenshots for your documentation. Remember that a field on the screen may or may not be a field in the database; the value may be calculated or generated automatically. Also remember that even though a screen may be called say “File Folder” that all the fields you can see may not in fact be part of the file folder table, they may be ‘linked’ fields in other tables in the database.

2.    You need to document and understand the data model, that is, all the tables and fields and relationships you will need to convert. See if someone has a representation of the data model but, never assume it is up to date. In fact, always assume it is not up to date. You need to work with an IT specialist (e.g., the database administrator) and utilize standard database tools like SQL Server Management Studio to validate the data model of the old system.

3.    Once you think you understand the data model and data to be converted you need to document your thoughts in a conversion report and ask the customer to review and approve it. You won’t get it right first time and expect this to be an iterative process. Remember that the customer will be in ‘discovery’ mode also.

4.    Once you have acceptance of the data to be converted you need to document the data mapping. That is, show where the data will go in the new application. It would be extremely rare that you would be able to duplicate the data model from the old application; it will usually be a case of adapting the data from the old system to the different data model of the new application. Produce a data mapping report and submit it to the customer for sign-off. Again, don’t expect to get this right the first time; it is also an iterative process because both you and the customer are in discovery mode.

5.    Expect that about 20% or more of the data in the old system will be ‘dirty’; that is, bad or duplicate and redundant data. You need to make a decision about the best time to clean up and de-dupe the data. Sometimes it is in the old application before you convert but often it is in the new application after you have converted because the new application has more and better functionality for this purpose.   Whichever method you choose, you must clean up the data before going live in production.

6.    Expect to run multiple trial conversions. The customer may have approved a specification but reading it and seeing the data exposed in the new application are two very different experiences. A picture is worth a thousand words and no one is smart enough to know exactly how they want their data converted until they actually see what it looks like and works like in the new application. Be smart and bring in more users to view and comment on the new application; more heads are better than one and new users will always find ways to improve the conversion. Don’t be afraid of user opinion, actively encourage and solicit it.

7.    Once the data mapping is approved you need to schedule end-user training (as close as possible to the cutover to the new system) and the final conversion prior to cutover.

Of course for the above process to work you also need the tools required to extract data from the old system and import it into the new system. If you don’t have standard tools you will have to write a one-off conversion program. The time to write this is after the data mapping is approved and before the first trial conversion. To make our life easy we designed and build a standard tool we call Xchange and it can connect to any data source and then map and write data to our RecFind 6 system. However, this is not an easy program to design and write and you are unlikely to be able to afford to do this unless you are in the conversion business like we are. You are therefore most likely going to have to design and write a one-off conversion program.

One alternative tool you should not ignore is Microsoft’s Excel. If the old system can export data in CSV format and the new system can import data in CSV format then Excel is the ideal tool for cleaning up, re-sequencing and preparing the data for import.

And finally, please do not forget to sanity check your conversion. You need to document exactly how many records of each type you exported so you can ensure that exactly the same number of records exist in the new system. I have seen far too many examples of a badly managed conversion resulting in thousands or even millions of records going ‘missing’ during the conversion process. You must have a detailed record count going out and a detailed record count going in. The last thing you want is a phone call from the customer a month or two later saying, “it looks like we are missing some records.”

Don’t expect the conversion to be easy and do expect it to be an iterative process. Always involve end-users and always sanity check the results.  Take extra care and you will be successful.

Month List