Are you still losing information in your shared drives?

by Frank 18. November 2012 06:00

Organizations both large and small, government and private have been accumulating electronic documents in shared drives since time immemorial (or at least since the early 1980’s when networked computers and file servers became part of the business world). Some organizations still have those early documents, “just in case”.

Every organization has some form of shared drives whether or not they have an effective and all-encompassing document management system in place (and very few organizations even come close to meeting this level of organization).

All have megabytes (1 million bytes or characters, 106=ten to the power of 6) of information stored in shared drives, the vast majority has gigabytes (109), many now have terabytes (1012) and the worst have petabytes (1015).

As all the IT consultants are now fixated on “Big Data” and how to solve the rapidly growing problem it won’t be long before we are into really big numbers like exabytes (1018), zettabytes (1021) and finally when civilization collapses under the weight, yottabytes. For the record, a yottabyte is 1024 or one quadrillion gigabytes or to keep it simple, one septillion bytes. And believe me the problem is real because data breeds faster than rabbits and mice.

Most of this electronic information is unstructured (e.g., Word and text files of various kinds) and most of it is unclassified (other than maybe being in named folders or sub-folders or sub-sub-folders). None of it is easily searchable in a normal lifetime and there are multiple copies and versions some of which will lead to legal and compliance nightmares.

The idea of assigning retention schedules to these documents is laughable and in general everyone knows about the problem but no one wants to solve it. Or, more precisely, no one wants to spend the time and money required to solve this problem. It is analogous to the billions of dollars being wasted each year by companies storing useless old paper records in dusty offsite storage locations; no one wants to step up and solve the problem. It is a race to see which will destroy civilization first, electronic or paper records.

When people can’t find a document they create a new one. No one knows which is the latest version and no one wants to clean up the store in case they accidentally delete something they will need in a month or a year (or two or three). Employees often spend far more (frustrating) time searching for a document to use as a template or premise than it would take to create a new one from scratch.

No one knows what is readable (WordStar anyone?) and no one knows what is relevant and no one knows what should be kept and what should be destroyed. Many of the documents have become corrupted over time but no one is aware of this.

Some organizations have folders and sub folders defined in their shared drives which may have at one time roughly related to the type of documents being stored within them. Over time, different people had different ideas about how the shared drives and folders should be organized and they have probably been changed and renamed and reorganized multiple times.  Employees however, didn’t always follow the rules so there are miss-filings, dangerous copies and orphans everywhere.

IT thinks it is an end user problem and end users think it is an IT problem.

The real problem is that most of these unstructured documents are legal records (evidence of a business transaction) and some are even vital records (essential to the ongoing operation of the entity). Some could be potentially damaging and some could be potentially beneficial but no one knows. Some could involve the organization in legal disputes, some could involve the organization in  compliance disputes and some could save the organization thousands or millions of dollars; but no one knows.

Some should have been properly destroyed years ago (thus avoiding the aforementioned legal and compliance disputes) and some should never have been destroyed (costing the organization evidence of IP ownership or a billable transaction). But, no one knows.

However, everyone does know that shared drives waste an enormous amount of people’s time and are a virtual ‘black hole’ for both important documents and productivity.

There is a solution to the shared-drives problem but it can’t happen until some bright and responsible person steps up and takes ownership of both the problem and the solution.

For example, here is my recommendation using our product RecCapture (other vendors will have similar products designed as ours is to automatically capture all new and modified electronic documents fully automatically according to a set of business rules you develop for your organization). RecCapture is an add-on to RecFind 6 and uses the RecFind 6 relational database to store all captured documents.

RecCapture allows you to:

  • Develop and apply an initial set of document rules (which to ignore, which to keep, how to store and classify them, etc.) based on what you know about your shared drives (and yes, the first set of rules will be pretty basic because you won’t know much about the vast amount of documents in your shared drives).
  • Use these rules to capture and classify all corporate documents from your shared drives and store and index them in the RecFind 6 relational SQL database (the initial ‘sweep’).
  • Once they are in the relational database you can then utilize advanced search and global change capabilities to further organize and classify them and apply formal retention schedules.You will find that it is a thousand times easier to organize your documents once they are in RecFind 6.
  • Once the documents are saved in the RecFind 6 database (we maintain them in an inviolate state as indexed Blobs) you can safely and confidently delete most of them from your shared drives.
  • Then use these same document rules (continually being updated as you gain experience and knowledge) to automatically capture all new and modified (i.e., new versions) electronic documents as they are stored in your shared folders. Your users don’t need to change the way they work because the operation of RecCapture is invisible to them, it is a server-centric (not user-centric) and a fully automatic background process.
  • Use the advanced search features, powerful security system and versioning control of RecFind 6 to give everyone appropriate access to the RecCapture store so users can find any document in seconds thus avoiding errors and frustration and maximizing productivity and job satisfaction.

RecCapture isn’t expensive, it isn’t difficult to set up and configure and it isn’t difficult to maintain. It can be installed, configured and operational in a few days. It doesn’t interfere with your users and doesn’t require them to do anything other than their normal work.

It captures, indexes and classifies documents of any type. It can also be used to automatically abstract any text based document during the capture process. It makes all documents findable online (full text and Metadata) via a sophisticated search module (BOOLEAN, Metadata, Range searching etc.) and military strength security regime.

Accredited users can access the document store over the network and over the Internet.  Stored documents can be exported in native format or industry standard XML. It is a complete and easy to implement solution to the shared drives problem.

I am sure that Knowledgeone Corporation isn’t the only vendor offering modern tools like RecFind 6 and RecCapture so there is no excuse for you continuing to lose documents in your shared drives.

Why don’t you talk to a few enterprise content software vendors and find a tool that suits you? You will be amazed at the difference in your work environment once you solve the shared drives problem.  Then ask the boss for a pay rise and a promotion; you deserve it.

Can you save money with document imaging?

by Frank 4. November 2012 06:00

I run a software company called Knowledgeone Corporation that produces an enterprise content management solution called RecFind 6 that includes extensive document imaging capabilities. We have thousands of customers around the world and as far as I can see most use RecFind 6 for document imaging of one kind or another.

This certainly wasn’t the case twenty years ago when document imaging tools were difficult to use and were expensive stand-alone ‘specialised’ products. Today however, almost every document management or records management product includes document imaging capabilities as a normal part of the expected functionality. That is, document imaging has gone from being an expensive specialised product to just a commodity, an expected feature in almost any information management product.

This means most customers have a readily available, easy-to-use and cost-effective document imaging tool at their fingertips. That being the case there should be no excuse for not utilizing it to save both time and money. However, I guarantee that I could visit any of my customers and quickly find unrealised opportunities for them to increase productivity and save money by using the document imaging capabilities of my product RecFind 6. They don’t even have to spend any money with me because the document imaging functions of RecFind 6 are integrated as ‘standard’ functionality and there is no additional charge for using them.

So, why aren’t my customers and every other vendor’s customers making best use of the document imaging capabilities of their already purchased software?

In my experience there are many reasons but the main ones are:

Lack of knowledge

To the uninitiated document imaging may look simple but there is far more to it than first appears and unless your staff have hands-on experience there is unlikely to be an ‘expert’ in your organization. For this reason I wrote a couple of Blogs earlier this year for the benefit of my customers; Estimating the cost of your next imaging job and The importance of document imaging. This was my attempt to add to the knowledge base about document imaging.

Lack of ownership

The need for document imaging transects the whole enterprise but there is rarely any one person or department charged with ‘owning’ this need and with applying best-practice document imaging policies and procedures to ensure that the organization obtains maximum benefits across all departments and divisions. It tends to be left to the odd innovative employee to come up with solutions just for his or her area.

Lack of consultancy skills

We often say that before we can propose a solution we need to know what the problem is. The way to discover the true nature of a problem is to deploy an experienced consultant to review and analyse the supposed problem and then present an analysis, conclusions and recommendations that should always include a cost-benefit analysis. In our experience very few organizations have staff with this kind of expertise.

Negative impact of the Global Financial Crisis that began in 2008

All over the world since 2008 our customers have been cutting staff and cutting costs and eliminating or postponing non-critical projects. Some of this cost cutting has been self-defeating and has produced negative results and reduced productivity. One common example is the cancelling or postponing of document imaging projects that could have significantly improved efficiency, productivity and competitiveness as well as reducing processing costs.  This is especially true if document imaging is combined with workflow to better automate business processes.  I also wrote a Blog back in July 2012 for the benefit our customers to better explain just what business process management is all about called Business Process Management, just what does it entail?

In answer to the original question I posed, yes you can save money utilizing simple document imaging functionality especially if you combine the results with new workflow processes to do things faster, more accurately and smarter. It is really a no-brainer and it should be the easiest cost justification you have ever written.

We have already seen how most information management solutions like RecFind 6 have embedded document imaging capabilities so most of you should have existing and paid-for document imaging functionality you can leverage off.

All you really need to do to save your organization money and improve your work processes is look for and then analyse any one of many document imaging opportunities within your organization.

A clue, wherever there is paper there is a document imaging opportunity.

Will the Microsoft Surface tablet unseat the iPad?

by Frank 28. October 2012 06:00

I run a software company called Knowledgeone Corporation that produces a content management system called RecFind 6. We need to be on top of what is happening in the hardware market because we are required to support the latest devices such as Apple’s iPad and Microsoft’s Surface tablet. Our job after all is to capture and manage content and the main job of devices like the iPad and Surface tablet is to allow end users to search for and display content.

At this time we plan to support both with our web client but each device has its special requirements and we need to invest in our software to make sure it perfectly suits each device. The iPad is by now a well-known partner but the Surface tablet is still something of a mystery and we await the full local release and our first test devices.

As we produce business software for corporations and government our focus is on the use of tablets in a business scenario.  This means using the tablets for both input and output meaning, capturing information and documents from the end user and well as presenting information and documents to the end user.

When looked at from a business perspective the Surface tablet starts to be a much better proposition for us than the iPad. I say this because of three factors; connectivity, screen size and open file system. To my mind these are the same three factors that severely limit the use of the iPad in a business environment.

Let me elaborate; I can connect more devices to the Surface, the slightly larger screen makes it easier to read big or long documents and the open file system allows us to easily upload and download whatever documents the customer wants. Ergo, the Surface is a much more useful product for our needs and the needs of our corporate and government customers.

So, after a superficial comparison, the Surface appears to have it all over the iPad or does it?

Maybe not given the early reviews of the buggy nature of Windows 8 on RT. Maybe not given that Windows 8 will never be as easy to use or as intuitive as iOS. Maybe not given that the iPad just works and no end user ever needed a training course or user manual. I very much doubt that end users will ‘learn’ Windows 8 as easily as they learnt iOS.

One unkind reviewer even referred to the Surface as a light-weight notebook.  I don’t agree though with its attached keyboard it is very close. I do think it is different to a notebook and I do applaud Microsoft for its investment and innovation. I think the Surface is a new product as opposed to a new generation notebook and I think most end users will see it that way too.

As is often the case both products have strengths and weaknesses and the real battle is yet to come as early adopters buy the Surface and test it. This is a critical time for acceptance and I hope Microsoft hasn’t released this product before it is ready. The early reviews I have read about the RT version are not encouraging especially as everyone still has awful memories of the Visa experience.

Microsoft is super brave because it is releasing two new products at the same time, the Surface hardware and Windows 8. Maybe it would have been smarter to get Windows 8 out and proven before loading it on the Surface but my guess is that Microsoft marketing is in one hell of hurry to try to turn the iPad tide around. There must be a lot of senior executives in Microsoft desperate to gain control of the mobile revolution in the same way they dominated the PC revolution. The Surface plus Windows 8 is a big-bang approach rather than the more conservative get-wet-slowly approach and I sincerely wish them all the best because we all need a much better tablet for business use. Apple also needs a little scare to remind it to be more respectful of the needs of its customers. Competition is always a good thing for consumers and Apple has had its own way with the iPad for too long now.

Don’t get me wrong, I love my iPad but I am frustrated with its shortcomings and I am hoping that more aggressive competition will force them to lift their game and stop being so damn arrogant.

I am about to place my orders for some Surface tablets for testing as soon as the Windows 8 Pro version is available and promise an update sometime soon about what we find. Watch out for an update in a month or so.

What is the future for real IT professionals?

by Frank 21. October 2012 06:00

I own and run a software company called Knowledgeone Corporation that produces an enterprise content management solution called RecFind 6. As such, our business is the design and programming of complex, heavy-duty application software. This means that we do the hard stuff, including all of the invention, and that I need really clever and innovative and productive IT people (mainly programmers) to work for me.

I have written previously about how hard it is nowadays to find the quality of people I need, see my previous blog entitled “Where have all the good applicants gone?” However, there is an even bigger problem in our industry with an ongoing fall in standards that began way back with the Y2K problem in the late 1990’s as everyone panicked about the problem of date handling once the year 2,000 clicked over.

The problem was basically one of greed where emerging countries like India realized there was a lot of money in providing IT expertise and started mass-producing so called ‘experts’ and shipping them all over the world. Very soon a resume or list of qualifications or certifications was all that was needed to convince paper-bound and rules-bound bureaucrats that an individual had the prerequisite skills to either immigrate or be awarded a work permit.

And of course, young people in countries like India and Pakistan and the Philippines moved into the IT industry not because they were motivated by the prospect of becoming IT professionals but because it was their ticket out of poverty and an entry opportunity into countries like the USA, Canada and Australia. So, we started to fill up the ranks of IT professionals with people that did not have the aptitude or motivation, just a strong desire for a better life (and who can blame them?).

Greed raided its ugly head again as local executives linked bigger bonuses to lower costs and the Indian companies further reduced ‘real’ qualifications to increase the supply of experts. Universities also got in on the act, again motivated by greed (more students equals more income) and standards were again lowered to create  a production line mentality, “Just pump more out of the system, we can sell them overseas!”

The law of averaging applies and as you gradually increase the number of the less talented and less well qualified people into the talent pool the lower the ‘average’ standard becomes. It is analogous to starting with a glass of the best Scotch Whiskey and then gradually adding more and more water. After a while it isn’t worth drinking because it isn’t whiskey any more, it is just flavoured water. We have similarly diminished our IT talent pool (especially in the ranks of programmers) to the degree where the average programmer can’t actually program.

For a long while we imported tens of thousands of these less-than-adequate programmers and they filled up the holes in mainly large enterprises like banks and finance companies and the public sector where they could hide their lack of real expertise. However, and unfortunately for them, the Global Financial Crisis (GFC) has accelerated the growth of outsourcing (back to even less qualified people in places like India, Pakistan and the Philippines) and our recent immigrants are now losing their jobs to their home country-men. I find this ironic but maybe you don’t agree.

In another previous blog, the world according to Frank, I predicted a significant rise in unemployment numbers within our IT industry. I also said it has been happening for some time but that the real numbers won’t be clear until around mid-2013.

Greed will continue to drive the outsourcing phenomenon just as it will continue to drive the lowering of standards and the overall effect on our industry will be significant as the available pool of real talent becomes smaller and smaller. Similarly, local opportunities for real professionals are disappearing fast. Many of you will end up having to help justify your boss’s big bonus by approving software created overseas when it isn’t really up to scratch and many more of you will relegated to fixing the crappy code being delivered to your company from the outsourced incompetents. Not a good future for real professionals and definitely not an environment of high job satisfaction.

When I began as a programmer in the 1960s everyone I worked with was highly motivated and everyone had a high aptitude because it was such a difficult industry to enter. You had no chance of working for a mainframe vendor unless you scored at least an A+ on the infamous IBM or Burroughs or ICL or GE or CDC aptitude tests. We were a very small and very exclusive group and to my mind, a dedicated band of professionals who were in IT because we loved it and were really good at it. The average level of expertise was extraordinarily high and this is now patently no longer the case because our industry has changed dramatically since those early and halcyon days.

So what is the future for real IT professionals who are in this industry because they love it and are really good at it? Like with all things, I believe there is good news and there is bad news.

The good news is that as a true IT professional your value is higher but, probably much higher than the less-than-competent manager who is interviewing knows. This is because many incompetent programmers have now managed to become incompetent managers and this situation protects incompetent programmers but punishes highly competent ones. Basically, your manager isn’t smart enough to recognize how different you are to the average programmer in his team. This makes getting paid what you are really worth very difficult.

Ergo, if you are really good at what you do and want to be paid what you are worth and want to do challenging and satisfying work your only chance is to select a company doing challenging work and a smart manager to be interviewed by. Oh, and don’t select a company with a greedy CEO who is looking to increase his bonus by outsourcing (regardless of the result) and lowering costs to impress the board and or shareholders. Sounds like a tough ask to me, thank God I am self-employed.

Would I recommend the IT industry to any young person today in high school contemplating a future career? No I probably wouldn’t. I would probably recommend accountancy, business studies, medicine or dentistry instead. So where am I going to find the really bright, talented and motivated programmers I need in the future? This almost certainly doesn’t bear thinking about but maybe it is an opportunity as most problems are.

We need a new way to select and train IT professionals; the universities are simply not doing a good enough job. Is there anyone out there with the money, ideas and knowledge willing to set up a new kind of highly selective IT training program? If so, please contact me, I will be more than happy to be one of your first customers.

Are you also confused by the term Enterprise Content Management?

by Frank 16. September 2012 06:00

I may be wrong but I think it was AIIM that first coined the phrase Enterprise Content Management to describe both our industry and our application solutions.

Whereas the term isn’t as nebulous as Knowledge Management it is nevertheless about as useful when trying to understand what organizations in this space actually do. At its simplest level it is a collective term for a number of related business applications like records management, document management, imaging, workflow, business process management, email management and archiving, digital asset management, web site content management, etc.

To simple people like me the more appropriate term or label would be Information Management but as I have already covered this in a previous Blog I won’t beleaguer the point in this one.

When trying to define what enterprise content management actually means or stands for we can discard the words ‘enterprise’ and ‘management’ as superfluous to our needs and just concentrate on the key word ‘content’. That is, we are talking about systems that in some way create and manage content.

So, what exactly is meant by the term ‘content’?

In the early days of content management discussions we classified content into two broad categories, structured and unstructured. Basically, structured content had named sections or labels and unstructured content did not. Generalising even further we can say that an email is an example of structured content because it has commonly named, standardised and accessible sections or labels like ‘Sender’, ‘Recipient’, ‘Subject’ etc., that we can interrogate and rely on to carry a particular class or type of information. The same general approach would regard a Word document as unstructured because the content of a Word document does not have commonly named and standardised sections or labels. Basically a Word document is an irregular collection of characters that you have to parse and examine to determine content.

Like Newtonian physics, the above generalisations do not apply to everything and can be argued until the cows come home. In truth, every document has an accessible structure of some kind. For example, a Word document has an author, a size, a date written, etc. It is just that it is far easier to find out who the recipient of an email was than the recipient of a Word document. This is because there is a common and standard ‘Tag’ that tells us who the recipient is of an email and there is no such common and standard tag for a Word document.

In our business we call ‘information about information’ (e.g., the recipient and date fields on an email) Metadata. If an object has recognizable Metadata then it is far easier to process than an object without recognizable Metadata. We may then say that adding Metadata to an object is the same as adding structure.

Adding structure is what we do when we create a Word document using a template or when we add tags to a Word document. We are normalizing the standard information we require in our business processes so the objects we deal with have the structure we require to easily and accurately identify and process them.

This is of course one of the long-standing problems in our industry, we spend far too much time and money trying to parse and interpret unstructured objects when we should be going back to the coal face and adding structure when the object is first created. This is of course relatively easy to do if we are creating the objects (e.g., a Word document) but not easy to achieve if we are receiving documents from foreign sources like our customers, our suppliers or the government. Unless you are the eight-hundred pound gorilla (like Walmart) it is very difficult to force your partners to add the structure you require to make processing as fast and as easy and as accurate as possible.

There have been attempts in the past to come up with common ‘standards’ that would have regulated document structure but none have been successful. The last one was when XML was the bright new kid on the block and the XML industry rushed headlong into defining XML standards for every conceivable industry to facilitate common structures and to make data transfer between different organizations as easy and as standard as possible. The various XML standardisation projects sucked up millions or even billions of dollars but did not produce the desired results; we are still spending billions of dollars each year parsing unstructured documents trying to determine content.

So, back to the original question, what exactly is Enterprise Content Management? The simple answer is that it is the business or process of extracting useful information from objects such as emails and PDFs and Word documents and then using that information in a business process. It is all about the process of capturing Metadata and content in the most accurate and expeditious manner possible so we can automate business processes as much as possible.

If done properly, it makes your job more pleasant and saves your organization money and it makes your customers and suppliers happier. As such it sounds a lot like motherhood (who is going to argue against it?) but it certainly isn’t like manna from heaven. There is always a cost and it is usually significant. As always, you reap what you sow and effort and cost produces rewards.

Is content management something you should consider? The answer is definitely yes with one proviso; please make sure that the benefits are greater than the cost.

 

Could you manage all of your records with a mobile device?

by Frank 2. September 2012 06:00

I run a software company and I design and build an enterprise strength content management system called RecFind 6 which among other things, handles all the needs of physical records management.

This is fine if I have a big corporate or government customer because the cost is appropriate to the scale of the task at hand. However it isn’t fine when we receive lots of inquiries from much smaller organizations like small law forms that need a records management solution but only have a very small budget.

A very recent inquiry from a small but successful engineering company was also a problem because they didn’t have any IT infrastructure. They had no servers and used Google email. However, they still had a physical records management problem as well as an electronic document management problem but our solution was way outside of the ballpark.

Like any businessman I don’t like to see business walk away especially after we have spent valuable consultancy time helping the customer to understand the problem and define the need.

We have had a lot of similar inquiries lately and it has started me thinking about the need for a new type of product for small business, one that doesn’t require the overhead and expense of an enterprise-grade solution. It should also be one that doesn’t require in-house servers and a high overhead and maintenance cost.

Given our recent experience building a couple of iOS (for the iPhone and iPad) and Android (for any Android phone or tablet) apps I am of the opinion that any low cost but technically clever and easy-to-use solution should be based around a mobile device like a smart phone or tablet.

The lack of an in-house server wouldn’t be a problem because we would host the solution servers at a data centre in each country we operate in. Programming it wouldn’t be a problem because that is what we do and we already have a web services API as the foundation.

The only challenge I see is the need to get really creative about the functionality and the user interface. There is no way I can implement all the advanced functionality of the full RecFind 6 product on a mobile device and there is no way I can re-use the user interface from either the RecFind 6 smart-client or web-client. Even scaled down the user interface would be unsuitable for a mobile device; it needs a complete redesign. It isn’t just a matter of adapting to different form factors (screen sizes), it is about using the mobile device in the most appropriate way. It is about designing a product that leverages off the unique capabilities of a mobile device, not trying to force fit an application designed for Windows.

The good news is that there is some amazing technology now available for mobile devices that could easily be put to use for commercial business purposes even though a lot of it was designed for light weight applications and games. Three examples of very clever new software for mobile devices are Gimbal Context Aware, Titanium Mobile SDK and Vuforia Augmented Reality. But, these three development products are just the tip of the iceberg; there is literally a plethora of clever development tools and new products both in the market and coming to market in the near future.

As a developer, right now the Android platform looks to be my target. This is mainly because of the amount of software being developed for Android and because of the open nature of Android. It allows me to do far more than Apple allows me to do on its sandboxed iOS operating system.

Android also makes it far easier for me to distribute and support my solutions. I love iOS but Apple is just a little too anal and controlling to suit my needs. For example, I require free access to the file system and Apple doesn’t allow that. Nor does it give me the freedom I need to be able to attach devices my customers will need; no standard USB port is a huge pain for application developers.

I am sorry that I don’t have a solution for my smaller customers yet but I have made the decision to do the research and build some prototypes. RecFind 6 will be the back-end residing on a hosted server (in the ‘Cloud’) because it has a superset of the functionality required for my new mobile app. It is also the perfect development environment because the RecFind 6 Web Services SDK makes it easy for me to build apps for any mobile operating system.

So, I already have the backend functionality, the industrial-strength and scalable relational database and the Web Services API plus expertise in Android development using Eclipse and Java. Now all I have to do to produce my innovative new mobile app is find the most appropriate software and development platforms and then get creative.

It is the getting creative bit that is the real challenge. Wish me luck and watch this space.

 

Are you addressing the symptoms or the problem?

by Frank 19. August 2012 06:00

We are a software company building, selling and supporting our product RecFind 6 as an information management system and enterprise content management system. We have an in-house support department (we don’t outsource anything) and thousands of customers that contact it with questions and reports of problems they are having.

However, like I suspect happens at most software vendors, it is often very difficult for my support people to initially diagnose the real problem. Obviously, if there is an error message then it is easier to resolve but in most cases there is no error message, just an explanation of what a user thinks is the product not working properly.

If we can connect in to the user’s workstation using GoToAssist then we can usually ‘see’ firsthand what the problem is and then help the customer. However, this is not always possible and in a lot of cases my people are working ‘blind’ via phone or email and the only recourse is a question and answer dialog until we get to the point where we can define what the user thinks is going wrong and we can get the history of the problem. That is “When did it start to happen? What changed? Does it happen with everyone or just some users?” Etc., etc.

My people are pretty good at this process but even they get caught occasionally when the customer describes what he/she thinks the solution is rather than what the problem is. This usually takes the form of the customers telling us the ‘fix’ we need to make to the product to solve his/her ‘problem’. The wise support person will always ask, “What were you trying to do?” Once you can determine what the customer was trying to do, you then understand why they are asking for the particular ‘fix’. In most cases, the real problem is that the customer isn’t using the right functionality and once shown how to use the right functionality the need for a ‘fix’ goes away.

Problems also arise when my support people start mistakenly addressing the symptoms instead of the problem. In all fairness, it is often hard to differentiate the two but you can’t fix a problem by addressing the symptoms; you have to go back further and first define and then fix the root problem. Once the root problem is fixed the symptoms magically disappear.

For example, a customer reports multiple documents being created with the same auto number (i.e., duplicate numbers) as a problem. This isn’t really the problem though that is how the customer sees it. It is in fact a symptom and a clue to the identification of the real problem. In the above example, the root problem will be either an auto-number algorithm not working properly or an auto-number configuration with a flawed design. The former is what we call a ‘bug’ and the latter is what we call ‘finger trouble’; the configured auto number configuration was working precisely as designed but not as the customer intended.

Bugs we fix in code but finger trouble we fix by first clearly understanding what the customer wants to achieve and then by helping them to configure the functionality so its works as expected.

All experienced support people get to know the difference between:

What the customer thinks is the solution versus the problem; and

The symptoms versus the problem.

In my experience these are the two most common challenges faced when handling support calls. Recognizing both as early as possible is critical to achieving a speedy resolution and minimizing frustration. Not recognizing both as early as possible leads to longer resolution times and unhappy customers.

If we extend our support experience to real life we realize that these same two challenges face us in everyday life and in all of our social interactions. It why we often argue at cross-purposes; each party seeing the problem differently because of different perceptions of what the real problem is.

The challenges of misunderstanding are also often harder to overcome in real life because unlike a support call which has form and structure, our social interactions are mostly unstructured and opportunistic. We don’t start with a problem, we start with a casual dialog and don’t realize we are about to enter a conflict zone until it sneaks up upon us.

So if you find yourself in an argument please take pause and take the time to ask yourself and the other party, “Just what is it exactly we are arguing about?”  Which upon reflection, is exactly how we should handle each and every support call.

If we take the time to properly define the real problem we would spend far less time arguing and making people unhappy and far more time enjoying the company of our customers and friends. It is a no-brainer really, who wants to go through life in constant conflict?

For my part, I will just continue to ask to ask, “Before I address your request for a change would you mind please explaining what you were you actually trying to achieve; can you please show me?” And “What were you doing when you first saw that problem? Please start from the beginning and walk me through the process.” These two questions have worked for me for a very long time and I certainly hope that they work for you.

 

Is Information Management now back in focus?

by Frank 12. August 2012 06:00

When we were all learning about what used to be called Data Processing we also learned about the hierarchy or transformation of information. That is, “data to information to knowledge to wisdom.”

Unfortunately, as information management is part of what we call the Information Technology industry (IT) we as a group are never satisfied with simple self-explanatory terms. Because of this age-old flaw we continue to invent and hype new terms like Knowledge Management and Enterprise Content Management most of which are so vague and ill-defined as to be virtually meaningless but nevertheless, provide great scope for marketing hype and consultants’ income.

Because of the ongoing creation of new terminology and the accompanying acronyms we have managed to confuse almost everyone. Personally I have always favoured the term ‘information management’ because it tells it like it is and it needs little further explanation. In the parlance of the common man it is an “old un, but a good un.”

The thing I most disliked about the muddy knowledge management term was the claim that computers and software could produce knowledge. That may well come in the age of cyborgs and true artificial intelligence but I haven’t seen it yet. At best, computers and software produce information which human beings can convert to knowledge via a unique human cognitive process.

I am fortunate in that I have been designing and programming information management solutions for a very long time so I have witnessed first-hand the enormous improvements in technology and tools that have occurred over time. Basically this means I am able to design and build an infinitely better information management solution today that I could have twenty-nine years ago when I started this business.  For example, the current product RecFind 6 is a much better, more flexible, more feature rich and more scalable product than the previous K1 product and it in turn was an infinitely better product than the previous one called RecFind 5.

One of the main factors in them being better products than their predecessors is that each time we started afresh with the latest technology; we didn’t build on the old product, we discarded it completely and started anew. As a general rule of thumb I believe that software developers need to do this around a five year cycle. Going past the five year life cycle inevitably means you end up compromising the design because of the need to support old technology. You are carrying ‘baggage’ and it is synonymous with trying to run the marathon with a hundred pound (45 Kg) backpack.

I recently re-read an old 1995 white paper I wrote on the future of information management software which I titled “Document Management, Records Management, Image Management Workflow Management...What? – The I.D.E.A”. I realised after reading this old paper that it is only now that I am getting close to achieving my lofty ambitions as espoused in the early paper. It is only now that I have access to the technology required to achieve my design ambitions. In fact I now believe that despite its 1995 heritage this is a paper every aspiring information management solution creator should reference because we are all still trying to achieve the ideal ‘It Does Everything Application’ (but remember that it was my I.D.E.A. first).

Of course, if you are involved in software development then you realise that your job is never done. There are always new features to add and there are always new releases of products like Windows and SQL server to test and certify against and there are always new releases of development tools like Visual Studio and HTML5 to learn and start using.

You also realise that software development is probably the dumbest business in the world to be part of with the exception of drug development, the only other business I can think of which has a longer timeframe between beginning R&D and earning a dollar. We typically spend millions of dollars and two to three years to bring a brand new product to market. Luckily, we still have the existing product to sell and fund the R&D. Start-ups however, don’t have this option and must rely on mortgaging the house or generous friends and relatives or venture capital companies to fund the initial development cycle.

Whatever the source of funding, from my experience it takes a brave man or woman to enter into a process where the first few years are all cost and no revenue. You have to believe in your vision, your dream and you have to be prepared for hard times and compromises and failed partnerships. Software development is not for the faint hearted.

When I wrote that white paper on the I.D.E.A. (the It Does Every Thing Application or, my ‘idea’ or vision at that time) I really thought that I was going to build it in the next few years, I didn’t think it would take another fifteen years. Of course, I am now working on the next release of RecFind so it is actually more than fifteen years.

Happily, I now market RecFind 6 as an information management solution because information management is definitely back in vogue. Hopefully, everyone understands what it means. If they don’t, I guess that I will just have to write more white papers and Blogs.

Are you really managing your emails?

by Frank 5. August 2012 06:00

It was a long time ago that we all realized that emails were about eighty-percent plus of business correspondence and little has changed today. Hopefully, we also realised that most of us weren’t managing emails and that this left a potentially lethal compliance and legal hole to plug.

I wrote some white papers on the need to manage emails back in 2004 and 2005 (“The need to manage emails” and “Six reasons why organizations don’t manage emails effectively”) and when I review them today they are just as relevant as they were eight years ago. That is to say, despite the plethora of email management tools now available most organizations I deal with still do not manage their emails effectively or completely.

As an recent example  we had an inquiry from the records manager at a US law firm who said she needed an email management solution but it had to be a ‘manual’ one where each worker would decide if and when and how to capture and save important emails into the records management system.  She went on to state emphatically that under no circumstances would she consider any kind of automatic email management solution.

This is the most common request we get. Luckily, we have several ways to capture and manage emails including a ‘manual’ one as requested as well as a fully automatic one called GEM that analyses all incoming and outgoing emails according to business rules and then automatically captures and classifies them within our electronic records and document management system RecFind 6.

We have to provide multiple options because that is what the market demands but it is common sense that any manual system cannot be a complete solution. That is, if you leave it up to the discretion of the operator to decide which emails to capture and how to capture them then you will inevitably have an incomplete and inconsistent solution.  Worse still, you will have no safeguards against fraudulent or dishonest behaviour.

Human beings are, by definition, ‘human’ and not perfect. We are by nature inconsistent in our behaviour on a day to day basis. We also forget things and sometimes make mistakes. We are not robots or cyborgs and consistent, perfect behaviour all controlled by Asimov’s three laws of robotics is a long, long way off for most of us.

This means dear reader that we cannot be trusted to always analyse, capture and classify emails in a one-hundred percent consistent manner. Our excuse is that we are in fact, just human.

The problem is exacerbated when we have hundreds or even thousands of inconsistent humans (your staff) all being relied upon to behave in an entirely uniform and consistent manner. It is in fact ludicrous to expect entirely uniform and consistent behaviour from your staff and it is bad practice and just plain foolish to roll out an email management system based on this false premise. It will never meet expectations. It will never plug all the compliance and legal holes and you will remain exposed no matter how much money you throw at the problem (e.g., training, training and re-training).

The only complete solution is one based on a fully-automatic model whereby all incoming and outgoing emails are analysed according to a set of business rules tailored to your specific needs. This is the only way to ensure that nothing gets missed. It is the only way to ensure that you are in fact plugging all the compliance and legal holes and removing exposure.

The fully automatic option is also the most cost-effective by a huge margin.

The manual approach requires each and every staff member to spend (waste?) valuable time every single day trying to decide which emails to capture and then actually going through the process time and time again. It also requires some form of a licence per employee or per desktop. This licence has a cost and it also has to be maintained, again at a cost.

The automatic approach doesn’t require the employee to do anything. It also doesn’t require a licence per employee or desktop because the software runs in the background talking directly to your email server. It is what we call a low cost, low impact and asynchronous solution.

The automatic model increases productivity and lowers costs. It therefore provides a complete and entirely consistent email management solution and at a significantly lower cost than any ‘manual’ model. So, why is it so hard to convince records managers to go with the fully automatic solution? This is the million dollar question though in some large organizations, it is a multi-million dollar question.

My response is that you should not be leaving this decision up to the records manager. Emails are the business of all parts of any organization; they don’t just ‘belong’ to the records management department. Emails are an important part of most business processes particularly those involving clients and suppliers and regulators. That is, the most sensitive parts of your business. The duty to manage emails transects all vertical boundaries within any organization. The need is there in accounts and marketing and engineering and in support and in every department.

The decision on how to manage emails should be taken by the CEO or at the very least, the CIO with full cognizance of the risks to the enterprise of not managing emails in a one-hundred percent consistent and complete manner.

In the end email management isn’t in fact about email management, it is about risk management. If you don’t understand that and if you don’t make the necessary decisions at the top of your organization you are bound to suffer the consequences in the future.

Are you going to wait for the first law suit or punitive fine before taking action?

What is a ‘Prescriptive’ RFQ/RFP and why is it bad?

by Frank 22. July 2012 06:02

Twenty years ago our main way of competing for business was to respond to Request For Quotes (RFQ) and Request For Proposals (RFP). Our sales people and technical people spend months on end laboriously responding to detailed questionnaires and spread sheets with only a small chance of winning because of the number of vendors invited to respond. Luckily, this is no longer the main way we compete for business and we now complete only a fraction of the RFQ/RFPs we used to; much to the relief of my hard working sales and pre-sales staff.

Now we only respond to RFQs and RFPs if we have prior engagement plus the opportunity for questions and engagement during the process together with a good fit for our software and services (we sell information management software and services). We also heavily qualify every opportunity and the first step is to initially speed read and scan all proposal documents for what we call ‘road blocks’.

Road blocks are contractual conditions, usually mandatory ones, which would automatically disqualify us from responding. Sometimes these are totally unfair, one-sided and non-commercial contractual conditions and sometimes they are mandatory features we don’t have and often, the road block is simply the prescriptive nature of the request document.

By prescriptive I mean that the request document is spelling out in detail exactly how the solution should work down to the level of screen design, architecture and keystrokes. In most cases prescriptive requests are the result of the author’s experience with or preference for another product.

As we produce a ‘shrink-wrapped’ or ‘off-the-shelf’ product, the RecFind 6 suite, we aren’t able to change the way it looks and works and nor can we change the architecture. In almost every case we could solve the business problem but not in the exact way specified by the author. Because our product RecFind 6 is highly configurable and very flexible we can solve almost any information management or business process management problem but in our particular way with our unique architecture and our unique look and feel.

In the old days we may have tried to enter into a dialog with the client to see if our solution, although working differently to the way the author envisioned a solution working, would be acceptable.  The usual answer was, “Why don’t you propose your solution and then we will decide.” Sometimes we did respond and then learned to our chagrin that our response was rejected because it didn’t meet some of the prescriptive requirements. Basically, a big waste of time and money. So, we no longer respond to prescriptive RFQs/RFPs.

But, why is a prescriptive RFQ/RFP a bad thing for the client? Why is it a bad practice to be avoided at all costs?

It is a bad thing because it reduces the client’s options and severely narrows the search for the best solution. In our experience, a prescriptive RFQ/RFQ is simply the result of someone either asking for the product they first thought of someone who is so inflexible that he/she isn’t able to think outside the box and isn’t open to innovative solutions.

The end result of a prescriptive RFP/RFQ is always that the client ends up with a poor choice; with a third best or worse solution to the problem.

The message is very simple. If you want to find the best possible solution don’t tell the vendors what the solution is. Rather tell the vendors what the problem is and give them the opportunity to come up with the most innovative and cost-effective solution possible.  Give them the opportunity to be innovative and creative; don’t take away these so very important options.

Please do yourself, and your organization, a favour. If you want the best possible solution clearly explain what the problem is and then challenge the vendors to come up with their best shot. Prescriptive requirements always deny you the best solution.

Month List