Why aren’t you managing your emails?

by Frank 16. January 2020 15:08

  

Emails long ago evolved to be around about eighty-percent plus of business correspondence. Most records managers also realize that most of us aren’t managing emails well and that this has left a potentially lethal compliance and legal hole to plug.

I have written numerous papers and posts on the need to manage emails, such as:

The need to manage emails differently to paper;

Managing Emails, how hard can it be?;

I am willing to bet you are still not managing your emails effectively;

How to simplify Electronic Document & Email Management;

Why are your staff still manually capturing & classifying electronic documents & emails?

The need to manage Emails?;

Six reasons why organizations do not manage Emails effectively;

When I review them today, they are just as relevant as they were years ago. That is to say, despite the plethora of email management tools now available, many organizations still do not manage their emails effectively or completely.

The Manual Model

As an recent example we had an inquiry from the records manager at a US law firm who said she needed an email management solution but it had to be a ‘manual’ one where each worker would decide if and when and how to capture and save important emails into the records management system.  She went on to state emphatically that under no circumstances would she consider any kind of automatic email management solution.

We have to provide multiple options, such as GEM and the Button, because that is what the market demands but it is common sense that any manual system cannot be a complete or consistent solution. If you leave it up to the discretion of the end user to decide which emails to capture and how to capture them, then you will inevitably have an incomplete and inconsistent store of emails.  Worse still, you will have no safeguards against fraudulent or dishonest behavior.

Human beings are, by definition, ‘human’ and not perfect. We are by nature inconsistent in our behavior on a day to day basis; we forget things and sometimes we make mistakes. We are not robots or cyborgs and consistent, perfect behavior is beyond us.

As humans, we cannot be trusted to always analyze, capture and classify emails in a one-hundred percent consistent manner.

The problem is exacerbated manifold when we have hundreds or even thousands of inconsistent humans (your staff) all being relied upon to behave in an entirely uniform and consistent manner. It is ludicrous to expect entirely uniform and consistent behavior from your staff and it is bad practice and just plain foolish to roll out an email management system based on this false premise. It will never meet expectations. It will never plug all the compliance and legal holes and you will remain exposed no matter how much money you throw at the problem (e.g., training, training and re-training).

The Automatic Model

The only complete solution is one based on a fully-automatic rules or AI driven model whereby all incoming and outgoing emails are analyzed at the email server level according to a set of business rules tailored to your specific needs. This is the only way to ensure that nothing gets missed. It is the only way to ensure that you are in fact plugging all the compliance and legal holes and removing exposure and risk.

The fully automatic option is also the most cost-effective by a huge margin.

The manual approach requires each and every staff member to spend (waste?) valuable time every single day trying to decide which emails to capture and then actually going through the process time and time again. It also requires some form of a license per employee or per desktop. This license has a cost and it also has to be maintained, again at a cost.

The automatic approach doesn’t require the employee to do anything except know how to search for emails in your EDRMS. It also doesn’t require a license per employee or desktop because the software runs in the background talking directly to your email server. It is a low cost, low impact and asynchronous solution.

The automatic model increases productivity and lowers costs. It also provides a complete and entirely consistent email management solution and at a significantly lower cost than any ‘manual’ model. So, why is it so hard to convince some records managers and /or business owners to go with the fully automatic solution?

Who Decides?

This is not a decision that should be left up to the records manager. Emails are the business of all parts of any organization; they don’t just ‘belong’ to the records management department. Emails are an important part of most business processes particularly those involving clients and suppliers and regulators. That is, the most sensitive parts of your business. The duty to manage emails transects all vertical boundaries within any organization. The need is there in accounts and marketing and engineering and in support and in every department.

The decision on how to manage emails should be taken by the CEO or at the very least, the CIO with full cognizance of the risks to the enterprise of not managing emails in a one-hundred percent consistent and complete manner.

Its Risk Management

In the end email management isn’t in fact about email management, it is about risk management. If you don’t understand that and if you don’t make the necessary decisions at the top of your organization, you are bound to suffer the consequences in the future.

Are you going to wait for the first lawsuit or punitive fine before taking action?

Document Imaging, Forms Processing & Workflow – A Guide

by Frank 28. July 2014 06:00

Document imaging (scanning) has been a part of most business processing since the early 1980s. We for example, produced our first document imaging enabled version of RecFind in 1987. So it isn’t new technology and it is now low risk, tried and proven technology.

Even in this age of electronic documents most of us still receive and have to read, analyse and process mountains of paper.

I don’t know of any organization that doesn’t use some form of document imaging to help process paper documents. Conversely, I know of very few organizations that take full advantage of document imaging to gain maximum value from document imaging.

For example, just scanning a document as a TIFF file and then storing it on a hard drive somewhere is almost a waste of time. Sure, you can then get rid of the original paper (but most don’t) but you have added very little value to your business.

Similarly, capturing a paper document without contextual information (Metadata) is not smart because you have the document but none of the important transactional information. Even converting a TIFF document to a PDF isn’t smart unless you first OCR (Optical Character Recognition) it to release the important text ‘hidden’ in the TIFF file.

I would go even further and say that if you are not taking the opportunity to ‘read’ and ‘capture’ key information from the scanned document during the scanning process (Forms Processing) then you aren’t adding anywhere near as much value as you could.

And finally, if you aren’t automatically initiating workflow as the document is stored in your database then you are criminally missing an opportunity to automate and speed up your internal business processes.

To give it a rating scale, just scanning and storing TIFF files is a 2 out of 10. If this is your score you should be ashamed to be taking a pay packet. If you are scanning, capturing contextual data, OCRing, Forms Processing, storing as a text-searchable PDF and initiating workflow then you get a 10 out of 10 and you should be asking your boss for a substantial raise and a promotion.

How do you rate on a scale of 0 to 10? How satisfied is your boss with your work? Are you in line for a raise and a promotion?

Back in the 1980s the technology was high-risk, expensive and proprietary and few organizations could afford the substantial investment required to scan and process information with workflow.

Today the technology is low cost and ubiquitous. There is no excuse for not taking full advantage of document imaging functionality.

So, where do you start?

As always, you should begin with a paper-flow analysis. Someone needs to do an inventory of all the paper you receive and produce and then document the business processes it becomes part of.

For every piece of paper you produce you should be asking “why?” Why are you producing paper when you could be producing an electronic document or an electronic form?

In addition, why are you producing multiple copies? Why are you filing multiple copies? What do your staff actually do with the paper? What happens to the paper when it has been processed? Why is it sitting in boxes in expensive off-site storage? Why are you paying to rent space for that paper month after month after month? Is there anything stored there that could cause you pain in any future legal action?

And most importantly, what paper can you dispose of?

For the paper you receive you need to work out what is essential and what can be discarded. You should also talk to your customers, partners and suppliers and investigate if paper can be replaced by electronic documents or electronic forms. Weed out the non-essential and replace whatever you can with electronic documents and electronic forms. For example, provide your customers, partners and suppliers with Adobe electronic forms to complete, sign and return or provide electronic forms on your website for them to complete and submit.

Paper is the enemy, don’t let it win!

Once you have culled all the paper you can, you then need to work out how to process the remaining paper in the most efficient and effective manner possible and that always ends up as a Business Process Management (BPM) exercise. The objectives are speed, accuracy, productivity and automation.

Don’t do anything manually if you can possibly automate it. This isn’t 30 years ago when staff were relatively cheap and computers were very expensive. This is now when staff are very expensive and computers are very cheap (or should I say low-cost?).

If you have to process paper the only time it should be handled is when it is taken from the envelope and fed into a document scanner. After that, everything should be automated and electronic. Yes, your records management department will dutifully want to file paper in file folders and archive boxes but even that may not be necessary.  Don’t accept the mystical term ‘compliance’ as a reason for storing paper until you really do understand the compliance legislation that applies to your business. In most cases, electronic copies, given certain safeguards, are acceptable.

I am willing to bet that your records manager will be operating off a retention schedule that is old, out-of-date, modified from another schedule, copied, modified again and ‘made-to-fit’ your needs. It won’t be his/her fault because I can probably guarantee that no budget was allocated to update the retention schedule on an ongoing basis. I am also willing to bet that no one has a copy of all of the current compliance rules that apply to your business.

In my experience, ninety-percent plus of the retention schedules in use are old, out-of-date and inappropriate for the business processes they are being applied to. Most are also way too complicated and crying out for simplification. Bad retention schedules (and bad retention practices – are you really destroying everything as soon as you are allowed?) are the main reason you are wasting thousands or millions of dollars a year on redundant offsite storage.

Do your research and save a fortune! Yes, records are very important and do deserve your attention because if they don’t get your attention you will waste a lot of money and sooner or later you will be penalised for holding information you could have legally destroyed a long time ago. A good records practice is an essential part of any corporate risk management regime. Ignore this advice at your peril.

Obviously, processing records efficiently requires software. You need a software package that can:

  1. Scan, OCR and Forms Process paper documents.
  2. Capture and store scanned images and associated Metadata plus any other kind of electronic document.
  3. Define and execute workflow.
  4. Provide search and inquiry capabilities
  5. Provide reporting capabilities.
  6. Audit all transactions.

The above is obviously a ‘short-list’ of the functionality required but you get the idea. There must be at least several hundred proven software packages in the world that have the functionality required. Look under the categories of:

  1. Enterprise Content Management (ECM, ECMS)
  2. Records Management (RM, RMS)
  3. Records and Document Management
  4. Document Management (DM, DMS)
  5. Electronic Document and Records Management (EDRMS)
  6. Business Process Management (BPM)

You need to define your business processing requirements beginning with the paper flow analysis mentioned earlier. Then convert your business processing requirements into workflows in your software package. Design any electronic forms required and where possible, re-design input paper forms to facilitate forms processing. Draw up procedures, train your staff and then test and go live.

The above paragraph is obviously a little short on detail but I am not writing a “how-to” textbook, just a simple guide. If you don’t have the necessary expertise then hire a suitably qualified and experienced consultant (someone who has done it before many times) and get productive.

Or, you can just put it off again and hope that you don’t get caught.

 

Are you really managing your emails?

by Frank 5. August 2012 06:00

It was a long time ago that we all realized that emails were about eighty-percent plus of business correspondence and little has changed today. Hopefully, we also realised that most of us weren’t managing emails and that this left a potentially lethal compliance and legal hole to plug.

I wrote some white papers on the need to manage emails back in 2004 and 2005 (“The need to manage emails” and “Six reasons why organizations don’t manage emails effectively”) and when I review them today they are just as relevant as they were eight years ago. That is to say, despite the plethora of email management tools now available most organizations I deal with still do not manage their emails effectively or completely.

As an recent example  we had an inquiry from the records manager at a US law firm who said she needed an email management solution but it had to be a ‘manual’ one where each worker would decide if and when and how to capture and save important emails into the records management system.  She went on to state emphatically that under no circumstances would she consider any kind of automatic email management solution.

This is the most common request we get. Luckily, we have several ways to capture and manage emails including a ‘manual’ one as requested as well as a fully automatic one called GEM that analyses all incoming and outgoing emails according to business rules and then automatically captures and classifies them within our electronic records and document management system RecFind 6.

We have to provide multiple options because that is what the market demands but it is common sense that any manual system cannot be a complete solution. That is, if you leave it up to the discretion of the operator to decide which emails to capture and how to capture them then you will inevitably have an incomplete and inconsistent solution.  Worse still, you will have no safeguards against fraudulent or dishonest behaviour.

Human beings are, by definition, ‘human’ and not perfect. We are by nature inconsistent in our behaviour on a day to day basis. We also forget things and sometimes make mistakes. We are not robots or cyborgs and consistent, perfect behaviour all controlled by Asimov’s three laws of robotics is a long, long way off for most of us.

This means dear reader that we cannot be trusted to always analyse, capture and classify emails in a one-hundred percent consistent manner. Our excuse is that we are in fact, just human.

The problem is exacerbated when we have hundreds or even thousands of inconsistent humans (your staff) all being relied upon to behave in an entirely uniform and consistent manner. It is in fact ludicrous to expect entirely uniform and consistent behaviour from your staff and it is bad practice and just plain foolish to roll out an email management system based on this false premise. It will never meet expectations. It will never plug all the compliance and legal holes and you will remain exposed no matter how much money you throw at the problem (e.g., training, training and re-training).

The only complete solution is one based on a fully-automatic model whereby all incoming and outgoing emails are analysed according to a set of business rules tailored to your specific needs. This is the only way to ensure that nothing gets missed. It is the only way to ensure that you are in fact plugging all the compliance and legal holes and removing exposure.

The fully automatic option is also the most cost-effective by a huge margin.

The manual approach requires each and every staff member to spend (waste?) valuable time every single day trying to decide which emails to capture and then actually going through the process time and time again. It also requires some form of a licence per employee or per desktop. This licence has a cost and it also has to be maintained, again at a cost.

The automatic approach doesn’t require the employee to do anything. It also doesn’t require a licence per employee or desktop because the software runs in the background talking directly to your email server. It is what we call a low cost, low impact and asynchronous solution.

The automatic model increases productivity and lowers costs. It therefore provides a complete and entirely consistent email management solution and at a significantly lower cost than any ‘manual’ model. So, why is it so hard to convince records managers to go with the fully automatic solution? This is the million dollar question though in some large organizations, it is a multi-million dollar question.

My response is that you should not be leaving this decision up to the records manager. Emails are the business of all parts of any organization; they don’t just ‘belong’ to the records management department. Emails are an important part of most business processes particularly those involving clients and suppliers and regulators. That is, the most sensitive parts of your business. The duty to manage emails transects all vertical boundaries within any organization. The need is there in accounts and marketing and engineering and in support and in every department.

The decision on how to manage emails should be taken by the CEO or at the very least, the CIO with full cognizance of the risks to the enterprise of not managing emails in a one-hundred percent consistent and complete manner.

In the end email management isn’t in fact about email management, it is about risk management. If you don’t understand that and if you don’t make the necessary decisions at the top of your organization you are bound to suffer the consequences in the future.

Are you going to wait for the first law suit or punitive fine before taking action?

Have we really thought about disaster recovery?

by Frank 29. July 2012 06:00

The greatest knowledge-loss disaster I can think of was the destruction of the great library of Alexandria by fire around 642 AD. This was the world’s largest and most complete store of knowledge at the time and it was almost totally destroyed. It would take over a thousand years for mankind to rediscover and regain the knowledge that went up in smoke and to this day we still don’t think we have recovered or re-discovered a lot of what was lost. It was an unmitigated disaster for mankind because nearly all of Alexandria’s records were flammable and most were irreplaceable.

By contrast, we still have far older records from ancient peoples like the Egyptians of five-thousand years ago because they carved their records in stone, a far more durable material.

How durable and protected are your vital records?

I mentioned vital records because disaster recovery is really all about protecting your vital records.  If you are a business a vital record is any record without which your business could not run. For the rest of us a vital record is irreplaceable knowledge or memories. I bet the first thing you grab when fire or flood threatens your home is the family photo album or, in this day and age, the home computer or iPad or backup drive.

In 1996 I presented a paper to the records management society titled “Using technology as a surrogate for managing and capturing vital paper based records.” The technology references are now both quaint and out-of-date but the message is still valid. You need to use the most appropriate technology and processes to protect your vital records.

Interestingly, the challenges today are far greater than they were in 1996 because of the ubiquitous ‘Cloud’.  If you are using Google Docs or Office 365 or even Apple iCloud who do you think is protecting your vital records? Have you heard the term ‘outage’? Would you leave your children with a stranger, especially a stranger who doesn’t even tell you the physical location of your children? A stranger who is liable to say, “Sorry, it appears that your children are missing but under our agreement I accept no liability.” Have you ever read the standard terms and conditions of your Cloud provider? What are your rights if your vital records just disappear? Where are your children right now?

Some challenges are surprisingly no different because we are still producing a large proportion of our vital records in paper. Apart from its major flaws of being highly flammable and subject to water damage paper is in fact an excellent medium for the long term preservation of vital records because we don’t need technology to read it; we may say paper is technology agnostic.

By contrast, all forms of electronic or optical storage are strictly technology dependent. What good is that ten year old DAT tape if you no longer have the Pentium compute, SCSI card, cable and Windows 95 drivers to read it? Have you moved your vital records to new technology lately?

And now to the old bugbear (a persistent problem or source of annoyance), a backup is not disaster recovery. If your IT manager tells you that you are OK because he takes backups you should smack him with your heaviest notebook, (not the iPad, the iPad is too light and definitely not with the Samsung tablet, it is too fragile).

I have written about what disaster recovery really involves and described our disaster recovery services so I won’t repeat it here, I have just provided the link so you can read at your leisure.

Suffice to say, the objective of any disaster recovery process is to ensure that you can keep running your business or life with only a minimal disruption regardless of the type or scale of the disaster.

I am willing to bet that ninety-percent of homes and businesses are unprepared and cannot in any way guarantee that they could continue to run their business or home after a major disaster.

We don’t need to look as far back as 642 AD and the Alexandria Library fire for pertinent examples. How about the tsunami in Japan in 2011? Over 200,000 homes totally destroyed and countless business premises wiped from the face of the earth. Tsunamis, earthquakes, floods, fire and wars are all very real dangers no matter where you live.

However, it isn’t just natural disasters you need to be wary of. A recent study published by EMC Corporation offers a look at how companies in Japan and Asia Pacific deal with disaster recovery. According to the study, the top three causes of data loss and downtime are hardware failure (60%), data corruption (47%), and loss of power (44%).

The study also goes on to analyse how companies are managing backups and concludes, “For all the differences inherent to how countries in the Asia Pacific region deal with their data, there is at least one similarity with the rest of the world: Companies are faced with an increasing amount of data to move within the same backup windows. Many businesses in the region, though, still rely on tape backup systems (38%) or CD-ROMs (38%). On this front, the study found that many businesses (53%) have plans to migrate from tape to a faster medium in order to improve the efficiencies of their data backup and recovery.”

It concludes by estimating where backups are actually stored, “The predominant response is to store offsite data at another company-owned location within the same country (58%), which is followed by at a “third-party site” within the same country.”

I certainly wouldn’t be relying on tape as my only recovery medium and neither would I be relying on data and systems stored at the same site or at an employee’s house. Duplication and separation are the two key principles together with proven and regularly tested processes.

I recently spoke to an IT manager who wasn’t sure what his backup (we didn’t get to disaster recovery) processes were. That was bad enough but when he found out it seemed that they took a full backup once a month and then incremental backups every day and he had not tested the recovery process in years. I sincerely hope that he has somewhere to run and hide when and if his company ever suffers a disaster.

In a nutshell, disaster recovery is all about being able to get up and running again in as short a time as possible even if your building burns to the ground. That in fact is the acid test of any disaster recovery plan. That is, ask your IT manager, “If this building burns down Thursday explain to me how we will be up and operating again on Friday morning.”

If his answer doesn’t fill you with confidence then you do not have a disaster recovery plan.

 

Are you running old and unsupported software? What about the risks?

by Frank 29. April 2012 20:59

Many years ago we released a 16 bit product called RecFind version 3.2 and we made a really big mistake. We gave it so much functionality (much of it way ahead of its time) and we made it so stable that we still have thousands of users.

It is running under operating systems like XP it was never developed for or certified for and is still ‘doing the job’ for hundreds of our customers. Most frustratingly, when we try to get them to upgrade they usually say, “We can’t justify the expense because it is working fine and doing everything we need it to do.”

However, RecFind 3.2 is decommissioned, unsupported and, the databases it uses (Btrieve, Disam and an early version of SQL Server) and also no longer supported by their vendors.

So our customers are capturing and managing critical business records with totally unsupported software. Most importantly, most of them also do not have any kind of support agreement with us (and this really hurts because they say they don’t need a support agreement because the system doesn’t fail) so when the old system catastrophically fails, which it will, they are on their own.

Being a slow learner, ten years ago I replaced RecFind 3.2 and RecFind 4.0 with RecFind 5.0, a brand new 32 bit product. Once again I gave it too much functionality and made it way too stable. We now have hundreds of customers still using old and unsupported versions of RecFind 5.0 and when we try to convince them to upgrade we get that same response, “It is still working fine and doing everything we need it to do.”

If I was smarter I would have built-in a date-related software time bomb to stop old systems from working when they were well past their use-by date. However, that would have been a breach of faith so it is not something we have or will ever do. It is still a good idea, though probably illegal, because it would have protected our customers’ records far better than our old and unsupported systems do now.

In my experience, most senior executives talk about risk management but very few actually practice it. All over the world I have customers with millions of vital business records stored and managed in systems that are likely to fail the next time IT updates desktop or server operating systems or databases. We have warned them multiple times but to no avail. Senior application owners and senior IT people are ignoring the risk and, I suspect, not making senior management aware of the inevitable disaster. They are not managing risk; they are ignoring risk and just hoping it won’t happen in their reign.

Of course, it isn’t just our products that are still running under IT environments they were never designed or certified for; this is a very common problem. The only worse problem I can think of is the ginormous amount of critical business data being ‘managed’ in poorly designed, totally insecure and teetering-on-failure, unsupportable Access and Excel systems; many of them in the back offices of major banks and financial institutions. One of my customers called the 80 or so Access systems that had been developed across his organization as the world’s greatest virus. None had been properly designed, none had any security and most were impossible to maintain once a key employee or contractor had left.

Before you ask, yes we do produce regular updates for current products and yes we do completely redesign and redevelop our core systems like RecFind about every five years to utilize the very latest technology. We also offer all the tools and services necessary for any customer to upgrade to our new releases; we make it as easy and as low cost as possible for our customers to upgrade to the latest release but we still have hundreds of customers and many thousands of users utilizing old, unsupported and about-to-fail software.

There is an old expression that says you can take a horse to water but you can’t make it drink. I am starting to feel like an old, tired and very frustrated farmer with hundreds of thirsty horses on the edge of expiration. What can I do next to solve the problem?

Luckily for my customers, Microsoft Windows Vista was a failure and very few of them actually rolled it out. Also, luckily for my customers, SQL Server 2005 was a good and stable product and very few found it necessary to upgrade to SQL Server 2008 (soon to be SQL Server 2012). This means that most of my customers using old and unsupported versions of RecFind are utilizing XP and SQL Server 2005, but this will soon change and when it does my old products will become unstable and even just stop working. It is just good luck and good design (programmed tightly to the Microsoft API) that some (e.g., 3.2) still work under XP. RecFind 3.2 and 4.0 were never certified under XP.

So we have a mini-Y2K coming but try as I may I can’t seem to convince my customers of the need to protect their critical and irreplaceable (are they going to rescan all those documents from 10 years ago?) data. And, as I alluded to above, I am absolutely positive that we are only one of thousands of computer software companies in this same position.

In fairness to my customers, the Global Financial Crisis of 2008 was a major factor in the disappearance of upgrade budgets. If the call is to either upgrade software or retain staff then I would also vote to retain staff. Money is as tight as it has ever been and I can understand why upgrade projects have been delayed and shelved. However, none of this changes the facts or averts the coming data-loss disaster.

All over the world government agencies and companies are managing critical business data in old and unsupported systems that will inevitably fail with catastrophic consequences. It is time someone started managing this risk; are you?

 

Month List