What is the future of RecFind? - The Product Road Map

by Frank 19. May 2014 06:00

First a little history. We began in 1984 with our first document management application called DocFind marketed by the then Burroughs Corporation (now called Unisys). In June 1986 we sold the first version of RecFind, a fully-featured electronic records management system and a vast improvement on the DocFind product. Then we progressively added document imaging then electronic document management and workflow and then with RecFind 6 a brand new paradigm and an amalgam of all previous functionality; an Information management system able to run multiple applications concurrently with a complete set of enterprise content management functionality. RecFind 6 is the eighth completely new iteration of the iconic RecFind brand.

RecFind 6 was and is unique in our industry because it was designed to be what was previously called a Rapid Application Development system (RAD) but unlike previous examples, we provided the high level toolset so new applications could be inexpensively ‘configured’ (by using the DRM) not expensively programmed and new application tables and fields easily populated using Xchange. It immediately provided every customer with the ability to change almost anything they needed changed without needing to deal with the vendor (us).  Each customer had the same tools we used to configure multiple applications within a single copy of RecFind 6. RecFind 6 was the first ECM product to truly empower the customer and to release them from the expensive and time consuming process of having to negotiate with the vendor to “make changes and get things done.”

In essence, the future of the RecFind brand can be summarised as more of the same but as an even easier to use and more powerful product. Architecturally, we are moving away from the fat-client model (in our case based on the .NET smart-client paradigm) to the zero-footprint, thin-client model to reduce installation and maintenance costs and to support far more operating system platforms than just Microsoft Windows. The new version 2.6 web-client for instance happily runs on my iPad within the Safari browser and provides me with all the information I need on my customers when I travel or work from home (we use RecFind 6 as our Customer Relationship Management system or CRM). I no longer need a PC at home and nor do I need to carry a heavy laptop through airports.

One of my goals for the remainder of 2014 and 2015 following is to convince my customer base to move to the RecFind 6 web-client from the standard .NET smart-client. This is because the web-client provides tangible, measurable cost benefits and will be the basis for a host of new features as we gradually deprecate the .NET smart-client and expand the functionality of the web-client. We do not believe there is a future for the fat/smart-client paradigm; it has seen its day. Customers are rightfully demanding a zero footprint and the support of an extensive range of operating environments and devices including mobile devices such as smartphones and tablets. Our web-client provides the functionality, mobile device support and convenience they are demanding.

Of course the back-end of the product, the image and data repository, also comes in for major upgrades and improvements. We are sticking with MS SQL Server as our database but will incorporate a host of new features and improvements to better facilitate the handling of ‘big data’. We will continue to research and make improvements to the way we capture, store and retrieve data and because our customer’s databases are now so large (measured in hundreds of Gigabytes), we are making it easier and faster to both backup and audit the repository. The objectives as always are scalability, speed, security and robustness.

We are also adding new functionality to allow the customer to bypass our standard user interface (e.g., the .NET smart-client or web-client) and create their own user interface or presentation layer. The objective is to make it as easy as possible for the customer to create tailored interfaces for each operating unit within their organization. A simple way to think of this functionality is to imagine a single high level tool that lets you quickly and easily create your own screens and dashboards and program to our SDK.

On the add-in product front we will continue to invest in our add-in products such as the Button, the MINI API, the SDK, GEM, RecCapture, the High Speed Scanning Module and the SharePoint Integration Module. Even though the base product RecFind 6 has a full complement of enterprise content management functionality these add-on products provide options requested by our customers. They are generally a way to do things faster and more automatically.

We will continue to provide two approaches for document management; the end-user paradigm (RecFind 6 plus the Button) and the fully automatic capture and classification paradigm (RecFind 6 plus GEM and RecCapture). As has been the case, we also fully expect a lot of our customers to combine both paradigms in a hybrid solution.

The major architectural change is away from the .NET smart-client (fat-client) paradigm to the browser-based thin-client or web-client paradigm. We see this as the future for all application software, unconstrained by the strictures of proprietary operating systems like Microsoft Windows.

As always, our approach, our credo, is that we do all the hard work so you don’t have to. We provide the feature rich, scalable and robust image and data repository and we also provide all of the high level tools so you can configure your applications that access our repository. We also continue to invest in supporting and enhancing all of our products making sure that they have the feature set you require and run in the operating environments you require them to. We invest in the ongoing development of our products to protect your investment in our products. This is our responsibility and our contribution to our ongoing partnership.

 

Is this Microsoft’s worst mistake ever?

by Frank 30. November 2013 06:00

I run a software company called the Knowledgeone Corporation that has been developing application solutions for the Microsoft Windows platform since the very first release of Windows. As always, our latest product offering RecFind 6 version 2.6 has to be tested and certified against the latest release of windows. In this case that means Windows 8.1.

Like most organizations, we waited for the Windows 8.1 release before upgrading our workstations from Windows 7. The only exceptions were our developers workstations because we bought them new PCs with Windows 8 pre-installed.

We are now testing the final builds of RecFind 6 version 2.6 and have found a major problem. The problem is that Microsoft in its infinite wisdom has decided that you can’t install Windows 8.1 over a Windows 7 system and retain your already installed applications.

The only solution is to install Windows 8 first and then upgrade Windows 8 to Windows 8.1. However, if you are running Windows 7 Enterprise this won’t work either and you will be told that you will have reinstall all of your applications.

I am struggling to understand Microsoft’s logic.

Surely Microsoft wants all its customers to upgrade to Windows 8.1? If so, why has it ‘engineered’ the Windows 8.1 upgrade so customers will be discouraged from using it? Does anyone at Microsoft understand how much work and pain is involved in re-installing all your applications?

No, I am not kidding. If you have a PC or many PCs with Windows 7 installed you are going to have to install Windows 8 first in order to maintain all of your currently installed applications. Then, after spending many hours installing Windows 8 (it is not a trivial process) spend more precious time installing Windows 8.1. Microsoft has ensured that you cannot go direct from Windows 7 to Windows 8.1.

Of course, if you are unlucky, you could be living in a country where Microsoft has blocked the downloading of Windows 8, like Australia. Now you are between a rock and a hard place. Microsoft won’t let you install Windows 8 and if you install Windows 8.1 you face days or weeks of frustrating effort trying to re-install all of your existing applications.

 

Here are some quotes from Microsoft:

“You can decide what you want to keep on your PC. You won't be able to keep programs and settings when you upgrade. Be sure to locate your original program installation discs or purchase confirmation emails if you bought programs online. You'll need these to reinstall your programs after you upgrade to Windows 8.1—this includes, for example, Microsoft Office, Apache OpenOffice, and Adobe programs. It's also a good idea to back up your files at this time, too.”

If you're running Windows 7, Windows Vista, or Windows XP, all of your apps will need to be reinstalled using the original installation discs, or purchase confirmation emails if you bought the apps online.”

If the management at Microsoft wanted to ensure the failure of Windows 8.1 they couldn’t have come up with a better plan than the one they have implemented. By making Windows 8.1 so difficult to install they have ensured that its customers will stick with the tried and proven Windows 7 for as long as possible.

Can anyone at Microsoft explain why they thought this was a good idea?

Why don’t you make it easy for end users to find what they need?

by Frank 8. June 2013 06:00

Many records managers and librarians still hold on to the old paradigm that says if a user wants something they should come though the information management professional. They either believe that the end user can’t be trusted to locate the information or that the task is so complex that only an information professional can do it in a proper and professional manner.

This approach to tightly controlled access to information has been with us for a very long time; unfortunately, not always to the benefit of the end user. It is often interpreted as a vice-grip on power rather than a service by the end users.

In my experience, (many years designing information and knowledge management solutions), most end users would like the option of searching for themselves and then deciding whether or not to request assistance.

Of course it may also be true that the system in use is so complex or so awkward to use that most end users (usually bereft of training) find it too hard to use and so have to fall back on asking the information professional. However, if this is the case then there will invariably be a backlog of requests and end users will be frustrated because they have to wait days or weeks for a response. In this environment, end users begin to feel like victims rather than valued customers or ‘clients’.

The obvious answer is to make it easy for end users to find what they are looking for but this obvious answer seems to escape most of us as we continue to struggle with the obscure vagaries of the existing system and an often impenetrable wall of mandated policies, processes and official procedures.

If we really want a solution, it’s time to step outside of the old and accepted model and provide a service to end users that end users actually want, can use and appreciate. If we don’t take a wholly new approach and adopt a very different attitude and set of procedures then nothing will improve and end user dissatisfaction (and anger) will grow until it reaches the point where they simply refuse to use the system.

End users are not stupid; end users are dissatisfied.

One of the core problems in my experience is an absence of an acceptance of the fact that the requirements of the core, professional users are very different to the requirements of the end users. At the risk of oversimplifying it, end users only need to know what they need to know. End users need a ‘fast-path’ into the system that allows them to find out what they need to know (and nothing more) in the shortest possible time and via the absolutely minimum amount of keystrokes, mouse-clicks or swipes.

End users need a different interface to a system than professional users.

This isn’t because they are less smart, it is because the ‘system’ is just one of the many things they have to contend with during a working day, it is not their core focus. They don’t have time (or the interest) to become experts and nor should they have to become experts.

If end users can’t find stuff it isn’t their fault; it is the system’s fault.

The system of course, is more than just the software. It is the way menus and options are configured and made available, it is the policy and procedures that govern access and rights to information. It is the attitude of those ‘in-power’ to those that are not empowered.

If you want happy and satisfied end users, give them what they need.

Make sure that the choices available to an end user are entirely appropriate to each class of end user. Don’t show them more options then they need and don’t give them more information than they are asking for. Don’t ask them to navigate down multiple levels of menus before they can ask the question they want to ask; let them ask the question as the very first thing they do in the system. Then please don’t overwhelm them with information; just provide exactly and precisely what they asked for.

If you want the end users off your back, give them what they need.

I fall back on my original definition of a Knowledge Management system from 1997, “A Knowledge Management system is one that provides the user with the explicit information required, in exactly the form required, at precisely the time the user needs it.”

With hindsight, my simple definition can be applied to any end user’s needs. That is, please provide a system that provides the end user with the explicit information required, in exactly the form required, at precisely the time the end user needs it.

What could be more simple?

More references:

The IDEA – 1995

Knowledge Management, the Next Challenge? - 1997

Whatever happened to the Knowledge Management Revolution?  – 2006

A Knowledge Management System – A Discourse – 2008

 

Records Management in the 21st century; you have computers now, do it differently

by Frank 1. June 2013 06:32

I own and run a computer software company called the Knowledgeone Corporation and we have specialised in what is now known as enterprise content management software since 1984 when we released our first product DocFind. We are now into the 8th iteration of our core and iconic product RecFind and have sold and installed thousands of RecFind sites where we manage corporate records and electronic documents.

I have personally worked with hundreds of customers to ensure that we understand and meet their requirements and I have also designed and specified every product we have delivered over the last 29 years so while I have never been a practicing records manager, I do know a great deal about records and document management and the vagaries of the practise all around the world.

My major lament is that many records managers today still want to run their ‘business’ in exactly the same way it was run 30 or 50 or even a hundred years ago. That is, as a physical model even when using computers and automated solutions like our product RecFind 6. This means we still see overly complicated classification systems and overcomplicated file numbering systems and overcomplicated manual processes for the capture and classification of paper, document images, electronic documents and emails.

It is a mindset that is locked in the past and can’t see beyond the confines of the file room.

I also still meet records managers that believe each and every employee has a responsibility to ‘become’ a junior records manager and both fully comprehend and religiously follow all of the old-fashioned and hopelessly overcomplicated and time-consuming processes laid out for the orderly capture of corporate documents.

I have news for all those locked-in-the-past records managers. Your approach hasn’t worked in the last 30 years and it certainly will not work in the future.

Smart people don’t buy sophisticated computer hardware and application software and then try to replicate the physical model for little or no benefit. Smart people look at what a computer system can do as opposed to 20,000 linear feet of filing shelves or 40 Compactuses and 30 boxes of filing cards and immediately realize that they have the power to do everything differently, faster, most efficiently and infinitely smarter.  They also realize that there is no need to overburden already busy end users by a forcing them to become very bad and very inconsistent junior records managers. End users are not hired to be records managers they are hired to be engineers, sales people, accountants, PAs, etc., and most already have 8 hours of work a day without you imposing more on them.

There is always a better way and the best way is to roll out a records and document and email management system that does not require your end users to become very bad and inconsistent junior records managers. This way it may even have a chance of actually working.

Please throw that old physical model away. It has never worked well when applied to computerised records, document and email management and it never will. Remember that famous adage, “The definition of insanity is to keep doing the same thing and to expect the results to be different”?

I guarantee two things:

1.     Your software vendor’s consultant is more than happy to offer advice and guidance; and

2.     He/she has probably worked in significantly more records management environments than you have and has a much broader range of experience than you do.

It doesn’t hurt to ask for advice and it doesn’t hurt to listen.

What is the future for real IT professionals?

by Frank 21. October 2012 06:00

I own and run a software company called Knowledgeone Corporation that produces an enterprise content management solution called RecFind 6. As such, our business is the design and programming of complex, heavy-duty application software. This means that we do the hard stuff, including all of the invention, and that I need really clever and innovative and productive IT people (mainly programmers) to work for me.

I have written previously about how hard it is nowadays to find the quality of people I need, see my previous blog entitled “Where have all the good applicants gone?” However, there is an even bigger problem in our industry with an ongoing fall in standards that began way back with the Y2K problem in the late 1990’s as everyone panicked about the problem of date handling once the year 2,000 clicked over.

The problem was basically one of greed where emerging countries like India realized there was a lot of money in providing IT expertise and started mass-producing so called ‘experts’ and shipping them all over the world. Very soon a resume or list of qualifications or certifications was all that was needed to convince paper-bound and rules-bound bureaucrats that an individual had the prerequisite skills to either immigrate or be awarded a work permit.

And of course, young people in countries like India and Pakistan and the Philippines moved into the IT industry not because they were motivated by the prospect of becoming IT professionals but because it was their ticket out of poverty and an entry opportunity into countries like the USA, Canada and Australia. So, we started to fill up the ranks of IT professionals with people that did not have the aptitude or motivation, just a strong desire for a better life (and who can blame them?).

Greed raided its ugly head again as local executives linked bigger bonuses to lower costs and the Indian companies further reduced ‘real’ qualifications to increase the supply of experts. Universities also got in on the act, again motivated by greed (more students equals more income) and standards were again lowered to create  a production line mentality, “Just pump more out of the system, we can sell them overseas!”

The law of averaging applies and as you gradually increase the number of the less talented and less well qualified people into the talent pool the lower the ‘average’ standard becomes. It is analogous to starting with a glass of the best Scotch Whiskey and then gradually adding more and more water. After a while it isn’t worth drinking because it isn’t whiskey any more, it is just flavoured water. We have similarly diminished our IT talent pool (especially in the ranks of programmers) to the degree where the average programmer can’t actually program.

For a long while we imported tens of thousands of these less-than-adequate programmers and they filled up the holes in mainly large enterprises like banks and finance companies and the public sector where they could hide their lack of real expertise. However, and unfortunately for them, the Global Financial Crisis (GFC) has accelerated the growth of outsourcing (back to even less qualified people in places like India, Pakistan and the Philippines) and our recent immigrants are now losing their jobs to their home country-men. I find this ironic but maybe you don’t agree.

In another previous blog, the world according to Frank, I predicted a significant rise in unemployment numbers within our IT industry. I also said it has been happening for some time but that the real numbers won’t be clear until around mid-2013.

Greed will continue to drive the outsourcing phenomenon just as it will continue to drive the lowering of standards and the overall effect on our industry will be significant as the available pool of real talent becomes smaller and smaller. Similarly, local opportunities for real professionals are disappearing fast. Many of you will end up having to help justify your boss’s big bonus by approving software created overseas when it isn’t really up to scratch and many more of you will relegated to fixing the crappy code being delivered to your company from the outsourced incompetents. Not a good future for real professionals and definitely not an environment of high job satisfaction.

When I began as a programmer in the 1960s everyone I worked with was highly motivated and everyone had a high aptitude because it was such a difficult industry to enter. You had no chance of working for a mainframe vendor unless you scored at least an A+ on the infamous IBM or Burroughs or ICL or GE or CDC aptitude tests. We were a very small and very exclusive group and to my mind, a dedicated band of professionals who were in IT because we loved it and were really good at it. The average level of expertise was extraordinarily high and this is now patently no longer the case because our industry has changed dramatically since those early and halcyon days.

So what is the future for real IT professionals who are in this industry because they love it and are really good at it? Like with all things, I believe there is good news and there is bad news.

The good news is that as a true IT professional your value is higher but, probably much higher than the less-than-competent manager who is interviewing knows. This is because many incompetent programmers have now managed to become incompetent managers and this situation protects incompetent programmers but punishes highly competent ones. Basically, your manager isn’t smart enough to recognize how different you are to the average programmer in his team. This makes getting paid what you are really worth very difficult.

Ergo, if you are really good at what you do and want to be paid what you are worth and want to do challenging and satisfying work your only chance is to select a company doing challenging work and a smart manager to be interviewed by. Oh, and don’t select a company with a greedy CEO who is looking to increase his bonus by outsourcing (regardless of the result) and lowering costs to impress the board and or shareholders. Sounds like a tough ask to me, thank God I am self-employed.

Would I recommend the IT industry to any young person today in high school contemplating a future career? No I probably wouldn’t. I would probably recommend accountancy, business studies, medicine or dentistry instead. So where am I going to find the really bright, talented and motivated programmers I need in the future? This almost certainly doesn’t bear thinking about but maybe it is an opportunity as most problems are.

We need a new way to select and train IT professionals; the universities are simply not doing a good enough job. Is there anyone out there with the money, ideas and knowledge willing to set up a new kind of highly selective IT training program? If so, please contact me, I will be more than happy to be one of your first customers.

The PC is dead, or is it?

by Frank 7. October 2012 06:00

The financial and IT news services tell us very pessimistic stories about the major PC players like DELL and HP. The general gist is that sales of PCs are down and sales of tablets are up and that the share prices of DELL and HP are falling. Just yesterday, the CEO of HP announced to a stunned market that 2013 will likely be worse than 2012. She also lamented the frequent turnover of HP CEO’s since the demise of Carly Fiorina. But to my mind that was a strange thing to do when also announcing that she won’t be improving anything and in fact will be in charge when things get worse. The mental picture I get is of the captain steering the ship into the rocks. My guess is that the musical chairs game at the top of HP will continue for some time yet because market analysts don’t like bad news and shareholders don’t like falling share prices.

So is the PC dead? Will we see it completely replaced in our homes and offices within a few short years? Are you still planning to buy a new PC? If so, why? Is business still planning to buy more PCs, for example to support Windows 8?  Will business in fact move to Windows 8 in 2013 or 2014 or 2015? Why would anyone be investing in expensive new PC hardware for their home or office? Are there better alternatives available now?

To my mind the global financial crisis that began in 2007/2008 has at least as much to do with falling PC sales as the advent of clever tablets from people like Apple. All over the western world people are holding back on spending money and are simply not replacing ‘older’ PCs or notebooks. In fact, I see the current crop of tablets as complimentary devices to PCs and notebooks, not replacements.  I blogged about this previously in “Why aren’t tablets the single solution yet?” and still believe my arguments to be valid.

My customers for example, still use PCs in the office to run my enterprise content management system RecFind 6 and use notebooks to run it when travelling. However, they are also now demanding that I provide support for a range of mobile devices including smartphones and tablets. But my customers are not replacing their PCs and notebooks with tablets, they are using tablets in an appropriate way to extend what they can do with mobile workers.

I also think that companies like DELL and HP are their own worst enemies. They have both exhibited a surprising lack of innovation and salesmanship and their marketing people seem to be about five years behind the market. They have both outsourced their services and support to awful Indian call centres and focussed more on reducing costs than on improving customer service. Customers have a way of showing their disapproval by walking away and I believe this is what they are doing.

So whereas I think tablets are the future I don’t think they are capable enough yet to replace PCs and notebooks in the office environment. I think most people have a tablet in addition to their PC and notebook (and smartphone).

I don’t see tablets, even the next generation, having all the functionality and screen size and power we need to replace PCs in the office. Even in the home, the small screen size of a tablet mitigates its value as does the lack of applications and connectivity; not everyone wants to replace their working backup drive and USB printer just to accommodate Apple.

I also think that PCs and notebooks are too expensive and that Intel, DELL and HP are too used to big margins. In economics we talk about the intersection of the price and demand curves; the theoretical point at which we make the most money. Set the price too high and you sell fewer and make less money. Set the price too low and you sell more but make less profit. Somewhere in the middle is the point where we set our price to get the optimum sales and profit results.

For example, if Apple priced the New iPad at $5,000 if wouldn’t sell any and it wouldn’t make any money but if it priced it at $10 it would sell a shed-load but also wouldn’t make any money. At $400 plus it seems to sell as many as it can produce and also make the maximum profit. Apple has found its optimum price point.

Every vendor struggles for the optimum price point and over time as technology matures and becomes more common, prices have to drop. I don’t think the prices of PCs and notebooks have dropped enough. It’s just economics stupid, your PC and note book prices are way above your optimum price point and that’s one reason why people are not buying them.

So no, I don’t think PCs are dead. I think their sales have dropped because of a combination of the ongoing global financial crisis and poor management and product decisions from the major players like Intel, DELL and HP. Apple has cleverly capitalised on this situation, it didn’t create it. Apple is clearly innovative, HP and DELL are not.

I believe that we are yet to see at least one more re-invention of the PC and notebook, albeit of a higher quality and with more innovation that Intel’s Ultra Book attempt at reinventing the notebook. The re-invention should also come with a new lower pricing algorithm, not a raising of prices as attempted by Intel with the Ultra Book range of notebooks.

So, Intel, DELL and HP; the ball is firmly in your court. You all employ scores of really smart and innovative people. Why don’t you give them the challenge? If you come up with a realistically priced and innovative new PC solution I would certainly buy a few. But, please do something about your service levels; I for one am really tired of being bounced around Indian, Singaporean and Philippine call centres. If foreign call centres are part of the new deal I am afraid that I want no part of it. That model is broken. If you want my business then I demand better service.

 

Are you also confused by the term Enterprise Content Management?

by Frank 16. September 2012 06:00

I may be wrong but I think it was AIIM that first coined the phrase Enterprise Content Management to describe both our industry and our application solutions.

Whereas the term isn’t as nebulous as Knowledge Management it is nevertheless about as useful when trying to understand what organizations in this space actually do. At its simplest level it is a collective term for a number of related business applications like records management, document management, imaging, workflow, business process management, email management and archiving, digital asset management, web site content management, etc.

To simple people like me the more appropriate term or label would be Information Management but as I have already covered this in a previous Blog I won’t beleaguer the point in this one.

When trying to define what enterprise content management actually means or stands for we can discard the words ‘enterprise’ and ‘management’ as superfluous to our needs and just concentrate on the key word ‘content’. That is, we are talking about systems that in some way create and manage content.

So, what exactly is meant by the term ‘content’?

In the early days of content management discussions we classified content into two broad categories, structured and unstructured. Basically, structured content had named sections or labels and unstructured content did not. Generalising even further we can say that an email is an example of structured content because it has commonly named, standardised and accessible sections or labels like ‘Sender’, ‘Recipient’, ‘Subject’ etc., that we can interrogate and rely on to carry a particular class or type of information. The same general approach would regard a Word document as unstructured because the content of a Word document does not have commonly named and standardised sections or labels. Basically a Word document is an irregular collection of characters that you have to parse and examine to determine content.

Like Newtonian physics, the above generalisations do not apply to everything and can be argued until the cows come home. In truth, every document has an accessible structure of some kind. For example, a Word document has an author, a size, a date written, etc. It is just that it is far easier to find out who the recipient of an email was than the recipient of a Word document. This is because there is a common and standard ‘Tag’ that tells us who the recipient is of an email and there is no such common and standard tag for a Word document.

In our business we call ‘information about information’ (e.g., the recipient and date fields on an email) Metadata. If an object has recognizable Metadata then it is far easier to process than an object without recognizable Metadata. We may then say that adding Metadata to an object is the same as adding structure.

Adding structure is what we do when we create a Word document using a template or when we add tags to a Word document. We are normalizing the standard information we require in our business processes so the objects we deal with have the structure we require to easily and accurately identify and process them.

This is of course one of the long-standing problems in our industry, we spend far too much time and money trying to parse and interpret unstructured objects when we should be going back to the coal face and adding structure when the object is first created. This is of course relatively easy to do if we are creating the objects (e.g., a Word document) but not easy to achieve if we are receiving documents from foreign sources like our customers, our suppliers or the government. Unless you are the eight-hundred pound gorilla (like Walmart) it is very difficult to force your partners to add the structure you require to make processing as fast and as easy and as accurate as possible.

There have been attempts in the past to come up with common ‘standards’ that would have regulated document structure but none have been successful. The last one was when XML was the bright new kid on the block and the XML industry rushed headlong into defining XML standards for every conceivable industry to facilitate common structures and to make data transfer between different organizations as easy and as standard as possible. The various XML standardisation projects sucked up millions or even billions of dollars but did not produce the desired results; we are still spending billions of dollars each year parsing unstructured documents trying to determine content.

So, back to the original question, what exactly is Enterprise Content Management? The simple answer is that it is the business or process of extracting useful information from objects such as emails and PDFs and Word documents and then using that information in a business process. It is all about the process of capturing Metadata and content in the most accurate and expeditious manner possible so we can automate business processes as much as possible.

If done properly, it makes your job more pleasant and saves your organization money and it makes your customers and suppliers happier. As such it sounds a lot like motherhood (who is going to argue against it?) but it certainly isn’t like manna from heaven. There is always a cost and it is usually significant. As always, you reap what you sow and effort and cost produces rewards.

Is content management something you should consider? The answer is definitely yes with one proviso; please make sure that the benefits are greater than the cost.

 

Do you really want that job you are applying for?

by Frank 26. August 2012 06:00

I own and run a software company that builds, sells, installs and supports an enterprise content management solution called RecFind 6. As such, I employ programmers, support specialists, accountants, consultants, trainers, pre-sales people and sales people to name but a few categories. This means I am always hiring and always reviewing applications from candidates.

Basically, most of the applications I receive are rubbish. They are badly written, badly formatted, not ‘selling’ documents and almost never focussed on the position I am advertising.  This is very sad but it does make vetting an avalanche of resumes pretty easy. I would probably spend no more than a minute or two reading each resume in the first pass to separate the real candidates from the flotsam. I move the results into two folders, one called possible and the other called ‘No way’.

This may sound a little impersonal but I have no patience with people who waste my time by firstly not reading the advertised job description properly and then by sending in a non-selling document. In fact, most resumes I see are great big red flags saying, “Please don’t hire me, I am a dope who didn’t read your ad properly and then couldn’t be bothered even getting the spelling and grammar correct or trying to sell myself in any way”.

So my first advice is if you are too lazy to allocate the time and effort required or can’t simply be bothered to sell yourself in the most professional manner possible then don’t bother because all you are doing is wasting your time and the time of any prospective employer. Prospective employers also have long memories so rest assured your next application to the same firm will be instantly relegated to the waste bin.

I only hire professionals and professionals do not send in a non-professional job application.

I only hire people who respect my time and I only hire people who manage to convince me that they really want the job I am advertising and are the best person for that role.

I figure that the effort you are prepared to expend on what should be your most important task at this time (i.e., finding employment) is indicative of the quality of work I can expect from you as an employee. If you send me a poor quality application then I assume everything you would do for me as an employee will be of a similar poor standard. If you are too lazy or too careless to submit a winning application then I can only assume you would also behave in this manner after employment so I have zero interest in you.

This is the bit I struggle to understand. How come the applicant doesn’t understand the obvious correlation any prospective employer makes between the quality of the job application and the quality of the person?

Please allow me to give you some simple common-sense advice that comes from a very experienced employer of people.

Always:

  • Read the job ad very carefully. Note the prerequisites and requirements; the employer put them in for a reason and he/she would really appreciate it if you didn’t waste his/her time by applying for a position you do not qualify for.
  • Always include a cover letter personalized for each and every job application. Your objective should be to convince the prospective employer that the job advertised is perfect for you and that you are in turn a perfect fit for the job.  If your past experience or skillset isn’t a perfect fit, use the cover letter to explain why it isn’t a problem and why you are still the right person for the job being advertised. All potential employers are impressed by someone who takes the time and trouble to align their skills and experience to the job on offer. Most importantly, use words and phrases from the job ad in your cover letter. This helps convince the potential employer that you have really thought about the position and have put intelligent time into your application.
  • Clean up your resume, spell and grammar check it and convert it to a PDF for a much better and more professional looking presentation effect. All potential employers can’t help but appreciate a well presented and professional looking resume; it sets you apart.

In the end it is all about the initial impression you convey to the prospective employer. You have one shot so make sure it is a good one.

You need to convince your prospective employer that you selected their advertised job to respond to because it really interests and excites you and that you have the attitude, aptitude, character, experience and skillset required to make the most of this position. You have to convince them that you would be an asset to their organization.

It doesn’t take long to write a personalised cover letter, maybe an hour or two at the most and it should never be more than one page long. My final advice is that if you don’t think the advertised position is worth an hour or two of your time then don’t respond because you will be wasting your time. Don’t ‘shotgun’ job opportunities with multiple low-quality and non-selling applications. Instead focus on just the jobs you really like and then submit a smaller number of high-quality and personalised applications. I guarantee that your success rate will be much higher and that you will be asked to more interviews and that you will eventually get the job of your dreams.

The simple message is that you will get out of the process precisely what you put into the process. It is a tough world but in my experience effort is always rewarded. For your sake, please make the effort.

Is Information Management now back in focus?

by Frank 12. August 2012 06:00

When we were all learning about what used to be called Data Processing we also learned about the hierarchy or transformation of information. That is, “data to information to knowledge to wisdom.”

Unfortunately, as information management is part of what we call the Information Technology industry (IT) we as a group are never satisfied with simple self-explanatory terms. Because of this age-old flaw we continue to invent and hype new terms like Knowledge Management and Enterprise Content Management most of which are so vague and ill-defined as to be virtually meaningless but nevertheless, provide great scope for marketing hype and consultants’ income.

Because of the ongoing creation of new terminology and the accompanying acronyms we have managed to confuse almost everyone. Personally I have always favoured the term ‘information management’ because it tells it like it is and it needs little further explanation. In the parlance of the common man it is an “old un, but a good un.”

The thing I most disliked about the muddy knowledge management term was the claim that computers and software could produce knowledge. That may well come in the age of cyborgs and true artificial intelligence but I haven’t seen it yet. At best, computers and software produce information which human beings can convert to knowledge via a unique human cognitive process.

I am fortunate in that I have been designing and programming information management solutions for a very long time so I have witnessed first-hand the enormous improvements in technology and tools that have occurred over time. Basically this means I am able to design and build an infinitely better information management solution today that I could have twenty-nine years ago when I started this business.  For example, the current product RecFind 6 is a much better, more flexible, more feature rich and more scalable product than the previous K1 product and it in turn was an infinitely better product than the previous one called RecFind 5.

One of the main factors in them being better products than their predecessors is that each time we started afresh with the latest technology; we didn’t build on the old product, we discarded it completely and started anew. As a general rule of thumb I believe that software developers need to do this around a five year cycle. Going past the five year life cycle inevitably means you end up compromising the design because of the need to support old technology. You are carrying ‘baggage’ and it is synonymous with trying to run the marathon with a hundred pound (45 Kg) backpack.

I recently re-read an old 1995 white paper I wrote on the future of information management software which I titled “Document Management, Records Management, Image Management Workflow Management...What? – The I.D.E.A”. I realised after reading this old paper that it is only now that I am getting close to achieving my lofty ambitions as espoused in the early paper. It is only now that I have access to the technology required to achieve my design ambitions. In fact I now believe that despite its 1995 heritage this is a paper every aspiring information management solution creator should reference because we are all still trying to achieve the ideal ‘It Does Everything Application’ (but remember that it was my I.D.E.A. first).

Of course, if you are involved in software development then you realise that your job is never done. There are always new features to add and there are always new releases of products like Windows and SQL server to test and certify against and there are always new releases of development tools like Visual Studio and HTML5 to learn and start using.

You also realise that software development is probably the dumbest business in the world to be part of with the exception of drug development, the only other business I can think of which has a longer timeframe between beginning R&D and earning a dollar. We typically spend millions of dollars and two to three years to bring a brand new product to market. Luckily, we still have the existing product to sell and fund the R&D. Start-ups however, don’t have this option and must rely on mortgaging the house or generous friends and relatives or venture capital companies to fund the initial development cycle.

Whatever the source of funding, from my experience it takes a brave man or woman to enter into a process where the first few years are all cost and no revenue. You have to believe in your vision, your dream and you have to be prepared for hard times and compromises and failed partnerships. Software development is not for the faint hearted.

When I wrote that white paper on the I.D.E.A. (the It Does Every Thing Application or, my ‘idea’ or vision at that time) I really thought that I was going to build it in the next few years, I didn’t think it would take another fifteen years. Of course, I am now working on the next release of RecFind so it is actually more than fifteen years.

Happily, I now market RecFind 6 as an information management solution because information management is definitely back in vogue. Hopefully, everyone understands what it means. If they don’t, I guess that I will just have to write more white papers and Blogs.

Month List