Posted on February 2016
It's really great to go to conferences because you get to have really interesting, often intelligent discussions with great people about interesting subjects and problems. Such has been the case over the last year when talking about the difficult issue of content review.
The only conclusion I've been able to reach from the vendors is, "Where you sit is where you stand." The only conclusion I've been able to reach from organizations is, "No one has actually figured it all out yet." The reason is that there are lots of problems and just as many solution approaches-- it is complicated!.
Back in the 1800's Eli Whitney invented the concept of standardized parts for manufacturing. Back then the focus was on fighting battles with guns, but today's focus is on fighting battles with words, tables, charts, and images. Yes the craftsmen who produced guns in the 1700's were replaced by the craftsmen who produced dies and other methods for making guns quickly and efficiently. The same is true for our knowledge workers today. Individual writers are now being asked to contribute to a more efficient and effective systems we call the content supply chain. Here's the problem: how can quality happen when our content does not fit together? And how can reviews ever move the quality and consistency of information forward when structures are ill-defined and incomplete i.e. not standardized?
There is no common technical standard for the review functions themselves. In DITA, all the vendors seem to hide their review functions in Processing Instructions but those functions are represented diversely. It is like saying that "we are all the same and compatible if we live in houses on earth. All books fit together because they are made up of words."
Technology-based tools can help. MS Word can help with spelling and some grammar issues. Readability scales can help you understand the educational level of the document. Acrolinx can take you to another level of content standardization and help improve content across cultures and languages. On the Flesch-Kincaid reading Ease scale this article gets a score of 59.5 and the average grade level of 5 applied tests or reading ease is 10.2. (I hope many of you completed high school. J the previous article was a 64.3 and grade level of 8.7, perhaps because I had a clearer idea of what I was talking about.)
Some want the whole document to review thereby thinking about context and document consistency. Some are hurried and harried and just want the Topics that require their review. Some are forced to take a whole document, say 100 pages and spend the majority of their time on the first 10 pages, skimming the rest as the deadline approaches. And then there's the "You've had it for two weeks and I need it tomorrow variation of this." And let's not forget the "my boss measures me on this, but my reviewers are not measured or motivated to do a good job."
Often dependent on purpose which varies. this problem can result from the variety of review objectives, for example:
The whole thing can be complicated by the concurrency problem. In the good ole days everybody got a copy of the Word document, used track changes and comments and emailed the document to the author who created another draft after considering the various comments and suggestions. With CMS's, check-in and check-out it's unrealistic to check out a large document and take your time reviewing. It is similarly unrealistic and suboptimal to let everyone do their reviews independent of other people's suggestions.
Content Mapper supports MS Word's review function and I will state with certainty that the reviewer actions can be the most varied in Word. You can add, delete, change, move, restructure to your heart's content. Similar to what we do with Word-based XML authoring, we confine review comments to the valid UI functions that are supported by the schema.
When we are working in a mixed environment, for example with Oxygen as another editor, we can round trip review comments from Content Mapper to Oxygen, but we must leave out the review functions/comments that Oxygen does not support, putting them back in upon return to the repository.
This sounds great and everyone with a connection to the web can access the content, but changes are limited. And what happens if I want to work offline.
Each editor has a different way of supporting review. The UI's are different and they don't work very well with Word-based or Browser-based reviews. But the use of these tools is generally limited to technical people. Oxygen has done a good job of integrating browser-based and client-based authoring and review, but what about the bulk of authors who use MS Word?
When you use a PDF review tool, you have a common agreed process, but may sacrifice from a less complete review of content, structure, organization, and purpose. And rationalizing those comments in the source document is difficult, at best.
It seems that the most appropriate level for group review in a modern organization is the Topic. Don't check the content out until you are ready to do the review. Check it back in with your comments quickly. Go back to the Topic later if you want to critique other comments, or stand on the shoulder of giants who have thought of things that you did not. At the end of the day, once again, information consumers will be delighted.
Like I said earlier, it's complicated. Let's work together this year to figure this one out.