Wednesday, 10 October 2012

Measuring the Performance of the Electronic Records Management Program

,
Too often organizations fail to establish tangible performance metrics to support electronic records management implementation initiatives.   A recent AIIM research found that such absence of metrics result in a lack of top down executive commitment to approve funding for electronic records management projects.[1]  The same research found that rapidly escalating regulatory compliance and e-discovery costs notwithstanding only 9% of organizations surveyed have an enterprise-wide electronic records management strategy and systems in place.[2]  There may be a number of reasons for such a poor enterprise records management adoption trend. For one many organizations tend to have a disproportionate number of disparate document management repositories resulting in a fragmented departmental approach to records management. This may be further compounded by insufficient investments in an enterprise wide information management strategy that strives for a consistent and normalized meta-data model for all corporate information assets. There may be lack of clarity as to what dimensions of electronic records management implementations ought to be measured.  One may focus on efficiency metrics associated with how the electronic records management system performs in terms of declaration, classification rates, retrieval times and disposition. Or one may focus on the outcomes associated with electronic records implementation measured in terms of tangible ROI such as lower physical storage space costs, reduced e-discovery costs or intangible measures such as improved constituency services. However organizational alignment and executive support are imperative determinants to the success of an effective electronic records management program. As one noted authority observed “Many EDRMS projects are “led from the middle”, making them highly susceptible to failure. Without senior management endorsement, the prioritisation of the project and provision of resources required over the typical 12 to 30 month timespan – both of which are prerequisites to success – have a high probability of being withdrawn. Senior managers generally support the idea of good recordkeeping and compliance, but do not know what is involved and do not understand the productivity that is unleashed by a successful project. At the other end of the spectrum, users tend to see an EDRMS project as just another compliance driven, administrative burden with little personal benefit. Getting engagement up and down the hierarchy is therefore fundamental to the success of an EDRMS implementation.”

 
Recently Bruce Miller of www.rimtech.ca has provided very useful guidance relating to successful electronic records management implementation best practices.  The link to the article is here and it will also appear in Canadian Government Executive. 



[1] Records Management Strategies, AIIM 2011: “A lack of commitment at the highest levels is the most likely reason that organizations have no records management system, either by default, or by not seeing sufficient need to invest the money.”
[2] Records Management Strategies, AIIM 2011: “A lack of commitment at the highest levels is the most likely reason that organizations have no records management system, either by default, or by not seeing sufficient need to invest the money.”
Read more

Monday, 17 September 2012

Strategic Analytics: The Final Frontier for “Enterprise” Data

,
Blog Author: Rod MacPherson

Sorry for the bad pun. I have been blogging recently about the incredible achievements of NASA nailing the landing of the latest Mars Rover- Curiosity. After a journey of 350,000,000 miles, it landed within 1.5 miles of its target. The jubilation in mission control was unconfined, as the team of mission monitors and engineers openly celebrated the successful milestone in the life of Curiosity- which is expected to relay a constant stream of data about the sustainability of life on the red planet back to earth over its anticipated 2 year lifetime. (although who knows- Voyager 1 is STILL relaying data- a journey that takes 17 hours, traveling at the speed of light from the extreme edges of our galaxy) If you want see where Voyager is, click here. When I last checked, it was 18 billion kilometers from earth. I digress.
 
If we think about that critical moment in mission control- when NASA personnel learned of the successful touchdown, a mere 15 minutes after the event- you realize that they have mastered the art of analytics. Not just about what has happened- but what was going to be happening (given the lag- and the need to adjust- they needed predicative indicators to make adjustments). They knew what they needed to know, and when they needed to know it. They had designed their data pipeline to provide the most critical information, in the most-timely fashion within practical limits of electro-magnetic physics as we understand it and our current understanding of quantum mechanics. If only we could do the same for our enterprise data.
 
The reality is - we can. Today’s leaders are often misled and discouraged by the current realities of information management and their cynicism about what information is practically available, when, and to what degree of reliability. This cynicism does not arise from the promise of what we as data management professionals are doing and promoting - they arise from the historical prejudice associated with IT and its excessive failure rate.
 
Strategic Analytics (and especially the predictive ones) that help decision-makers understand where the organization is going, not just where it has been, offers the greatest potential for organizations seeking to lever the enterprise data and big data social analytics in the most effective manner possible. Wayne Gretzky, perhaps the greatest hockey player of all time said it best perhaps. "I don't skate to where the puck is, I skate to where it’s going".
 
For the data management professional, our ability to help the C-Level understand where the organization is going is probably the single biggest opportunity to prove and validate the value proposition of enhanced data management practices. For those of us designing and implementing strategic analytic solutions- we know that architected, reliable, qualified, relevant and timely data is the only way to do this.
Read more

Tuesday, 4 September 2012

Open Data is Driving Enhanced Data Management Practices in Government

,

Blog Author: Rod MacPherson

We recently hosted an Executive Breakfast Seminar on Open Data at the offices of CORADIX and were pleasantly surprised by the greater than expected turnout by senior government personnel wanting to learn more about the risk, challenges, opportunities and benefits of Canada's Open Government Action Plan.  (see data.gc.ca)

Like many other jurisdictions, including 30 countries, and hundreds of municipal and provincial level government organizations, Canada followed the lead of the United States, who launched its' open data portal in 2009. (see data.gov)  Starting with the release of just a few hundred datasets, the Open Data movement in government entities around the world has led to the release of literally million's of data sets on all topics ranging from socio-economic data and geo-spatial datasets to extensive details on government operations.

The Canadian approach to Open Government relies on 3 pillars. Open Data, Open Information and Open Dialog. Open Data is about offering government data in a more useful format to enable citizens, the private sector and non-government organizations to leverage it in innovative and value-added ways.  Open Information  is about proactively releasing information, including on government activities, to Canadians on an ongoing basis. By proactively making government information available it will be easier to find and more accessible for Canadians. Open Dialogue, which is about giving Canadians a stronger say in government policies and priorities, and expanding engagement through Web 2.0 technologies. (my next blog will talk about why Canada differentiates between Open Information and Open Data)

For many in the room, the session was a sobering sort of wake-up call, as I pointed out to participants the obvious risks associated with releasing open data sets such as the liabilities and potential embarrassment associated with releasing erroneous data, to the not-so apparent risks such as foreign intelligence organizations combining disparate data sets in such a way as to expose national security vulnerabilities. (the example I used was a case where a vulnerability was exposed by comparing weather data with emergency response statistics and geo-spatial data - resulting in the ability to predict the impact of threatening weather on the security apparatus)  In addition to the direct risks from the data itself, we also discussed the potential political risks of government-wide releases of certain datasets, that because they lack a common architecture and meta-data standards. Exposing these practices (or lack thereof) could create additional workload and embarrassment for the stewards responsible for those datasets- and more significantly for the political apparatus behind it.

While there was much discussion around the downsides, there was also widespread acknowledgement of the benefits and opportunities, the most significant of which was enabling the constituent citizens and industry to generate economic advantage and innovation by leveraging the rich datasets now freely accessible.

For data management practitioners - the Open Data imperative provides the perfect business case- and timing to introduce enhanced data management practices in government sector organizations.  Data Management professionals need to be front and centre in these initiatives, putting in place to practices to ensure that Open Data sets are consistently architected, described and delivered with a defensible and competent approach to ensuring their quality. Data stewards need to be clearly identifiable as they will become the focal point for departments and agencies being faced with questions and challenges about their data. Finally, as always, none of these things are going to happen without an effective governance mechanism to ensure that these practices are being properly employed.

To us as data management professionals, we have always understood the compelling value proposition of improved data management practices for all organizations- however the lack of customer focus, and a profit imperative has made it difficult to convince government decision-makers to direct more investment in this area. The Open Data imperative breathes considerable new life into the strength of imperative; the time has come for us to "Carpe diem".
Read more

Tuesday, 31 July 2012

Integrating OpenText Content Server 2010 with SharePoint 2010

,
By: Bruce Miller, RIMtech Inc.
       www.rimtech.ca
      bruce.miller@rimtech.ca

As we all know, OpenText Content Server 2010 (CS 2010) is mandated as the EDRMS platform for electronic recordkeeping at the Government of Canada. Shared Services Canada (SSC) has developed a more or less standardized implementation of CS 2010, known as GCDocs, which Treasure Board hopes will be rolled out at some larger departments within the next year or so.

Many of these same large departments have also implemented SharePoint 2010 as a platform for collaboration and document production. However, SharePoint cannot be used to manage Government of Canada Records. SharePoint is non-compliant with GC recordkeeping requirements (see this report). CS 2010 is mandated for recordkeeping use, yet SharePoint is increasingly being deployed for collaboration. Departments with both platforms are going to need a way to somehow use CS 2010 to manage the records they produce and store in SharePoint.

Enter OpenText’s new solution called AGA, or Application Governance & Archiving. AGA allows the two platforms to be integrated such that records in SharePoint can be moved (archived) to CS 2010 to be managed as records in a fully compliant recordkeeping environment. AGA is a core component of OpenText`s Better Together strategy for integration with Microsoft products.

AGA is a sophisticated offering that allows a document to be moved from SharePoint to CS 2010 where it can be managed as a record. It provides for manual (what OpenText calls “Interactive”) and Process-driven (what OpenText calls “Automatic”) modes of operation. There are no less than 6 different ways and means of transferring documents to CS 2010, depending on how SharePoint is being used, and whether or not the document is a record.

The product is extremely thoroughly designed and well thought through. Keep in mind that a document is not just a document in a modern EDRMS platform such as SharePoint or CS 2010. A document has security permissions, metadata, and audit data associated with it. Each of the two platforms has a different format and protocol for each of these three critical document elements. Therefore it’s not as simple as it might sound to just “move” a document from one to the other. To their credit, OpenText has taken all of these compatibility differences into account. This can make the integration of the two rather complex at times, due to the platform differences that must be accounted for, which leads to a rather dizzying number of integration permutations that must be handled.

The bottom line is that the tool is sufficiently comprehensive to get the job done. Thanks to AGA there is a way that a department’s SharePoint records can be managed via CS 2010. The job of the Records Manager just got a little more complex and a little trickier for sure, but welcome to the modern world of EDRMS!

Bruce Miller
Read more

Friday, 20 July 2012

Recent Supreme Court of Canada Copyright Decisions Pave the Way for Research and Innovation by Expanding the Ambit of Fair Use

,
Copyright law is designed to strike a careful balance between the property rights of authors in their original works and the public interest rights to foster economic and social progress [1]. Historically, copyright jurisprudence tended to favor the economic interest of authors by narrowly defining the scope of available defences to copyright infringement claims.  Recently however the defence of fair use has been the subject of heightened judicial analysis [2] in CCH Canadian Ltd. v. Law Society of Upper Canada, where the Supreme Court of Canada laid out a two pronged test for the determination of a valid fair dealing defence to copyright infringement pursuant to section 29 of the Copyright Act.  The first prong of the test is a determination of the purported infringing action – can it be construed as “research or private study”, or “criticism or review” ,or does it constitute “news reporting”? In the event that the impugned infringing activity meets the first prong of the test then the second enquiry looks at factors such as “the purpose, character, and amount of the dealing; the existence of any alternatives to the dealing; the nature of the work; and the effect of the dealing on the work” to determine if in fact the appropriation of the work may be construed as “fair dealing”.

The increased focus on re-balancing competing copyright interests is largely driven by the juxtaposition of globalization and the growth of the digital economy[3]   There are those who argue[4] that the current copyright regime is based on eighteenth century concepts of property rights that advocated “artificial scarcity…and by analog limitations on copying,”. The digital economy on the other hand makes copying infinitely easier resulting in the “democratization” of content.
So what constitutes fair use in the context of the fast paced and transformative digital economy?  The Supreme Court of Canada has ruled on a series of five copyright cases that legal analysts believe may re-balance copyright law by tilting it more toward the public interest. The five cases address a wide spectrum of vexing problems that span copying textbooks, music downloading and place limits on the application of tariffs which regulators such as the Copyright Board may levy. Michael Geist a leading legal expert commented that in these decisions [5] “the court has delivered an undisputed win for fair dealing that has positive implications for education and innovation, while striking a serious blow to copyright collectives such as Access Copyright” and that” the court has recognized that innovation...is crucial to the economy…” These cases have held that cable companies and internet providers are not required to pay royalties for music downloads as it is tantamount to sampling merchandize before consumers decide on what to purchase, that school boards are not required to pay tariffs on selective copying of materials from textbooks  designed for study and research purposes.. Of particular significance of these decisions is the liberal interpretation of the meaning of “research” which “can include many activities that do not demand the establishment of new facts or conclusions. It can be piecemeal, informal, exploratory, or confirmatory. It can in fact be undertaken for no purpose except personal interest…”

These decisions affirm that there is greater good in expanding fair use to copying that may otherwise be protectable copyright in light of the rapid advancement of the digital economy. A broader interpretation of fair use can accelerate innovation and foster economic opportunity.  As the Oxford Economic Report on the New Digital Economy observed “ the new digital playing field has all but obliterated the old working models for the music, publishing and field industries…With information becoming a commodity…firms are switching from subscription fees to “freemium” pricing that combines free services with paid-for subscription services…” 

The Government of Canada  is placing increased emphasis on the strategic value of data characterizing it as  “Canada’s new natural resource”[6].  As an integral part of the Action Plan on Open Government[7] open data aims to provide “raw data available in machine readable format to citizens, governments, not for profit and private sector organizations to leverage it in innovative and value added ways.”  This contemplates the implementation of a licensing scheme that removes the current restrictive application of section 12 of the Copyright Act that protects Crown Copyright including compilations of data. A proposed “universal open government license” is designed to remove such restrictions and the recent Supreme Court of Canada decisions in expanding the fair sue doctrine may further accelerate the process. 
____________________________________

[1] Théberge v. Galerie d’Art du Petit Champlain inc., 2002 SCC 34, The Copyright Act is usually presented as a balance between promoting the public interest in the encouragement and dissemination of works of the arts
and intellect and obtaining a just reward for the creator (or, more accurately, to prevent someone other than the creator from appropriating whatever benefits may be generated
[2] CCH Canadian Ltd. v. Law Society of Upper Canada, [2004]
[3] The New Digital Economy How it will transform business, Oxford Economics, 2011 the total size of digital economy is estimated at $20.4 trillion, equivalent to roughly 13.8% of all sales flowing through the world economy.
[4] How to Fix Copyright, Bill Patry Oxford University Press
[5] ESAC v. SOCAN, Rogers v. SOCAN, SOCAN v. Bell - song previews, Alberta v. Access Copyright, Re:Sound
[6] Tony Clement, July 12, 2012, Winnipeg Free Press
Read more

Tuesday, 5 June 2012

A Must Read: A Technology Manifesto for the Cloud, Mobile and Social Media

,
John Mancini President of AIIM (www.aiim.org) has just published a thought provoking book Occupy IT Book that is a must read for information professionals. Written in a highly engaging manner he lays out in simple yet compelling fashion a blue print for how businesses must fully espouse IT innovations inherent in the intersection of cloud, mobile and social media technologies.  The author starts with the following premise: “…in prior decades, new systems were introduced at the very high end of the economic spectrum…Now it is consumers, students and children who are leading the way, with early adopting adults and nimble small to medium size businesses following, and it is the larger institutions who are, frankly, the laggards…”. 

The author makes a compelling argument that the confluence of rapid IT innovation cycles and consumer led mass adoption means that “…what is transpiring is momentous, nothing less than the planet wiring itself a new nervous system. If your organization is not linked into this nervous system, you will be hard pressed to participate in the planet’s future…”. The shift is toward “systems of engagement” with employees, customers, partners and external constituencies interacting in a dynamic and instantaneous manner unencumbered by the constraints posed by legacy systems.   

The author posits the view that “…the inexorable drive toward Systems of Engagement requires that we think radically differently about IT in our organizations…”.  The author’s call for action is in the form of a manifesto for “…creating a framework and a set of imperatives for how we should collectively look at our IT priorities in the era of consumer technologies…”. He calls for five key initiatives; fully embrace cloud based IT architecture, go mobile, transform the business into a social enterprise, remove paper from business processes by digitizing content and prepare for capabilities that extract value from big data. These initiatives call for more agile organizational structures and collaboration between business and IT “…where smaller teams made up of multitaskers and multidimensionally skilled workers with subject matter expertise, business savvy, technology skills, and a range of appropriate interpersonal and “political” skills…”.

The book is not only written in a personalized and engaging manner but is also extremely well researched, with extensive references to empirical studies and research that supports the author’s arguments as well educates the reader. And there is more. There are several chapters that provide a technology primer related to cloud, mobile, social media and records management principles and technologies. 
 
The book may be downloaded here: Occupy IT Book
Read more

Tuesday, 22 May 2012

Striking a balance between Data Privacy Legislation and Charter Rights relating to Freedom of Expression

,

Within the Canadian constitutional landscape data privacy legislation is arguably within the ambit of both federal and provincial jurisdictions. Under section s 91(2) trade and commerce clause of the Constitution Act of 1867 Parliament of Canada enacted the Personal Information and Protection Act (PIPEDA) which applies to private sector organizations with commercial activities across Canada. At the same time provincial legislatures have also implemented privacy legislations pursuant to s 92(13) property and civil rights provisions of the Constitution Act. While the constitutionality of PIPEDA may be challenged[1] at a future date[2]  there is acknowledgment of shared jurisdiction between the federal government and the provinces. Provinces are exempt from compliance with Part 1 of PIPEDA if provincial privacy legislations are “substantially similar” to obligations relating to the collection, use and disclosure of personal or personally identifiable information.

There is yet another dimension of the tension between provincial and federal jurisdictions that surfaced with the recent Alberta Court of Appeal decision in United Food and Commercial Workers Local 401 v Alberta (Attorney General), 2012 ABCA 130.  In this appeal the court was asked to determine whether or not a union’s actions of videotaping individuals who may have crossed picket lines and then posting it to a website are protected by Charter of Rights as freedom of expression or does it infringe upon privacy rights pursuant to the provisions of the Alberta Personal Information Protection Act (PIPA). The court held that while it is of outmost importance to protect individual privacy rights particularly in light of technology advances the definition of personal or personally identifiable information in the Alberta PIPA Act was deemed to too broad and as such encroached upon the equally important right of freedom of expression. The court was particularly concerned about the absence of reasonable limits on what is deemed to constitute personal or personally identifiable information. The court reasoned “that People do not have a right to keep secret everything they do in public, such as crossing picket lines. There is no recognized right to withhold consent to the dissemination of information about unpleasant conduct. Holding people accountable for what they do or do not do in public is a component of the right to free expression.” The court concluded that “While the protection of personal information is important, it is no more important than collective bargaining and the rights of workers to organize.”

This decision may prompt similar challenges with respect to other provincial privacy legislations as well as the Federal PIPEDA legislation. Furthermore this decision brings into focus the role that Information Management plays in the formulation, implementation and measurement of the efficacy relating to the collection, use, disclosure and disposition of personal and personally identifiable information. There are increasing complexities associated with harmonizing competing values in an effort to balance competing interests domestically and also as part of international or cross border obligations relating to the transfer of personal information.  For example the European Union’s Directive on Data Protection prohibits member states from transferring personal data unless the requesting party provides adequate levels of protection in accordance with the provisions of the Directive.  Information Management professionals need to be increasingly more familiar with the complexities associated with multi -jurisdictional privacy regimes, legislation and regulations in order to implement effective business processes to support the collection, use, disclosure and disposition of personal data. Further compounding the challenges faced by IM professionals is the dynamic nature of a constantly evolving privacy landscape.



[1] http://www.teresascassa.ca/index.php?option=com_k2&view=item&id=96:fresh-questions-about-the-constitutionality-of-pipeda?&Itemid=80.  The recent Supreme Court decision in Re Securities Act held that a purported national securities regime was unconstitutional as it encroached upon provincial jurisdiction under section 92(13). The federal government was unable to establish that absence of a national securities regime would undermine consistent administration of securities regulations which the provinces acting alone could not achieve thereby falling within the trade and commerce clause of the Constitution Act. A similar legal argument may be advanced with respect to the constitutionality of PIPEDA.
[2] The Province of Quebec initially challenged but then later abandoned its action. However in light of the Supreme Court of Canada decision in Re Securities Act a possible avenue may opened up to future constitutional challenges to PIPEDA.
Read more