The Oracle Australia and New Zealand Middleware and Technology Blog.

Wednesday, November 26, 2008

The Perfect ECM Architecture - Part 2

Hopefully you enjoyed Part 1 of this article and had a giggle at the analogy!

How should an organisation execute on their ECM requirements - well, to draw upon a similar analogy you need a vehicle and visit your local showroom. You are shown a real-vehicle, go for a test-drive, negotiate the price and when you get to the delivery day - the vehicle you selected is waiting for you with a full-tank of petrol ready to drive away. All you have to do is tune in your favourite radio stations, put the drivers-seat in the right position and set the mirrors correctly. Buying an ECM solution should be as straight-forward - the solution architecture you get should be like the car you buy from a dealer - the parts come from one manufacturer, have been designed to work together seamlessly from day-1 and all you need to do is configure it to your requirements. There should be no need to bolt the solution together and hope it all works, you shouldn't need to spend time and money with 3rd party suppliers to help you get it running!

We believe that the perfect ECM architecture should look something like this....

So, how should this all work???

The ECM solution should provide the ability to manage ALL of your unstructured information as easily as you manage your structured information. Managing unstructured data across a relational database AND a file-store(s) AND an index is painful, time-consuming and costly. Managing unstructured data WITHIN a relational database is easy, cheap and occurs using the common database management tools and processes you have already developed and deployed for managing structured data. Unstructured data should be stored, managed and protected within the repository and able to be leveraged by other systems including enterprise search using a common set of open-standards tools and APIs.

The ECM platform should be singular in its architecture and approach. You should not have to use different tools and interfaces for managing different types of unstructured data. It should be easy and seamless to create a new HR policy, for example, through MS-WORD (and SharePoint?) review and approve the policy using a template-driven workflow, convert to PDF (or HTML), publish to your portal, intranet or extranet, apply a retention policy appropriate to the document-type and its context and apply digital-rights all through a single solution.

The ability to archive any information from any system should be a component of the ECM solution. A lot of data exists in file and email systems for example. The ability to ingest this data, leaving a 'stub' of metadata is a deliverable that most IT managers will be interested in as it provides back-end benefits. Integrations with ERP and CRM systems to archive data and deliver data is crucial in the improvement of business processes - an example of this is Accounts Payable/Receivable where delivering a digitised image of an invoice, for example, reduces organisational risk and speeds up the payment process. This is driven by image capture, which is ideally also provided as a part of the ECM solution and delivers the ability to acquire (the process that occurs immediately after capture/scanning/eFax-delivery) physical information, tag/categorise/classify, and ingest into the repository.

Underpinning a deployment of Microsoft SharePoint is another area where a complete ECM solution will provide real benefits to an organisation. We all know what SharePoint is good at and most organisations are rapidly becoming aware of its limitations. As a user-interface supporting office-level collaboration SharePoint is great. However, for a larger organisation - its deployability presents some concerns, typically around the number of repositories it can create to support scalability. Providing a true-enterprise repository to support these multiple silos of unstructured data means that common and consistent records and storage policies can be applied and that data can be shared and leveraged by the organisation as a whole.

An organisation may already have tier-2 or tier-3 solutions in place to manage components of ECM such as records or web-content. A complete ECM solution will provide the capability to integrate with these environment to perform one or more of the following functions. Migration, easily support the rapid migration of unstructured information into the enterprise repository enabling the legacy solution to be decommissioned. Federating ECM functionality, such as unified records management, into the legacy repository enabling the application of common and consistent records policies to be applied to unstructured data without the need to migrate information. Leveraging the legacy solution through enterprise search enabling the organisation to find unstructured and structured data through a single search regardless of the repository where data is stored.

Finally, end-user content consumption and interaction through the organisational web-presence environments is becoming a key-focus for businesses around the world. Recent reports have highlighted that in order for a organisation to be successful in the current and projected economic climate, a closer relationship with its customers and partners needs to be established. This occurs in two key ways - publish and interaction.

The first of these, Publish, predominately supports the Internet or any other web-presence where content-consumption is the lead requirement. Content is created and managed within the ECM solution and then, as a part of its lifecycle, published to a website. End-users access the site and either navigate or search for information and land at the published content. In order for this to be successful and for the organisation NOT to end up with a management overhead because they selected a separate app-server/portal-server and ECM solution from different vendors - the ideal solution is to have a single solution in place. This enables the website and the ECM solution to be integrated in such a way that management and administation occurs once. New content, with potentially new categorisation resulting in a change to naviagation, is created, managed and published and ALL changes required around the content occur naturally and automatically. New classification results in the navigation hierarchy being modified - the user, or administrator, does NOT need to create a new 'node' on the menu manually.

The second, interaction, enables multi-directional web-based communication between an organisation and its customers and partners. Recent developments in web-technologies resulting in the adoption of Web2.0 and the popularity of social-networking sites such as FaceBook and mySpace have raised questions with organisations around how this can be of real-benefit to the business. Qualification has taken place around the investments required to return value to the business and organisation like Oracle have invested in initiatives like SocialCRM and its Enterprise 2.0 offerings. The interactive environment will allow employees, customers and partners to consume and contribute information within an easy-to-use (i.e. minimal training required) environment and is totally integrated with the ECM solution. Information from varying systems should be accessible through this single environment - allowing the mashing together of data, e.g. from the ERP, CRM and ECM environment to provide better context around information and participation within the creation and review process.

All of this functionality and capability is available from a handful of tier-1 ECM vendors including Oracle. Tier-2 and tier-3 ECM vendors can offer part of the overall ECM requirement but do so through an imperfect architecture - often requiring multiple UI's or repositories to be deployed or compromising on functionality or capability in some way.

I look forward to reading your questions and comments about this article - I'm sure it will raise many amongst our readership.


Greening the Data-Centre - the ECM Way

Greening the data-centre is an initiative that any responsible environmentally-aware organisation should be undertaking.

Simply virtualisation at the server-level goes along way to achieving your goals - but so does switching off monitors, ceiling-lights and other items of electrical equipment that are on your desks right now!

One of the major challenges that is faced by an organisation is actually around the amount of information spread across multiple systems and held in numerous versions or simple duplication of data. Within an organisation, 85% of information is comprised of unstructured data and of this - a high percentage (30-40% in some cases) is duplicated. Implementing an ECM solution properly, that promotes single-point-storage for unstructured data is a key deliverable in greening the data-centre. Managing a smaller data footprint on fewer servers delivers environmental benefits as well as reducing overall costs in real-dollar terms.


Friday, November 21, 2008

ECM - 'One' Repository or 'Three'?

Quick question, which looks simpler to you in the following diagram....?
If you answered A then you are probably from another ECM vendor organisation claiming efficiency, scalability and all the usual reasons quoted to your customers. Whilst it is true that during the 1980's and 1990's this was the ONLY architecture available that would work for an enterprise organisation - things have moved-on in the world!

Oracle's 11g technology allows any information to be managed in a single database architecture. You really, and truly don't have to deploy and manage a separate filestore and index in order to manage your unstructured information separately from your structured data and to the IT department - you can use the same management tools and methodologies you love to look after the 85% of information your organisation generates outside of the database. Think about a world where you don't have to worry about persistence of backups in order to prevent data-loss when recovering a system.

Doesn't the diagram to the right look simpler? We'll be covering more around the 11g architecture and time progresses but for now - consider simplifying your lives and providing richer functionality to your users through a single ECM solution.


Information Security

Oracle has a great capability in this area using 11G to store all three of the core data sets inside one instance ( Text, metadata and objects) this makes tracing illegal access both viable and practical which to us is the key to good security policy. The single-repository approach promotes this and provides some obvious benefits.

Patching with Confidence!

I was reading this recent article regarding the latest security patches released by Oracle, and it had me thinking about how to best manage patching cycle.

At the outset it addresses a frightening reality:
“In many cases, companies, especially large ones with many databases, are reluctant to bring down production databases to implement new patches. Many are also wary about deploying untested patches in live environments or need to wait for their packaged application vendors to test and certify the patches before they can be deployed"

“As a result, there usually is a considerable lag time between when a patch becomes available from Oracle and when it gets deployed. In some cases, the lag can be months. Other users simply skip entire patch cycles and choose to deploy the patches on a yearly or twice-yearly basis.”

Furthermore, the article said: “…of 305 Oracle database administrators from 14 Oracle user groups between August 2007 and January 2008 and found that two-thirds of Oracle DBAs apparently are not installing Oracle's security patches at all, no matter how critical the vulnerabilities are.”

This drew the author to the make the following stark but obvious conclusion: “…such practices can leave companies dangerously exposed to attacks directed against database vulnerabilities.”

Patch Management is no that difficult. So why are organisations still struggling? Many DBA’s and Operations Managers I’ve spoke to mention three recurring themes.
  1. Application Regression Testing- often, because of 3rd party application certification dependencies, there is a requirement to wait for the next application refresh cycle so that appropriate regression testing can be completed prior to upgrading.
  2. Confidence is often low because - regardless of the fact that an update has gone through certification - many DBA's have a story to tell about when a tested patch in dev worked differently in production as a result of environmental differences.
  3. The timing and time required to apply an update. DBA's need to allocate time to apply and test the update in a test environment, then they need to find a slot in the upgrade schedule to apply it in production. All of this needs to multiplied by the number of environments that need updating and the number of updates to be applied. Clearly, mechanisms are required to cut down cycle times and automate recurring tasks.
Patch Management requires planning and change management to ensure that each updateis priorotised and scheduled accordingly. The DBA should review each vulnerability patch and make a determination on the criticality and impact on their specific environment. Often not all patches need to be applied based on the broader security setup and criticality of the system.

As it stands today, Oracle provides a robust set of tools under our System and Application Management (SAM) portfolio.Real Application Testing, an innovation in Oracle Database 11g, allows database workloads to be efficiently captured in production and replayed for testing purposes. The facility faithfully replays transactions with order and timing preserved. It then reports on the performance of the test and highlights areas of change. Using Oracle Testing capabilities, confidence is increased are tests are based on real production workloads rather than synthetic test scenarios. Testing cycles are therefore shortened, not only because the test is largely automated, but also because the increased confidence associated with using real production workloads.

Furthermore, Oracle Enterprise Manager Provisioning Pack provides automated provisioning and patching facilities that make it easier to locate and apply patches. It also drastically reduces the effort required to patch multiple systems, which is great news if you’ve got scores of databases to manage.

Finally, when it comes to applying updates to production environments, 11g “Hot patching” means that it will be possible to apply an update to the Oracle binary while the database continues to run avoiding small downtime windows.

The article goes on to say that of DBAs and IT Managers at 300 organisations surveyed by the Independent Oracle Users Group, “…20 percent said they expected their databases to be breached in the coming year”.

Fortunately, these and other innovations in 11g, if utilised will take the pain out for DBA's, Application and System administrators which in turn will make their life easier.

Marc Caltabiano
Director, Enterprise Architecture

The Perfect ECM Architecture - Part 1

It's funny trying to explain the perfect ECM architecture to people representing a business-unit or even to IT on occasions. I often get asked 'What is the perfect ECM architecture' and of course, the answer is normally based upon the perceived requirements. So, to make things easy to understand, I'm writing this blog in 2 parts. The first part, that you are reading now, will focus on the imperfect architecture - really to explain what you should be avoiding and the second part will talk about the architecture that you should look to deploy.

I'm going to start with an analogy. You walk into a car dealership and talk to the sales person about your requirements for transport. You talk about the need to carry 4 people, in-car entertainment, performance and economy etc. etc. etc. The salesperson shows you a picture of a brand-spanking new vehicle that looks absolutely perfect for your requirements - it is the right colour, has 4-wheels, seats and a radio and the specifications for the vehicle show the right levels of performance and economy. You negotiate on price and come to an agreement, shake hands and set a delivery date. The day arrives when you are to have your new vehicle delivered. You arrive early at the dealership and get walked through the terms of the agreement, you drink some nice coffee and eat some chocolate biscuits. The time arrives and you are shown your new vehicle......

To your surprise, in the car park, is a pile of boxes from very different manufacturers - some of which you even recognise the names of. The engine has been shipped from Germany, the doors come from Geelong, the wheels from the UK, the chassis from the USA, the radio from Japan.... The sales person explains that what you brought is the best-of-everything meeting your precise requirements. What he failed to explain during the sales process is that you needed to spend an inordinate amount of time wiring and bolting the car together and that getting the vehicle serviced once complete would require visits to multiple providers. The salesperson offers the names of several mechanics and auto-electricians that can help you put together your new vehicle - at additional cost of course.

What do you end up with....?

As casual as this seems - this happens every-day around the world with ECM solutions. You think you know what you are getting from the vendor but more-often-than-not - you actually get something different. The document management solution doesn't talk to the WCM solution. Records policies cannot be applied to collaborative content. The WIKI and BLOG environment is separate and based upon unsupported open-source code. The user-interface looks clunky and the solution delivers not one, but many silos of information that you need to manage.

Your organisation doesn't need this, if it happens you will end-up spending more money that you ever imagined and the promise of an enterprise content management environment may never materialise! Your users will face an interface that doesn't provide any value and impacts their ability to perform their role on a daily basis. Of course, the inverse of this is a nice-looking interface that sits on top of a messy back-end that just causes headaches for the IT department to manage.

What you need is a nice-looking, functional user-interface based upon a complete, integrated and open-standards based back-end that provides the ECM capability that your organisation needs. This is simple, surely?

The perfect ECM architecture, I'll introduce you to that in the next article!


Wednesday, November 19, 2008

Oracle Fusion Middleware Forum

The recent OFM Forum was a great success in Sydney and Melbourne with around 150 people attending each event.  The event covered all facets of OFM with particular attention on the WebLogic Application Grid, Governance, SOA Best Practice and product roadmap information.  Presently also were several key Oracle Fusion Middleware partners including Intelligent Pathways, Integral Technology Solutions and Renewtek.

Thanks to all who attended. Below are links to the various presentations and podcast recordings of the speakers.

First up was Matt Wright (ANZ Product Marketing). Matt spoke about the Application Grid and how the concept of pooling compute resources and the services and data layers can result in much greater efficiency and scalability for applications.  Below is Matt’s presentation.  The podcast for Matt's presentation is here.

Next Ryan Close (CIO of Australian Vintage Limited) spoke about the use of SOA within the AVL. This focused on the automation of Order Management and the use of BAM to get greater business visibility.  Ryan’s presentation below and the podcast is here.

The next presentation came from Alison Foster (Managing Consultant at Integral Technology Solutions) who spoke about the how Business Process Management can enable a more dynamic business environment. Alison’s presentation is below and the podcast recording here.

We then shifted focus to governance and had Aaron Blishen (Solution Architect and Intelligent Pathways) talk about how to avoid becoming and SOA statistic through using the appropriate methodologies, tools and skills to ensure high quality delivery of projects.  Aaron’s presentation is below and the podcast here.

After a great lunch we had Peter McTaggart (CTO of Renewtek) speak about the use of Agile Methodologies and the relationship of these new project approaches to SOA.  Peter’s presentation is below and the podcast is here.

Wrapping up the day was Saul Cunningham (SOA Business Development, Oracle) who spoke about SOA Adoption and Best Practice.  This presentation looked at the SOA Maturity Model and the typical progression and best practice at the various levels. Saul’s presentation can be found below and the podcast here.

Monday, November 17, 2008

Real Application Testing Certified with E-Business Suite

One of the significant new options of Oracle Database 11g, Real Application Testing, is now certified with Oracle Oracle E-Business Suite Release 11i and 12. This means that for E-Business suite customers who wish to upgrade their database layer, from 10gR2 to 11g in preparation for an apps R12 upgrade.

Or migrate from a single siloed, monolithic infrastructure to a RAC cluster, as part of a consolidation strategy. You now have a tool that will greatly assist in all that expensive, mundane and repetitive testing that needs to be done when migrating a business critical application.

For more info take a look here:-

Once you have migrated your EBiz to 11g you can then start looking at the benefits of Advanced Compression.


Federating or Consolidation - A buyers guide to REAL ECM and what it means to your existing solutions

If you've recently purchased or are thinking of purchasing a REAL ECM solution from a software vendor - there is one challenge that you will face. What do you do with the information in your current systems?

Firstly, let me qualify 'REAL ECM'. There are not that many vendors in the market who offer a REAL ECM solution - that is, one that is Complete, Integrated and Open and able to be deployed to the enterprise as a whole. I can count the number on the fingers of one hand so you will gather that I'm excluding vendors who offer either a point solution (WCM alone for example) or those who profess to the complete solution but don't actually provide such an architecture.

Now to the problem - if you're going down the path of ECM you have more than likely taken the specialist path historically or have acquired an organisation that had their own solution. Either way, you're now looking at managing your unstructured information effectively and in a controlled manner and want to take advantage of the latest and greatest technology and capabilities on offer from the top-vendors. If this is you, congratulations as you've taken the first step towards successfully addressing one of the biggest challenges facing organisations today - the growth of data that lives outside of the database. Within an ECM project, and I've been involved in around 100 significant depoyments in my time as an ECM consultant, you will look at two things....

1. The usability, functionality, capabilitiy, benefits etc. etc. etc. to be realised by the organisation through the deployment of ECM and if you're following a defined methodology - will take the business requirements, functional requirements and solution design stages for each and every project you run. The output from these projects will be new abilities for the business to manage their own information freeing up valuable time and resource from the IT department. If done well, you'll also save a lot of money and streamline your business processes to the point where you'll sit back and wonder how you managed things before you started!

2. What are you going to do with all the information in your current systems? You will probably have a records-management solution that looks after the physical warehouse and all the paper stored in boxes, a document-management solution that you started to deploy but realised that the end-user change management was too hard, a web-content management solution that didn't meet the needs of the business and ended up costing you more in integration services than the software in the first place and a collaborative environment where your users can talk to each other in small groups/teams but not as an organisation as a whole. Buying a new ECM solution isn't going to solve world-huge (although I've seen it sold this way in the past) but it should help resolve the issues listed above if deployed in the right way. I've spoken at length in the past about successful projects - find the article and read it if you're still not too sure about what approach you need to follow. From a functional and capability perspective, a modern and integrated ECM solution will provide you with the ability to manage your unstructured information as easily as you manage structured data in your relational-database solutions. From a historical-data perspective, there is a challenge however.

You have legacy solutions that manage your unstructured information in a fragmented manner - lots of solutions and lots of silos of data, none of which talk to one-another. Do you migrate the information to the new ECM solution or leave it where it is and federate the information?

Migration presents it's own challenges. Whilst you may end up with a single solution long-term you will go through a lot of pain (and dollars) in the process. Migrating information to the new solution isn't particularly easy and certainly is not risk-free. You will have to replicate metadata, move the content, ensure consistency of information and guarantee compliance in the process. You will spend time and money on this approach and probably cannot guarantee success - and this is not the vendor or SI's fault, sometimes things are just too hard to accomplish. So that leave federating functionality from the new system to the old-solutions....

Federating ECM capability provides two core capabilities....

1. Finding information - enabling a secure enterprise search capability from the new solution to the old repositories means that your users have a single search-interface that can deliver results from a multitude of sources. Providing this capability within an ECM solution rather than a pure-play enterprise search-tool means that information is delivered in context AND relevance to the user through a system that understands (and has been configured) to deliver the business requirments for ECM.
2. Managing information - enabling you to set-up and manage policies that impact retention and disposition of information as well as storage-locations and applying these to information in existing legacy solutions means that you have a centalised control over your entire enterprise information-set and use common and consistent management techniques.

Effectively, with a Federation approach you leave the data where it is and stop using the existing UI's for content consumption and contribution. You switch over to the new ECM solution for content-management across its capabilities and manage/leverage your historical silos of information. Most of your legacy solutions will be database-based and while you can switch off support for the package itself - you own the data and the database management solution and don't have to switch this off.

Of course, the Federation approach won't work for all organisations - each company has its own requriements and in some-cases a migration approach of all information to the new ECM solution will be a mandatory requirement. Also, there may be some solutions that you won't want to switch off - drawing management for example - as the actual requirements are pretty specialised and the health-and-safety of employees and customers maybe affected through the migration to a system that compromises on functionality.


IT Focus - Internal or External, you decide for your future!

I have just read a fascinating post by Bob Evans over at Information Week entitled Shoot The Mantra: STOP 'Aligning IT With The Business'. In his article, Bob talks about two core concepts...

1. Traditional approaches to IT projects result in solutions that address yesterday's needs for the business and not tomorrow's because of a primarily internally focused requirements-set
2. To be successful tomorrow, companies need to look outside of their business and address the requirements of customers (and partners).

This whole strategy lends itself well to Web2.0 technologies and an Enterprise 2.0 mindset. Enabling a collaborative environment where internal and external users can participate in the business operations and have information delivered in context will allow the company to get closer to what their customers (who pay the bills remember) actually want and help define the future-state strategy in a clearer manner than in the past.

Bob goes on to say that 'this is a subject to which we'll surely be returning in the months to come' - we look forward to reading what else he has to say on this topic. The future of companies today (based upon current economic pressures) can be directly linked to how these very companies view their internal and external operations and communication channels. A company that engages its customers will be successful, one that doesn't is likely to fail.


Friday, November 7, 2008

Don't Give Up

Recently there have been reports of a SOA drop-off in the face of the financial crisis. The stories here and reports here that since the beginning of 2008 there has been a fall in the number of organisations that are planning to adopt SOA for the first time. These potential late adopters seem to be delaying SOA uptake or flat out deciding not to adopt SOA. The two key reasons for not adopting SOA are, according to the the reports, a lack of skills and a lack of a viable business case. Interestingly the reports also show that European organisations have a near universal uptake of SOA, the US a moderate uptake, while in Asia there is a distinct lag.

Does this make sense? Why does Europe find SOA compelling while Asia takes a much more cautious stance? Do the reasons for not up-taking SOA (lack of skills and lack of business case) have any relationship to the financial crisis? Or do they have more to do with culture and phase of economic development?

It is interesting to me that Europe leads the way in SOA adoption. This trend is also evident in other areas such as enterprise modelling. I think it reflects the more structured European approach that appreciates coherent design and long term planning. This is in contrast to the more pragmatic shorter term focus of the US.

Many parts of Asia are in a rapid process of ramping up their IT infrastructure. As such they have a choice – do they go for a quick project win that gives maximum short term business return or do they invest in an architected approach that costs more now but less over the longer term? Many organisations are in the process now of regretting the first of these approaches. It would be a major mistake for organisations to sacrifice the long term effectiveness of their infrastructures for the sake of short term gain even though it may be understandable that they may think along these lines because
of the uncertain financial situation.

Below is a graph showing the long term cost of different approaches. Over the longer term SOA is so much cheaper. If organisations had embarked on proper architected approaches earlier they would now be reaping the benefits of lower long term cost and so would be at significant competitive advantage in these difficult economic times. This is because SOA is a very cost-effective way of developing new processes and composite applications without big spending on new off-she-shelf applications.

My question at the bottom of all this is simple – if you are not doing SOA what are you doing instead? Are you just building systems as you need, connecting them as you need? I have always felt that asking for a business case for SOA is meaningless. You don’t ask for a business case to use an architect or have an architecture when designing a building. You ask for a business case to build the building. But once you decide that the building is viable you don’t then have a business case for actually design it. It is assumed. Why is it that we treat IT in such a haphazard way?

--Saul Cunningham

Wednesday, November 5, 2008

The Seven Deadly Sins of Modern Information Management

I was doing some research recently into Web and Enterprise 2.0 initiatives and came across an interesting article written by David Galbraith on his blog concerning the 7 deadly sins of Web 2.0. In his article, David describes these as...

1. Obsession with rounded corners everywhere.
2. Pastel colors.
3. Linear blends.
4. Fonts bigger than 15 pixels.
5. Avoiding tables, when they are the best solution.
6. Stretchable text columns that are too wide to read comfortably.
7. Ajax use that makes things difficult to link to.

...which amused me somewhat as it focuses on the presentation aspects of Web2.0 rather than the fundamental underpinnings of this technology which is supposed to make information access easier for all. I thought about my area of expertise, information management, and started to jot down what I think the seven deadly sins of this area are (for modern information management systems anyway).

1. Thinking that Information Management is a one-way process. In today's society, collaboration and participation are relevant to information management. It is certainly a multi-way process and all efforts should be made to introduce this philosophy wherever possible.

2. Not implementing a truly-integrated solution. Most organisations have multiple solutions that partly address information management for the enterprise. A modern solution should be truly integrated - both internally and externally, meaning that within its own environment the varying aspects of functionality should seamlessly work together and outside, should enable information from various sources to be integrated (or federated) into a common, single-looking solution.

3. Ignoring security, privacy and intellectual property. With collaboration/participation comes the challenge of security - both from a perimeter and from an information-protection perspective. Again, these areas should be seamlessly integrated (or made available) from an information management solution.

4. Forgetting that you have end-users. Yes, end-users are crucial to any successful deployment as I've said a few times on this site. Addressing end-user requirements and involving these people within the project/deployment processes will ensure success when rolling-out the solution. Addressing a modern approach to information access (particularly Gen-Y) is the way forward - highlighted by Lindsay Tanner in his interview with the Australian IT today.

5. Focusing on navigation NOT search. We used to think that implementing an explorer-like interface for information management was the right thing to do. It isn't! End-users won't always know which folder information has been stored in or the metadata applied by the original author so search IS relevant for an information management solution. Automatically applying metadata and/or extracting relevant information from the content enables any user to find information quickly and efficiently.

6. Not addressing the paper-issue. 20 years ago managing paper (or physical information of any kind) was inexpensive. Today, it is costing a lot of organisations a lot of money on a daily basis. The cost of managing paper within a process can be huge and the risk-exposure unimaginable in a lot of cases. Get the physical information digitised as soon as it arrives in your organisation and use an electronic workflow to manage the process for review/approval etc. Store the paper temporarily (30 days or so) then rely upon an enterprise-class archival solution that addresses your compliance and retention/disposition requirements.

7. Not addressing your compliance requirements. Around the world, government and legal agencies are regularly introducing newer and stricter requirements for managing information within an organisation. The process for storage, migration, retention and disposition of information can be audited in a lot of environments and organisations can no longer rely upon ignorance after deleting vital detail from their information management solutions - and yes, I do include your email solution here as well!

Hope that helps...


Australian Federal Government and Web2.0

I read an interesting article this morning on the Australian IT website where Lindsay Tanner (Finance Minister) was quoted as saying "The rise of internet-enabled peer production as a social force necessitates a rethink about how policy and politics is done in Australia". He went on to say "In the longer term, governments will have to adapt to information's new online center of gravity."

The challenge for companies like Oracle who provide Web and Enterprise 2.0 technologies to Federal and State Government departments is changing the mind-set that was established some years ago when EDRMS solutions were first pitched and sold. Departments brought off on the grandiose (post edit: meaning more complicated that was actually required at the time) idea that they would be able to effectively manage electronic and physical documents through a single solution - allowing their content-creation processes to be supplemented by formal records keeping procedures within a single solution. However, in a lot of cases - only the records management solution was actually implemented - the document creation environment was often deemed too-hard or unnecessary for the department and placed additional constraints on the information workforce.

If Mr. Tanner's approach is to be successful, and we wish him every success in what he is trying to achieve, then these departments have to truly embrace the thought that their staff and their customers (you and I) will require an easy-to-use solution that provides information in context and enables participation within the process. Sounds like a load of marketing speak? Let me explain.....

Ease of access to information requires some radical rethinking on the way that information is presented to the end-user. Simply deploying a solution that implements "Windows Explorer" on the Web isn't good enough and nor is a solution that implements itself as a raft of non-integrated silos of information. The Solution needs to be enterprise-class and needs to embrace and implement the Web/Enterprise 2.0 way of accessing information. Search rather than navigation is the key here and to make search work well - information categorization and classification needs to be automated and consistent. Users searching for information will retrieve relevant information based upon the terms that are entered or the keywords/tags selected. This is what we call information-context.

The context in which information is searched for should follow-through to the way that the retrieved information is presented to the user. If we take the tax office (ATO) as an example, searching for the term "Christmas Trees" could result in many fragments of information being retrieved from the repository(s). By providing context to the search, the results can be narrowed enabling the right information to be presented to the user - ordering by relevance. The user may be searching for the tax rules relating to the sale of Christmas Trees from a private residence - this is the information context and the end-user facing interface needs to take this into account. Once found, the user may want to start an online conversation about the tax-rules and this is what we call participation.

Participation allows multiple persons to collaborate on an initiative which may be as simple as asking a question to a government tax-advisor or as complex as the creation of a new tax rule within the tax office itself. In either event, security and privacy become a challenge if using some of the historic information management solutions or even through some of the latest offerings from information-management vendors. Security needs to be tightly integrated into any environment where participation or collaboration is enabled and it isn't as simple as introducing a directory-service with authentication unfortunately. When information passes through the traditional firewalls of an organisation - through eMail or through a Laptop being removed from site - there is a risk, and it is ever increasing seemingly, of the information falling into the wrong hands. Take for example, a government department who is working on a new policy affecting certain members of the population. The department is liaising with external organisations including lawyers, consultants and advisors. Information, and I'm referring primarily to documentation (Word, Excel, PPT in general), can be leaked to the public or to the press accidentally or otherwise. Implementing a solution where the control of access to this information can be guaranteed is a highly-desirable requirement when information-management requirements embrace Web/Enterprise 2.0. We call this Information Rights Management (IRM).

Mr. Tanner went on to say "that the Government not only had to adapt to a world moving online, but would have to do so at an ever-increasing pace. As a huge creator and manager of information with an obligation to be open and transparent, we have little choice."

We think that Mr. Tanner is onto something here, the government is the largest manager of information and needs to look to implement a more-open approach for the access of information - both internally and externally.


Monday, November 3, 2008

The Future of Education in New South Wales

Earlier this month I attended an AIIA here in Sydney recently about the future of ICT in the NSW Department of Education and Training. Speakers were Michael Coutts-Trotter, Director General of NSW DET, Stephen Wilson, CIO, and Pam Chrisitie, Director of the Sydney Institute of TAFE NSW.

Michael Coutts-Trotter was up first. He opened by saying that he had good news for the mainly vendor audience - I was there representing Oracle, I saw Microsoft, IBM, Telstra, Fujitsu etc... Anyway, the good news for software vendors, was that the department spend a lot on ICT and they're going to spend a lot more. A round of chuckles from the audience and we were off, first up? Social networks of course!

He said that when he is in schools he talks to the kids about which social networks they use as a bit of an icebreaker and the results... Bad news for MySpace. Facebook and Bebo are anecdotally more popular.

If there was one single thing I took from the session it was this, "The use of social networking software for teaching and learning is of profound importance to the department".

He went on to sat that "professional development" for teachers is key in order to, "equip our staff to take advantage of the technology." By the way, the "average age of teachers in NSW is 47". The Department are currently rolling out a Learning Management and Business Reform (LMBR) Project (powered by SAP) which will, among other things, deliver "principal dashboards" to enable a school principal to easily see and ask, "why is attendance in teachers A's English class different to Teacher B's English class?". He was then at pains to point out that these dashboards would not just about management by penalty.

As far as putting technology in the students hands he talked about 1:1 Computing, "we want to put a laptop in the hands of every year 9-12 student. Estimated to cost $2400 per student over 4 years". And NSW are currently seeking extra funding from the Commonwealth. Michael finished up by saying that "the job of leadership in a school - done well - is one of the most extraordinary things you can see"

Stephen Wilson was up next. Some of the highlights from Stephens talk were that the Department "process 55M emails a month - 40M of which are spam". He talked about the roll-out of gmail for student email, and said that would be complete by the end of the month. He also talked about rolling out an eBackpack in "early 2009", and that this would be "4GB of online storage for students and teachers". He also said that along with 1:1 computing the Department are "wirelessly enabling all schools with students in years 9-12"

Pam Christie was the final speaker, some of her highlights where around a change of focus for TAFE from "school leavers to job seekers and up-skilling existing workers". she also spoke of how TAFE are, "moving to a Virtual Learning Environment... including access to Web 2.0 tools, virtual worlds, rich media, content management and learning management systems"

All in all I thought it was a really good event. As a parent with a child going through the NSW Public Education System and as someone who's worked in IT for 20 years or so I came away with a real optimism for the future of teaching and learning.