Thursday, July 31, 2008

IQ Visual Mapping Update

The excitement continues to mount as we move towards the October 2-3,2008 IQ Visual Mapping Conference to be held in Morristown, NJ. I am very happy to share with you that Michael Scherotter who is well known to the Mindjet Community has agreed to speak. Michael was the force behind Mindjet Labs when he was working at Mindjet and was an active member of the mind mapping community. Michael is is currently an Evangelist with Microsoft Corporation. As part of Microsoft’s Communications Sector evangelism team, Michael focuses on enterprise development tools and technologies for telecommunications, web hosting, and media & entertainment companies. Educated as an architect with both a Bachelors of Architecture and Masters of Architecture specializing in design tool development, Michael brings a sense of creativity and design to synergistic software development. Throughout his 13-year career in software development, Michael has specialized in combining software in innovative ways. You can read up about what Michael is up to at his blog called the Synergist.

Also check out Michael's blog post about the conference, I know that you will enjoy it. Hope to see you at the conference!

Service Providers - How Do You Find Good Ones

Within a few hours of each other, I received two requests for referral to service providers. One request was for eLearning development providers from fairly large 5,000+ employees based in the US.
We've looked at a few eLearning vendors and haven’t been thrilled. We found the three US vendors expensive and/or light on good ID and the India-based vendors (Tata and Brainvisa) are priced really well, but would require more extensive project, quality and ID management. So the 50K question is do you have any really high quality referrals – that might also be able to come in at reasonable prices and would be able to to turn a 2 or 3-hour ILT into a simple yet sophisticated CBT all the way up to eventual simulations and online custom leadership content?
Likely they would want to have a provider who is fairly local, but not sure about it. They are fairly new to eLearning design and development.

The other request is for providers of new-hire orientation outsourcing companies:
My boss wants me to look up the three (or so) best-in-industry new-hire orientation outsourcing companies. Does anyone have experience in this area to help me out, or at least direct me where to search? That would be MUCH appreciated!
I have a few thoughts on where you might go to search for this, but I would be curious how people who read this blog would go about finding service providers for themselves, or how they would advise these two people to go about it?

For thoughts about this, but from the service providers perspective (especially service professionals such as accountants, attorneys, consultants, etc.) take a look at Social Media for Service Professionals and Social Media to Build Reputation and Reach Prospects – More Ideas.

Leading Learning and New Skills

This month on the Learning Circuits Blog - I asked some very leading questions
If we have responsibility for informal learning, social learning, eLearning 2.0, long tail learning, etc. then ...
  • Don't we have to conclude that learning professionals must be literate in these things?
  • If so, then what should learning professionals do to become literate?
  • Should workplace learning professionals be leading the charge around these new work literacies?
  • Shouldn't they be starting with themselves and helping to develop it throughout the organizations?
  • And then shouldn't the learning organization become a driver for the organization?
  • And like in the world of libraries don't we need to market ourselves in this capacity?
If we really care about improving performance, then we need to recognize the scope of our Learning Responsibility and to broaden ourselves to go from Learning Objectives to Performance Objectives and Business Needs.

Kimberly McCollum in The networked nature of information fairly calls me out for asking such leading questions -
They are more like a rhetorical rallying cry to the already converted. “Yes! We should!”
However, in defense, I would point that there's a disconnect between saying "Yes" and the level of understanding and adoption among people in the profession. Go to an ASTD conference and ask about this stuff. You won't find many who even are aware of any of this. I'm personally out to change this, but we are a long way from being in a position to lead.

And let me back up the need with some thoughts from other bloggers on these questions...

The Learning Revolution: Where have all the leaders gone?
It's not necessary to use all the new online tools that are out there but it is necessary to know about them and understand them if for no other reason than it gives you options, and may improve personal and organisational performance.

I don't believe that a learning professional could call themselves as such without being aware of all the latest developments in learning methods /approaches.
Harold Jarche - Skills 2.0
Enabling learning is no longer about just disseminating good content, if it ever was. Enabling learning is about being a learner yourself, sharing your knowledge and enthusiasm and then taking a back seat. In a flattened learning system there are fewer experts and more fellow learners on paths that may cross. With practice, one can become a guide who has already walked a path. As fields of practice and bodies of knowledge expand, a challenge for learning professionals will be to change their tool sets from prescriptive to supportive.
Gina Minks: Adventures in Corporate Education What Competencies do Knowledge Workers Need?
How can you design with these new tools if you don’t understand them? How can you apply them to your existing systematic learning system if you don’t know what the heck wiki even means? So, yes, learning professionals must learn and use these tools, and then apply the tools to their existing framework.
Clark Quinn - Learnlets: Lead the Charge?
The point being that to truly help an organization you have to move to a performance focus, moving people from novice, through practitioner, to expert, and giving them a coherent support environment. To do this, you need to know what’s available. And, consequently, the learning organization has to experiment with new technologies for it’s own internal workings to determine how and when to deploy them to organizational benefit.
Stephen Lahanas - Welcome to The Revolution
We are indeed at a cross-roads in our perception of what learning can or should be be. It is definitely a revolution, one that can be equally applied to both the personal and organizational level.

Those educators who truly believe that the learners come first and that learning is a continual process should not feel intimidated by whatever new technologies emerge that might be applied to education. This is not a threat - it is enhancement that enriches both learners and educators.
Shilpa Patwardhan: Would you trust a firefighter who did not know how to fight fire?
How in the world can we kid ourselves that not keeping up is okay? Would you trust a firefighter who did not know how to fight fire? Would you trust a lifeguard who did not know the latest life-saving techniques? Would you trust a surgeon who did know the latest surgical procedures? Then why should anyone trust learning professionals who wonder whether they need to be familiar with latest technology?
Catherine Lombardozzi - The short answer is yes
If a learning professional wants to be a thought leader in his or her organization around how to support learning in the workplace, he or she cannot be illiterate in these new technologies.

As learning professionals, not only do we have to come up to speed on the technologies, we have to develop a clearer understanding of how these 2.0 technologies can be used to support learning. Otherwise, our organizations will stumble, and we’ll wind up behind instead of ahead.

Kevin Shadix - There's no "I" in "We."
A big mistake made by way too many folks is to preach the good word without having gone through the transformation themselves. Web 2.0 represents a whole mind shift, not just a set of tools. It is the power of “we” not “I”. It is about people creating content together, not the lone, brave hero leading the pack. The only way to “get it” is to try it.
Deb Gallo - Lead the charge?
As L&D professionals it’s up to us to be innovative and introduce the business to tools and methods that will ultimately improve business performance.

We need to develop our competencies, skills and comfort levels with these new tools. Unless you try it you won’t really get your mind around the possibilities they bring or how you might use them in the workplace.

Taruna Goel - New Work Literacies - Leading the Way
I don’t want to be a learning professional who is sitting on the fence and talking about new tools and technologies and hasn’t used any!
Several people suggested that we should use caution when considering any leadership role ...

Kevin Shadix - There's no "I" in "We."
2.0 has implications beyond the learning function, and we need to let other groups discover and figure out for themselves how they want to use them.
Clive Shepherd
The application of web 2.0 to organisations is not exclusively a learning issue - it permeates all aspects of the way in which people network and collaborate. First of all, the web 2.0 concept must be appropriate to the organisation, and this is open to question when you're looking beyond knowledge workers. Assuming it is appropriate, champions can come from many quarters. If learning professionals have really bought into the idea and can demonstrate how they are applying it productively, then they are in a good position to lead the charge. If not, someone else from another business function will step in.
Jay Cross - No, no, no, no.
It’s presumptuous to assume learning professionals are going to be “leading the charge.” This is not some independent effort. Organizational stakeholders better be taking the lead. And we’d better be supporting their vision.

Of course we must use network technologies ourselves. Understanding how to apply social networks to improve organizational performance is a prerequisite for shaping learning and development from here on out. People who are illiterate in network technology need not apply.

Some good specific suggestions:

Kerry McGuire - Live and Learn: What's the real question?
  • Find two or three people with wide networks and help them solve a workplace issue using these tools.
  • Recruit other people that are passionate to start sending out the same message.
Christy Tucker - Experiencing E-Learning: Leading by Example
If I had to focus on one single skill, it would be lifelong learning. Perhaps this isn’t a skill so much as an attitude.
Peter Isackson - Phoning it in
I would put my effort into making it work from the bottom up and demonstrate how it can achieve other things than self-promotion.

Wednesday, July 30, 2008

Blogs, Social Networks and LinkedIn Answers

I received a great question from someone relative to my last post - Required Reading for Training Managers where I continue to suggest the benefits of blogging (see Blogging - I'm Pushing Harder Now and Top Ten Reasons To Blog and Top Ten Not to Blog for recaps of much of this). The question was:
What is your assessment of the relative benefits of pure blogging vs LinkedIn Answers and other social networking platform-based discussion venues?
I consider LinkedIn Answers to be quite a different animal. The way I use LinkedIn Answers is to ask specific questions (never open ended or partially thought out the way I do in a blog) that I want to get specific answers or find people with expertise who I can't seem to find by searching LinkedIn normally. This question goes only to my network and then also out to the rest of the world who look at Answers. It's a great way to get help on specific inquiries. But, it's just that.

For example, the question above would probably be okay in LinkedIn Answers, but you might want to change it a bit to be more specific.

LinkedIn Answers is limited in time and does not create any kind of sustained conversation. But it does work across both your own and your indirect network.

Social Network based discussions (for example a Ning network) act like communities. I've had several posts talking about the differences of conversation in communities and those via blogs (network-based discussion). I'd look at:
There's a fair amount to consider around the differences. To me blogging is a great engine for network-based conversation and continuous learning. You likely can use a community the same way, but most people don't.

Likely looking at how to Learn and Network with a Blog ways to promote Blog Discussion
and Types of Blog Discussions are also good pieces.

It still comes down to a personal style, commitment level (to sustained learning).

But Wendy and Karyn will tell you it has high value for them. Of course, we all know they are both pretty weird.

Tuesday, July 29, 2008

Using aHa!Visual Web Export

I thought it would be a good idea to share with you information about the the IQ Visual Mapping Conference by using a mind map. What better way to convey information about the speakers and their bio's, then to export it to a mind map! In the process, I had the opportunity to use aHa!Visual Web Export which is a MindManager add-in from aHa!Coaching in the Netherlands. The aHa!Visual web export allows you to publish the results of a brainstorm or strategy session in the same visual map format the participants saw on their computers without the need for any other plug-in or viewers. Using a standard browser you can view your map complete with attachments and notes. I found using aHa!Visual Web Export to be be very easy and straight forward. Once aHa!Visual Web Export is installed you simply click on the Export tab and Select Export as Web Pages. (View the Process by Clicking here.) At this point you will see that you have a new export template call aHa!Visual Web Template. You simply select the aHa!Visual Web Export template, select where you want the output files to be stored and click OK. Once the process is complete you can view your map in your default browser. The mind map is interactive and you can click on the topics as well as the embedded documents and attachments to view the contents. To view it on the web just use your favorite FTP program to upload the files on your server and you are ready to go. After uploading the files I was able to view the map. I do wish that the mind map could appear bigger on the screen and that you could open and close the nodes as you would when using MindManager 7 application- but overall I was very satisfied with the ease of the the process and how aHa!Visual Web Export worked. This really was very easy to do! You can view the mind map on the web by clicking here.

MediaWiki with IDS (courtesy of Jacques Roy)

This is a very quick post, just to reference a blog article by old time IDS guru Jacques Roy.
Jacques Roy has years of experience with IDS, in particular in extensibility and he writes a blog.

In one of his articles he tells us that he manage to run MediaWiki with IDS 11.50.
This setup takes advantage of one of the new features of IDS 11.50, the Basic Text Datablade (BTS) that you get for free with it.

As you probably know, MediaWiki is the best known Wiki software, and you've probably used it in WikiPedia.
It's very nice to see that it's possible to use IDS with this kind of software, and if you'd like to use it, I'm sure Jacques would be glad to help you.
I took the opportunity to add Jacques's blog to the list of links.

Required Reading for Training Managers

Great post from Karyn Romeis -Either it matters or it doesn't. It discusses the all too common issue that learning professionals receive assignments to create learning solutions only to find out that the client doesn't really know what they want/need, and worse yet, really don't care, or the audience isn't willing to give time / attention to it, or you lack SME attention, etc.
There is often the perception on the part of the client that they should be able to say "We want an elearning about health and safety in the workplace" and we will go away for a few days, only to return with an all-singing, all-dancing piece of elearning that covers exactly what they wanted to cover and includes all sorts of sexy graphics and clever interactions.
Later in the post ... she raises a great question.
So when they say the same things of the management development programme they have asked me to develop, but getting access to SMEs and stakeholders is like pulling teeth, well then I have a hard time believing them. When words and actions are mismatched, the true message is in the actions.

Wouldn't you say?
I've discussed similar kinds of issues in What Clients Really Want : eLearning Technology.

But what really struck me as I read this and also took a look back at some of the discussion between Karyn and Wendy (see Please Stop Throwing Stuff at Me!!!!) ...

This is fantastic stuff that should be required reading for all Training Managers.

It's a great example of the value of blogging. It offers Wendy and Karyn fantastic opportunities to help each other out ... sometimes they are the only two leaving comments. But, it is also quite a treasure trove of information on what works or doesn't work in their worlds.

As I'm thinking about it - likely it's a treasure trove for anyone starting out being an eLearning developer / designer. This is what life is really like. This is what you face. This is how they handle it.

This represents some of the best of what blogging is all about!

Monday, July 28, 2008

Brain 2.0

I had a nice six hour drive today with my wife, Margaret, who is an ex high school counselor and teacher along with all the normal credentials and masters degrees, etc. Part of our conversation was in my changing belief about the importance of learning a bunch of facts that someone can look up at a later time. Does a student really need to know all the state capitals? I argued that it was more important for my kids to know:

a. when they might care about a capital in their life (when they might want to know about a capital) and how to look up a capital (and possibly how to check the accuracy if they are just using google).

than it was for them to know

b. the 50 capitals, states and the locations of the states.

Granted, I am sometimes amazed that people have no clue where a state is and I'm certainly happy that my kids have done well on the capitals/states tests in their lives, but I'm still pretty adamant that we should be looking at aiming at creativity, synthesis, composition, etc. more than memorization. We need to create students who are knowledge-able rather than knowledgeable.

I also had to vent on my poor wife about a question asked by a history professor in college. The class cost me my 4.0 GPA (and I only had to take it because of a weird rule that I couldn't count my AP US History Units against college world history so those AP Units did me no good ... the horror of the situation). In any case, the question was two parts multiple choice. (1) "What was the population in England in 1800?" (2) "What percentage worked in agriculture?". I actually knew the first part, because I believe that fully 10% of the population had moved to London which had grown to 1M people. (Now these facts could be completely wrong some 25 years later, but that's besides the point.) I got part 1 correct. The second part I had to guess between 25% and 35% or some such thing and still don't remember.

I could have told the professor about the move towards more urban and away from agriculture, but he didn't ask that. He didn't know if I knew the important concepts that he stressed in the class. No he had to ask a ridiculous memorization question. I vowed never to ask such a thing or at least to have all open book tests so that such questions were useless. And, I'm pretty sure I stuck to that pledge. But the rest of the world still asks these questions all the time.

And my wife certainly would. She feels it's still important to teach memorization and I don't disagree. You still need memory to be able to pull up how to look up the capitals and when it might apply. But my guess is that your brain would be organized significantly different if you were taught around concepts, and were taught when and how to look-up as opposed to all the little bits.

I get back from the trip and I see a post from Brent - There is no Brain2.0...so why Learning2.0? and I have to jump in and say that I'm not so sure that there's not something along the lines of a Brain 2.0 emerging. I'm not claiming that the brain itself has changed, but instead what's changing is:
  • metacognition
  • metamemory
  • access to information
  • access to other people
  • access to smart systems
all of this changes what the brain needs to do. A look inside the processing of problems by the brain of someone born today when they reach 40 vs. the brain of someone who is 40 today, I would guess is going to be quite different.

Friday, July 25, 2008

SnagIt 9 and MindManager 7

Sometime ago, I did a screencast showing you how I use SnagIt with MindManager to add images to your mind maps. Since TechSmith recently released SnagIt 9 along with the MindManager Plug-in, I thought that it would be a good idea to show you just how you can effortlessly add images to your mind maps. In this screencast you can see just how easy it is to capture images and send them directly to MindManager 7. You can click on this link or on the image to see the screencast. Enjoy

Sharing with Crossloop

A couple of days ago I had mentioned that I came across a screen sharing application called Crossloop and just yesterday evening, I finally got a chance to work with it with my colleague Wallace Tait. Both of us were talking over Skype and decided that we wanted to take a look at how JCVGantt worked with MindManager in preparation for the IQ Visual Mapping Conference that is planned for October 2-3, 2008. We both started up Crossloop and exchanged our access keys that we entered into the Crossloop widget. In a matter of seconds we were connected and a dialog box popped up asking for permission to share each other computer screens. Once I clicked on the OK button I was able to see Wallace's screen and I could control his computer as if I was looking over his shoulder. The screen was very clear and there was no noticeable delay when I was moving icons or double clicking on applications to start them up. Since Wallace could also use the mouse as well, there were times when we would both try to do something and realize that we had to wait for the other person to relinquish control of the mouse. Other than that it was easy to go back and forth with sharing our screens by clicking on control icons that I would consider to be a very easy and intuitive to learn how to use. Crossloop is a free tool for Window's users that would be ideal for one on one webinars and for doing remote support. Using this technology both computers would need to have Crossloop client installed. Once installed it is very easy to use to get your session started. Download Crossloop and let me know how you successfully used it in the work that you do. If you are looking for training in the comfort of your home or office, over the web, in the area of assistive technology, visual mindmapping, or project planning- shoot me an email and I will let you know how we can get started.

Thursday, July 24, 2008

NASA Images

NASA Images was just launched (press release) and it uses software that was partly developed by my firm TechEmpower. It has a bunch of neat features that makes it fun to play with, especially the ability to create different kinds of widgets of groups and images and workspaces. Warning, if you go there and like these kinds of things, prepare to spend a bit of time playing.



Jing Turns One!

Happy Birthday to Jing Project which celebrated its first year in production. I have used Jing in my courses and continue to use it for quick video tutorials. Techsmith in honor of its first year birthday has double the amount of storage and bandwidth on Screencast. So now you automatically get 2 GB of storage space to store all of your Jing screencasts. What a great deal! It looks as if there will be some premium services built around Jing in the coming months based on the email that I received. So go enjoy your new found freedom to create more Jing projects!

Webinar Design / Training

In the post - Webinar Software - Adoption Advice - one of the comments I made was:
If you've done webinars, you know that they are different to design and deliver successfully than other kinds of presentations and training.
This is something I learned the hard way. My very first presentation was a large public presentation where they had muted the entire audience (dead silence) and there was no moderator. I was alone in my office. Holding a handset (ouch). So, five minutes into the presentation, I felt completely disconnected from the audience because I hadn't planned for ways to connect and make sure I was doing okay. It was an awful feeling.

Out of that comment, I received a question:
Your post from yesterday brought up an important point and one that I had not considered. That of appreciating the difference between Webinars and other online presentations. I would LOVE to learn more. Today, coincidentally, I am doing 4 webinars (I set aside one day a month for this as a primary outreach tool). Now that you have made this interesting observation, I need to get trained. Where would you suggest I look to understand the nuances? We use GoToWebinar for our events.
I immediately thought of Karen Hyder - who helped me prepare for a couple of sessions and is really great and really thorough. She definitely helped up my game. I also thought of Ann Kwinn and Ruth Clark's book - The New Virtual Classroom.

Knowing who asked the question, he's actually a pretty sophisticated presenter. He doesn't need the basics. And likely he's actually asking about information on differences in design between different types of online meetings (webinars, online presentations, online workshops, online classrooms). And then design specifics for these.

So, I'll definitely make an introduction to Karen, but I'm wondering what else you would recommend around this. Likely a lot of us can use help in this area. Any pointers are appreciated.

Wednesday, July 23, 2008

Webinar Software - Adoption Advice

Good question about adoption of webinar software / services - this time from someone who went to my workshop in Cincinnati - Revolution in Workplace Learning - who by the way - called it "awesome." Here's the question:
We are currently moving toward web-based training for an external audience and have been experimenting with a modest product called Ready Talk. It doesn't have any bells and whistles like web cam compatibility or video streaming, online polling, white boards or anything cool like that. We are considering moving to other products such as Adobe Connect.

We are looking into a contract where we have to purchase a minimum of five licenses for the platform. I think we'll feverishly use three of them possibly four, but my office manager thinks I am nuts. I've made the case, or at least the statements on the reduced costs for travel, lunch, and copies we can expect.

What I fear is that we are secretly not committed to the shift in the way we meet. Our webinars have been successful by our standards and for the most part, we've been early adopters of the technology. This tool would make our web-based training and consulting work worlds better with more opportunities for engagement and collaboration online.

I just want to be sure that I'm not committing too heavily.
To me there's a few additional questions embedded here (between the lines):
  • Do you need to commit to that particular tool? Can you change out technically fairly easily? Can you change out contractually? Any experience with being able to try these things out and possibly moving later?
  • Anyone have concern about adopting Adobe Connect vs. the myriad of other solutions out there? Has anyone had enough experience with Adobe Connect delivered to a wide audience that you can say what kinds of issues they might expect?
  • What about all the features mentioned? Are those important in practice?
A few thoughts.

First some findings from the eLearningGuild's recent research report on Synchronous Learning Systems:
  • Guild members are resoundingly positive in giving synchronous learning very high marks for its impact on their organizations. Specifically, 94.7% are convinced that a SLS is essential to their organization.
  • WebEx enjoys the largest market share with 42.6% of Guild members that use a SLS indicating that they use WebEx Training Center. This is followed by Microsoft with 29.3%, Adobe with 24.7%, and Citrix Online with 11.7%.
  • 30% of Guild organizations that use a SLS use more than one tool in their organizations.
  • Members that receive formal training on how to deliver synchronous learning report much better results than those that receive little or no training.
The last bullet is an excellent point. If you've done webinars, you know that they are different to design and deliver successfully than other kinds of presentations and training.

In addition to the four they list, I've personally had experience with GoToMeeting and GoToWebinar, Elluminate, Centra and Interwise. I've run into a few technical issues with Interwise before. And I used to with Elluminate, but have not had much issue recently.

I personally often adopt whatever is most easily available. I tend not to use a lot of the different features, but certainly like to having polling, recording, chat, screen sharing. In fact, after you've had these features, it sometimes feels weird to be in a presentation hall and not be able to easily poll the audience (and not have a back-channel automatically).

Those are my quick thoughts, but my guess is that there's lots more thoughts out there on what to do around choosing webinar software.

Tuesday, July 22, 2008

PersonalBrain 4.5 Released

I have written a great deal during the past year about the PersonalBrain and I am glad to announce that PersonalBrain 4.5 was released today with a slew of new features. Here are are new features that you will find in the PersonalBrian 4.5:

PersonalBrain 4.5 helps people mind map their Thoughts, ideas and information with new visualization, searching and editing features. New features include:

• Increased Performance - PersonalBrain's dynamic user interface enables one to visually navigate related information. All operations are now faster, from loading to creating information and navigation is up to 30 times faster. This increased speed opens up the ability to easily create Brains with hundreds of thousands of Thoughts.

• Advanced Search and Indexing - PersonalBrain features a powerful search that lets users shift focus to anything that comes to mind in seconds. Live search results appear as users type, prioritized based on usage data. Version 4.5 intelligently interprets requests so the initials of a person can be used instead of their full name for example.

• Superior Distant Thought View (http://www.thebrain.com/site/video/pbViews.html) - PersonalBrain's exclusive display engine makes it a powerful visualization tool for concept and mind mapping. The engine continues to evolve in version 4.5 with an optimized view of second generation Thoughts for easier reading and an even more intuitive visual browser.

• Merging Brains and Copying Thoughts - Users can now easily merge the contents of one Brain into another and use sophisticated copying functions to leverage a set of Thoughts and relationships in different contexts.

• Excel and Word Relationship Import (http://www.youtube.com/watch?v=jV533xqxQl4) - Information structures from Word, Excel and other applications can now be imported into PersonalBrain. Outlines constructed outside of PersonalBrain are easy to import and are automatically visualized. The imported structures can be left as is or leveraged with the sophisticated tools available in PersonalBrain. Conversely, information can also be exported back out of PersonalBrain just as easily.

• Complete Undo and Redo Editing - PersonalBrain's intuitive drag and relate technology lets users create and link numerous ideas and Thoughts in seconds. Now with complete undo and redo capabilities, users can make connections even faster. If a change is made and then reconsidered the multiple level undo buffer makes going back a one step process.

Pricing and Availability

PersonalBrain 4.5 is a complimentary upgrade for all registered PersonalBrain 4.0 users. PersonalBrain is offered in three editions: Free, Core, and Pro (the most powerful edition). PersonalBrain Free is offered at no charge and does not expire. Download your PersonalBrain at http://www.thebrain.com/download. (http://www.thebrain.com/download) Using the free edition anyone can link unlimited ideas, notes and Web information on their desktops. Pro and Core editions are priced as follows: PersonalBrain Pro $249.95, PersonalBrain Core $149.95.

PersonalBrain 4 is compatible with Windows XP, Vista, Linux, and Mac OS X.

Awarding Winning Technical Support


If you have not experienced the PersonalBrain you really owe it to yourself! Here is a link to the Press Release.

Monday, July 21, 2008

eLearning Curriculum

I received a question today and thought that I should post it so that people can weigh in with resources and suggestions.
We are designing curriculum for a graduate program aimed at building skills to be able to work in the eLearning and eContent design and development domain. What curriculum belongs to prepare students to work in the eLearning field?
Obviously, the starting point for this would be existing programs at other schools ...
Any suggestions on good programs that they should emulate?
Certainly look at some of the comments in: Masters Education Technology for some thoughts from people who have attended different programs.

Other related questions ...
What's missing today in these programs that they should consider adding?
Where else should they look for information to help design a good program?
Any help would be appreciated.

Sunday, July 20, 2008

Compliance: Role Separation and Audit (part II)

On the first article about auditing, I tried to show how to configure an Informix Dynamic Server instance to use role separation. The purpose was to prepare the instance for implementation of the various roles involved in the management and audit of the instance.
In this article I will focus on the audit process. We will see how to setup the audit facility and how to define what and who is audited.
Keep in mind the previous article main point: The administration and auditing is prepared for several roles. Whether these roles are played by absolutely different sets of people is really a matter of why are you doing auditing. If you plan to setup a proper auditing environment, and want to be prepared for internal or external auditors than this role separation is crucial. On the other hand, if you're planning to implement auditing in order to gain more knowledge about your instance (what is being done, how often, by whom, from where...) then you can consider playing having each person play more than one role.

In the following paragraphs I will try to answer the following questions:

* What can be audited?
* Who can be audited?
* How do we define who/what is audited?
* How do we control the auditing facility?
* Where and how does the engine store the audit information?
* What do the audit logs contain?
* How painful is the auditing?

After these I will give examples and make a few considerations.
Let's start with the questions...


What can be audited?

Informix auditing facility permits us to activate audit per events. This means that we can audit actions. An action can be a command execution, a GRANT operation, execution of stored procedures, creation of tables etc. I have to be clear about one point: Currently, we cannot define audit based on objects. We can only define auditing on actions (and users as we will see next). This means you can activate auditing for GRANTs, but not for GRANTs on a specific table. You can activate audit for execution of stored procedure but not for a specific stored procedure. Some other RDBMS do exactly the opposite: You can audit objects and not actions. In a perfect world we'd probably like to have both. But if we think about it, we can understand Informix perspective. First we should keep in mind that a user can only do what he is allowed to do. If we're trying to answer the question "who did this?" a vague but truthfully answer would be "one of the persons who has the privileges to do that". This will not make auditors happy, but my point is that you can take two paths: define auditing for your sensitive objects, or otherwise define auditing for your sensitive users. Let's see some examples... In Informix you can audit actions like DROP CHUNK, DROP INDEX, DROP TABLE etc... Would you need to define the object in the auditing definition? Of course not... It's better to define the auditing on the action. The WHO and WHAT will be included in the audit log. You can also define auditing for CREATE TABLE, CREATE PROCEDURE etc. These ones can't also be established on objects, because they refer to new objects.
If you're thinking about using auditing to see who saw what in your database you probably have to reconsider your strategy. As I wrote before, people only do what they can. A user can't do more than their privileges let him. If you want to have finer control on what people can access you should probably implement Label Based Access Control (LBAC) to prevent the access instead to trying to check if people accessed what they should not.


Who can be audited?

In short: anyone. We activate the auditing for users. Users are anyone from the different roles. In this context a user can be a member of the DBSA group, a member of the users allowed to connect to the instance or a member of the DBSSO group. So, everybody that can access the instance, that can manage the instance, or that can define what is audited can be audited.
I would like to emphasize this: Without any other product, just with the bare Informix software, you can really establish a strong auditing facility that allows you to implement role separation, and you can effectively control your users actions and specially your power users, like the DBAs, DBSAs and DBSSOs. And besides this you also guarantee the integrity of the audit logs against your DBAs and DBSSOs, meaning that they can't change anything or do any cover up action without being spotted by the AAOs. This means more value for your money. Other RDBMs will require other products to implement this. Again, root is still a possible problem, and informix is a problem which you can control.

How do we define who/what is audited?

I hope you're ready for the technical part! Ok... In Informix, we have a long list of mnemonics that represent auditable actions. Theoretically, everything we can do within an instance is an action, and has a mnemonic for it. To give you a few examples, let's consider INSERTs, DELETEs, UPDATEs, CREATE TABLE etc. and a few more esoteric like CREATE CAST, onaudit, onbar....
Each of these have a mnemonic like INRW, DLRW, UPRW, CRTB, CRCT, ONAU, ONBR

So, as we saw before, we can audit actions and users. We specify the actions using a list of mnemonics but how do we specify the users? By using the concept of masks. Informix has three standard masks, usually called global masks and we can define the ones we want. Each mask will be defined with a list of mnemonics by the DBSSOs. The three global masks are:
  1. _default
    This mask will be applied to any user that doesn't have a specific mask
  2. _required
    This mask will be added to the user mask (or the _default if there is not a specific user mask)
  3. _exclude
    This mask will contain actions that should not be audited
The masks are applied in the order above meaning that first will be the user (or _default) mask, then the _required and then the _exclude. Any conflict will be resolved by the order of the masks, meaning for example that if you specify one action (mnemonic) in the _required and _exclude masks, that action will not be audited.

You probably noticed that these masks start with an underscore. All masks that start with an underscore are called template masks and you can then use them to create specific user masks.
Note that if you do this and later change the template it will not change the user's mask.

How do we control the auditing facility?

The auditing facility is mainly controlled using the onaudit command. With it DBSSOs can change/define the user masks and the AAOs can start, stop and reconfigure the audit facility.
Let's check the command usage:


Usage: onaudit [-f file] [-u name] [-r bmsk] [-e eset] [-y]
onaudit [-c] [-n] [-l lev] [-e err] [-p path] [-s size]
action: one of
-a -- add a mask
-d -- delete a mask
-m -- modify a mask
-o -- output a mask
-r bmsk -- name of basemask
-c -- print audit configuration
-n -- start new log file
-l lev -- set ADTMODE
-e err -- set ADTERR
-p path -- set ADTPATH
-s size -- set ADTSIZE
-f file -- include instruction file
-u mask -- name of target/mask
-e eset -- event set added to (+) or removed from (-) mask
-y -- respond yes to all prompts


So, we can create a mask (-a), we can drop a mask (-d), change a mask (-m) or show a mask (-o). These would be options for the DBSSOs.
For AAOs we can change the audit mode (-l), change the behavior in case of error (-e), name the audit log files location (-p) and each audit log size (-s). I will show some examples later, but it's important to explain some of the settings. First, let's take the audit level. We can define the following audit levels:
  • 0 - Means audit off
  • 1 - Audit is on. No DBSSOs or DBSAs actions are automatically audited
  • 3 - Audit is on and all the DBSSOs actions are automatically audited
  • 5 - Audit is on and all the DBSAs actions are automatically audited
  • 7 - Audit is on and all the DBSSOs and DBSAs actions are automatically audited
One very important option to note is that the AAOs can define a level that automatically audits the actions of the DBSAs and/or the DBSSOs. This prevents any DBSSOs to manipulate the audit masks to remove himself from the audited users. This also shows the importance of implementing real role separation.
Another note about the audit levels: You may notice that the levels (besides 0 which means off) are all odd numbers. If you're using older versions of IDS you may see references to even numbers. These were equivalents to the ones specified above, but instead of writing to log files the audit facility would send the audit info to the underlying operating system auditing facility. This was not standard, and didn't work on all platforms. In the latest IDS versions the support for those levels were discontinued.
Another important configuration is the audit error mode (-e). This defines the behavior of the instance when it need to audit something, but can't write to the audit log:

  • 0 - Continues the normal processing, without writing the audit log
  • 1 - Suspends the thread/session that need to be audited, and keeps trying to record the audit log
  • 3 - The instance shuts down
So, you have to choose how important the audit is for you... If you can live without it, set it to 0. If you think you can continue to work on the instance (other sessions may not being audited), set it to 1. If you consider auditing to be crucial, you can stop the instance by setting it to 3, meaning it will shutdown if it finds a problem when trying to write the audit log.

The other options are all relative to the audit masks creation and manipulation and are more or less self explanatory.

Where and how does the engine store the audit information?

There are three types of audit information in the auditing facility:

  1. The audit masks definition
  2. The audit facility configuration (level, path, file size and error mode)
  3. The audit logs or the audited information
The audit mask definitions are stored in the sysmaster database and it will survive any reboot. The sysmaster database has several tables containing audit configuration info: sysaudit, sysadtinfo, syscrtadt.

The audit facility configuration is stored in a file called adtcfg.<servernum> located in the $INFORMIXDIR/aaodir. Please note that there is an annoying bug relative to this file usage: Whenever you change the audit facility configuration this file will be updated, but when the server starts it looks at the contents of the file called adtcfg located in the same directory. So you must copy the adtcfg.<servernum> file to adtcfg. This is also one of the reasons why this directory should be unique for each $INFORMIXDIR you have on the same system. This bug did not happen in my testing environment with IDS 11.50.UC1. I hit it with previous versions (7.31.FD7 and 10.00.FC4).
This file contains the following configuration parameters:

  • ADTMODE
    The audit level
  • ADTPATH
    The path where the audit log files are created
  • ADTSIZE
    The size for each audit log file (in bytes)
  • ADTERR
    The error mode of the audit facility

Finally, the audit logs will be created in the path specified by the -p option of onaudit or by the ADTPATH configuration parameter of the adtcfg file.

What do the audit logs contain?

The audit logs are text files that contain entries for each action run from the list of audit actions.
The information for each line will have the same structure, but some fields contain different meanings for different mnemonics or actions.
IDS provides an utility called onshowaudit that will read these audit logs and will generate pipe separated files that you can load into a table with the following structure:


CREATE TABLE frag_logs (
adttag CHAR(4), -- The audited mnemonic
date_time DATETIME YEAR TO FRACTION(3), -- The action timestamp
hostname CHAR(18), -- The client hostname
pid INT, -- The client PID
server CHAR(18), -- The DB server name
username CHAR(8), -- The username
errno INT, -- The error number
code CHAR(4), -- The error code
dbname CHAR(18), -- The database name
tabid INT, -- The table or object ID
objname CHAR(18), -- The object name
extra_1 INT, -- Extra info specific for the mnemonic
partno INT, -- Partition number (usefull for partitioned tables)
row_num INT, -- The row ID
login CHAR(8), -- The database login
flags INT, -- Flags specific for each mnemonic
extra_2 VARCHAR(160,1) -- Extra info specifc for the mnemonic
);


The audit logs will have a name with the following nomenclature: <servername>.<log_number>. The log_number will be a sequential number. When the current file reaches the size of ADTSIZE, this number is incremented and a new file is created. Only the AAOs will be able to access these files, so once again, this will prevent any change by people who manage the instance (DBSSAs) or by people who define the audit masks (DBSSOs).


How painful is the auditing?

This is the million dollar question. Every customer that thinks about implementing the audit functionality asks how much impact will it have on their instances? And usually they want an answer specifying a percentage of performance impact... Unfortunately for me, I will never receive a million dollars by answering this question, simply because there is no answer!
Let's keep this topic on the technical level... How can we specify the impact if we don't know what (and who) will be audited? And even if we did, how could we specify the impact, if we don't know how many times the audited actions will happen?
Although I really think there is no answer, we can of course discuss this topic... For example, let's consider that we want to audit everybody, but we only want to audit the session creation (STSN), and the GRANTs and REVOKEs on tables (GRTB and RVTB). In a normal system you can do this and will probably don't even be able to measure a performance impact... I mean, how many CONNECTs and GRANT/REVOKEs do you do?
On the other hand, let's imagine you want to audit every table row that it read by everyone (RDRW). I'd bet your instance will suffer a very large impact.
So, regarding performance costs, what we need is to clearly define the needs of the auditing process, and to accept that some things just can't be audited (unless you reduce it to a few users for example). Establishing the RDRW mnemonic may be acceptable if you define it just for your DBAs... In short, the million dollar answer would be "it depends", but no one would pay so much money for such a poor answer...

Examples

I will try to show some very basic configuration of the audit facility, in order to demonstrate the concepts exposed above. I will implement role separation. Let's start with a fresh IDS 11.50 installation in /usr/informix:



[root@PacMan srvr1150uc1]# id
uid=0(root) gid=0(root) groups=0(root),1(bin),2(daemon),3(sys),4(adm),6(disk),10(wheel)
[root@PacMan srvr1150uc1]# pwd
/usr/informix/srvr1150uc1
[root@PacMan srvr1150uc1]# ls -lia
total 132
750742 drwxr-xr-x 25 informix informix 4096 Jul 13 17:25 .
456970 drwxr-xr-x 12 informix informix 4096 Jul 13 17:14 ..
750755 drwxrwxr-x 2 informix informix 4096 Jul 13 17:25 aaodir
750756 drwxr-xr-x 2 informix informix 4096 Jul 13 17:26 bin
750757 drwxrwxr-x 2 informix informix 4096 Jul 13 17:25 dbssodir
752279 drwxrwxr-x 4 root root 4096 Jul 13 17:25 demo
750767 drwxr-xr-x 3 informix informix 4096 Jul 13 17:25 doc
750760 drwxrwxr-x 4 informix informix 4096 Jul 13 17:26 etc
751451 drwxr-xr-x 12 informix informix 4096 Jul 13 17:25 extend
752645 drwxrwxr-x 2 root root 4096 Jul 13 17:25 forms
750774 drwxr-xr-x 7 informix informix 4096 Jul 13 17:25 gls
751483 drwxr-xr-x 3 informix informix 4096 Jul 13 17:25 gsk
751478 drwxr-xr-x 2 informix informix 4096 Jul 13 17:26 gskit
752401 drwxrwxr-x 4 root root 4096 Jul 13 17:25 help
750754 drwxr-xr-x 2 informix informix 4096 Jul 13 17:25 ids_license
750762 drwxr-xr-x 6 informix informix 4096 Jul 13 17:25 incl
752232 drwxrwxr-x 2 root root 4096 Jul 13 17:25 ism
750758 drwxr-xr-x 3 informix informix 4096 Jul 13 17:25 lib
750751 drwxr-xr-x 3 informix informix 4096 Jul 13 17:25 master_license
750771 drwxr-xr-x 3 informix informix 4096 Jul 13 17:25 msg
750764 drwxr-xr-x 3 informix informix 4096 Jul 13 17:25 release
752736 drwxrwxr-x 4 root root 4096 Jul 13 17:25 snmp
751482 drwxrwx--- 2 informix informix 4096 Jul 13 17:25 ssl
750749 drwxrwx--- 2 informix informix 4096 Jul 13 20:39 tmp
752765 drwxr-xr-x 2 informix informix 4096 Jul 13 17:25 uninstall_ids1150
751450 -rwxr-xr-x 1 root informix 29713 Apr 25 00:42 uninstallserver
[root@PacMan srvr1150uc1]#


Then, using the ixvirtdir script I'll create a new INFORMIXDIR called srvr1150uc1_cheetah2:


[root@PacMan srvr1150uc1]# ixvirtdir /usr/informix/srvr1150uc1/ /usr/informix/srvr1150uc1_cheetah2 11.5 ixdbsa ixaao ixdbsso


[root@PacMan srvr1150uc1]# ls -lia /usr/informix/srvr1150uc1_cheetah2/
total 24
456978 drwxr-xr-x 6 informix informix 4096 Jul 14 00:41 .
456970 drwxr-xr-x 13 informix informix 4096 Jul 14 00:41 ..
457097 drwxrwx--- 2 informix ixaao 4096 Jul 14 00:41 aaodir
457079 lrwxrwxrwx 1 informix informix 29 Jul 14 00:41 bin -> /usr/informix/srvr1150uc1/bin
457096 drwxrwx--- 2 informix ixdbsso 4096 Jul 14 00:41 dbssodir
457080 lrwxrwxrwx 1 informix informix 30 Jul 14 00:41 demo -> /usr/informix/srvr1150uc1/demo
457081 lrwxrwxrwx 1 informix informix 29 Jul 14 00:41 doc -> /usr/informix/srvr1150uc1/doc
457098 drwxrwxr-x 4 informix ixdbsa 4096 Jul 14 00:41 etc
457082 lrwxrwxrwx 1 informix informix 32 Jul 14 00:41 extend -> /usr/informix/srvr1150uc1/extend
457083 lrwxrwxrwx 1 informix informix 31 Jul 14 00:41 forms -> /usr/informix/srvr1150uc1/forms
457084 lrwxrwxrwx 1 informix informix 29 Jul 14 00:41 gls -> /usr/informix/srvr1150uc1/gls
457095 lrwxrwxrwx 1 informix informix 29 Jul 14 00:41 gsk -> /usr/informix/srvr1150uc1/gsk
457094 lrwxrwxrwx 1 informix informix 31 Jul 14 00:41 gskit -> /usr/informix/srvr1150uc1/gskit
457085 lrwxrwxrwx 1 informix informix 30 Jul 14 00:41 help -> /usr/informix/srvr1150uc1/help
457092 lrwxrwxrwx 1 informix informix 37 Jul 14 00:41 ids_license -> /usr/informix/srvr1150uc1/ids_license
457086 lrwxrwxrwx 1 informix informix 30 Jul 14 00:41 incl -> /usr/informix/srvr1150uc1/incl
457087 lrwxrwxrwx 1 informix informix 29 Jul 14 00:41 ism -> /usr/informix/srvr1150uc1/ism
457088 lrwxrwxrwx 1 informix informix 29 Jul 14 00:41 lib -> /usr/informix/srvr1150uc1/lib
457093 lrwxrwxrwx 1 informix informix 40 Jul 14 00:41 master_license -> /usr/informix/srvr1150uc1/master_license
457089 lrwxrwxrwx 1 informix informix 29 Jul 14 00:41 msg -> /usr/informix/srvr1150uc1/msg
457090 lrwxrwxrwx 1 informix informix 33 Jul 14 00:41 release -> /usr/informix/srvr1150uc1/release
457091 lrwxrwxrwx 1 informix informix 30 Jul 14 00:41 snmp -> /usr/informix/srvr1150uc1/snmp
457221 drwxr-xr-x 3 informix informix 4096 Jul 14 00:41 ssl




We need to make oninit executable to anyone:


cheetah2@PacMan.onlinedomus.net:informix-> ls -lia $INFORMIXDIR/bin/oninit
750793 -rwsr-sr-x 1 root informix 15274592 Jul 13 17:26 /usr/informix/srvr1150uc1_cheetah2/bin/oninit


Then I setup several users for several roles:
  • dbsauser belongs to ixdbsa group, will be the DBSA
  • ssouser belongs to ixdbsso group will be the DBSSO
  • aaouser belongs to ixaao group will be the AAO
I will not use user informix except for storage file creation.
So, as user dbsauser I will initialize the instance:


cheetah2@PacMan.onlinedomus.net:dbsauser-> id
uid=506(dbsauser) gid=506(ixdbsa) groups=506(ixdbsa)
cheetah2@PacMan.onlinedomus.net:dbsauser-> oninit -iwv

This action will initialize IBM Informix Dynamic Server;
any existing IBM Informix Dynamic Server databases will NOT be accessible -
Do you wish to continue (y/n)? y
Checking group membership to determine server run mode...succeeded
Reading configuration file '/usr/informix/srvr1150uc1_cheetah2/etc/onconfig.cheetah2'...succeeded
Creating /INFORMIXTMP/.infxdirs...succeeded
Creating infos file "/usr/informix/srvr1150uc1_cheetah2/etc/.infos.cheetah2"...succeeded
Linking conf file "/usr/informix/srvr1150uc1_cheetah2/etc/.conf.cheetah2"...succeeded
Checking config parameters...succeeded
Writing to infos file...succeeded
Allocating and attaching to shared memory...succeeded
Creating resident pool 4818 kbytes...succeeded
Allocating 50016 kbytes for buffer pool of 2K page size...succeeded
Initializing rhead structure...
succeeded
Initialization of Encryption...succeeded
Initializing ASF...succeeded
Initializing Dictionary Cache and SPL Routine Cache...succeeded
Bringing up ADM VP...succeeded
Creating VP classes...succeeded
Onlining 0 additional cpu vps...succeeded
Onlining 2 IO vps...succeeded
Forking main_loop thread...succeeded
Initializing DR structures...succeeded
Forking 1 'soctcp' listener threads...succeeded
Starting tracing...succeeded
Initializing 8 flushers...succeeded
Initializing log/checkpoint information...succeeded
Initializing dbspaces...succeeded
Opening primary chunks...succeeded
Opening mirror chunks...succeeded
Validating chunks...succeeded
Creating database partition...succeeded
Initialize Async Log Flusher...succeeded
Forking btree cleaner...succeeded
Initializing DBSPACETEMP list...succeeded
Checking database partition index...succeeded
Initializing dataskip structure...succeeded
Checking for temporary tables to drop...succeeded
Forking onmode_mon thread...succeeded
Starting scheduling system...succeeded
Verbose output complete: mode = 5
cheetah2@PacMan.onlinedomus.net:dbsauser->
cheetah2@PacMan.onlinedomus.net:dbsauser-> onstat -

IBM Informix Dynamic Server Version 11.50.UC1 -- On-Line -- Up 00:00:39 -- 88060 Kbytes

cheetah2@PacMan.onlinedomus.net:dbsauser-> onstat -m

IBM Informix Dynamic Server Version 11.50.UC1 -- On-Line -- Up 00:00:41 -- 88060 Kbytes

Message Log File: /usr/informix/logs/cheetah2.log
23:12:03 Checkpoint Statistics - Avg. Txn Block Time 0.000, # Txns blocked 0, Plog used 134, Llog used 1123

23:12:04 'sysmaster' database built successfully.
23:12:05 'sysutils' database built successfully.
23:12:05 'sysuser' database built successfully.
23:12:11 Building 'sysadmin' database ...
23:12:11 dynamically allocated 2000 locks
23:12:12 Loading Module
23:12:13 Unloading Module
23:12:13 Loading Module
23:12:13 'sysadmin' database built successfully.
23:12:13 Logical Log 3 Complete, timestamp: 0x20a1b.
23:12:13 SCHAPI: Started dbScheduler thread.
23:12:13 SCHAPI: Started 2 dbWorker threads.
23:12:13 Checkpoint Completed: duration was 1 seconds.
23:12:13 Wed Jul 16 - loguniq 4, logpos 0x206018, timestamp: 0x214ce Interval: 6

23:12:13 Maximum server connections 1
23:12:13 Checkpoint Statistics - Avg. Txn Block Time 0.000, # Txns blocked 1, Plog used 177, Llog used 2086


Perfect... Instance up and running, without user Informix... ok... I cheated a little bit, because I had previously touched/chmod the rootdbs chunk file and gave group permissions for the $INFORMIXDIR/etc/$ONCONFIG file...


No I will create a normal dbspace and a temporary dbspace. Again, the file creation should be done as user informix. After this let's create the stores demo database...


cheetah2@PacMan.onlinedomus.net:informix-> id
uid=501(informix) gid=501(informix) groups=501(informix)
cheetah2@PacMan.onlinedomus.net:informix-> pwd
/usr/informix/discos/cheetah2
cheetah2@PacMan.onlinedomus.net:informix-> touch dbs1.c1 temp1.c1
cheetah2@PacMan.onlinedomus.net:informix-> chmod 660 dbs1.c1 temp1.c1
cheetah2@PacMan.onlinedomus.net:informix-> ls -l
total 150152
-rw-rw---- 1 informix informix 0 Jul 16 23:18 dbs1.c1
-rw-rw---- 1 informix informix 153600000 Jul 16 23:17 root_dbs.c1
-rw-rw---- 1 informix informix 0 Jul 16 23:18 temp1.c1
cheetah2@PacMan.onlinedomus.net:informix->


And again as user dbsauser:


cheetah2@PacMan.onlinedomus.net:dbsauser-> id
uid=506(dbsauser) gid=506(ixdbsa) groups=506(ixdbsa)
cheetah2@PacMan.onlinedomus.net:dbsauser-> onspaces -c -d dbs1 -p /usr/informix/discos/cheetah2/dbs1.c1 -s 50000 -o 0
Verifying physical disk space, please wait ...
Space successfully added.

** WARNING ** A level 0 archive of Root DBSpace will need to be done.
cheetah2@PacMan.onlinedomus.net:dbsauser-> onspaces -c -d temp1 -t -p /usr/informix/discos/cheetah2/temp1.c1 -s 10000 -o 0
Verifying physical disk space, please wait ...
Space successfully added.
cheetah2@PacMan.onlinedomus.net:dbsauser-> onstat -d

IBM Informix Dynamic Server Version 11.50.UC1 -- On-Line -- Up 00:09:00 -- 88060 Kbytes

Dbspaces
address number flags fchunk nchunks pgsize flags owner name
477787f8 1 0x60001 1 1 2048 N B informix rootdbs
4890fb40 2 0x60001 2 1 2048 N B informix dbs1
487c5e60 3 0x42001 3 1 2048 N TB informix temp1
3 active, 2047 maximum

Chunks
address chunk/dbs offset size free bpages flags pathname
47778958 1 1 0 75000 19181 PO-B /usr/informix/discos/cheetah2/root_dbs.c1
4890fca0 2 2 0 25000 24947 PO-B /usr/informix/discos/cheetah2/dbs1.c1
4889e778 3 3 0 5000 4947 PO-B /usr/informix/discos/cheetah2/temp1.c1
3 active, 32766 maximum

NOTE: The values in the "size" and "free" columns for DBspace chunks are
displayed in terms of "pgsize" of the DBspace to which they belong.

Expanded chunk capacity mode: always

cheetah2@PacMan.onlinedomus.net:dbsauser->


cheetah2@PacMan.onlinedomus.net:dbsauser-> dbaccessdemo stores_demo -log -dbspace dbs1

DBACCESS Demonstration Database Installation Script

Dropping existing stores_demo database ....

Creating stores_demo database ....


So, we have a functional instance with the usual stores_demo database.
Now we need to start the auditing facility... Let's try running onaudit with the dbsauser:


cheetah2@PacMan.onlinedomus.net:dbsauser-> id
uid=506(dbsauser) gid=506(ixdbsa) groups=506(ixdbsa)
cheetah2@PacMan.onlinedomus.net:dbsauser-> onaudit
Onaudit -- Audit Subsystem Configuration Utility


Must be an AAO or DBSSO to run this program.


So, as expected, the instance administrator which in this simple case is also the database administrator for stores_demo can't even run the onaudit utility... Let's change our identity to aaouser and activate the audit facility, but first, let's try to exceed our role... Let's stop the instance!


cheetah2@PacMan.onlinedomus.net:aaouser-> id
uid=508(aaouser) gid=507(ixaao) groups=507(ixaao)
cheetah2@PacMan.onlinedomus.net:aaouser-> onmode -ky

Must be a DBSA to run this program
cheetah2@PacMan.onlinedomus.net:aaouser->


Sorry... we can't... We're not a DBSA... :)

But we should be able to configure the audit... Check it:



cheetah2@PacMan.onlinedomus.net:aaouser-> id
uid=508(aaouser) gid=507(ixaao) groups=507(ixaao)
cheetah2@PacMan.onlinedomus.net:aaouser-> ls -l $INFORMIXDIR/aaodir
total 8
-rw-rw-r-- 1 informix informix 1120 Apr 25 00:42 adtcfg
-rw-r--r-- 1 informix informix 812 Apr 25 00:42 adtcfg.std
cheetah2@PacMan.onlinedomus.net:aaouser-> ls -lia /usr/informix/audit_logs
total 8
457224 d--xrwx--- 2 informix ixaao 4096 Jul 17 23:25 .
456970 drwxr-xr-x 15 informix informix 4096 Jul 17 00:13 ..


No specific ADTCFG file for the engine and no files in the /usr/informix/audit_logs directory, which will be the place to store the audit trail files.
Let's activate the audit facility:


cheetah2@PacMan.onlinedomus.net:aaouser-> onaudit -l 7 -e 0 -p /usr/informix/audit_logs -s 50000
Onaudit -- Audit Subsystem Configuration Utility

cheetah2@PacMan.onlinedomus.net:aaouser-> ls -l $INFORMIXDIR/aaodir
total 12
-rw-rw-r-- 1 informix informix 1120 Apr 25 00:42 adtcfg
-rw-rw-r-- 1 informix informix 1216 Jul 18 00:04 adtcfg.0
-rw-r--r-- 1 informix informix 812 Apr 25 00:42 adtcfg.std

So now, we have the adtcfg.0 (0 is the instance SERVERNUM).


cheetah2@PacMan.onlinedomus.net:aaouser-> ls -lia /usr/informix/audit_logs
total 8
457224 d--xrwx--- 2 informix ixaao 4096 Jul 18 00:04 .
456970 drwxr-xr-x 15 informix informix 4096 Jul 17 00:13 ..
459479 -rw-rw---- 1 informix ixaao 0 Jul 18 00:04 cheetah2.0


And we already have an audit trail file. The suffix is a number which will increment when we change the file (using onaudit or when one file reaches the size we specified.


cheetah2@PacMan.onlinedomus.net:aaouser-> onaudit -c
Onaudit -- Audit Subsystem Configuration Utility

Current audit system configuration:
ADTMODE = 7
ADTERR = 0
ADTPATH = /usr/informix/audit_logs
ADTSIZE = 50000
Audit file = 0
cheetah2@PacMan.onlinedomus.net:aaouser->


And we can see the current audit configuration. Take noticed that because we used audit level 7, DBSA and DBSSO actions will be automatically recorded into the audit log. Let's make a test with dbsauser. A simple connect to sysmaster:


cheetah2@PacMan.onlinedomus.net:dbsauser-> id
uid=506(dbsauser) gid=506(ixdbsa) groups=506(ixdbsa)
cheetah2@PacMan.onlinedomus.net:dbsauser-> dbaccess sysmaster <<EOF
select count(*) from sysdatabases;
> EOF

Database selected.



(count(*))

4

1 row(s) retrieved.



Database closed.

cheetah2@PacMan.onlinedomus.net:dbsauser->


and no, let's check the contents of the audit trail file:


cheetah2@PacMan.onlinedomus.net:dbsauser-> id
uid=506(dbsauser) gid=506(ixdbsa) groups=506(ixdbsa)
cheetah2@PacMan.onlinedomus.net:dbsauser-> ls -lia /usr/informix/audit_logs/
ls: /usr/informix/audit_logs/: Permission denied
cheetah2@PacMan.onlinedomus.net:dbsauser-> cat /usr/informix/audit_logs/cheetah2.0
cat: /usr/informix/audit_logs/cheetah.0: Permission denied
cheetah2@PacMan.onlinedomus.net:dbsauser->


Hmmm... what is a DBSA trying to check something only AAO should do?! :)
So, let's now do it with the correct user:


cheetah2@PacMan.onlinedomus.net:aaouser-> id
uid=508(aaouser) gid=507(ixaao) groups=507(ixaao)
cheetah2@PacMan.onlinedomus.net:aaouser-> ls -lia /usr/informix/audit_logs/
total 12
457224 d--xrwx--- 2 informix ixaao 4096 Jul 18 00:04 .
456970 drwxr-xr-x 15 informix informix 4096 Jul 17 00:13 ..
459479 -rw-rw---- 1 informix ixaao 937 Jul 18 00:14 cheetah2.0
cheetah2@PacMan.onlinedomus.net:aaouser-> cat /usr/informix/audit_logs/cheetah2.0
ONLN|2008-07-18 00:14:49.000|PacMan.onlinedomus.net|4386|cheetah2|dbsauser|0:STSN
ONLN|2008-07-18 00:14:49.000|PacMan.onlinedomus.net|4386|cheetah2|dbsauser|0:OPDB:sysmaster:0:-
ONLN|2008-07-18 00:14:49.000|PacMan.onlinedomus.net|4386|cheetah2|dbsauser|0:ACTB:sysmaster:informix:sysdatabases:247
ONLN|2008-07-18 00:14:49.000|PacMan.onlinedomus.net|4386|cheetah2|dbsauser|0:ACTB:sysmaster:informix:sysdbspartn:100
ONLN|2008-07-18 00:14:50.000|PacMan.onlinedomus.net|4386|cheetah2|dbsauser|0:RDRW:sysmaster:100:1048578:513
ONLN|2008-07-18 00:14:50.000|PacMan.onlinedomus.net|4386|cheetah2|dbsauser|0:RDRW:sysmaster:100:1048578:514
ONLN|2008-07-18 00:14:50.000|PacMan.onlinedomus.net|4386|cheetah2|dbsauser|0:RDRW:sysmaster:100:1048578:515
ONLN|2008-07-18 00:14:50.000|PacMan.onlinedomus.net|4386|cheetah2|dbsauser|0:RDRW:sysmaster:100:1048578:516
ONLN|2008-07-18 00:14:50.000|PacMan.onlinedomus.net|4386|cheetah2|dbsauser|0:CLDB:sysmaster
cheetah2@PacMan.onlinedomus.net:aaouser->


Ah... Ok, now we have the audited data... Let's check it by mnemonic:
  1. STSN
    For STart SessioN, showing timestamp, instance, user, PID...
  2. OPDB
    For OPen DataBase, showing timestamp, instance, user, database, PID, return code...
  3. ACTB
    For ACess TaBle, showing the table, the table owner/schema, the database etc.
  4. RDRW
    For ReaD RoW, showing the timestamp, instance, database, table id and rowid
  5. CLDB
    For CLose DataBase, showing the timestamp, instance, PID and database...
This simple test shows you the potential that RDRW has to damage your instance performance... A full table scan on a big table will be enough to fill thounsands or millions of audit trail entry lines.

Now, let's create the _default global mask. I'll use user informix to test it.
To start, let's audit the same events we saw in the previous example, but first let's change the audit log:


cheetah2@PacMan.onlinedomus.net:aaouser-> id
uid=508(aaouser) gid=507(ixaao) groups=507(ixaao)
cheetah2@PacMan.onlinedomus.net:aaouser-> ls -ltr /usr/informix/audit_logs/
total 0
-rw-rw---- 1 informix ixaao 937 Jul 20 17:37 cheetah2.0
cheetah2@PacMan.onlinedomus.net:aaouser-> onaudit -n
Onaudit -- Audit Subsystem Configuration Utility

cheetah2@PacMan.onlinedomus.net:aaouser-> ls -ltr /usr/informix/audit_logs/
total 0
-rw-rw---- 1 informix ixaao 937 Jul 20 17:37 cheetah2.0
-rw-rw---- 1 informix ixaao 0 Jul 20 17:40 cheetah2.1
cheetah2@PacMan.onlinedomus.net:aaouser->


And now, with the ssouser, let's create the _default mask:


cheetah2@PacMan.onlinedomus.net:ssouser-> id
uid=507(ssouser) gid=508(ixdbsso) groups=508(ixdbsso)
cheetah2@PacMan.onlinedomus.net:ssouser-> ls -ltr /usr/informix/audit_logs/
ls: /usr/informix/audit_logs/: Permission denied
cheetah2@PacMan.onlinedomus.net:ssouser-> id
uid=507(ssouser) gid=508(ixdbsso) groups=508(ixdbsso)
cheetah2@PacMan.onlinedomus.net:ssouser-> onaudit -a -u _default -e STSN,OPDB,CLDB,ACTB,RDRW
Onaudit -- Audit Subsystem Configuration Utility

cheetah2@PacMan.onlinedomus.net:ssouser-> onaudit -o -y
Onaudit -- Audit Subsystem Configuration Utility

_default - ACTB,CLDB,OPDB,RDRW,STSN
cheetah2@PacMan.onlinedomus.net:ssouser->


And now, using another user, informix, let's do the same we did before and check the audit trail log:


cheetah2@PacMan.onlinedomus.net:informix-> id
uid=501(informix) gid=501(informix) groups=501(informix)
cheetah2@PacMan.onlinedomus.net:informix-> dbaccess sysmaster < select count(*) from sysdatabases;
> EOF

Database selected.



(count(*))

5

1 row(s) retrieved.



Database closed.

cheetah2@PacMan.onlinedomus.net:informix->


And now, the content of the audit trail:


cheetah2@PacMan.onlinedomus.net:aaouser-> id
uid=508(aaouser) gid=507(ixaao) groups=507(ixaao)
cheetah2@PacMan.onlinedomus.net:aaouser-> ls -ltr /usr/informix/audit_logs/
total 0
-rw-rw---- 1 informix ixaao 0 Jul 20 17:37 cheetah2.0
cheetah2@PacMan.onlinedomus.net:aaouser-> onaudit -n
Onaudit -- Audit Subsystem Configuration Utility

cheetah2@PacMan.onlinedomus.net:aaouser-> ls -ltr /usr/informix/audit_logs/
total 0
-rw-rw---- 1 informix ixaao 0 Jul 20 17:37 cheetah2.0
-rw-rw---- 1 informix ixaao 0 Jul 20 17:40 cheetah2.1
cheetah2@PacMan.onlinedomus.net:aaouser-> id
uid=508(aaouser) gid=507(ixaao) groups=507(ixaao)
cheetah2@PacMan.onlinedomus.net:aaouser-> cat /usr/informix/audit_logs/cheetah2.1
ONLN|2008-07-20 17:44:15.000|PacMan.onlinedomus.net|8059|cheetah2|ssouser|0:STSN
ONLN|2008-07-20 17:44:15.000|PacMan.onlinedomus.net|8059|cheetah2|ssouser|0:OPDB:sysmaster:0:-
ONLN|2008-07-20 17:44:15.000|PacMan.onlinedomus.net|8059|cheetah2|ssouser|0:ACTB:sysmaster:informix:sysadtinfo:214
ONLN|2008-07-20 17:44:15.000|PacMan.onlinedomus.net|8059|cheetah2|ssouser|0:RDRW:sysmaster:214:1025:0
ONLN|2008-07-20 17:44:15.000|PacMan.onlinedomus.net|8059|cheetah2|ssouser|0:ACTB:sysmaster:informix:sysaudit:261
ONLN|2008-07-20 17:44:15.000|PacMan.onlinedomus.net|8059|cheetah2|ssouser|0:RDRW:sysmaster:261:1048836:513
ONLN|2008-07-20 17:44:15.000|PacMan.onlinedomus.net|8059|cheetah2|ssouser|0:ACTB:sysmaster:informix:syscrtadt:215
ONLN|2008-07-20 17:44:15.000|PacMan.onlinedomus.net|8059|cheetah2|ssouser|0:LSAM:_default
ONLN|2008-07-20 17:44:15.000|PacMan.onlinedomus.net|8059|cheetah2|ssouser|0:INRW:sysmaster:215:1026:6913
ONLN|2008-07-20 17:44:15.000|PacMan.onlinedomus.net|8059|cheetah2|ssouser|0:ACTB:sysmaster:informix:syscrtadt:215
ONLN|2008-07-20 17:44:15.000|PacMan.onlinedomus.net|8059|cheetah2|ssouser|100:LSAM:_default
ONLN|2008-07-20 17:44:15.000|PacMan.onlinedomus.net|8059|cheetah2|ssouser|0:INRW:sysmaster:215:1026:6913
ONLN|2008-07-20 17:44:15.000|PacMan.onlinedomus.net|8059|cheetah2|ssouser|0:CLDB:sysmaster
ONLN|2008-07-20 17:48:24.000|PacMan.onlinedomus.net|8255|cheetah2|informix|0:STSN
ONLN|2008-07-20 17:48:24.000|PacMan.onlinedomus.net|8255|cheetah2|informix|0:OPDB:sysmaster:0:-
ONLN|2008-07-20 17:48:24.000|PacMan.onlinedomus.net|8255|cheetah2|informix|0:ACTB:sysmaster:informix:sysdatabases:247
ONLN|2008-07-20 17:48:24.000|PacMan.onlinedomus.net|8255|cheetah2|informix|0:ACTB:sysmaster:informix:sysdbspartn:100
ONLN|2008-07-20 17:48:24.000|PacMan.onlinedomus.net|8255|cheetah2|informix|0:RDRW:sysmaster:100:1048578:513
ONLN|2008-07-20 17:48:24.000|PacMan.onlinedomus.net|8255|cheetah2|informix|0:RDRW:sysmaster:100:1048578:514
ONLN|2008-07-20 17:48:24.000|PacMan.onlinedomus.net|8255|cheetah2|informix|0:RDRW:sysmaster:100:1048578:515
ONLN|2008-07-20 17:48:24.000|PacMan.onlinedomus.net|8255|cheetah2|informix|0:RDRW:sysmaster:100:1048578:516
ONLN|2008-07-20 17:48:24.000|PacMan.onlinedomus.net|8255|cheetah2|informix|0:RDRW:sysmaster:100:1048578:517
ONLN|2008-07-20 17:48:24.000|PacMan.onlinedomus.net|8255|cheetah2|informix|0:CLDB:sysmaster
cheetah2@PacMan.onlinedomus.net:aaouser->


You may notice that there are entries referring to two Process IDs. The first (8059) refers to the onaudit command we used first to define the _default mask. The other (8255) refers to the dbaccess process and is similar to the first test we did. So, any change to the audit masks, which must be done by a DBSSO user is automatically audited if we choose a correct audit level (as we did).

Again we see the RDRW mnemonic which is able to cause serious performance issues.
This is a nice opportunity to show you the effect of the _exclude mask. Let's include the RDRW in this mask. But first let's change the audit trail log file again with:


cheetah2@PacMan.onlinedomus.net:aaouser-> id
uid=508(aaouser) gid=507(ixaao) groups=507(ixaao)
cheetah2@PacMan.onlinedomus.net:aaouser-> onaudit -n
Onaudit -- Audit Subsystem Configuration Utility

Now, with ssouser, let's add the _exclude audit mask:


cheetah2@PacMan.onlinedomus.net:ssouser-> id
uid=507(ssouser) gid=508(ixdbsso) groups=508(ixdbsso)
cheetah2@PacMan.onlinedomus.net:ssouser-> onaudit -a -u _exclude -e RDRW
Onaudit -- Audit Subsystem Configuration Utility

cheetah2@PacMan.onlinedomus.net:ssouser->


Since we're using audit level 7, this action should have been logged. Let's see how:


cheetah2@PacMan.onlinedomus.net:aaouser-> id
uid=508(aaouser) gid=507(ixaao) groups=507(ixaao)
cheetah2@PacMan.onlinedomus.net:aaouser-> cat /usr/informix/audit_logs/cheetah2.2
ONLN|2008-07-20 18:21:38.000|PacMan.onlinedomus.net|8478|cheetah2|aaouser|0:ACTB:sysmaster:informix:syscrtadt:215
ONLN|2008-07-20 18:21:38.000|PacMan.onlinedomus.net|8478|cheetah2|aaouser|0:CLDB:sysmaster
ONLN|2008-07-20 18:22:42.000|PacMan.onlinedomus.net|8484|cheetah2|ssouser|0:STSN
ONLN|2008-07-20 18:22:42.000|PacMan.onlinedomus.net|8484|cheetah2|ssouser|0:OPDB:sysmaster:0:-
ONLN|2008-07-20 18:22:42.000|PacMan.onlinedomus.net|8484|cheetah2|ssouser|0:ACTB:sysmaster:informix:sysadtinfo:214
ONLN|2008-07-20 18:22:42.000|PacMan.onlinedomus.net|8484|cheetah2|ssouser|0:RDRW:sysmaster:214:1025:0
ONLN|2008-07-20 18:22:42.000|PacMan.onlinedomus.net|8484|cheetah2|ssouser|0:ACTB:sysmaster:informix:sysaudit:261
ONLN|2008-07-20 18:22:42.000|PacMan.onlinedomus.net|8484|cheetah2|ssouser|0:ACTB:sysmaster:informix:syscrtadt:215
ONLN|2008-07-20 18:22:42.000|PacMan.onlinedomus.net|8484|cheetah2|ssouser|100:LSAM:_exclude
ONLN|2008-07-20 18:22:42.000|PacMan.onlinedomus.net|8484|cheetah2|ssouser|0:INRW:sysmaster:215:1026:6913
ONLN|2008-07-20 18:22:42.000|PacMan.onlinedomus.net|8484|cheetah2|ssouser|0:ACTB:sysmaster:informix:sysaudit:261
ONLN|2008-07-20 18:22:42.000|PacMan.onlinedomus.net|8484|cheetah2|ssouser|0:INRW:sysmaster:261:1048836:514
ONLN|2008-07-20 18:22:42.000|PacMan.onlinedomus.net|8484|cheetah2|ssouser|0:ACTB:sysmaster:informix:syscrtadt:215
ONLN|2008-07-20 18:22:42.000|PacMan.onlinedomus.net|8484|cheetah2|ssouser|0:CRAM:_exclude
ONLN|2008-07-20 18:22:42.000|PacMan.onlinedomus.net|8484|cheetah2|ssouser|0:INRW:sysmaster:215:1026:6913
ONLN|2008-07-20 18:22:42.000|PacMan.onlinedomus.net|8484|cheetah2|ssouser|0:CLDB:sysmaster



So the DBSSO action was fully logged. Now let's change to another audit log file, and repeat the same dbaccess we did previously with user dbsauser:


cheetah2@PacMan.onlinedomus.net:aaouser-> id
uid=508(aaouser) gid=507(ixaao) groups=507(ixaao)
cheetah2@PacMan.onlinedomus.net:aaouser-> onaudit -n
Onaudit -- Audit Subsystem Configuration Utility

cheetah2@PacMan.onlinedomus.net:aaouser->



and let's see the file contents:


ONLN|2008-07-20 18:23:00.000|PacMan.onlinedomus.net|8486|cheetah2|informix|0:STSN
ONLN|2008-07-20 18:23:00.000|PacMan.onlinedomus.net|8486|cheetah2|informix|0:OPDB:sysmaster:0:-
ONLN|2008-07-20 18:23:00.000|PacMan.onlinedomus.net|8486|cheetah2|informix|0:ACTB:sysmaster:informix:sysdatabases:247
ONLN|2008-07-20 18:23:00.000|PacMan.onlinedomus.net|8486|cheetah2|informix|0:ACTB:sysmaster:informix:sysdbspartn:100
ONLN|2008-07-20 18:23:00.000|PacMan.onlinedomus.net|8486|cheetah2|informix|0:CLDB:sysmaster
cheetah2@PacMan.onlinedomus.net:aaouser->


So, similar to the previous examples, but without the RDRW event.

These examples should give you a very basic feeling on how we set up and configure the auditing facility. Following, I will try to show how to process the audit files, and how we can use SQL to do it.

Using SQL to analyze your auditing information

I have created an SQL file with the CREATE TABLE above, and I will obtain a file in unload format (PIPE separated), with the contents of the audit logs we generated. To do this I will use the onshowaudit utility:


cheetah2@PacMan.onlinedomus.net:aaouser-> id
uid=508(aaouser) gid=507(ixaao) groups=507(ixaao)
cheetah2@PacMan.onlinedomus.net:aaouser-> onshowaudit -l -I -n 0 >audit_logs.unl

Program Over.

cheetah2@PacMan.onlinedomus.net:aaouser-> head -30 audit_logs.unl

ONSHOWAUDIT Secure Audit Utility
INFORMIX-SQL Version 11.50.UC1
ONLN|2008-07-20 17:44:15.000|PacMan.onlinedomus.net|8059|cheetah2|ssouser|0|STSN||||||||||
ONLN|2008-07-20 17:44:15.000|PacMan.onlinedomus.net|8059|cheetah2|ssouser|0|OPDB|sysmaster|||||||0|-|
ONLN|2008-07-20 17:44:15.000|PacMan.onlinedomus.net|8059|cheetah2|ssouser|0|ACTB|sysmaster|214|||||informix||sysadtinfo|
ONLN|2008-07-20 17:44:15.000|PacMan.onlinedomus.net|8059|cheetah2|ssouser|0|RDRW|sysmaster|214|||1025|0||||
ONLN|2008-07-20 17:44:15.000|PacMan.onlinedomus.net|8059|cheetah2|ssouser|0|ACTB|sysmaster|261|||||informix||sysaudit|
ONLN|2008-07-20 17:44:15.000|PacMan.onlinedomus.net|8059|cheetah2|ssouser|0|RDRW|sysmaster|261|||1048836|513||||
ONLN|2008-07-20 17:44:15.000|PacMan.onlinedomus.net|8059|cheetah2|ssouser|0|ACTB|sysmaster|215|||||informix||syscrtadt|
cheetah2@PacMan.onlinedomus.net:aaouser->


As you can see there are 3 annoying header lines at the top. I erased them with vi.
So now, we're prepared to load this into an audit table that we create with the SQL above:


cheetah2@PacMan.onlinedomus.net:aaouser-> dbaccess stores_demo frag_logs.sql

Database selected.


Table created.


Database closed.

cheetah2@PacMan.onlinedomus.net:aaouser-> dbaccess stores_demo -

Database selected.

> load from audit_logs.unl insert into frag_logs;

58 row(s) loaded.

> cheetah2@PacMan.onlinedomus.net:aaouser->



So now, you have a table with all the info collected by the auditing facility. You can then explore it using simple SQL. You can search for specific entries or simply create reports based on it.

Hopefully this will be enough to get you started with the auditing facility, but I'd like to point out several possible problems (with solutions) and a few other considerations.


Possible issues:
  1. Versions 9.x, 10.x and 11.x may raise an error like:

    Onaudit -- Audit Subsystem Configuration Utility

    Cannot set audit configuration parameters.
    Problem may be caused by lack of permission, bad
    parameter values, or bad parameter combinations.

    This error usually relates to the fact that the oninit processes are not running as user root.
    This was done in versions 7.x, but was changed in versions 9.x. If running versions 9.x and versions 10.x you may set a variable called NONROOT_OFF to value 1:

    NONROOT_OFF=1
    export NONROOT_OFF

    before starting the instance. This will force the oninit processes to run as root. The need to run as root derives from the fact that the engine must create the audit log files with specific user/group and root privileges are needed for this.
    At least in IDS 11.50 this doesn't happen if you start your intance using a DBSA user.
  2. In some IDS versions you may need to explicitly grant privileges on the sysmaster tables to the DBSSO and/or AAO users. The tables are sysmaster:sysadtinfo and sysmaster:sysaudit
  3. In some IDS versions, the audit facility will change/create the adtcfg.<servernum> file, but the instance will only look at the adtcfg file on startup. Be sure to confirm your version doesn't have this issue, or if it has, be sure to copy the file generated by the auditing facility to the one used by the instance startup process. In this case you have to do this after each change of the auditing facility configuration
Note that on version 11.50 which I used to generate the examples on this article, I didn't notice any of the above issues. I could reproduce the first one only if I start the engine with user informix which I didn't include in the group od DBSAs

Final recommendations
  • Test everything in a test environment using the same OS and IDS version you use in production. Some of the changes may cause your system to become unusable or unstable if something goes wrong
  • If you think about using the auditing facility for compliance reasons, don't do it without creating full role separation
  • If you implement full role separation, be sure that the different teams and/or users are able to work together, and that each one understands the impact of his activity in the overall use of your database instance(s)
  • Consider carefully the settings for ADTERR. Informix allows you to choose what to do if you can't register the events on the auditing log. This setting can decide the instance availability
  • Think carefully about how to use the auditing logs information. If you intend to use SQL for processing the information, be sure to guarantee that the information can only be managed by the AAO group users. Informix does everything to guarantee the audit logs information security, so you must be sure to guarantee this security after you extract the info and process it. Consider using a different instance and possibly a different machine to process this data
  • Understand that the auditing information can be useful for your database administration purposes. To give you an example I help managing a reasonable sized system and the audit facility was the best way to measure some things like:


    • We had 11 connections per second
    • We execute more than 90 stored procedures per second
    • We create more than 6 temporary tables per second

  • Think very carefully about activating some action mnemonics like RDRW, INRW and DLRW. This can have a dramatic impact on your intance performance. It really depends on which users will use masks with this mnemonics active, and how many DML operations they do.

References: