One of the challenges of IT security monitoring is figuring out what to do with the mountains of data that can easily be gathered. Once you've overcome any technical and procedural challenges to collecting that data at a central point, you now have to normalize that data.
I have found the best way to do this is to organize your data by source (such as Antivirus software logs), and then build database tables for each source. Use common column names such as source_host, dest_host, and date where possible. It will take some creative use of Perl, Python, etc. to slice up the raw logs into a format that is usable in your database but it will be worth it in the long run. Once the data has been properly sliced and diced, it should be fairly trivial to generate reports with your freshly normalized data. For example, you could visually display a list of all machines which have detected a malware threat in the past X days. Once the data is in there and it makes sense, the possibilities are endless.