At a special panel for RSA Security’s gargantuan conference this year, Jerry Sto. Tomas, director of global information security at Allergan said "I don't call it Big Data," I call it garbage data,” when discussing the challenges security professionals face in gathering useful information. I felt vindicated based upon the work I’ve been doing that challenges the current trend towards log and event gluttony. News flash: Information technology groups aren’t failing in their attempts at being proactive because they aren’t collecting enough events. Most organizations have enough SNMP, syslog, and netflow data to circle the planet a few times. I believe it’s because most of the applications available in the market to correlate data require a PhD in physics because they’re so complicated to use. Then there’s the impossible task of trying to find the performance “sweet spot” of these applications. I recently needed to perform some log correlation for an investigation and after carefully constructing the query in our event reporting and analysis tool, still hadn’t received the full results after four hours. As I yearned for the bygone days of using the grep/sed/awk triple-threat on some text files, I thought, “Since when do you need a supercomputer for log and event correlation?”
↧