Nowadays it seems that everyone from businesses to product vendors are using Apache Hadoop to store critical data. Hadoop is regarded by many as the miracle solution to big data challenges and it’s not uncommon to find Hadoop clusters storing petabytes of information which make it a target for cyber criminals. The world is adopting and supporting Apache Hadoop – so why hasn’t the field of forensics? The sheer volume of data and distributed architecture of Hadoop frustrates traditional forensic approaches. Apache Hadoop is a challenge that investigators will continue to face on an increasing basis. Get ready and learn how to tackle Apache Hadoop, the elephant in the room! This session will outline techniques and tools that can be used to investigate incidents on Apache Hadoop, and reduce huge data sets into manageable artifacts that can be analyzed in support of a case.
(@kevviefowler), National Leader, Cyber Response Services, KPMG Canada Kevvie Fowler is a partner and national cyber response leader for KPMG Canada and has over 19 years of IT security and forensics experience. Kevvie assists clients in identifying and protecting critical data and proactively preparing for, responding to, and recovering from incidents in a manner that minimizes impact and interruption to their business.