If you read some of the details of the recent Sony attack, one striking revelation is that over 100 terabytes of information was stolen during the attack – movies, scripts, data, email archives, and apparently anything else in sight. The lesson here is that if you can figure this out forensically, you could likely have detected it while it was happening. 100TB is not a blip on the monitoring screen, especially if the data was moving from inside the corporate network to someplace outside the corporate network. Does your organization have both the tools, the people, the expertise and the priority to make sure you are watching for these types of events? Are you segregating the important data into a more secured area that might even have manual permissions necessary to allow a transfer of the data?
As we’re all flooded with an arsenal of tools to help prevent and respond to attacks, it is important to have a clear set of priorities and a common sense approach to how business is done. As an example, if you have to allow outside partners to copy large chunks of information, create a download area in a cloud repository, and stop allowing random access to your internal network to take place. Why take the chance when a segmented approach can make you less vulnerable and better controlled?