Apr 24, 2009 (08:04 PM EDT)
InformationWeek Analytics: Endpoint Security And DLP
Read the Original Article at InformationWeek
Everyone's talking about the insider threat. But protecting data can't supersede the requirement to give users the access they need to do their jobs--otherwise, soon you'll have neither business data nor employees to worry about.
Striking a balance between access and protection isn't easy, however, judging by our InformationWeek Analytics/DarkReading.com Endpoint Security Survey of 384 business technology pros. In that poll, 43% classify their organizations as "trusting," allowing data to be copied to USB drives or other devices with no restrictions or protective measures.
Still, IT is aware of the need to move from a stance of securing endpoints to assuming that laptops and smartphones will be lost, good employees will go bad, and virtual machines will be compromised. Instead of focusing on endpoints, let fortifications follow the data: Decide what must be protected, find out everywhere it lives, and lock it down against both inside and outside threats, whether via encryption, multi-tiered security suites, or new technologies like data loss prevention.
DLP suites combine network scanning and host-based tools to collect, categorize, and protect corporate intellectual property. These products can maintain an archive of data and documents, along with associated permissions by group, individual, and other policies. They then actively scan internal networks and external connections looking for anomalies. This takes data protection beyond perimeter or endpoint protection--DLP facilitates internal safety checks, allowing "eyes only" data to remain eyes only and minimizing the risk that sensitive data will be viewed by the wrong folks, even in-house.
A successful DLP implementation involves a number of steps:
• Identify known content risks. Secure the data you know about, then hunt down a startling number of forgotten, misplaced, or illicit stores.
• Analyze network ports and protocols to look for both expected and devious behavior on nonstandard ports.
• Create point-in-time content signatures and filter rules, and establish a baseline for sensitive data stores. These will be used to monitor traffic flow and find full sets plus snippets of sensitive data across file types and transmission methods.
• Perform content inspection of all traffic using signature and filter rule sets, and set a policy for notification, blocking, and enforcement.
• Enable root-cause and historical data analysis; policies and rule sets must be modified as new threats or risks are identified.
• Consider a phased approach. "Passive" network-based tools to analyze data traffic can be implemented without affecting endpoints. Agent-based or full client apps can then be incorporated into standard image builds as endpoint policies and rolled out later.
Joe Hernick is an industry analyst and former Fortune 100 IT executive.