TORONTO – Security and compliance are almost impossible to achieve since the current security paradigm is inefficient and ineffective, said speakers at Info-Tech Live Wednesday.
Major security vendors have recognized that point security solutions are not working. Threats continue to mount, while compliance processes are time-consuming and arduous, analysts said.
Info-Tech Research Group has come up with an “adaptive security” model that stresses the use of tightly integrated security tools, not just within platforms but between platforms. This model includes processor-level encryption and authentication, the use of virtualization, and centrally managed intelligent software agents.
IT security is made up of a number of solutions that are not integrated together and that creates weak points, said James Quin, senior research analyst with Info-Tech. Today’s “reactive” desktop is made up of the operating system, applications and hardware. “Because none of these are truly, fully integrated, we create gaps between them,” he said. And hackers force themselves into those gaps.
We also push a lot of security management responsibility onto end-users, such as deciding whether to apply a patch or open an attachment. Training end-users is all well and good, he said, but there should be someone within an organization who has assigned responsibility for security management.
There are a lot of horror stories out there – a high number of security breaches among high-profile organizations. ID theft is of growing concern, said Ross Armstrong, senior research analyst with Info-Tech, and 70 per cent of breaches – that we know of – are internal breaches.
So what can organizations do? The adaptive desktop includes secure platforms made up of hardware-assisted device virtualization and Trusted Platform Module (TPM) chips. Virtualization allows for the creation of a dedicated security appliance within each platform and for the detaching of the user environment from the platform hardware, according to Info-Tech. TPMs allow for the encrypted storage of platform and user environment specifications in a protected and tamper-proof space.
Virtualization software is available from the likes of Microsoft, VMware and Xen, and TPM chips are already shipping in several devices – however, they require the appropriate OS releases, such as Vista, to address their capabilities.
It will require legwork to build this infrastructure, said Quin, but technologies are coming down the pipe, and the end goal is a ubiquitous, holistic infrastructure. But it will take three to five years to get there.
Acknowledging that current security approaches aren’t good enough, said Quin. Begin the planning process to an adaptive security solution, and ensure proper long-term alignment of security priorities with corporate goals and objectives. This applies to any size of business, large or small, public or private, he said, though larger enterprises will likely be early adopters.
Also during Info-Tech Live, Matt Brudzynski, senior research analyst with Info-Tech, discussed the business case for virtualization, based on in-depth interviews with 30 American, Canadian and UK-based firms.
Mid-market enterprises are in a sweet spot to achieve cost savings, he said. While virtualization is a technology that everyone can benefit from, he added, mid-market enterprises with 100-5,000 employees and 15-plus servers would get the best bang for their buck. Large companies tend to dabble in virtualization, while small and mid-size enterprises look at every server they have.
The business case for virtualization is rooted in hardware savings: from 40-75 per cent acquisition cost savings and 25-50 per cent recurring cost savings. Success, however, depends on company characteristics, IT department characteristics and server infrastructure. “There are server huggers out there,” he said.
You will most likely hit an I/O bottleneck, he said, and vendor support may be an issue – beware of vendor licensing arrangement and maintenance contracts.