Securing the Future of e-Science
Network access control holds the key
The advent of Web 2.0 and the key principles characterizing Web 2.0 applications, such as the Web as a platform, data as the driving force, network effects created by an architecture of participation, and extensive use of folksonomies (in the form of tags, or annotations, as referred to within the scientific community), are driving the scientific community to embrace a scientific research establishment based on e-Science. The aim of the e-Science transformation is to use IT to enable collaborative research by virtual or geographically dispersed teams.
In addition, the Internet has ushered in a new age of data sharing and has spurred new networking and remote computing capabilities in virtual instrumentation that are simply not possible with stand-alone, proprietary alternatives. Virtual instrumentation takes advantage of the Internet so scientists can easily publish data to the Web directly from the measurement control device and read data on a handheld personal digital assistant or even on a mobile phone. Through virtual instrumentation, scientists can use the power of the Internet to control instruments remotely or to collaborate on projects with colleagues in separate offices or countries.
Major security issue
Controlling access to the network by virtual instruments or endpoint devices, however, constitutes a major security issue. Further, the openness allowing for all this collaborative progress creates additional security challenges. Advances ranging from innovative instrumentation to effective collaboration to the ability to move large data files are speeding up the path to discovery and development in the sciences. However, as with all advances using the Internet as the medium of transmission, the cost of increased complexity is on the security front.
For example, universities that cooperate to conduct scientific research are often targeted by hackers because of the open nature of their mission. Large groups of individuals need to access systems from all over the world, so universities commonly have portions of the network set up almost like the Internet, in that access is wide open to both invited and uninvited guests.
While access to high-quality information encourages openness and an exchange of material, no one wants unauthorized parties on their network — after all; intellectual property has a real and significant monetary value. This, in turn, creates a responsibility to provide effective safeguards, and to assure any scientists still resistant to e-Science that the desired level of security has been achieved.
Controlling access
Controlling user and device access to a network and ensuring authorized devices aren’t creating security risks is a process known as network access control (NAC). Threats originating and terminating within the same network comprise 80 percent of all network attacks, often using authorized endpoints to completely bypass perimeter defenses. Services used to secure proprietary networks, such as antivirus programs, firewalls, virtual private network (VPN), and intrusion detection technologies, are designed to protect against threats entering through the perimeter and simply don’t do the trick against this type of attack. Therefore, a more comprehensive, complete strategy for network security is required to fully enjoy the benefits of the collaborative world of Web 2.0.
Infonetics Research predicts that the NAC market will grow from an already impressive $323 million in 2005 to a whopping $3.9 billion in 2008. Meanwhile, the Gartner Group predicts that, by the end of 2007, 80 percent of enterprises will have implemented NAC policies. Make no mistake — NAC represents the most significant change in the way that networks are secured since the invention of the firewall.
‘Pre-admission’ NAC mimics the principle of the firewall, controlling access on the edge of the network and managing the various levels of network access that different devices are granted. One pre-admission approach integrates the dynamic host configuration protocol (DHCP), the protocol that networks use to assign IP addresses to client systems. Network identity services such as DHCP are essential to any NAC solution. Address acquisition is the first step for access over IP, so DHCP is a must for any NAC implementation. Many solutions, however, take an agent-based approach, which relies on agent software running on Windows or on proprietary switches and other equipment, which can make NAC hard to deploy in large organizations, as well as being incomplete in terms of monitored devices and narrow in the range of available authentication options.
NAC market accelerating
Over the past year, the growth of the NAC market has accelerated dramatically as key vendors have clarified their plans. High-profile vendors such as Cisco Systems and Microsoft offer their own proprietary standards. Cisco’s Network Admission Control (NAC) checks endpoints as they enter the network, using switches and routers to enforce compliance with security policies. Microsoft’s Network Access Protection (NAP), yet to be released, while using the DHCP server as one of the primary enforcement mechanisms, suffers from significant challenges to remediate issues and render a device compliant. Optimal functioning of Cisco NAC or MS NAP occurs in a homogeneous Cisco network or for networks and devices running on a Microsoft OS, respectively.
Cisco and Microsoft concern themselves with on-entry — pre-admission — NAC. While pre-admission checks are critical; devices can, of course, still become infected or hacked once on the network.
In order to keep a network secure, all network activity must be observed — pre- and post-admission. Appliance-based NAC combined with a DHCP appliance, as a turnkey solution, can be an appealing option for the scientific community as the combined solution enables cost-effective authentication of users and control of their access to network resources.
Independent NAC vendor Mirage Networks, has introduced patent-pending technology that analyzes network traffic patterns to discover infected or out-of-policy devices at all points on the network throughout their connection lifecycle. Interior network defense using this type of appliance can be integrated with a compatible DHCP appliance, such as BlueCat Networks’ Adonis. Adonis authenticates users using multiple methods and configures the endpoints for the network, integrating this information with the NAC appliance to ensure the delivery of a complete solution.
Conclusion
Since newer inadvertent, as well as malicious, threats will continue to populate the landscape, implementations of complete and forward-looking NAC strategies are key to continued scientific collaboration. For NAC to be a viable solution in the science or laboratory environment, organizations need to consider a NAC solution with a seamless approach for bringing a non-compliant device into compliance so it can gain access to the network. DHCP-based NAC provides an advantage through its ability to tie an identity to a network address and associate it to corporate policy. Identity information can then be used to make more intelligent NAC quarantine decisions and help secure the future of the e-Science transformation.
David Berg is Director of Product Management at BlueCat Networks. He may be reached at editor@ScientificComputing.com.