Articles by Christopher Camejo
Vulnerability scans rely on mostly automated tools to find potential vulnerabilities at either the network or application level Security vulnerability in any network can be found and exploited by hackers and others in no time. The only questions are when this will happen and how much damage an individual could do once they’ve gained access to the network. Recognizing this reality, most organizations test their own networks for security weaknesses, whether to meet compliance requirements or simply as a best practice. Those that aren’t doing this now should start—the sooner, the better. There are a variety of methods that can be used for these tests, each of which has its strengths and weaknesses. For example, some can be performed relatively quickly and easily, while others are more complex and exhaustive. Determining which method is right for a particular organization or situation can be overwhelming to say the least, particularly for those lacking advanced IT skills. The below overview of the most common testing practices will help make sense of the often-confusing array of options to help organizations ensure the highest level of network security and protection. Vulnerability Scans When run on a regular basis,vulnerability scans can serveas an early warning that softwareis out of date or patches aremissing or misconfigured Vulnerability scans rely on mostly automated tools to find potential vulnerabilities at either the network or application level. Of the two, network scans are the more basic, looking for known common vulnerabilities in widely used commercial and open source software and reporting any that are found with ratings that identify the level of severity. The advantages of network vulnerability scans lie in their speed, cost efficiency, and safety, which make them ideal for ensuring that the latest system patches and updates have been deployed and that security configurations are as stringent as possible. When run on a regular basis, these scans can serve as an early warning that software is out of date or patches are missing or misconfigured. Many organizations only test their networks from the Internet. It’s true that Internet facing-vulnerabilities are the most well-known and well-publicized and may seem like the easiest for an attacker to exploit, but there’s much more to the story. Specifically, by limiting scans only to external threats, organizations remain unaware of exactly what an attacker could accomplish once the network has been breached, for example by tricking a user into installing a backdoor via a phishing email. What internal network vulnerabilities could an attacker exploit to move between systems once they’ve gained a foothold? Without testing internally, there’s no way to know the answer to this question until it’s too late. Organizations must also test from inside the firewall to discover what an attacker could accomplish once the network has been breached Internal Network Scans Therefore, in addition to network vulnerability scans, organizations must also test from inside the firewall. But it’s important to note that even internal network scans can leave blind spots since, by default, scanners only check services that listen for network communications. Unfortunately, many attacks are made possible by phishing, drive-by-downloads, and other campaigns which target web browsers, PDF viewers and other client software that a network scan will skip over. Using these tactics, attackers can then exploit vulnerabilities in other local operating systems to gain administrator privileges. There is a way to eliminate these blind spots by configuring scanning tools with authentication credentials that enable them to log in to their targets during internal scans, allowing them to check local software as well. This approach will give the most complete view of the status of an organization’s patches and configurations. Even internal network scanscan leave blind spots since,by default, scanners onlycheck services that listen fornetwork communications The other main shortcoming of network vulnerability scanners is that they are only as good as their vulnerability signatures, which are based on existing databases of known vulnerabilities. This means they cannot identify flaws that haven’t yet been reported publicly, including those found in more obscure or custom applications. This can present significant risk, as attackers regularly target and leverage vulnerabilities in custom applications to access the data they contain or breach the underlying network. This is where application vulnerability scans come in. Application Scanners Application scanners are designed specifically to identify these previously undocumented vulnerabilities found in custom applications. Unlike network scanners, these tools exercise all of an application’s functionality to find common types of flaws, rather than looking for a list of known vulnerabilities. However, because of the amount of data these scanners send to an application, they must be used very carefully. No organization wants to become another entry on the long list of stories about application scanners dumping garbage data into a database or triggering thousands of emails. That said, regardless of how advanced application scanners may be, they are still incapable of catching a number of vulnerabilities, especially those that are too subtle for the scanner to pick up on but which would be obvious to a human observer. As is the case with network scans, a clean report by an application scanner is a good start but is no guarantee that there are no problems. Organizations should build on these scans with deeper, more complex and thorough methods, such as penetration testing. Penetration testing brings skilled, "white hat" hackers into the mix to simulate real-world attacks Real-World Testing Organizations often make the mistake of concentrating their network security efforts on fixing only those vulnerabilities identified by scans as being critical or high-severity in nature, which is a highly ineffective practice. Why? Because real-world breaches are rarely perpetrated on the basis of a single critical network vulnerability. Instead, attackers recognize the tendency to focus on only “serious” problems and often chain together multiple low- to medium-severity network vulnerabilities or combine them with “local” vulnerabilities that are invisible from the network. Building on network and application vulnerability scanning, penetration testing brings skilled, “white hat” hackers into the mix to simulate the kind of real-world attacks against an organization’s network services, applications, or even both simultaneously. Like malicious attackers, these testers attempt to combine vulnerabilities uncovered by scanners while also looking for those that the scanners are incapable of detecting. While this process is more time-consuming and costly than deploying scanning tools alone, it provides a more realistic assessment of just how much effort an actual attacker would need to put forth to breach an organization’s network and data. No matter how careful penetration testers are in their efforts, it is always possible that a host would be knocked offline temporarily or data in a database altered Potential Unintended Consequences Each of these network vulnerability testing methods brings its own strengths and weaknesses to the overall security equation, underscoring the reality that no testing— regardless of how important or critical it may be—comes without risk. For example, no matter how careful penetration testers are in their efforts to exploit flaws and vulnerabilities without causing damage, it is always possible that a host would be knocked offline temporarily or data in a database altered. Organizations need to be aware of these potential unintended consequences. It is important to understand that the skill level of the testers will largely determine the success of testing, so organizations should seek out testers with strong experience and skillsets. One final note is that regardless of how tempting it may be to cut costs by limiting the scope of testing, the potential long-term costs—network disruption, data theft, damage to reputation, etc.—could be far greater than today’s savings. For this reason alone, the higher cost to an organization of having an established, experienced team perform exhaustive testing can actually turn out to be a tremendous bargain. Save
Cybersecurity is a fast-changing field and 2015 was no exception. The proliferation of cybersecurity issues continued to make headlines, including the very dramatic hacking of a vehicle to allow for remote control over steering, brakes, the transmission and other critical functions. There was also state-sponsored hacking that targeted government, defense and other strategic sectors of the marketplace. Staying on top of these breaches still remains a challenge. Securing The Internet of Things While we can’t discuss the details of our projects over the past year, we are proud to say that NTT Com Security has been successful in identifying new vulnerabilities in mobile devices, home security systems, and automotive telematics interfaces. These are all part of the important area of the “Internet of Things” that will continue their climb into the information security headlines in 2016. For the New Year, we are anticipating a broader focus beyond the servers, workstations and communications infrastructure that we are used to, growing to encompass appliances, vehicles, factories, utility infrastructure, medical devices, and myriad other devices that will eventually all be connected to the Internet – and therein lies the ongoing task. The Internet in general is extremely vulnerable, and companies and individuals have a long way to go in learning how to protect themselves. We can expect continued attacks targeting payment card data wherever it can be found along with attempts to commit financial fraud with stolen banking credentials or via social engineering tricks like spear phishing emails. High profile breaches, such as the Edward Snowden leaks, have brought security to the forefront of the public consciousness Security and risk management Fortunately, these high-profile breaches, hack demonstrations (like the Jeep hack), and the Snowden leaks have all helped bring the topic of security directly into the public consciousness. With consumers becoming aware of the value of their information and the importance of protecting their privacy, companies will be forced to design security into their products. We are already seeing a public demonstration of this with the new security and encryption controls Apple has implemented on the iPhone. Meanwhile, talking about risks and vulnerabilities can sometimes feel like shouting at a crowd that refuses to listen despite the obvious danger they are in. The good news is that we are taking active steps to guard against risk and combat it where it happens. NTT Com Security has developed a global cybersecurity and risk management portfolio of managed services, consulting and technology solutions to protect critical and confidential data from attacks. See the full coverage of 2015/2016 Review and Forecast articles here
The Internet of Things (IoT) raises many new issues for security professionals. Attendees who register for ‘The Pros and Cons of the Internet of Things’ seminar at ISC West 2017 will come away with insights on new developments in networked solutions for the security industry and how they are impacting today’s systems. The assembled panel of thought leaders, representing nearly every category within the industry, will discuss the ins and outs of networking technologies and cybersecurity, as well as the role of the IoT for professional security applications. Armed with the best practices these experts will provide, security professionals will be well-equipped to provide their customers with even greater value from their security solutions. The seminar will be moderated by Ron Hawkins, Manager of Special Projects and Partnerships, Security Industry Association Efficient Data Security “New surveillance and security technologies are capturing volumes of information with greater detail and efficiency than ever before,” said Tom Cook, Senior Vice President of Sales, Hanwha Techwin America. “Video surveillance technology is a prime example, with new image capture devices delivering unprecedented levels of performance and intelligence that makes them invaluable for business applications that transcend traditional security.” The seminar will be moderated by Ron Hawkins, Manager of Special Projects and Partnerships, Security Industry Association (SIA). Panelists include Ronnie Pennington, National Sales Engineer, Altronix Corporation; Rick Caruthers, Executive Vice President, Galaxy Control Systems; Tom Cook, Senior Vice President, Sales, Hanwha Techwin America; Chris Camejo, Director of Product Management – Threat Intelligence, NTT Security; Ken LaMarca, Vice President, Sales and Marketing, OnSSI; Don Campbell, Vice President, Products, Quantum Secure; and Bud Broomhead, Chief Executive Officer, Viakoo.
The report shows that 65% of business decision makers surveyed expect to suffer an information security breach NTT Com Security, the global information security and risk management organization, has issued a new Risk: Value research report which highlights the critical need for organizations to protect their data. The report shows that 65% of business decision makers surveyed expect to suffer an information security breach – at an average cost of almost one million dollars and a recovery time of two months. Risk Of Data Breach “This report makes it clearer than ever how critically important it is for organizations to implement a comprehensive solution to protect their data,” said Christopher Camejo, Director of Threat and Vulnerability Analysis, NTT Com Security. “Fortunately, business leaders are now recognizing the risk to their organizations’ revenues and reputations, and beginning to take action to protect critical and confidential data from attacks.” Survey participants reported that they expect to suffer severe monetary and reputational consequences from an information security breach. Respondents estimate a breach would take nine weeks to recover from and would cost, on average, $907,053, – even before taking into consideration the cost of any reputational damage, brand erosion and lost business. Additionally, decision makers approximate that connected remediation costs would include legal fees, compensation to customers, third party resources and fines or compliance costs. Other expected remediation costs include PR and communications and compensation to suppliers and employees. Further, companies predict a 13% drop in revenue as a result of a breach. Impact Of Stolen Data For Organizations According to the report, almost all respondents say they would suffer external and internal impacts if data was stolen in a security breach, including loss of customer confidence (69%) and damage to reputation (60%). One third of business decision makers also expects to resign or expects another senior colleague to resign as a result of a breach. While 54% of those surveyed say information security is ‘vital’ to their business and nearly a fifth (18%) agree that poor information security is the ‘single greatest risk’, two-thirds (65%) predict that their organization will suffer a data breach sometime in the future. “The devastating cost of a data breach cannot be overstated, and as this report shows, many organizations simply are vulnerable,” said Camejo. “Our objective at NTT Com Security is to provide a broad range of solutions to protect data and reduce the risk of a catastrophic breach.”