Organizational Complexity Poses Critical Cyberrisk

According to a recent survey on IT security infrastructure, 83% of businesses around the world believe they are most at risk because of organizational complexity.

“Employees are not following corporate security requirements because they are too difficult to be productive, plus policies hinder their ability to work in their preferred manner,” noted the Ponemon Institute’s “The Need for a New IT Security Architecture: Global Study,” sponsored by Citrix. “It is no surprise that shadow IT is on the rise because employees want easier ways to get their work done.”

Shadow IT, the information technology systems built and used by an organization without explicit approval, has largely cropped up because employees feel official tools are too complex or otherwise difficult and inefficient. As a result, company data is being put on personal devices and official business is conducted on platforms that enterprise security teams can not monitor or secure.

Nearly three-quarters of respondents said their business needs a new IT security infrastructure to reduce risk. With increasing amounts of sensitive data stored, new technology like the internet of things adopted, and new cyberrisk threats constantly emerging, addressing individual security challenges may be impossible, Citrix Chief Security Officer Stan Black told eWEEK. Rather, companies should focus on larger issues like controlling complexity, developing and maintaining strong incident response plans, and rigorously vetting vendors with access to systems or responsibility for storing data.

Check out more of the report’s findings in the infographic below:

organizational complexity cyberrisk

65% of Businesses Unprepared For Email-Based Cyber Threats

In a recent threat report, cloud email management company Mimecast warned they had seen a 55% increase in whaling attacks over the past three months. As we reported in this month’s Risk Management cover story “The Devil in the Details,” social engineering fraud schemes like whaling (which is phishing that targets higher-profile employees and executives) resulted in a total losses of more than $1.2 billion worldwide between October 2013 to August 2015. According to the Mimecast Business Email Threat Report 2016, released yesterday, IT security professionals clearly recognize the risk, with 64% of respondents in the new saying they see email as a major cybersecurity threat to their business. Yet only 35% feel confident about their level of preparedness against data breaches, while 65% feel ill-equipped or too out of date to reasonably defend against the risk.

“Our cyber-security is under attack and we depend on technology, and email in particular, in all aspects of business. So it’s very disconcerting to see that while we might appreciate the danger, many companies are still taking too few measures to defend themselves against email-based threats in particular,” said Peter Bauer, chief executive officer of Mimecast. “As the cyber threat becomes more grave, email attacks will only become more common and more damaging. It’s essential that executives, the C-suite in particular, realize that they may not be as safe as they think and take action. Our research shows there is work still to be done to be safe and we can learn a lot from the experience of those that have learnt the hard way.”

Even the most secure companies feel the most at risk of these scams. Of the top 20% of organizations that feel most secure, 250% are more likely to see email as their biggest vulnerability. Those who feel most confident about guarding against the risk are 2.7 times more likely to have a C-suite that is extremely or very engaged in email security. Among the IT security managers who feel most prepared, five out of six say that their C-suite is engaged with email security, Mimecast reports. However, of all IT security managers who were polled, only 15% say their C-suite is extremely engaged in email security, while 44% say their C-suite is only somewhat engaged, not very engaged, or not engaged at all.

The firm also had some insight on best budgeting against the risks of phishing. Those who feel better prepared to handle email-based threats also allocate higher percentages of their IT security budgets toward email security, the firm found, with these IT security managers allocating 50% more of their budgets to email security compared to managers who were less confident in their readiness. Mimecast found 10.4% of the total IT budget toward email security is the ideal intersection between email security confidence and spend.

To reduce the threat of whaling, Mimecast recommends that companies:

  • Educate your senior management, key staff members and finance teams on this specific type of attack. Don’t include whaling in a general spear-phishing awareness campaign—single out this style of attack for special attention to ensure key staff remain vigilant.
  • Carry out tests within your own business. Build your own whaling attack as an exercise to see how vulnerable your staff are.
  • Use technology where possible. Consider an inbound email stationery that marks and alerts readers of emails that have originated outside of the corporate network.
  • Consider subscribing to domain name registration alerting services so you are alerted when domains are created that closely resemble your corporate domain. Consider registering all available TLDs for your domain, although with the emergence of generic TLDs (gTLD) this may not be scalable.
  • Review your finance team’s procedures; consider revising how payments to external third parties are authorized. Require more than single sign-off, or perhaps use voice or biometric approval only with the requestor to ensure validity of the request.

Check out the infographic below for more on business email threats:

mimecast business email threats

Automation: The Key to More Effective Cyberrisk Management

cybersecurity automation

In a perfect cybersecurity world, people would only have access to the data they need, and only when they need it. However, IT budgets are tighter than ever and, in most organizations, manually updating new and existing employees’ access levels on a consistent basis is a time-consuming productivity-killer. As a result, there’s a good chance an employee may accidentally have access to a group of files that they should not. As one can imagine, security that is loosely managed across the enterprise is a breeding ground for malware.

The velocity of cyberattacks has accelerated as well. It is easier than ever for cyber criminals to access exploits, malware, phishing tools, and other resources to automate the creation and execution of an attack. Digitization, Internet connectivity, and smart device growth are creating more vectors for attackers to gain an entry point into an organization’s network, and this trend only gets worse as you think about the Internet of Things, which could have concrete impact on machines from production equipment to planes and cars.

One way IT departments can help mitigate the cyberrisk of employee access overload is through automating security policies and processes such as the monitoring, detection and remediation of threats. In the past, organizations have spent a lot on prevention technologies: disparate point solutions such as anti-virus software and firewalls that try to act before an attack occurs. Prevention is important but not 100% effective. And how could technology used for prevention stop a cyber-attacker that has already infiltrated the network? If prevention were the end-all, be-all in security tools, we wouldn’t be reading about cyberattacks on a daily basis. As more companies realize this, a spending shift to detection and response is being driven.

To help determine cyberrisk—or better yet, safely manage your cyberrisk—you must look at the threat (which is ever growing due to constant hackers and advanced techniques), vulnerability (how open your data is to cyberattacks), and consequence (the amount of time threats are doing damage in your network). Or, more simply put: risk = threat X vulnerability X consequence time.

To manage your cyberrisk, you need to optimize at least one of the aforementioned variables. Unfortunately, threat is the one variable that cannot be optimized because hackers will never stop attacking and are creating malware at an escalating rate. In fact, a G DATA study showed that 6 million new malware strains were found by researchers in 2014—almost double the number of new strains found the previous year. Instead, what organizations can focus on is investing in the right solutions that target the remaining two variables: vulnerability and consequence.

  • Step One: Organizations must make sure they know their environments well (such as endpoints, network, and access points) and know where their sensitive information lives. It’s always a good idea to rank systems and information in terms of criticality, value and importance to the business.
  • Step Two: Organizations must gain increased visibility into potential threat activity occurring in the environment. As is often said, there are two types of companies: those that have been attacked and those that have been attacked and don’t know it. A way to increase visibility is through the deployment of behavior-based technology on the network, like sandboxes. Organizations are now shifting their focus to the endpoint. Today’s attacks require endpoint and network visibility, including correlation of this activity. The challenge with visibility is that it can be overwhelming.
  • Step Three: There needs to be some process or mechanism to determine which alerts matter and which ones should be prioritized. In order to gain increased visibility into environments and detect today’s threats, organizations clearly need to deploy more contemporary detection solutions and advanced threat analytics.
  • Step Four: Invest more in response and shift the mindset to continuous response. If attacks are continuous and we are continuously monitoring, then the next logical step is to respond continuously. Historically, response has been episodic or event-driven (“I’ve been attacked – Do something!”). This mindset needs to shift to continuous response (“I’m getting attacked all the time – Do something!”).  A key ingredient to enable continuous incident response will be the increasing use of automation. Why? Automation is required to keep up with attackers that are leveraging automation to attack. It’s also required to address a key challenge that large and small companies face: the significant cybersecurity skills shortage.

Advanced threat analytics should be important to any organization that takes its security posture seriously. The majority of threats being faced today are getting more advanced by the minute. If an organization relies solely on legacy, signature-based detection, their defenses will be easily breached. It’s important for teams to understand that the cyber defense and response capabilities of an organization must constantly evolve to match the evolving threat landscape. This includes both automatic detection and remediation. Automatic remediation dramatically reduces the time that malware can exist on a network and also reduces the amount of time spent investigating the issue at hand. With automated security defenses, IT teams are given a forensic view of every packet that moves through the network and allows teams to spot anomalies and threats before they have a chance to wreak havoc. And since these tools are automated and work at machine speed, they can deal with a high volume of threats without necessitating human intervention, taking some of the load off overburdened security teams, and ultimately freeing them to act decisively and quickly, before network damage is done.

Windows Server 2003 Expiration Brings Defense in Depth to Life

windows server 2003

The termination of support for Windows Server 2003 (WS2003) is less than four months away, leaving many enterprises in a race against the clock before the system’s security patches cease. In fact, 61% of businesses have at least one instance of WS2003 running in their environment, which translates into millions of installations across physical and virtual infrastructures. While many of these businesses are well aware of the rapidly approaching July 14 deadline and the security implications of missing it, only 15% have fully migrated their environment. So why are so many enterprises slow to make the move?

Migration Déjà Vu

The looming support deadline, the burst of security anxiety, the mad rush to move off a retiring operating system… sound familiar? This scenario is something we’ve seen before, coming just 12 months after expiration of Windows XP support.

While there may be fewer physical 2003 servers in an organization than there were XP desktops, a server migration is more challenging and presents a higher degree of risk. From an endpoint perspective, replacing one desktop with the latest version of Windows affects only one user, while a server might connect to thousands of users and services. Having a critical server unavailable for any length of time could cause major disruption and pose a threat to business continuity.

Compared to the desktop, server upgrades are significantly more complex, especially when you then add hardware compatibility issues and the need to re-develop applications that were created for the now outdated WS2003. Clearly, embarking on a server migration can be a very daunting process – much more so than the XP migration – which seems to be holding many organizations back.

Cost of Upgrading versus Staying

Moving off WS2003 can be a drain on time resources. While most IT administrators understand how to upgrade an XP operating system, the intricacy of server networks means many migrations will require external consultancy, especially if they are left to the last minute. It’s no wonder that companies this year are allocating an average of $60,000 for their server migration projects. Still, it’s a fair price to pay when you consider the cost of skipping an upgrade entirely. Legacy systems are expensive to maintain without regular fixes to bugs and performance issues. And without security support, organizations will be left exposed to new and sophisticated threats. Meanwhile, hackers will be looking to these migration stragglers as their prime targets. For those who fall victim to exploits as a result, it’s not just financial losses they will have to deal with, but a blow to their reputation as well. It also means that companies continuing to run on WS2003 after support ends will be removed from the scope of compliance, adding other penalties that could further damage the business.

If they haven’t already, businesses still running on the retiring system should be thinking now about making an upgrade to Windows Server 2012. It’s easier said than done, of course. A server migration can take as long as six months, so even if businesses start their migration now, there could still be a two month period during which servers run unsupported. This means that organizations should be putting defenses in place to secure their datacenters for the duration of the migration and beyond.

Control Admin Rights

While sysadmins are notorious for demanding privileged access to applications, the reality is, allocating admin rights to sys-admins is extremely risky, since malware often seeks out privileged accounts to gain entry to a system and spread across the network. Plus, humans aren’t perfect, and the possibilities for accidental misconfigurations when logging onto a server are endless. In fact, research has shown that 80% of unplanned server outages are due to ill-planned configurations by administrators.

Admin rights in a server environment should be limited to the point where sysadmins are given only the privileges they need, for example to respond to urgent break-fix scenarios. Doing so can reduce exploit potential significantly. In an analysis of Patch Tuesday security bulletins issued by Microsoft throughout 2014, the risk of 98% of Critical vulnerabilities affecting Windows operating systems could be mitigated by removing admin rights.

Application Control

Application Control (whitelisting) adds more control to a server environment, including those that are remotely administered, by applying simple rules to manage trusted applications. While trusted applications run through configured policies, unauthorized applications and interactions may be blocked. This defense is particularly important for maintaining business continuity as development teams are rewriting and refactoring apps.


Limiting privileges and controlling applications sets a solid foundation for securing a server migration, but even with these controls, the biggest window of opportunity for malware to enter the network – the Internet – remains exposed. Increasingly, damage is caused by web-borne malware, such as employees unwittingly opening untrusted pdf documents or clicking through to websites with unseen threats. Vulnerabilities in commonly used applications like Java and Adobe Reader might be exploited by an employee simply viewing a malicious website.

Sandboxing is the third line of defense that all organizations should have in place, at all times. By isolating untrusted content, and by association any web-borne threats or malicious activity in a separate secure container, sandboxing empowers individuals to browse the Internet freely, without compromising the network. Having instant web access is expected in modern workplaces, so sandboxing is ideal for securing Internet activity without disrupting productivity and the user experience.

Windows Server 2003 Migration: A Window of Opportunity

It shouldn’t take an OS end of life to spur change – especially security change. Organizations and their IT teams need to be thinking about how they can adapt their defenses, ensuring that they are primed to handle the new and sophisticated threats we see emerging every day. A migration is often the perfect time to revitalize an organization’s security strategy. With a migration process as a catalyst for reinvention, IT can lean on solutions like Privilege Management, Application Control and Sandboxing to not only lock down the migration, but carry beyond it as well, providing in-depth defense across the next version of Windows.