Whenever a project is being planned, risk management has to be part of the equation – things rarely go smoothly or completely as expected, and there will always be areas that present more risks than others. Whether they affect the projected timeframes, budgets or outcomes, it is the job of the project manager to identify them and ensure that provisions are in place to limit their impact should they occur.

However, failures are made in risk management every day – they helped to trigger the economic crisis in 2008, demonstrating that even the world’s biggest banks, which take financial and logistical risks every day, are not immune to risk mismanagement. With this in mind, it’s understandable that smaller projects and processes might suffer from errors made in risk management.

Why aren’t we performing risk management well, then? With project management an ever-growing sector and more and more jobs being created every day, the next generation of risk managers needs to be able to identify issues in order to rectify them.

Unknown Unknowns

One of the most problematic aspects of risk management is the concept of “unknown unknowns” – the risks that we can’t predict and don’t even know could occur. As thorough as a risk management plan might be, there are some areas that it just can’t cover because they technically do not exist until the project has started and will arise as a result of the ongoing work.

There is little that can be done about unknown unknowns – the only way that they can be completely avoided is if the project is never started, which is not a viable option. Any project inherently contains risks, but they can be risks that work out positively for the project and the organization. There is every chance that unknown unknowns may turn out that way.

Lack of Data

A lot of project risks are identified using historical data, which isn’t always credible – in the stock market, it is impossible to figure out future trends by using past events, and it’s the same here. However, data can be utilised to an extent, which means that the job is made a lot more difficult when it isn’t available.

A recent survey by the Economist Intelligence Unit states that more than half of risk executives at banks around the world have insufficient data to support a robust risk management strategy – therefore, there is no reason to suggest that, should the situation be the same in other industries, they would be any better equipped to produce a decent risk management strategy with the same data deficiencies.


On a very basic level, it can be quite intimidating to think about the number of risks that a project might possess, and risk managers can be concerned about seeming overly negative, affecting people’s opinions of the project and potentially the methods and processes used to complete the project. One might argue that if someone lacks this kind of forthrightness, they should not be involved in project management, but it is a weakness that has to be legislated for.

To not perform risk management thoroughly, however, smacks of incompetence and costs the organization as a whole both time and money. The responsible thing is to highlight risks so that they can be planned for in the event that they occur. Don’t worry about telling stakeholders anything they don’t want to hear – it just might trigger a different, better way of doing things.


The composite rate in the U.S. in 2015 for all property and casualty lines was up 1% in February, compared to flat in January 2015, MarketScout said today.

Pricing measurements by coverage showed no further price deterioration in any line and an increase of 1% in auto, professional liability and EPLI, from plus 1% to plus 2%. By account size, large accounts ($250,001 to $1,000,000 premium) increased from flat to plus 1%, while all other account sizes remained the same as in January, according to MarketScout.

“Could this mean underwriting executives are actually walking away from underpriced business?” asked Richard Kerr, MarketScout CEO.

“February is normally a low volume premium month so we would caution about putting too much credibility in these metrics; however, historically once the insurance market starts softening it normally accelerates rather than moderates or turns around,” he said in a statement. “We speculate insurers are not going to cut deep and long in this cycle. Big data, modeling software and improved underwriting acumen are resulting in insurers simply being too smart to fall for extended and deep price cuts.”

When measuring by industry classification, contracting, habitational, public entity and transportation all increased by 1% in February compared to January.

Summary of the February 2015 rates by coverage, industry class and account size:

By Coverage

Commercial Property         Up 1%

Business Interruption       Up 0%

BOP                                  Up 1%

Inland Marine                   Up 0%

General Liability                Up 1%

Umbrella/Excess               Up 1%

Commercial Auto              Up 2%

Workers Compensation     Up 0%

Professional Liability          Up 2%

D&O Liability                    Up 1%
EPLI                                Up 2%

Fiduciary                          Up 0%

Crime                               Up 0%

Surety                              Up 0%







Despite extensive, persistent drought in the western United States, 2014 saw notably low numbers of wildfire incidents, both for total number of fires and acreage burned. According to CoreLogic, there were 63,345 wildfires in 2014, which ranks second only to 2013 as the lowest annual number of wildfires over the past 20 years. In comparison with 2013, which was the second lowest annual total acreage burned in the past 10 years, the 2014 season saw even lower numbers, with 3,587,561 acres burned by wildfires.

More intensive response to small fires and ignitions, increased overwinter snowpack and timely precipitation during wildfire season, and greater efforts to boost public awareness and homeowner mitigation efforts have all contributed to more effective control over wildfires, the company pointed out. But responding agencies, homeowners and insurers should not allow the decline to translate into a sense of security.

“Even though we haven’t seen the type of wildfire activity over the last few years that seemed to be thematic in the 2000s, there have been record setting wildfire events even during the recent periods of overall reduced wildfire numbers,” the report said. “With continuing residential growth in the West, the opportunity for fires to find homes and businesses is going to increase as well. This is why it has never been more important to know where wildfire risk is located and understand the likelihood of it occurring.”

Across the western states, the highest risk areas can be found:

Western US Wildfire Risk

Based on CoreLogic wildfire analysis, there are 897,102 residential properties in the region that are currently located in High or Very High wildfire-risk categories, with a reconstruction value of more than $237 billion. In the Very High risk category alone, there are just over 192,000 residences with a reconstruction value of more than $49 billion. “Taking into consideration the combination of risk factors both inside and outside the property boundary to assess numeric risk score, more than 1.1 million homes in the U.S. with a total reconstruction value of more than $268 billion fall into the highest wildfire risk segment of 81-100. This total is more than five times the number of homes that fall under the Very High risk category,” CoreLogic reported.

The company also broke down the statewide totals for potential exposure to wildfire damage, in reconstruction value per risk category:

CoreLogic: Total Potential Exposure (Reconstruction Value) to Wildfire Damage by Risk Category

Check out the full report for more details on the risks of wildfire damage.


While the Internet of Things (IoT) offers many benefits to businesses, such as keeping track of inventory, ordering products and having them delivered when needed, installing smart street lamps that monitor traffic, and detecting moisture levels in soil for optimal irrigation, most companies have yet to optimize the technology, according to a study by Accenture.

From Productivity to Outcomes: Using the Internet of Things to drive future business strategies,” found that the 87% of companies are aware of the benefits and the potential impact on their business, but only 38% believe their company’s executives understand the technology.

“Is it caution or complacency that is hindering the C-suite from harnessing the Internet of Things? This study shows that senior leaders cite multiple reasons why they have not made inroads—from constrained access to capital, to insufficient access to technology or poor information and telecommunications infrastructure,” Bruno Berthon, managing directure-Accenture Strategy commented in the report.


Berthon continued:

“I believe the conditions are ripe for the widespread adoption of the Internet of Things; a proliferation of data-rich sensors and devices that open up connectivity and a universal demand for faster, more efficient ways to work and live…The Internet of Things is game changing. Leaders should seek out the best outcomes—to benefit their businesses, their countries and the worldwide economy.”