Organizational Complexity Poses Critical Cyberrisk

According to a recent survey on IT security infrastructure, 83% of businesses around the world believe they are most at risk because of organizational complexity.

“Employees are not following corporate security requirements because they are too difficult to be productive, plus policies hinder their ability to work in their preferred manner,” noted the Ponemon Institute’s “The Need for a New IT Security Architecture: Global Study,” sponsored by Citrix. “It is no surprise that shadow IT is on the rise because employees want easier ways to get their work done.”

Shadow IT, the information technology systems built and used by an organization without explicit approval, has largely cropped up because employees feel official tools are too complex or otherwise difficult and inefficient. As a result, company data is being put on personal devices and official business is conducted on platforms that enterprise security teams can not monitor or secure.

Nearly three-quarters of respondents said their business needs a new IT security infrastructure to reduce risk. With increasing amounts of sensitive data stored, new technology like the internet of things adopted, and new cyberrisk threats constantly emerging, addressing individual security challenges may be impossible, Citrix Chief Security Officer Stan Black told eWEEK. Rather, companies should focus on larger issues like controlling complexity, developing and maintaining strong incident response plans, and rigorously vetting vendors with access to systems or responsibility for storing data.

Check out more of the report’s findings in the infographic below:

organizational complexity cyberrisk

2016 Ends with 1% Average Rate Reduction

The year ended with few surprises in commercial insurance pricing in the United States, after 2016 started out with a composite rate decrease of 4%. In ms-barometerApril, rates began to moderate and continued reductions of 1% to 2% per month. The year closed with a composite rate reduction of 1%, according to MarketScout.

While the soft market has been going for 16 months, that period seems longer because for the first eight months of 2016, the composite rate was flat to plus 1% before dropping into negative territory, MarketScout said.

“We have been tracking commercial property and casualty rates since 2001. Generally, the soft or hard market cycles last at least three years,” Richard Kerr, CEO of MarketScout, said in a statement. “We expect more moderate rate reductions for the coming year for all but a few lines of business.” An increase in interest rates could accelerate rate reductions, he added.

By coverage classification, commercial property moderated in December, from down 3% to down 2%. Workers’ compensation rates dropped from down 1% to down 2%. EPLI and crime were the only coverages that saw rate increases—both lines of coverage went up by 1% to up 2%. The composite rate for all other coverages was unchanged.
ms-coverage-class

By account size, there were no changes from November to December 2016.
ms-account-size

By industry classification, contractors adjusted from down 1% to flat. Transportation accounts saw ongoing rate increases across the board, jumping from up 3% in November to up 5% in December.
ms-industry-class

Annual Report Card Finds Vermont Has the Best Insurance Regulatory System

Examining a matrix of variables affecting a state’s insurance regulations, the R Street Institute determined that Vermont has the best regulatory system for insurance and that vermontNorth Carolina has the worst, according to the Insurance Regulation Report Card.

The annual report grades each state across seven dimensions. The three fundamental questions the report seeks to answer are:

1. How free are consumers to choose the insurance products they want?

2. How free are insurers to provide the insurance products consumers want?

3. How effectively are states discharging their duties to monitor insurer solvency and foster competitive, private insurance markets?

“We believe states should regulate only those market activities where government is best-positioned to act; that they should do so competently and with measurable results; and that their activities should lay the minimum possible financial burden on policyholders, companies and, ultimately, taxpayers,” Senior Fellow R.J. Lehmann said in a statement.

According to the report:

The insurance market is both the largest and most significant portion of the financial services industry to be regulated almost entirely at the state level. While state banking and securities regulators largely have been preempted by federal law in recent decades, Congress reserved to the states the duty of overseeing the “business of insurance” as part of 1945’s McCarran-Ferguson Act. On balance, we believe states have done an effective job of encouraging competition and, at least since the broad adoption of risk-based capital requirements, of ensuring solvency. As a whole and in most individual states, U.S. personal lines markets are not overly concentrated. Insolvencies are relatively rare and, through the runoff process and guaranty fund protections enacted in nearly every state, generally quite manageable. However, there are certainly ways in which the thicket of state-by-state regulations leads to inefficiencies, as well as particular state policies that have the effect of discouraging capital formation, stifling competition and concentrating risk. Central among these are rate controls.

For the third straight year, the report found that Vermont had the best insurance regulatory environment in the United States, receiving the only A+ score. Other states receiving either an A or A- were Arizona, Idaho, Illinois, Kentucky, Maine, New Hampshire, Utah and Wisconsin.

Meanwhile, North Carolina had the worst score, receiving a failing grade for the third year in a row. States ranking a D include Alaska, Massachusetts, California, Hawaii, Louisiana, Mississippi, Delaware, Montana, North Dakota and New York.

R Street found the most significant shift to be the continued expansion of North Carolina’s two property insurance residual market entities, even as Florida’s Citizens Property Insurance Corp.—previously the nation’s largest residual market entity—has continued to shrink.

“Not coincidentally, when R Street issued its first regulation report card in 2012, Florida ranked dead last and North Carolina was somewhere in the middle. This year, North Carolina is dead last and Florida is somewhere in the middle,” Lehmann wrote.

Driver Data: Advances in Innovative Exchange

With an innovation worthy of the digital age, the field of vehicle telematics is bringing auto manufacturers and insurance companies into sharper alignment. Now, data recorded in an individual vehicle can be “crunched” to yield insightstelematics about driving behavior—insights that can shed light on a driver’s risk category. In a further innovation, 2016 brought the establishment of a telematics data exchange, enabling risk managers to make use of this data with the consent of drivers.

Telematics data can potentially benefit consumers, fleet owners and insurers. Instead of insurers generally relying on a driver’s general information—age or gender, for example—policies can be written to address specific levels of risk supported by actual driving data (speed, acceleration, braking and time of operation). So the elements are falling into place to tap telematics-derived data, with potential for also attaining higher fuel efficiency and better fleet vehicle performance.

How do consumers and fleet owners benefit?

  • Rewards: Discounted insurance for drivers who have fewer risks or lower annual miles
  • Ease: Greater convenience, flexibility, and portability when shopping for auto insurance
  • Safety: Promotion of good driving habits
  • Savings: Insurers’ enhanced ability to segment risk types, potentially lowering premium costs for commercial fleet owners and managers

History of an idea

The seed for telematics was planted in the early 1960s, during a period when tensions between the United States and the former Soviet Union were escalating. That is also when the U.S. government, intent on national security and concerned about a potential nuclear threat, funded development of Global Positioning System (GPS) technology. Initially, GPS was intended for military and intelligence applications. By the early 2000s, telematics technologies were used in web-based fleet management systems that featured real-time information updates to remote networks. At that time, slow tracking rates limited data transmissions to one or two instances per hour. It wasn’t long, however, before GPS-based vehicle navigation systems flooded the consumer market.

Aligning value

In recent years, telematics has brought auto manufacturers and insurers into alignment, with both industries recognizing the potential of telematics. Automakers have found value in using telematics data to communicate information to car owners about their vehicle’s maintenance needs and performance and to convey information to consumers about their driving behavior, which could lead to safer driving. In turn, safer driving—such as fewer sharp turns and hard-braking incidents—could positively affect vehicle performance and fuel efficiency. And insurers have found a means to help better define risks.

Automakers also recognized that better fuel efficiency and less wear and tear (requiring less maintenance) could potentially save money for consumers, thereby reducing the total cost of car ownership.

Many insurers, too, quickly saw the inherent value of telematics data. Traditionally, insurers rate consumers on various factors that typically include proxy data to predict an individual’s risk level, which helps determine rates. Some consumers may complain that not enough insight goes into the rating process. Yet telematics data, applied through usage-based insurance (UBI) programs, allows insurers to consider details of individual driving behavior—which might lead to more accurate and customized pricing. Insurance rating could become more focused on individual behavior and performance. Insurers understand that a benefit of using telematics data as part of their underwriting practices can include the consumer’s perception that carriers are operating with greater transparency—and potentially give consumers greater understanding of their auto insurance expenses.

Consumers could now examine their own driving data—and likely this data overlapped with the data their insurance company reviewed when establishing their rate in the first place.

Great leap forward

For some time, we’ve said that a telematics data exchange might represent the future of usage-based insurance. That future isn’t far away. Consider this: It is estimated that by 2020, more than 90% of all new vehicles sold in the United States will be able to connect to the internet. Today, about 5% of vehicles are so equipped. That is a powerful leap forward in terms of the data that will be available from connected cars.

This gives auto-makers the potential to capitalize on vast amounts of data collected by the connected cars they sell. Insurers can benefit by potentially enhancing their efforts to acquire and retain safer drivers and monitor their policyholders’ driving behavior and vehicle mileage.

There can be corresponding challenges related to such connected vehicle data, however. The volume of data from connected cars is enormous and growing. The hardware, software, and carrying costs needed to store and manage that data can run into the millions of dollars—a cost many insurers may find onerous. Automakers face their own set of issues, chief among them being the “many-to-many” problem: how to connect with hundreds of insurers that might be interested in accessing their data. While those are just a few of the multiple hurdles to overcome when harvesting exponentially growing stores of data, these are challenges that a telematics data exchange can help address. That is why the launch of the first data exchange marks such a critical milestone in the history of telematics.