Questions? Feedback? powered by Olark live chat software


Heart Pacemaker Devices Found To Have Major Technology Security Risks

Security professionals have been talking for months about the dangers of smart devices, most of which are almost comically (and tragically) lacking in even the most basic security protocols. More recently, the global Wannacry Ransomware attack demonstrated that smart medical devices were vulnerable to attack, with several of them being temporarily shut down by the malware. But exactly how bad is the problem?

Here’s an interesting comparison:

This past week, Google’s Project Zero found a total of eight critical security flaws in Microsoft’s Malware Protection Engine. Microsoft considered this to be such a serious issue that they took the unusual step of issuing a patch outside their normal schedule to address them.

Now, compare that with the number of security flaws found in a line of smart pacemakers by security researchers from WhiteScope, which identified more than 8,600 security flaws, mostly coming from third party libraries.

It should be noted that not all of these flaws are considered critical, and the number spans seven different manufacturers. However, the sheer number underscores the difference in scope and scale, and the point is further driven home by looking at the way smart device manufacturers are responding to the report.

We’ve known since at least 2013 that the vast majority of smart devices being marketed and sold today are highly insecure, and yet, almost none of the equipment manufacturers have done anything about it. This latest report generated a response that was more of the same – almost complete disinterest.

That’s dangerous, because it sets the conditions for what amounts to a perfect storm.
Right now, there are people living all over the world who rely on smart medical devices to keep them alive. The day’s coming when a hacking attack will kill someone.

Granted, even if smart device manufacturers started taking security more seriously, that would still almost certainly happen at some point. Taking no meaningful action at all only hastens the arrival of that day.

Used with permission from Article Aggregator

Ransomware Hits Medical Devices; Security Becomes Even More Important

As much attention as the recent, worldwide “Wannacry” ransomware attack got for bringing the UK’s health system to its knees and idling factories around the globe, it had another, less noticed, but no less important and terrifying impact.

An unnamed source recently released a screenshot of a “smart” medical device that had been locked and rendered inaccessible, thanks to the malware.

The device, a Bayer Medrad, which is used for imaging MRIs, is one of two devices known to have been hacked. The company assured the public that both devices saw functionality restored within 24 hours, but this event raises a pair of important issues.

First, “smart” devices don’t really deserve the name. Yes, they’re internet capable, but smart, they are not.

Worse, almost none of the smart devices being made and sold today have any protection or security at all. The few that do boast some sort of security only offer basic, bare bones, primitive protections that any teen-aged hacker with a limited tool set could circumvent.

That brings us to the much larger and more ominous second problem. An increasing number of peoples’ lives literally depend on the proper functioning of these devices.
We have now entered an era where a computer virus can kill a human being.

Imagine being hooked up to a machine, without which, you may die. Now imagine that machine being infected by malware, with the hackers demanding hundreds of dollars to restore its functionality.

It’s no longer a question of if that will eventually lead to a death, it’s a matter of when. The worst part is that we could be doing much more to make those kinds of attacks harder, and we’re not. Thus far, the makers of smart devices have been largely uninterested in bolstering security on the products they sell, and one day, probably in the not-too-distant future, someone is going to pay with their life for the lack of foresight.

Used with permission from Article Aggregator

HIPAA Fines Continue; New Focus On Signed Business Associate Agreements

Last year, the Department of Health and Human Services made headlines by issuing more than a dozen hefty fines to big companies that deal with Protected Health Information for noncompliance with HIPAA regulations.

That trend has continued into 2017, but with a new twist. The agency is expanding the scope and scale of their investigations. It is now targeting companies, including much smaller firms, that contract out their document storage and disposal if they don’t have a Business Associate Agreement on file for the third-party vendor.

Their most recent fine was levied against the Center for Children’s Digestive Health, which is a small pediatric specialty practice based in Illinois. The company was hit with a $31,000 fine for exactly that reason. According to David Holtzman, Vice President of Compliance at the consulting firm CynergisTek:

“Covered entities and business associates have an absolute obligation to have a BBA in place with contractors and vendors who handle Protected Health Information when performing an activity or function on their behalf.”

What this comes down to is a matter of documented assurance. If your company deals in any way with Protected Health Information, then any third-party vendor you do business with that handles the data on your behalf in any way has got to have a BAA on file. If you can’t produce a copy should HHS ask to see your records, you could face a stiff fine.

Depending on the size of your company, you may be able to absorb a $31,000 hit to your bottom line without missing a beat, but that amounts to an extremely expensive piece of paper. Given the fact that there’s such an easy fix for this issue, the answer seems clear. Be sure you’ve got a BAA on file for every vendor you deal with who comes into contact with any PHI your firm deals with.

Used with permission from Article Aggregator

Health Care Employees Have Big Problems With Data Security

Over the past two years, the hackers of the world have begun to shift focus. Previously, their preferred targets had been big credit card companies, and all over the dark web, interested parties could find as many credit card numbers (including all relevant account information) as they were interested in purchasing.

Times and tastes change, however, and starting two years ago, a new focus was selected: health care providers, insurance companies and the like. Protected health information turned out to be, in many cases, more valuable than mere credit card data.

The market boomed, and to this day, health-related companies are disproportionately represented among hacking targets.

Worse, a recent study conducted by MediaPro indicates that health care professionals are woefully unprepared to deal with such attacks. Based on their findings, just 28 percent of healthcare professionals in the United States have the privacy training and security skills necessary to provide any meaningful assistance when it comes to preventing leaks or minimizing their impact.

The findings only get worse from there. Nearly one in five healthcare professionals (18 percent) were classed as being active risks whose lack of security awareness could actually increase the chances of a breach.

Another 54 percent were rated as “novices,” meaning they had only rudimentary knowledge and understanding in the key areas of acceptable uses of social media, cloud computing, understanding malware warning signs, phishing prevention and access controls.

Even more disheartening is the fact that 69 percent of health care organizations reported feeling more at risk than companies in other sectors. They know that the problem exists, but thus far, have struggled to do anything meaningful to begin to change the equation. This is in spite of the fact that almost two thirds (61 percent) of those companies have adopted the best practice security frameworks like NIST.

This is a problem with no easy fixes, and it’s going to get worse before it gets any better, because 2017 stands to be another record breaking year in terms of high profile security breaches. Buckle up, especially if you work in the industry. It’s going to be a bumpy ride.

Used with permission from Article Aggregator

Attack On Health Firm Speaks To Real Threat For Everyone

On March 22, Urology Austin became the second medical company to suffer from a major breach. It was a particularly nasty one, combining data theft with a ransomware attack that impacted more than a quarter million users.

To its credit, the company took immediate, decisive action. It ensured that no one’s care was impacted and notified affected users, offering them a year’s worth of free credit and identity monitoring. Unfortunately, the breach underscored a weakness that often goes unnoticed.

When the company announced the breach, they indicated that the hackers were able to get their hands on a large amount of legacy data. That is data pertaining to patients who had not received care from the facility in years, but which was still stored on company servers.
This raises a delicate and troubling question.

Once a company has completed a given course of treatment for a patient, how long should it keep the digital records? A year? Seven years? Forever?

There are currently no standards in place, but given how cheap bulk storage has become, companies have simply defaulted to the position of keeping every scrap of data forever, and that can have unfortunate consequences as this latest breach demonstrates.

Some security experts have put forth the notion that one thing companies could do if they’re inclined to keep data for extended periods of time is to “de-identify” it. That means they should strip out any and all data that could conclusively pair it to a specific individual, but keep the treatment data itself for research purposes.

This is not widely done anywhere at this point, but it’s an idea that makes sense, if companies insist on keeping data long term.

There are no easy solutions here. On one hand, the day will surely come that a company deletes some information only to find out that it has a pressing need for it and no way to recover it. Or, on the other hand, a company will suffer a catastrophic breach that impacts years, or even decades’ worth of data with ruinous financial consequences.

How long do you keep client data, and do you have a well-defined policy that covers the subject? If you don’t, now is the time to change that.

Used with permission from Article Aggregator

Medical Offices Under Attack For Patient Records

The Darkweb has trends, just like the conventional web does. In years past, the hottest data for sale on the black market was credit card info, but that is now changing according to the latest Trend Micro survey. The new hot property? Medical records, and Trend’s survey indicates that North American hospitals are more exposed to digital threats than anywhere else on the planet.

Canada is the overall winner, with fully 53% of their hospitals exposed, with 36% of their US counterparts being exposed. The company found more than a thousand expired SSL certificates in the US alone, which is the digital equivalent of a large neon sign to a hacker.

As grim as those numbers look for the US and Canada, the picture doesn’t improve much in other parts of the world. For instance, nearly half of all NHS Trust in England suffered from a ransomware attack in the last twelve months, and even though the UK’s’ exposure is much lower than the US’s (less than 1%), they still suffered nearly a thousand data breaches last year, which speaks to the massive scope and scale of the problem.

Just how lucrative is the market for medical data on the Darkweb? As an example, a complete hospital database can sell for as much as half a million dollars.

What makes medical data so enticing for hackers is the width and breath of what it contains. The hackers can get dates of birth, social security numbers, addresses, billing and banking information, insurance information and a plethora of other financial details.

Basically, anyone armed with the data from a medical database would be able to digitally reconstruct your life and steal your identity. Worse, many of these pieces of data are unique and can’t be changed or reset if stolen.

If you work with PHI, be warned. The hackers are coming, and they’re coming for whatever data you’ve got.

Used with permission from Article Aggregator

Stolen USB Drive Gets Insurance Company $2.2 Million HIPAA Fine

The Department of Health and Human Services is off to a busy start in 2017, having just levied their second hefty fine for HIPAA violations. In this instance, the company in question was Puerto Rican insurer MAPFRE.

Back in 2011, MAPFRE had reported an ePHI incident involving a flash drive that was stolen from the company’s IT department, where it had been left in the open and unsecured overnight. The drive contained the names, social security numbers and other protected health information for slightly more than two thousand MAPFRE customers.

After properly reporting the breach, the company conducted a risk analysis and crafted an action plan designed to prevent such instances from occurring in the future. They presented their plan to OCR, which is the Health Department’s investigative arm, but then failed to implement their new proposed procedures.

When a follow up investigation revealed that the company had not taken the actions they committed to taking, in order to improve the physical aspect of their data security, OCR levied a staggering $2.2 million fine against them.

Last year was a record-setting year for the Department of Health and Human Services with thirteen settlements issued, plus a civil monetary penalty case. If January is any indication, the department looks like it’s going to have a busy 2017 as well.

Once again, the size of the settlement underscores the importance of taking HIPAA regulations seriously, but this particular case also brings to light the importance of taking timely corrective action. In this instance, MAPFRE did everything right except for the last step, when they failed to actually implement the changes they had committed to making. That proved to be a costly mistake. If your business deals with protected health information in any capacity, be sure it’s not one you repeat. Doing so could be far more costly than you realize!

Used with permission from Article Aggregator

Patient Information On Social Media Shows Need For Better Security

A very strange and disturbing case of the theft of protected health information has come from the New Hampshire Department of Health and Human Services (DHHS). The department recently reported on an incident that occurred in October 2015, in which a former psychiatric patient was able to access non-confidential information from a computer located in the hospital’s library.

The fact that a psychiatric patient (current or former) was able to access the information at all is disturbing enough, but there’s more to the story. This incident was observed by a member of the staff, who notified his supervisor who, to his credit, took steps to restrict the access of the library’s computer to put such information off-limits.

Unfortunately, while steps were taken, the incident was not reported to upper management in either the New Hampshire Hospital or DHHS. Not long thereafter, that same former patient posted non-confidential information on social media, which was when the hospital became aware that he had not only accessed, but also copied the information.

At this point, law enforcement and DHHS officials got involved and an investigation launched.

Unfortunately, the deeper they dug, the worse it got.

As it turns out, the former patient had also been able to access protected health information, which also wound up on social media. In all, nearly fifteen thousand DHHS clients had their personal information exposed, including names, addresses and social security numbers.

The information was removed just hours after it was discovered, but there’s no way to tell if anyone made copies during the brief window of time it was widely visible.

The criminal investigation into the matter is ongoing, and the hospital’s IT department has identified and eliminated the flaw that allowed the breach in the first place. However, this incident underscores just how easy it is to miss one small detail and open the door to a breach which could have serious consequences.

Used with permission from Article Aggregator

Watch Out For Emails Asking To Do A HIPAA Audit

There’s a new phishing email making the rounds that your firm needs to be aware of if you deal with protected health information (PHI) in any way and are subject to HIPAA rules and regulations.

The email is quite good, appearing for all intents and purposes to be an official communication from the Department of Health and Human Services, signed by its Directory, Joycelyn Samuels.

There’s a surprising twist to this story, though. The email was not sent by hackers, but by a private company.

The text of the email indicates that the recipient has (possibly) been included in a HIPAA privacy, security and breach notification audit program currently underway by the OCR. In other words, it looks legitimate, and sounds just dire enough to prompt a click.

If you click the link contained in the email, however, rather than being taken to a government website, you’re taken to a company website, where you’re prompted to do business with them to ensure your compliance with all applicable HIPAA regulations.
It’s an underhanded tactic, taken right out of the hackers’ playbook, and the Department of Health and Human Services is not amused.

Director Samuels has released a formal statement, saying that the matter is currently under investigation, and because of that, has declined to name the company responsible for sending out the emails. She stressed that any official communication from her department regarding audits would be sent from If you don’t see that email address, it’s not an official communication.

Navigating the maze of HIPAA rules and regulations can be difficult enough without companies resorting to hacker tactics to try to get your business.

If you have any questions or concerns about your company’s compliance, we’d be happy to assist. Give us a call and a member of our knowledgeable team will work with you to access your current compliance status, and create a strategy that will ensure you don’t run afoul of Director Samuels or her department.

Used with permission from Article Aggregator

Exiting Employees Should Be A Concern For Confidential Data Theft

A recently fired Expedia IT Specialist used his position to access the emails of company executives and make more than $300,000 in an insider trading stock scheme, according to investigators.

While the facts of the case are both interesting and alarming, the real story, and lesson to all business owners, is the danger that your employees, both current and former pose.
The simple truth is that no matter how much you spend on your company’s digital security system, it can all be undone by one careless (or disgruntled) employee. They are your firm’s greatest asset, and its biggest weakness.

Unfortunately, too many firms consider employee security training to be an afterthought at best, and what training employees get tends to be nonspecific, and according to many employees, not very helpful.

Obviously, the even larger concern here is the issue of employee terminations and how they are handled.

In the Expedia case, while the company’s Network Administrators were quick to disable his accounts, he retained access to the laptop he was issued by the company for several months, and using it, he was able to access a variety of non-public information. This was how he continued to pilfer and ultimately profit from sensitive information.

This is a simple case of improper asset tracking. Had a thorough equipment audit been conducted when their employee was let go, the missing piece of equipment would have been quickly identified and collected.

Unfortunately, with so many companies using a BYOD (Bring Your Own Device) policy, this gets even more complex, because an employee could easily download a variety of proprietary information onto his or her personal device in advance of leaving. Unless there are protocols in place to check and prevent such actions, your company could be at serious risk.

The question, then, is whether you have measures in place to prevent a disgruntled employee from causing your company serious financial harm. If you’re not sure, contact us today, and one of our talented team members will be happy to review the current state of your digital security and make recommendations to keep your business safe.

Used with permission from Article Aggregator

Medical Devices Found To Have Major Security Risks

It’s official. The almost complete lack of security in the Internet of Things can kill you.

Researchers operating out of the University of Birmingham working in cooperation with fellow researchers in Belgium have recently published the results of an intensive study which revealed critical vulnerabilities in ten different “smart” medical devices so far, all of which can be potentially life-threatening.

Insulin pumps can be remotely ordered to trigger a fatal dose. Pacemakers can be shut off entirely, or set to shock the heart constantly until the patient dies, and more.

In addition to killing the people who depend on these devices, of course, the same bugs can be used to intercept and steal all manner of personal health information to sell to the highest bidder.

None of this should come as a surprise to anyone. We’ve known for quite some time now that most of the “smart” devices that currently constitute the Internet of Things lack even the most basic security protocols, which make them notoriously easy for hackers to get control of.

So far, the hackers have been content with simply enslaving these devices to construct huge ‘botnets, like the one used to cripple much of the US Internet for the better part of a day recently. That, however, is just the tip of the iceberg, and with more and more internet objects being added to the IoT every day, the problem is bound to get much worse before it starts to improve.

One of the chief reasons the problem won’t be going away anytime soon is the simple fact that device manufacturers have shown almost no interest in beefing up security on the devices they make and sell.

In large part, this is because they don’t have any financial interest in doing so. Once the device is sold, their association with it ends, leaving tens of millions and the entire architecture of the internet vulnerable.

Sadly, it will probably take a few deaths before we start getting serious about IoT security, but by then, we’ll be facing an uphill battle, given how many internet objects are already in use, and how rapidly that number continues to grow.

Used with permission from Article Aggregator

Another Organization Gets Hit With Massive HIPPA Fine

The University of Massachusetts’ Amherst campus just learned a hard and incredibly expensive lesson about how serious the Department of Health and Human Services is about cracking down on HIPPA noncompliance.

In the 13th high profile fine of the year, the university was hit with a staggering $650,000 fine. Based on agency comments, it could have, and would have been much higher, but “…the university operated at a financial loss in 2015.”

What makes that statement all the more terrifying are the key facts surrounding the case, which are:

• The University voluntarily reported that a satellite office and language center was the subject of a generic malware infection designed to collect data and send it back to the hackers who own the code.
• They could not verify or rule out whether any protected health information had been put at risk during the period when the language center was infected.
• The malware would have put just 1,700 health records at risk.

What the case comes down to is the fact that although the language center was a satellite office, it was still possible to access PHI from that location. Although that was true, the university did not see the satellite office as part of the network that was required to be in compliance with HIPPA codes.

In addition to the fine, the university has agreed to a corrective action plan to help ensure that a similar incident does not happen in the future. This once more underscores just how seriously the government has begun to take HIPPA noncompliance.

If your firm is in any way involved with or connected to protected health information, it’s extremely important that you conduct a comprehensive review to ensure that you are fully compliant in order to avoid a fine like this one.

If you’re not sure and you need extra help, then don’t hesitate to contact us. One of our talented team members would be happy to work with you to analyze your compliance risks and work out a plan of action to ensure that your firm isn’t the next headline.

Used with permission from Article Aggregator