FDA Clears AI Applications to Assist Radiologists

Artificial intelligence (AI) applications in radiology have been and are being developed by the established imaging companies and a host of new companies. Many of the recent FDA cleared applications have focused on the radiologist, aiding image analysis or improving workflow.

The major imaging companies both develop AI applications, often to improve scanner image acquisition and incorporate 3rd party solutions. Some companies offer a platform to host AI applications from companies that have been cleared by the FDA, either via the 510 (k) process or the De Novo process.

Applications to assist radiologists fall into three categories: workflow improvements, clinical decision support, or image interpretation. Workflow improvements include providing relevant patient summaries extracted from the EHR, ordering worklists to place the most critical preliminary findings first and volume segmentation and automatic labeling. Image interpretation provides identification of suspect lesions and nodules.

The table on the right (click to enlarge) presents some AI Algorithms cleared by the FDA as of October 2018, many of which received clearance in 2018. Please notify us of other FDA cleared AI applications at info@medtechcon.com

Some companies have developed AI platforms to host AI algorithms from 3rd party developers. Envoy AI hosts a platform that hosts AI algorithms from developers that can be used to test and refine their applications. Once cleared by the FDA, the algorithms are available to providers to try out the algorithms and incorporate them into their practices.

Nuance has introduced its AI Marketplace to host machine machine learning and deep learning algorithms for image analysis, workflow optimization, clinical decision support, and other radiology applications and use cases. Vendors can host their applications and when subscribed to by providers, receive feedback to improve the applications.

Philips has developed the HealthSuite Insights platform which is available to healthcare providers to develop AI algorithms. The platform will also host 3rd party and Philips AI algorithms for test and further development.

The Siemens Healthineers Digital Ecosystem presents a variety of digital solutions, including AI algorithms in a store where users can purchase and deploy these offerings either installed locally or cloud-based. The Ecosystem includes clinical, operational, and financial solutions in healthcare delivery.

Carestream Vue Clinical Collaboration platform includes Carestream AI developments. One example is the use use of AI to triage exams identifying studies with urgent findings moving these to the top of the worklist. In addition, the platform integrates algorithms from other venders.

Agfa Healthcare has integrated AI into its Enterprise Imaging solution embedding algorithms into clinical workflows. Algorithms include Agfa developments and those from other industry vendors and care providers.

Change Healthcare has been incorporating AI Solutions from other vendors. GE incorporates other vendors algorithms in addition to developing their own. Many of GE’s AI developments have focused on improving the performance of its scanners, e.g., CT, ultrasound, and MRI.

Interventional Cardiology in Transition

Aging baby boomers, new clinical therapies, and evolving regulations are increasing the work load of interventional cardiologists. Do these three factors represent a Perfect Storm for cardiology?

If so, let’s hope that cardiology weathers their storm better than the crew of the Andrea Gail fishing trawler did in the “perfect storm” of 1991 as depicted in the 2000 movie. Instead of a combination of meteorological conditions, cardiology’s brewing storm results from changing demographics, added clinical applications, and evolving regulatory requirements.

As the baby boomers age into retirement they are also entering the peak of cardiovascular disease. Cardiologists are also entering retirement resulting in a shortage of specialists to deal with the increased influx of patients.

The last ten years have seen major advances of less invasive treatments for major types of cardiac disease. Most notably heart valve replacements and ablation therapy for atrial fibrillation are now routinely treated by interventional cardiologists. The new therapies are further increasing the demand on interventional cardiologists and on cardiac catheterization laboratories.

As if the increase in demand driven by the aging population and new interventional therapies was not enough, the healthcare system is in a major sea change. The payment system is starting to shift from fee-per-service to value-based care in 2017. This change will occur over the next few years.

The rules for the new payment systems are evolving resulting in increased demands on providers for additional documentation of clinical decisions and procedures. Providers performing better than average will receive increased Medicare reimbursements and those providers performing below average will see decreased Medicare reimbursements.

Population Increase
AHA CV Disease

Interventional cardiologists are just starting to deal with a larger patient base, more of whom will be treated in the catheterization laboratory, and will be working with an evolving payment system requiring more documentation.

Changing Population Demographics
The baby boomers are not only moving into their retirement years but are also moving into their peak cardiac disease years. The US Census Bureau projects that the 65 and older population is projected to grow from 43.1 million in 2012 to 72.8 million in 2030 – a growth of 70%.

The 2015 American Heart Association’s “Heart Disease and Stroke Disease” statistics shows a remarkable increase in cardiovascular disease in the over 60 population in the US.

The changing demographics of the population is reflected in the interventional cardiologist population as well. The 2014 MedAxiom Survey showed 34% of the interventional cardiology workforce to be over 59 years old with a median age of 54.

Estimates of the shortages of interventional cardiologists vary but the rate of retiring cardiologists has not been balanced by a growth in fellowship positions. Reasons for the projected shortages vary as well. In addition to the changing demographics, other factors listed include the increased demand and the lack of growth in fellowship positions.

All estimates agree that there will be insufficient interventional cardiologists to meet the meet in ten years.

Cardiovascular disease is increasingly being treated in cardiac catheterization laboratories. Minimally invasive techniques have been developed and proven for disease once treated surgically if at all.

Clinical Therapy
For various types of heart disease, the preferred treatment has moved from open heart surgery to minimally invasive techniques, either minimally invasive surgery or delivered via a catheter (transcatheter) similar to angioplasty or stent placements. In come cases, a combined minimally invasive surgical approach and transcatheter therapy provide the best results.

For example for multi-vessel coronary, a minimally invasive surgically technique is most effective on a particular artery (left anterior descending coronary) while stents in other vessels inserted with a catheter by an interventional cardiologist are more effective in the other vessels. Both procedures are performed in a “hybrid” lab, a combination surgical and cardiac catheterization suite.

Similarly ablation therapy used to treat atrial fibrillation may be a combined surgical and catheter based procedure. Each procedure is more effective to access different areas of the heart.

More recently, heart valve replacement surgery is being replaced with a transcatheter procedure by an interventional cardiologist. This started with pulmonary valve replacement for pediatric patients, followed by aortic valve replacement, and now mitral valve replacements for adults. All of these new therapies creat an increased demand on interventional cardiologists.

As the types of interventional procedures have increased so have the reporting requirements. For example the FDA all transcatheter valve procedures be documented and submitted to a CMS (Centers for Medicare & Medicaid Services) approved registry which tracks procedures and outcomes.

The Transcatheter Valve Therapy(TVT) registry is joint collaboration between the American College of Cardiology (STS) and the Society of Thoracic Surgeons. It is the only registry approved by CMS for transcather valve replacment reporting.

In addition to the TVT registry, the ACC maintains nine additional registries for various types of cardiovascular transcatheter interventions. These registries include the CMS mandated ICD Registry for implantable cardioverter defibrillator(ICD) patients and the CMS mandated LAAO Registry for left atria appendage occlusion procedures.

Most facilities elect to participate in the non-mandated transcather registries as well as those mandated by CMS. The other registries are employed for quality control and provide outcomes data to insurance companies.

Regulatory
The healthcare system is moving from the fee-for-service to value-based payment models. Much of this change is driven by the Medicare Access and CHIP Reauthorization Act (MACRA) going into effect starting in 2017.

MACRA’s Quality Payment Program implements two payment options: the MIPS and the APM. The Merit-Based Incentive Payment Systems (MIPS) is a complex pay-for-performance system combining previous programs: the PQRS (Physician Quality Reporting Program), the VBPM (Value Based Payment Modifier, and the MU (Meaningful Use EHR Incentive Program). MIPS also adds an additional measure: Clinical Practice Improvement Activities. Medicare reimbursement gets adjusted based a weighted average of these four components.

The Alternative Payment Models will encompass a variety of shared risk programs such as Accountable Care Organizations (ACOs) and Medical Homes. These models are few to start with and more models are under development.

Most physicians will fall under the MIPS payment plan initially. However, the program plan is to move everyone to a APM in the future.

Payments under MACRA start in 2019 based on data submitted in 2017 and 2018. Supposedly there is more flexibility in the program in 2017 when data submission starts than in the following year. All of the measures under both payment plans must be carefully documented and reported.

Some of the documentation is taken care of by participation in the Clinical Registries, most notably the ACC’s NCDR (National Clinical Data Registry) suite of cardiovascular registries. However, some of the measures require additional documention and reporting.

One of these measures is the Appropriate Use Criteria (AUC) for imaging exams and associated therapies. ACU includes criteria for transcatheter procedures by interventional cardiologists. A deliberate assessment is difficult to make when a patient is coming from the emergency room with a serious cardiac event where “time is muscle” and every minute countsm The ACC AUC definitions are careful to state that a score of “rarely appropriate care” for a angioplasty and stent does not mean that it should not be undertaken in specific cases. However, “exceptions should have documentation of the clinical reasons” for proceeding.

Additional complicating factors are bundled payments where the hospital is paid for an episode of care which includes not just the inpatient stay and associated interventional procedures for a cardiac event but also any related services for 90 days after discharge. The rationale being that higher quality of care and results in fewer post procedure complications.

As most cardiologists are now hospital employees, hospital administration will be watching these events very closely. All of these changes result in increased responsibilities for the interventional cardiologist and a measure of uncertainty as the policies evolve. This environment may also lead to earlier retirement of older cardiologists.

3D Printing Expanding in Medicine: Implications for Radiology

3D Printing or more accurately, Additive Manufacturing, is quickly finding more and more medical applications. Several of these applications are in radiology. Radiologists are taking note as was evidenced by the sold out session at RSNA 2014, “Fundamentals of 3D Printing” on Sunday morning.

At this session, the team from the 3D Medical Applications Center at Walter Reed National Military Medical Center reported 3 areas of applications for 3D Printing.
1) Medical Models for use in surgical planning, patient education and consent and a pre and post-op record.
2) Virtual Surgery to actually perform the technique on a model.
3) Device design to create custom implants and surgical tools.

Full size models for use in surgical planning and reference during the surgery have proved to be so successful that orthopedic surgeons at Walter Reed use the models for all surgeries.

Frank J. Rybicki, M.D., radiologist and director of Brigham and Women’s Applied Imaging Science Laboratory reported on their experience with 3D Printed models to plan and perform face transplantation procedures. It was so successful it is now a mandatory step in surgical planning for these procedures. (RSNA Press Release)

Cleveland Clinic has also used 3D printed models in face transplantation. In addition the clinic employs 3D planted models in other surgeries. 3D printed models are employed in complex liver surgeries. No two livers are exactly alike and the damaged areas of the liver need to be removed without damaging the inner blood vessels while keeping the healthy areas of the liver intact. Surgeons study the 3D models which include the inner vessels prior to surgery and have the models available in the operating room during the procedure. ( Cleveland Clinic Makes Surgery more Personal)

Over the last two years, the use of 3D printed models has spread rapidly at Children’s Hospitals. These include the Children’s Hospital of Illinois, Boston Children’s Hospital, New York-Presbyterian Morgan Stanley Children’s Hospital, C.S. Mott Children’s Hospital, Miami Children’s Hospital, Phoenix Children’s Hospital, Texas Children’s Hospital, Lurie Children’s Hospital of Chicago, Children’s National Medical Center, Children’s Hospital of Philadelphia, and Kosair Children’s Hospital in Louisville among others.

A common application at children’s hospitals is the use of 3D printed hearts in surgical planning. Dr. Matthew Bramlet, a pediatric cardiologist at Children’s Hospital of Illinois in Peoria is taking it a step further. He has started a “library” of 3D printed hearts as a teaching tool. ( “Library of Hearts”)

Traditionally, physicians have used pathologic libraries but the hearts in these libraries have started falling apart. Due to the cost and difficulty of acquiring replacements, the pathologic libraries are closing. Dr. Bramlet has put out a nationwide call for pre and post op MRI and CT scans of congenital heart diseases of all ages. The online library will be housed at the Jumptrading Simulation and Education Center which is partnering with the NIH Print Exchange.

Boston Children’s Hospital added 3D printed models to its simulation program in early 2014. It has been rapidly adopted for surgical planning in many different types of surgery, including cerebrovascular. In the first year, over 100 3D models have been printed. It is becoming the standard for every patient in the Cerebrovascular Surgery and Interventions Center based on such successes as successful intervention in a case of infantile spasms. (Doctor turns to 3D Printers)

Currently, 3D printing activity is centered in the academic centers and specialty institutions, especially children’s hospitals. Typically 3D printed models cost in the $100 – $1000 range. Sophisticated models can cost much more and there is the cost of the labor required to process the data prior to printing. The capital investment is usually not large compared to imaging equipment costs. However, with no reimbursement, 3D printed models is not expected to be a service for most radiologists in the near future.

One of the few radiology practices offering 3D printing as a service is Spectrum Medical Imaging in Sydney, Australia. Data is being gathered to bolster the antidotal claims of improved surgical outcomes and shortened OR times. Patient demand may also play a role as patient centered care takes hold and with popular press stories such as “Man saves wife’s eyesight by 3D printing her brain tumor”.

Many practitioners currently employing 3D printing believe that it will become commonplace in radiology practices. Rajesh Krishnamurthy, MD, director for research and the cardiovascular imaging program at Texas Children’s Hospital’s EB Singleton Department of Pediatric Radiology says that in addition to the other advantages, it makes a huge difference to patient and care team comprehension of the procedure. Frank J. Rybicki, M.D., radiologist and director of Brigham and Women’s Applied Imaging Science Laboratory says that there is no doubt that 3D printing will be part of radiology practices. (“Ready to Hit Print?”)

Developing an Enterprise Imaging System Plan: First Steps

Looking beyond radiology and cardiology to build a system to collect, index, manage and communicate images of all types throughout the enterprise is a large and complex undertaking. Key steps in developing a plan are to size the project, establish a governance structure, and lay out a roadmap. Determining the magnitude of the Enterprise Imaging Project provides information that will aid in developing the strategy, the governance structure, and the roadmap.

To size the project first identify the image generating departments in the enterprise such as respiratory care, otolaryngology, ophthalmology, pathology, dermatology, wound care, sleep labs, and many others. One institution identified 60 image generating departments, some at the out set of the project, and others during the course of the project. Even now, years into the project, and with half of the departments on the enterprise system, they report that new requests keep coming in.

For each of these departments, basic information about the images needs to be collected.
   • Image type: monochromatic/color; still; motion; motion with audio; motion with      waveforms
   • Image sizes and frame rates (if motion)
   • Acquisition device: Technology; manufacturer, model
   • Where images are acquired: location; mobile
   • Current storage technique and device: digital, paper, thermal printer, analog video,      film, or none
   • Image formats: standards employed
   • Study sizes
   • Study volumes
   • Associated meta data: currently acquired; needed to be acquired
   • Archive requirements: medical and legal; all images or key images; how long
   • Viewing needs: identify the users; specialized processing needs; mobile needs

In addition, the needs of each area for image sharing should be identified, both import and export of images, to and from the enterprise.

Once the generators and consumers of the images have been identified, a governance structure can be proposed. This step can be delicate since it is moving imaging beyond cardiology and radiology to encompass the enterprise and may involve silos of image storage in other departments as well. To succeed, the governance structure needs to be broader than a single, image-generating department. Many organizations have established an Enterprise Imaging department and pulled in personnel from existing PACS support teams to staff it. A multi-disciplinary physician advisory group is important to provide guidance to the program and to help communicate the program to the enterprise before and during roll-outs.

Criteria for an implementation roadmap should be established and reviewed with the governing body. Consideration needs to be given to the enterprise strategic goals, the volume and nature of the requests for enterprise image access, and plans to replace existing PACS equipment. Often a first step in the roadmap is to bring in the established enterprise disciplines of radiology and cardiology.

The next departments to be added to the Enterprise Imaging system, may be the low-hanging fruit, that is those that are already digital, interfaced with ADT, and maybe even DICOM compatible. Some enterprises have also required a physician champion in a department before adding that department to the roadmap. The roadmap is not usually complete at the outset of the project and evolves as the project progresses. One large healthcare system which has been developing its enterprise image system for several years has reported that no one can say when they will be done. As the project progresses, new places emerge with image storage and communication needs.

Once the strategy is in place, the size of the project estimated, and a roadmap in place, work flow analyses can be examined, indexing strategies started, system architectures proposed and analyzed, schedules and budgets developed for the initial phase. Communication and promotion of the Enterprise Imaging System can proceed. Integration of the Enterprise Image system with the EMR broadens access to all images and will benefit providers and quality of care both inside and outside the enterprise.

Look at the Dark Side of the Cloud Before Using it for Archiving Images

Introduction
The attractiveness of the economy of scale of cloud services has drawn many health system CIO’s attention for some time when looking at medical image storage. Now that Enterprise Image Archives are coming, CIO interest in the cloud has increased as has the number of companies offering cloud services to healthcare. When considering the cloud, it is important to look at the risks associated with the cloud and take measures to mitigate these risks.

Security concerns about the cloud have prevented many healthcare organizations from signing up, and the new HIPAA rules make security an even bigger issue. Moving to the cloud can also mean giving up control of the image data since it is on someone else’s hardware. Service outages are another issue to be aware of and retrieving the data upon termination of the service can be problematic as well. Current users of the cloud have run into all of these problems. Healthcare providers can take advantage of their experiences.

Service Outages
All cloud services experience outages, and often Service Level Agreements(SLA’s) are carefully written to exclude specific portions of the hardware and software to limit their liability.  In the first 3 months of 2013, Microsoft, Google, and Amazon, all of which offer major cloud storage services, had significant outages.

Microsoft’s Azure cloud storage service went down for 12 hours in February, 2013. The Google Drive cloud storage was down for 17 hours in March, 2013.  Amazon Web Services was down for almost an hour in January, 2013.  In December, 2012 the Amazon service was down for 24 hours.  In total, Amazon had 4 multi-hour outages in 2012.

In most cases, all the data affected in these outages was recovered.  Although recovered, the data was often unavailable for sometime after the outage.

Reasons for the outages vary.  Often the outage is due to an update of hardware or software in the network or servers that went awry.  Hardware failures also occur as data centers are pushed to ever increasing power densities.

Users of the cloud service for data storage need to have contingency plans for outages.  The cloud service services offer redundant storage options, including storage in multiple data centers or availability zones. Not only do these options come at an additional cost, they are not fail safe either.  Sometimes the switch over to the other center either takes some time to occur or doesn’t happen at all.

Data Loss
Losing control of one’s data can lead to losing the data as well. Millions of users of Megaupload’s file sharing service found this out in January, 2012 when the FBI shutdown the Megaupload web site and seized the servers leased by Megaupload from a cloud hosting service in Virginia.

The servers were seized and the site shut down due to copyright violations involving music and movies stored on the servers. The fact that millions of files were legitimate did not matter, since they were commingled with the pirated files and could not be separated out.

As the case meandered through the justice system, the files remained frozen. The Dutch hosting service for Megaupload had never received a request to save the data. Thus, in February 2013, the Dutch hosting service decided to re-provision 630 servers and deleted all the Megaupload data.

In the United States, the Department of Justice established a process for users to regain their data. It was so onerous and lengthy that few users were able to recover their data. As of October 2013, the hosting service in the US was told that the files were no longer needed and could be destroyed. However, the data could not be returned to its legitimate owners, even though an independent analysis demonstrated that the majority of the files were not pirated.

The Megaupload users learned that putting data into the cloud means losing control of the data. They had no control or knowledge of where it was physically stored or what other data was on the same servers. In the end, they no longer even had access to the data.

Amazon, Microsoft, and other major cloud service companies develop and control their own data centers for their cloud services. In addition, to maintain growth and handle spikes in demand, Amazon, Microsoft, and other companies lease additional capacity from other hosting services. The ultimate owner of the hardware has the most control of the data and it is important to know who this is. The practice of leasing has implications for HIPAA compliance as well.

Security
Two of the motivating factors behind the development of the Internet by ARPA were to have a decentralized network and to enable resource sharing. Any two servers on the network could connect over multiple paths as opposed to a single, fixed point connection. Any attack that took out one path would not disrupt the communication.

As data and services move to large cloud services, the Internet is being decentralized. One of the effects of this centralization is that there are fewer points of failure and one cloud service having an outage can bring down dozens of web services.

The cloud presents fewer and richer targets for hackers. In March 2013, Evernote was hacked, and the user names, emails, and encrypted passwords of all the users were accessed. In 2012, Dropbox, a file sharing and backup was similarly hacked. One of the more extreme examples, was in 2011, when Sony’s PlayStation Network had 77 million accounts compromised.

For healthcare providers considering the cloud for medial image storage, the new HIPAA rules, enforced as of September 23, 2013, make security an even greater concern. The healthcare provider and the HIPAA Business Associate are both responsible if the Business Associate fails an audit or commits a breach. Over 20% of the reported data breaches since 2009 have been caused by Business Associates.

In addition, providers are responsible for ensuring that any subcontractors a HIPAA Business Associate uses are also compliant. Thus, the cloud vendor’s data center must have a risk assessment and be able to pass a HIPAA security audit and so should any hosting service that the cloud vendor employs. As part of investigating cloud storage, healthcare providers need to know the locations of all data centers employed, the company owning the servers, the company operating the servers, and examine the security risk analysis done by each entity.

The security risk analysis must be kept current. This means that any change in the systems storing or transferring the images by the cloud vendor or its subcontractors and their subcontractors requires an update to the to the security risk analysis for any changes in risks.

Data Migration
There are data migration implications in the cloud just as anywhere else. Someday one may wish to change services or leave the cloud or as happened recently, the cloud could leave you.

Nirvanix was the cloud hosting company behind IBM’s SmartCloud Storage service, among other services. In mid September 2013, Nirvanix told its customers that due to a failed funding round, Nirvanix would be closing by the end of the month and customers should migrate their data in two weeks. IBM was not commenting and Aorta Cloud, another large company using the Nirvanix service, announced that it had contingency plans for its clients but could not help other large Nirvanix customers.

Nirvanix ended up staying open until October 15th. Nirvanix partnered with IBM, CoreSite, and HP to get the data out and offered customers the option of either returning their data or transitioning to another service such as Amazon, Microsoft, or Google. No official notice was given on how long the partners could keep the Nirvanix servers up and data transfer going.

The more common need for data migration is to change services or move to a different storage paradigm. A common practice is for the user to transfer all the data prior to terminating the service. Downloads are charged per gigabyte transferred. To speed up the process, some cloud services offer to bypass the Internet by either transferring to a portable storage device or offering a high speed direct connection, at additional charges.

A cloud user’s data may be deleted immediately upon termination of the services. It is important to recover all data prior to termination, and it is equally essential to have the data format, transfer method, and time frame agreed upon in the SLA. There have been reports of data being returned encrypted on media that required special hardware to read.

Performance
With 1 Gbps and 10 Gbps networks common in healthcare systems, accessing images over the Internet will not be as fast as accessing images over the internal network. A PACS system getting a prior exam from the local cache may take a few seconds. Getting the same exam from the cloud is dependent on the provider’s connection to the Internet and how much of the bandwidth is available, that is how may other applications are accessing the Internet at that time. It also depends on the Internet latency which is a function of the path the data takes. The same exam could take minutes to transfer instead of seconds.

Amazon Web Services offers a direct connection to the cloud that bypasses the Internet. Options are available up to 10 Gbps at a per hour connect cost and a per GB transfer cost.  Even with the direct connection, the exam transfer will still be slower from the cloud than on site, depending on what format conversions are necessary in the cloud servers.

Conclusion
A business continuity plan should be in place to assure that images required for priors or for use in procedures are available during a cloud outage. This plan may include increasing the size of the onsite image cache. To determine the size of the cache, each discipline using the images must determine how long image access is absolutely essential. For radiology it may be a period of years while for wound care, it may be a period of months. The size of cache should also consider performance issues. That is, how old can the data be such that each discipline can afford to wait for the images to be retrieved and how long can they wait.

To protect against data loss, a disaster recovery system should be in place. This could employ a different cloud vendor but one should be careful to know that the two could vendors are not sharing same hosting service. A better approach may be to use a data center that is off site and under control of the health system. The disaster recovery system may be planned to avoid issues associated with data migration, should one decide to change cloud vendors or if the cloud leaves you suddenly.

Security issues will require the provider to carefully vet the cloud service and negotiate theSLA. The cloud service needs to be HIPAA compliant with the HIPAA regulations as they are now – not as the HIPAA regs used to be. Ask the cloud service how often they update their security analysis and if the answer is based upon the calendar, e.g., once a year, there is a major problem. At this point the healthcare provider needs to assess how much time they can spend in educating the cloud service and/or if the provider should be considering alternatives.

The process for notification of breaches needs to be carefully spelled out in the SLA. The healthcare provider is responsible for notifying patients within 60 days and needs time to do so. This may mean that the cloud service needs to notify the healthcare provider within 10 days.

Once all these measures are planned, the costs and risks associated with archiving medical images in the cloud may be reconsidered. It may not be as inexpensive as first thought. Contingency plans and well written SLAs are a must.

References

Outages
The worst cloud outages of 2013 (so far)
Cloud Computing – Outages: Analysis of Key Outages 2009-2012
The Most Recent Amazon Outage Exposes the Dark Side of the Cloud
Amazon Cloud Outage Kos Reddit, Foursquare & Others

Data Loss
More than 10 million legal files snared in Megaupload shutdown
LeaseWeb explains why it deleted Kim Dotcom’s MegaUpload data
The Dark Side of the Cloud: IBM Partner Gives Fold Two Weeks to Move Data
Nirvanix Shut-Down Sends Shockwaves through the Cloud Services Industry

Cloud Leasing Practices
Microsoft Accelerates Its Data Center Expansion
Cloud Builders Still Leasing Data Center Space

Migration
A Dark Side of the Cloud: Breaking Up is Hard to Do

Security
HIPAA Business Associate Myths & Facts
10 Myths about HIPAA’s Required Security Risk Analysis
New HIPAA rule could change BAA talks

Contracting
The Dark Side of the Cloud: How to Avoid the Pitfall of Cloud Computing Contracts for Your Business