Category Archives: ANALYST INSIGHTS

20% of North American PHM Market is “Captive”

20% of North American PHM Market is “Captive”

Written by Alex Green

  • Signify Research preliminary analysis and commentary on the North American Population Health Management IT Market, which includes data for the quarter ending December 2016
  • 20% of North American PHM market in 2016 is accounted for by vendors supplying their own owners with solutions
  • Optum, Healthagen, Medicity, ActiveHealth Management, Transcend Insights, Conifer Health, Medecision – all have owners who are also customers

Analysis

According to preliminary data from Signify Research’s report “Population Health Management IT – North America – 2017”, 20% of the North American PHM market is effectively locked out to competition as it is served by vendors supplying their owners with PHM solutions – i.e. it’s a captive market.

Many of the leading vendors of PHM software solutions are owned by major payer or provider organisations, who then purchase PHM products and services from their subsidiaries. In some circumstances these payers or providers were initially responsible for developing the PHM solutions offered and once successful, commercialised the products by establishing new operating companies or brands. There are also several examples of payers or providers acquiring vendors that had already developed successful PHM solutions and then ultimately becoming customers of these acquired companies.

The leading vendors that are driving this captive 20%, along with an overview of their businesses, are outlined below.

Optum

Optum, the market leader in terms of PHM IT revenues in North America, is owned by US payer UnitedHealthcare and is one of the companies that makes up the largest share of the captive market.

United is Optum’s largest customer across its entire portfolio. In 2016 Optum Insight, Optum’s health care technology business segment, generated more than 50% of its $9B sales via internal business. Specific to PHM, in 2016 United was a major customer for Optum’s payer-centric PHM solutions, such as Optum Impact Intelligence, Optum Impact Pro and Optum Symmetry.

Optum has also started to bolster its non-UnitedHealthcare business; in particular it has increased the share of its business that is driven by providers as opposed to payers. Its acquisition of Humedica in 2013 and the subsequent launch of its provider-focused PHM solution Optum One being the main enabler. As sales of this solution have ramped up, the share of the overall Optum PHM business generated by UnitedHealthcare has started to fall.

Conifer Health Solutions

From a leading payer-owned example to a leading provider-owned example – Conifer Health Solutions offers a broad range of provider-focused PHM solutions via its ConiferCore portfolio of products. It is estimated to have commanded a top five market share position in North America in 2016 (based on Signify Research preliminary market estimates).

Conifer Health Solutions is the principal operating subsidiary of Conifer, which is majority owned by the US health provider Tenet Healthcare. Further, US provider Catholic Health Initiatives (CHI) also holds a 23.8% ownership position in Conifer Health Solutions.

Tenet alone accounted for 37% of Conifer’s sales (all products) in 2016. Across both Tenet and CHI, the figure was between 75% and 80% in 2015 and 2016. These figures relate to business across all Conifer’s products, but its PHM business is also very much focused around Conifer Health’s two owners.

Healthagen

US payer Aetna has developed a broad PHM portfolio via several acquisitions over recent years. In 2005 Aetna acquired ActiveHealth Management for a reported $400M. ActiveHealth Management provided medical management and data analytics solutions at the time. In 2011, it completed the acquisition of Medicity, a provider of Health Information Exchange (HIE) solutions for $500M.  Finally, in 2011, Aetna acquired Healthagen, at the time best known for developing the mobile symptom checking app iTriage. In 2013 Aetna consolidated all its PHM- products obtained via these acquisitions under the Healthagen brand.

Although Aetna is estimated to drive some internal business for Healthagen, unlike the previous two examples of Optum and Conifer, the internal business generated from Aetna is estimated to be relatively small.

Transcend Insights

A similar picture exists for Transcend Insights. As with the Healthagen brand, the Transcend Insights brand reflects the PHM offerings of another major US payer, namely Humana (until recently, itself in merger discussions with Aetna).

Transcend Insights was formed in March 2015 after Humana brought the businesses of its subsidiaries Certify Data Systems, Anvita Health and nLiven Systems together under one brand.

The combined entity is a top 10 vendor in terms of 2016 PHM market share and it addresses the PHM market via its HealthLogix portfolio. Humana is estimated to have been a significant customer for Transcend Insight’s PHM products in 2016.

Other Examples

Several other companies also contribute towards this 20% figure, such as Medecision which is owned by one of its largest clients, US payer Health Care Services Corporation (HCSC).

There are also examples of provider/payer-owned vendors where the owner relationship isn’t defined as customer/client. For example, Evolent Health, which addresses the PHM market via its Identifi portfolio, is part-owned by Pittsburgh-based provider UPMC. However, much of Evolent’s portfolio is based on IP developed by UPMC. In this set up, Evolent acts as a reseller of UPMC technology, rather than a supplier to UPMC.

That said, Evolent does contribute toward this captive market in other ways. Since February 2016 one of its top three customers, Passport Health Plan, owns a sizable share in Evolent. Passport Health drove 20% Evolent’s overall business in 2016.

Premier Inc, is also technically part-owned by its customers but the business is very different to the other vendors discussed in this section. It has been a publicly-traded company since its IPO in September 2013; however, at the end of 2016 64% of the company equity was held by its members which comprised its customers. This includes 3,750 hospitals/130,000 providers making the company technically a provider-owned business. However, unlike the others listed in this section it is not owned by one provider, but many, and the vendor/client relationships are very different to the other examples. For this reason, Premier’s business has not been included in the 20% captive market figure.

Steady Decline in ‘Captive’ Share of the Market

The share of the market represented by companies selling to their owners (or in some cases owned subsidiaries) is forecast to fall over the coming years. For the companies involved, although their internal business is forecast to grow, the share of business generated by external customers is forecast to grow faster. This, coupled with the growing PHM businesses of other vendors that don’t have internal customers, results in the captive share of the market falling to less than 10% by 2021.

New Market Report from Signify Research Publishing Soon

The market data presented above are the preliminary estimates and forecasts from Signify Research’s upcoming report on the North American PHM market which will be published in April. The report is a component of the Signify Research “PHM & Telehealth Market Intelligence Service”. Vendors tracked include Aetna, Allscripts, AthenaHealth, AxisPoint Health, Caradigm, Cerner, Conifer Health, eClinicalWorks, Enli, Epic, Evolent, HealthCatalyst, Humana/Transcend Insights, IBM Watson Health, McKesson, Medecision, Meditech, NextGen, Optum, Orion Health, Premier Inc., Verscend, ZeOmega and others. The report provides quarterly market estimates for 2015 & 2016, and annual forecasts by vertical, function, service type, platform delivery and country to 2021.

For further details please click here or contact Alex.Green@signifyresearch.net.

5 Key Issues for Cloud Adoption in Clinical IT

5 Key Issues for Cloud Adoption in Clinical IT

Written by Steve Holloway

Hype surrounding cloud solutions for clinical IT has ramped up in the last two years, buoyed by user demand for more flexible data access and connectivity. However, global market adoption of cloud technology for clinical IT to date has been relatively slow, despite the increased marketing efforts of healthcare technology vendors and the growing presence of cloud technology platform vendors in healthcare, such as Microsoft (Azure) and Amazon (AWS).

So why is market penetration so low when cloud IT is big business in other sectors? And what are the key factors in cloud adoption for healthcare?

1. Each provider implementation is unique

No two clinical IT implementations are the same and no single software solution can address every provider’s needs. Scale, complexity of existing infrastructure, variety of user groups and interfaces and differing needs for mobility and connectivity all impact the effectiveness of clinical IT implementations. Therefore, the one-size-fits-all approach rarely works.

This complexity also impacts IT architecture selection: some provider organisations already own and maintain extensive IT data warehouses, so are unwilling to use third-party solutions when they can host their own private cloud. Others have complicated legacy networks of disparate clinical IT solutions across multiple locations, requiring flexile, multi-faceted cloud IT solutions. Smaller providers have limited resources for IT administration and so require full third-party managed service cloud solutions.

This variance of need makes it very difficult for providers and vendors alike. For providers, it is challenging to find case study examples of past implementations with similar profiles to learn from, especially as cloud IT is relatively new for healthcare. For vendors, it’s difficult to know which market segments and regions to target and which product lines to “cloud-enable”, without spending extensive time understanding the nuances of their customers’ unique needs, not to mention the dizzying amount of red-tape from local, regional, national and international regulations (see number 3).

2. Providers often misconceive cloud is less expensive

Whether fully hosted or hybrid architecture, it is rare for cloud IT implementations to be less expensive than on-premise solutions, though this is a common misconception amongst buyers.

Some cloud solutions are offered with a subscription-based managed service pricing model which can be misunderstood as less expensive relative to an up-front purchasing model. However, cloud solutions for clinical IT can be up to a third more expensive, depending of course on the unique needs of the provider. The complexity of most providers’ health networks and multi-faceted interfacing also adds significant financial risk to new implementations, for providers and vendors alike.

The relative infancy of cloud implementation also means there are few long-term case studies outlining the cost benefits of cloud for clinical IT. Vendors should be doing more to partner with early-adopters to better profile the wider benefits that cloud IT enables (mobile and remote access, workflow efficiency, reduction in IT administration). In doing so, providers will be able to better understand if a true return on investment (ROI) is possible.

3. Security and legislation is a moving target

Barely a day goes by without news headlines announcing the unsolicited release of sensitive patient health data, be it from malicious hacking or accidental release. Cybersecurity has therefore become a leading issue and challenge for healthcare providers, both to satisfy patients and adhere to the increasingly complex array of cybersecurity and compliance legislation. For larger providers with regional, national or international footprints, this is even more challenging, as each has its own “flavour” of regulation and each is evolving as legislators catch-up with new types of cyber threat.

This creates a challenging environment for selling cloud IT products, even if they are proven to be more secure than the provider’s current on-premise architecture. Large health providers are particularly sensitive to patient data security, as a major breach could be costly both from a financial and legal perspective, not to mention losing patient trust.

While many strategies exist to overcome these issues, vendors must fundamentally build customer-confidence in their adherence to the most up-to-date legislation and security protocols, provide certified examples and statistics on their cybersecurity record and be willing to work long-term with their customers on transitioning to cloud. Risk-averse providers are more willing to adopt cloud IT in a step-wise approach, such as off-loading second copy data and disaster recovery back-up in a hybrid cloud architecture as a first phase trial. Once the benefits from a financial, administration and security perspective have been proven, they will be more willing to expand cloud technology implementation.

4. Enterprise EMR is not an adoption precursor, but it helps

From a global perspective, adoption of cloud technology for clinical IT is relatively low compared to other industry sectors. It has however, been closely linked to markets where enterprise EMR implementation has been significant, such as the USA, the Nordics, the Netherlands and Singapore. There is no technical reason for this trend – cloud technology can in theory be deployed in any market with the necessary base infrastructure.

Instead, it is more to do with the impact made by digitalising core patient information with enterprise EMR. The mere existence of a basic centralised EMR spurs greater administrative and clinical focus on improving interoperability and connectivity of health data, both within network and intra-network. Moreover, EMR has commonly provided the initial interconnectivity of patient and data to drive momentum for implementation of value-based care models. As many of these models exploit and demand patient-payer-provider interconnectivity across a variety of access nodes, cloud technology adoption consequently increases.

5. Health data is the new currency

The value of health data is also changing, especially due to recent market development and focus on predictive analytics and artificial intelligence. While the question of who should “own” patient data is a complex and ethical one that far outstrips the remit of this piece, the increasing importance of patient data as a commodity to fuel new healthcare IT solutions, such as risk stratification analytics for phm or new care management workflows, is quickly becoming evident to provider, vendors and patients-alike.

Hybrid or hosted cloud technology solutions can be viewed by some providers as “losing control” or “ownership” of their data, despite the many contractual safeguards available. This view has also intensified with the advent of artificial intelligence, as providers also see the mid-term revenue potential of licensing use of their data to train machine learning algorithms.

While this is still a relatively new development, providers, healthtech vendors and cloud IT platform vendors are already acutely aware of the potential commercial gains to be made from pooling patient data, making adoption of cloud technology even more complicated.

New Service from Signify Research: Clinical Content Management IT – 2017
This and other issues will be explored in full in Signify Research’s upcoming intelligence service ‘Clinical Content Management IT – World, with first delieverable due in April 2017. For further details please click here or contact steve.holloway@signifyresearch.net

Target Applications for Machine Learning in Medical Imaging

Target Applications for Machine Learning in Medical Imaging

Written by Simon Harris

Rapid advancements in machine learning, most notably deep learning techniques, are fuelling renewed growth for medical image analysis software tools. We estimate that the global market1 for these products will be worth nearly $300 million this year and will more than double in size by 2021.  But which clinical applications are driving this growth?

Breast is Best

Breast imaging was the largest category in 2016, accounting for just over one-quarter of the total market. The breast imaging market mainly comprises computer-aided detection (CADe) solutions, such as iCAD’s SecondLook and Hologic’s ImageChecker, for the US breast cancer screening market, along with quantitative image analysis software for diagnosis applications, such as Invivo’s DynaCAD Breast and QLAB Suite from Philips Healthcare.

The market for image analysis tools in breast imaging is forecast to grow at a slower rate than the other applications, as the well-established CADe market in the US is now saturated. The main growth drivers will be:

  • CADe upgrades as imaging centres replace 2D mammography systems with digital breast tomosynthesis
  • Wider acceptance of CADe outside of the US
  • The increasing use of ultrasound (with CADe) in breast cancer screening
  • Uptake of new solutions such as breast density analysis software and decision support tools (e.g. MammoRisk from Statlife)

Cardiology Still Pumping

The cardiology market for image analysis software solely comprises quantitative imaging tools2, which are typically sold as applications for advanced visualisation platforms. These tools provide automatic calculation of various cardiovascular metrics, such as stroke volume, ejection fraction and arterial calcification. Growth will be driven by an accelerated pace of innovation from the use of deep learning algorithms and the resulting introduction of innovative solutions that address unmet market needs. In the mid-term, growth will be boosted by the introduction of decision support tools that provide predictive analytics for risk stratification and computer-aided diagnosis (CADx) systems that facilitate early detection of cardiovascular disease. For example, healthbit’s heartcare™ uses machine learning algorithms to predict congestive heart failure based on cardiac MRI scans.

Deep Breathing

Lung cancer is the leading cause of cancer-related death worldwide, and in response, many countries have introduced lung cancer screening programmes. This is driving demand for CADe solutions, although a lack of reimbursement prohibits more widespread uptake. Early generation CADe solutions based on shallow machine learning suffered from high false positive rates. Deep learning solutions promise improved detection accuracy, which should increase the usability of lung CADe and accelerate demand.

There is also a sizeable and growing market for quantitative image analysis tools for pulmonology applications, that provide characteristics of abnormalities such as size, texture, location, rate of growth, etc. These imaging biomarkers may be useful for predicting prognosis and therapeutic response. As was the case for breast imaging and cardiology, there is also an emerging market for pulmonology decision support tools that combine quantitative imaging with other patient information to provide a data rich, longitudinal history of the patient’s care. An example is the QIDS platform from HealthMyne.

A Head Start in Neurology

Brain scans are the most common type of MRI procedure and accounted for around one-quarter of the 34 million MRI exams performed in the US last year. There is already an established market for quantitative imaging software in neurology, primarily for tools that provide visualisation and quantification of blood perfusion in the brain. Additional growth will come from research into the use of imaging biomarkers for the diagnosis and management of neurological disorders, such as Alzheimer’s disease, multiple sclerosis and Parkinson’s disease.

Growth will be boosted by the introduction of CADe solutions to detect intracranial haemorrhage (ICH) from head CT scans. vRAD and MedyMatch have developed real-time ICH detection tools and both are expected to fully commercialise their solutions in 2017, pending regulatory approval. Teleradiology companies are expected to be the early adopters, particularly in the US as most CT scans ordered by emergency department are interpreted by teleradiologists.

Best of the Rest

The gastroenterology and urology markets for image analysis software were estimated to be similar in size and are forecast to grow at similar CAGRs over the coming years. The gastroenterology market comprises a mix of quantitative tools for analysis of the colon, pancreas and liver and CADe solutions for the detection of colorectal cancer, the third leading cause of cancer death in the US. Colonoscopy remains the gold standard in colon cancer screening, but CT colonography (CTC) is gaining acceptance. However, a lack of reimbursement (CMS does not pay for the use of CTC in colon screening) has hampered the uptake of CTC. The urology market solely comprises quantitative imaging tools, primarily for prostate analysis.

Footnotes

1 The image analysis software market comprises computer-aided detection (CADe) systems, quantitative image analysis tools, decision support tools and computer-aided diagnosis (CADx) systems. A full list of the products included is available on request.
 2 The quantitative image analysis tools category includes all software products that provide automatic quantification of anatomical features, not just those that use machine learning. Products that use other image analysis techniques, such as a statistical model-based approach, are also included.

Related Reports

Machine Learning in Medical Imaging – 2017 Edition” provides a data-centric and global outlook on the current and projected uptake of machine learning in medical imaging. The report blends primary data collected from in-depth interviews with healthcare professionals and technology vendors, to provide a balanced and objective view of the market. If you would like further information please contact Simon.Harris@signifyresearch.net.

How to Sell Machine Learning Algorithms to Healthcare Providers

How to Sell Machine Learning Algorithms to Healthcare Providers

Written by Simon Harris

One of the greatest commercial challenges for developers of medical image analysis algorithms is how to take their products to market. Most independent software vendors (ISVs) of image analysis solutions only offer a handful of algorithms for specific use-cases, e.g. coronary calcium scoring, bone age prediction, detection of lung nodules, etc. However, most generalist radiologists require a comprehensive “analytical tool kit” with a broad portfolio of algorithms that can detect a wide range of conditions for multiple body sites and across multiple modalities. Locating, evaluating and sourcing image analysis algorithms on a piecemeal basis from multiple vendors will be a cumbersome and time consuming process for healthcare providers. Not to mention the challenges associated with integrating the algorithms with the providers’ existing healthcare IT infrastructure. Whilst this may be a viable option for the larger academic hospitals and IDNs, most providers will not have the necessary resources for this and instead will prefer to deal with a small number of vendors, and ideally a single supplier.

There are several routes to market for image analysis ISVs, as follows:

  1. Develop an in-house image analysis workstation or platform (proprietary or open)
  2. Partner with established imaging IT companies, e.g. PACS, viewer and advanced visualisation companies
  3. Partner with modality companies
  4. Partner with healthcare ecosystem (open platform) providers
  5. Partner with companies who provide vendor agnostic image analysis platforms

The advantages and disadvantages of each, as viewed through the lens of algorithm developers, are presented below.

1. Develop an in-house image analysis platform

Examples: iCAD PowerLook Advanced Mammography Platform (AMP), RADLogics AlphaPoint™, HealthMyne QIDS
Advantages: A viable option for specific clinical applications, e.g. breast and lung cancer screening. Solutions can be highly customised for specific customer types, e.g. breast imaging specialists. Full control of the development roadmap.
Disadvantages: Limited choice of algorithms for general radiology. High product development, marketing and sales costs.

2. Partner with established imaging IT companies

Examples: Most of the major PACS and advanced visualisation companies offer clinical applications from third party vendors, alongside their in-house applications. For example, GE offers over 50 clinical applications for its AW advanced visualisation platform, some of which are licensed from third party developers.
Advantages: Access to an established customer base. Tight integration with partner’s imaging IT platform. Partnering with a well-known brand may add credibility by association. Leverage the partner’s sales and marketing efforts.
Disadvantages: The imaging IT market is fragmented – being tied to a specific vendor(s) gives access to only a fraction of the total available market. The Imaging IT market is evolving from departmental PACS to enterprise imaging solutions, creating uncertainty and complexity in the marketplace.

3. Partner with modality companies

Examples: Arterys has a non-exclusive, co-marketing agreement with GE Healthcare, whereby Arterys 4D Flow is available via the ViosWorks application for GE MRI scanners.
Advantages: Access to an established customer base. Credibility by association. Leverage partner’s sales and marketing efforts. Access to “raw data” direct from the modality may improve accuracy of algorithms.
Disadvantages: Doesn’t give access to the total market, although the modality markets are more consolidated than Imaging IT. For example, the MRI market is largely controlled by an oligopoly of 5 companies – Siemens, GE, Philips, Toshiba and Hitachi. Long sales cycles. Modality companies are likely to embed a small number of algorithms rather than a full suite, which will limit the available market.

4) Partner with healthcare ecosystem (open platform) providers

Examples: GE Health Cloud (features applications from Arterys, Pie Medical Imaging and imbio, to name a few), IBM Watson Health Core (recently added an application from MedyMatch that detects intracranial bleeds on CT scans), NTT DATA Unified Clinical Archive (offers analytical solutions from imbio, Zebra Medical Vision and AnatomyWorks), Siemens Healthineers Digital Ecosystem (announced at HIMSS 2017 with Arterys, SyntheticMR and a handful of others having already agreed to provide applications).
Advantages: Widest choice of algorithms. Major focus of investment by the major healthcare technology vendors (GE plans to invest $500m over the next three years in its Health Cloud platform). Access to the platform developer’s installed base of customers. Credibility by association. Leverage the platform developer’s sales and marketing efforts.
Disadvantages: Some resistance from healthcare providers to cloud-based platforms, often due to data compliance requirements. Ecosystem platforms are a relatively new and unproven concept in medical imaging and currently there are relatively few healthcare providers using them.

5) Partner with companies who provide dedicated, vendor agnostic image analysis platforms

Examples: Medimsight offers a cloud-based computer-aided diagnosis marketplace for biomarker quantification. The platform features 39 applications, including algorithms from LAIMBIO, FMRIB (Oxford Centre for Functional MRI of the Brain) and Martinos Center for Biomedical Imaging.  Blackford Analysis offers a vendor-neutral pre-processing (VNP) platform that acts as a broker for pre-processing algorithmic solutions from third party developers, to enable integration with existing clinician workflows. McCoy Medical is a distribution partner / sales channel for companies who make algorithms and analytics.
Advantages: Support with integration reduces the need for PACS back-end engineering. A highly focused marketplace for image analysis solutions.
Disadvantages: The developers of dedicated, vendor agnostic image analysis platforms are small companies with limited resources and few customers. Strong competition from healthcare ecosystem providers (see 4 above).

The Signify View

In the short-term we expect image analysis ISVs to focus on developing their own platforms and to seek partnerships with established imaging IT vendors. However, with the major healthcare technology vendors investing heavily in their healthcare ecosystem platforms, these new “clinical application marketplaces” look set to be an increasingly important sales channel in the coming years. The single platform model greatly simplifies purchasing and workflow integration for healthcare providers and gives radiologists access to the widest selection of algorithms to build their “analytical tool kits”.

Related Reports

Machine Learning in Medical Imaging – 2017 Edition” provides a data-centric and global outlook on the current and projected uptake of machine learning in medical imaging. The report blends primary data collected from in-depth interviews with healthcare professionals and technology vendors, to provide a balanced and objective view of the market. If you would like further information please contact Simon.Harris@signifyresearch.net.

Is $500M Enough? GE Healthcare’s Investment in Digital Health

Is $500M Enough? The Signify View on GE Healthcare’s Investment in Digital Health

Written by Steve Holloway

  • Late last week, the CEO of GE Healthcare announced a plan to invest $500M in the division.
  • Investment will take place over the next 3 years.
  • It will be used to recruit 5,000 software engineers, data analysts and imaging analysts.
  • Some funding may also be used to fuel acquisition of data analytics firm(s).
  • Focus to develop Health Cloud platform (announced November 2015) and hundreds of clinical software applications.
  • Platform will be driven by GE’s Predix operating system.
  • Germany highlighted as a key target market for GE.

The recent announcement from GE Healthcare on plans to invest $500M on 5,000 software engineers and potential acquisition of a software analytics firm is no great surprise but excellent PR. The firm announced in late 2015 their “Health Cloud” platform to great fanfare, but little progress has been evident to date. However, GE has been making it very clear publicly that it plans to transition the industrial conglomerate into a “digital” firm, fueled by its Predix industrial platform.

Here’s the Signify View on the announcement:

Battle Lines Drawn

The size and scale of the announcement is significant for GE Healthcare, one of the leading global healthtech firms with $18.4B of revenue in 2016. However, it is by no means the biggest move in healthcare of late.

IBM’s entry into the healthcare field, investing over $4B in Merge Healthcare ($1B) Truven Analytics ($2.6B), Explorys and Phytel (not disclosed, but estimated to exceed $400M) has been very aggressive, not to mention the serious investment IBM has made in recruiting healthcare leadership and marketing spend for its IBM Watson Health business.

Other major competitors are also making big moves. Siemens plans to take its newly branded “Healthineers” division through an IPO later this year, while Philips has re-focused the company on health and wellbeing markets, having divested its lighting division, Philips Lighting, in 2016. Other global technology giants have also been lining up the healthcare sector; Google (Deepmind), Amazon, Salesforce and NTT Data have also made a big push for healthcare market share. Therefore, GE was always going to need to invest heavily to compete.

Digital Deutschland?

The orchestrated announcement was made to a German media outlet (Handsblatt), for good reason. GE has for some time been lining up the German market. While aiming for Europe’s largest market may seem a sensible move, it will not be straightforward, especially when it comes to Healthcare IT:

  • Germany is one of the most price sensitive markets for healthcare technology in Europe; procurement prices are regularly 30-50% lower than the Western European average
  • The health IT market in Germany is highly fragmented. In Imaging IT alone there are approximately 30 local vendors and integrators working with a complex and fragmented provider network, especially in the private and administrative sectors
  • There has been little if any national or regional co-ordination of health IT in Germany to date. While the market is therefore under-penetrated for Health IT in relation to European peers, consolidation of the market to larger, more profitable enterprise networks will be challenging, bureaucratic and long-winded

In Germany, GE Healthcare does have considerable experience and past success in health IT with smaller ambulatory practice management software. This will certainly help in catering for the market, as will a sizeable installed base of imaging and clinical care hardware in Germany and surrounding markets. That said, attaining a commanding share of the multi-billion dollar German healthtech market could be far from straightforward against a mix of large incumbent multinationals (Siemens, AGFA Healthcare) and a multitude of small private vendors. Add to this economic concern and political uncertainty over the future of the European Union and GE’s strategy looks increasingly risky.

ACE Platform or Bust

GE is also not alone in its bid to position itself as a central platform provider. Philips, Siemens, NTT Data and IBM are all making a similar play. The move from these vendors is hardly surprising – enterprise EHR vendors have done little to establish any real expertise in best-of-breed clinical IT or imaging IT software to date.

For GE Healthcare, Philips Healthcare and Siemens Healthineers, leveraging their clinical expertise and modality hardware footprint to expand the breadth of their clinical IT offerings, including analytics, dashboarding, integrated workflow and even population health and telehealth capability, is a natural progression. These new solutions, that Signify Research has termed Agnostic Clinical Enterprise (ACE) platforms, look set to be the foundation for future cross-discipline implementations. In adopting the ACE platform model, there are many benefits for providers and vendors alike. The single ACE platform model allows the vendor to become embedded in the provider’s core clinical workflow and care management, while also putting themselves in prime position to win long-term, managed service deals, including imaging hardware, clinical care device supply and lucrative professional services.

For providers, the ACE platform model offers a single vendor to deal with for clinical IT (“one-throat-to-choke”) and a partner to share the risk of previously capital-intensive procurement. Moreover, the ACE platform model will, over time, use the core platform vendor as a contractor. If the provider wants to bring in a new technology or software for a specific clinical function, the ACE platform vendor will have responsibility to sub-contract and integrate the new module into their platform. This will lead to greater choice for the provider in each clinical discipline.

With its competitors also making significant moves to establish ACE platforms and aggressive investment from IT and analytics industry giants, GE’s recent announcement really only offers one question: will $500m be enough?

New Service from Signify Research: Clinical Content Management IT – 2017
This and other issues will be explored in full in Signify Research’s upcoming intelligence service ‘Clinical Content Management IT – World, with first delieverable due in April 2017. For further details please click here or contact steve.holloway@signifyresearch.net

North American PHM Market Worth $1.01B in Q4 2016

North American PHM Market Worth $1.01B in Q4 2016

Written by Alex Green

  • Signify Research preliminary analysis and commentary on the North American Population Health Management IT Market (PHM Market), which includes data for the quarter ending December 2016
  • Q4 2016 population health management (PHM) market in North America remained buoyant despite uncertainty around US healthcare reform
  • Revenues of $1.01 billion were generated in Q4 2016, up 17% YoY and up 8% compared to previous quarter
  • PHM solution spending for FY2016 in North America was $3.7 billion, up 15% on 2015

Analysis

Signify Research’s preliminary market estimates for the Q4 2016 North American population health management (PHM) IT market (platforms and services) shows that despite the uncertainty caused by the US presidential election, and its potential ramifications for US healthcare policy, the market remained buoyant. PHM revenues for Q4 2016 in North America stood at an estimated $1.01 billion, up 17% on the same period in 2015 and up 8% on the previous quarter in 2016.

The continued growth in the fourth quarter does need to be put in context. The fall out of the presidential election result is unlikely to have had time to substantially affect sentiment to the point that immediate orders would have been impacted.  Fourth quarter has also traditionally been a seasonally strong quarter for leading vendors of PHM and related solutions.

The results are still very encouraging though. It is Signify Research’s view that the longer-term trend towards value-based care, the move to accountable care organisations (ACO) and the need to better manage health care spending in general will ultimately drive continued growth for vendors offering PHM solutions, despite legislative uncertainty.

Supplier Base Remains Fragmented

For the full year 2016, the North American PHM market was estimated to have been worth $3.7 billion, compared to $3.2 billion in 2015. The list of companies that drive these numbers remains long, and is indicative of the fact that the market, despite reaching a certain level of maturity, is still highly fragmented. However, several companies including Optum, IBM Watson Health, Cerner, Allscripts, Conifer Health and Evolent have started to take market leading positions in terms of share. Between them they are estimated to have accounted for approximately 45% of the 2016 market. Most still only command a single digit share in 2016, with a long list of vendors closely following.

Data Aggregation / Analytics / Stratification Driving Market to Date

Signify Research’s upcoming report segments the market in to three main components, data aggregation/analytics/risk stratification solutions, care coordination/management solutions and patient engagement solutions. Of the three, the data aggregation/analytics/risk stratification segment represented the largest market in 2016 and is projected to remain the largest for the report’s forecast period (2017-2021). Similarly, the provider vertical, specifically the acute provider sector, is also projected to remain the largest market channel compared with the payer, employer and other verticals.

Full Impact of Legislative Uncertainty Yet to be Felt

As indicated above, the real test of the impact of uncertainty in relation to potential changes in legislation will be in seen in market performance during the first half of 2017. Results for Q1 2017 will be eagerly awaited. Signify Research’s market update in the second half of 2017 will give an early indication on the level to which this uncertainty hits the revenues for PHM solution vendors in 2017. However, it’s our view that strong annual growth will continue in 2017, with the market projected to grow a further 16%.

New Market Report from Signify Research Publishing Soon

The market data presented above are the preliminary estimates and forecasts from Signify Research’s upcoming report on the North American PHM market which will be published in April. The report is a component of the Signify Research “PHM & Telehealth Market Intelligence Service”. Vendors tracked include Aetna, Allscripts, AthenaHealth, AxisPoint Health, Caradigm, Cerner, Conifer Health, eClinicalWorks, Enli, Epic, Evolent, HealthCatalyst, Humana/Transcend Insights, IBM Watson Health, McKesson, Medecision, Meditech, NextGen, Optum, Orion Health, Premier Inc., Verscend, ZeOmega and others. The report provides quarterly market estimates for 2015 & 2016, and annual forecasts by vertical, function, service type, platform delivery and country to 2021.

For further details please click here or contact Alex.Green@signifyresearch.net.

Market Impact of EMA decision on GBCAs

Walking a Gadolinium Tightrope of Perception & Bottom Line

The recommendation by the European Medicines Agency (EMA) on 10 March that market authorisation for four linear gadolinium contrast agents (GBCAs) be suspended shocked the radiology community, but very little has been said about this decision’s effect on businesses.

Supply of the agents in question is significant business for Bayer HealthCare Pharmaceuticals (Magnevist), GE Healthcare (Omniscan), Bracco (MultiHance), and Guerbet (Optimark), and Europe’s one of their largest and most mature marketplaces. So, what will be the market repercussions?

Fight or flight? It’s not all black and white

The information offered with the EMA announcement provides a few clues about how the market will react. However, it’s clear that a swift reversal of the recommendation and soon-expected EMA suspension is unlikely.

This is the opening extract of Steve’s regular monthly market column for AuntMinnie Europe.  

To read the full article, please click here.

(Access to the article may require free membership to AuntMinnie Europe – it’s full of great content and insight so well worth signing up!) 

Deep Learning in Ultrasound – Ready to be Embedded?

Applying Deep Learning to Ultrasound – Is the Technology Ready to be Embedded?

Written by Simon Harris

Much like at last year’s RSNA conference, deep learning was one of the key themes at ECR 2017. Several speakers at the scientific sessions presented promising research results for the application of deep learning in specific use-cases. In one of the professional challenges sessions, Dr. Angel Alberich-Bayarri from QUIBIM suggested that convolutional neural networks (CNNs) may already be old news, with generative adversarial nets (GANs), a new architecture for unsupervised neural networks, showing promise for medical imaging applications. GANs may be a solution to one of the major challenges with developing deep learning algorithms – the need for large training data sets.

On the exhibition floor, there were fewer companies showing machine learning solutions than at RSNA (there were at least 20 at RSNA but less than 10 at ECR) and in our conversations with vendors it was evident that expectations were more measured, with less marketing hype. Several of the better known deep learning start-ups were notable by their absence, including Enlitic and Zebra Medical, as was IBM Watson Health.

Samsung chose ECR to make a big push for its S-Detect™ deep learning feature, which is currently available as an option on its RS80A premium ultrasound system. S-Detect™ for Breast makes recommendations about whether a breast abnormality is benign or cancerous. It is commercially available in parts of Europe, the Middle East and Korea and is pending FDA approval in the US. S-Detect™ for Thyroid uses deep learning algorithms to detect and classify suspicious thyroid lesions semi-automatically based on Thyroid Image Reporting and Data System (TI-RADS) scores. With both applications, S-Detect™ produces a report to show the characteristics of the lesion, including composition, echogenecity, orientation, shape, etc., along with the risk of malignancy, e.g. “high suspicion”.

ContextVision, the leading independent vendor of ultrasound image enhancement software, showcased its latest research in artificial intelligence at ECR.  Its prototype VEPiO (Virtual Expert Personal Image Optimizer), which is built on the company’s Virtual Expert artificial intelligence platform, can automatically optimize ultrasound images for individual patients. VEPiO aims to improve diagnostic accuracy and reduce scan times, particularly for more challenging patients, by making automated setting adjustments to obtain the optimal image quality. The company is also exploring the use of deep learning to optimise image quality, for organ-specific segmentation and for decision-support functionalities.

Ultrasound OEMs must decide whether deep learning technology is ready to be embedded into their systems, or to take a “wait and see” approach. Although many research papers have found that deep learning can produce good results in specific medical imaging applications, often at or near the performance of experienced radiologists, these are usually based on relatively small datasets and/or small reader studies. It remains to be seen if deep learning will perform as expected in routine clinical use. Although Samsung has taken an early lead and is the first of the major ultrasound vendors to embed deep learning, it carefully positions S-Detect™ for Breast as a decision support tool for “the beginner or non-breast radiologist”.

OEMs must also decide whether to establish an in-house deep learning capability or to partner with a specialist. Deep learning engineers are a scarce and expensive resource and most mid-tier ultrasound OEMs will struggle to attract and retain talent. Instead we expect they will partner with independent software vendors, such as ContextVision. For the major OEMs, we expect to see a combination of build, buy and partner strategies. Most of the major modality OEMs have, to varying extents, established in-house R&D efforts for machine learning and with over 50 start-ups developing artificial intelligence solutions for medical imaging, there’s certainly no shortage of options for acquisitions and partnerships.

Another limiting factor is the additional processing power, typically GPUs, required for embedded deep learning algorithms. Ultrasound is a fiercely contended and price sensitive market and OEMs will be reluctant to add additional hardware cost. Initially we expect deep learning to be an optional feature on premium systems only, such as with the Samsung example, but as is often the case in ultrasound, features that start out on premium systems typically cascade to less expensive high-end and mid-range systems over time.

With deep learning technology progressing at a rapid pace, and ultrasound OEMs constantly on the look-out for the next “big thing” to differentiate their products, it seems inevitable that deep learning will increasingly be embedded in ultrasound systems, both as workflow tools to help with productivity and decision support tools to improve clinical outcomes.  It’s no longer a question of will it happen, but when will it happen, and the OEMs that wait too long will get left behind in the AI race.

 

Related Reports

Machine Learning in Medical Imaging – 2017 Edition” provides a data-centric and global outlook on the current and projected uptake of machine learning in medical imaging. The report blends primary data collected from in-depth interviews with healthcare professionals and technology vendors, to provide a balanced and objective view of the market. If you would like further information please contact Simon.Harris@signifyresearch.net.

Key Observations from ECR 2017

Key Observations from ECR 2017:

  • Is Radiology Losing Grip on Imaging IT Decision-Making?
  • Applying Deep Learning to Ultrasound – Is the Technology Ready to be Embedded?
  • Canon + Toshiba + Vital + Olea = Serious Competitor, but do they have a missing link?

Is Radiology Losing Grip on Imaging IT Decision-Making?

Written by Steve Holloway

In stark contrast to the recent RSNA show in North America, imaging IT and vendor neutral archives (VNA) were far less evident at ECR. Of course, the traditional radiology PACS vendors were there alongside the well-known names in advanced visualisation, but you had to hunt hard for any independent VNA vendors. Even on the major vendors’ booths, imaging IT was far from prominent, hidden away behind the latest modality hardware systems.

Is this a result of limited space at the smaller exhibition, or a reflection of the state of imaging IT maturity in Europe?

For much of Europe today, radiology IT still means PACS and RIS, be it at the radiology department-level, or increasingly “super PACS” at the hospital-network or regional level (such as Spain, Ireland and Scotland). In some cases, VNA has been used to bring disparate PACS systems together between hospital clusters, but for the most part it has remained heavily driven by DICOM radiology and cardiology image management and archiving. If we consider Europe’s largest five markets (Germany, France, UK, Italy, Spain) only the UK and Spain are starting to show any real development towards integrating non-DICOM content into VNA. For the remainder, there are very few examples of collaboration outside of the core DICOM applications, with most limited to academic or university hospitals.

Perhaps more telling was the lack of attendance from two of the largest VNA vendors globally: IBM Merge and Lexmark Healthcare (recently acquired by Kofax). Both are predominantly active in the North American market, but also have customers in Europe. Their lack of exhibition attendance might suggest they don’t yet see enough enterprise VNA opportunity in Europe.
Alternatively, there could be another factor – that enterprise IT adoption (including multi-application VNA) will be decided not by imaging specialists (such as radiologists and cardiologists) but by Chief Information Officers (CIOs), as we’ve seen more recently in North America. While this could have a positive impact in driving enterprise IT strategy and connecting disparate parts of health organisations together, it could also have negative connotations for radiologists; less choice of radiology software, overarching clinical IT decision-making and Electronic Health Record (EHR) vendors with greater customer influence.

Today, radiology IT decision-making remains very much in the hands of radiology departments for most of Europe, but it might not stay there for too much longer.

 

Applying Deep Learning to Ultrasound – Is the Technology Ready to be Embedded?

Written by Simon Harris

Much like at last year’s RSNA conference, deep learning was one of the key themes at ECR 2017. Several speakers at the scientific sessions presented promising research results for the application of deep learning in specific use-cases. In one of the professional challenges sessions, Dr. Angel Alberich-Bayarri from QUIBIM suggested that convolutional neural networks (CNNs) may already be old news, with generative adversarial nets (GANs), a new architecture for unsupervised neural networks, showing promise for medical imaging applications. GANs may be a solution to one of the major challenges with developing deep learning algorithms – the need for large training data sets.

On the exhibition floor, there were fewer companies showing machine learning solutions than at RSNA (there were at least 20 at RSNA but less than 10 at ECR) and in our conversations with vendors it was evident that expectations were more measured, with less marketing hype. Several of the better known deep learning start-ups were notable by their absence, including Enlitic and Zebra Medical, as was IBM Watson Health.

Samsung chose ECR to make a big push for its S-Detect™ deep learning feature, which is currently available as an option on its RS80A premium ultrasound system. S-Detect™ for Breast makes recommendations about whether a breast abnormality is benign or cancerous. It is commercially available in parts of Europe, the Middle East and Korea and is pending FDA approval in the US. S-Detect™ for Thyroid uses deep learning algorithms to detect and classify suspicious thyroid lesions semi-automatically based on Thyroid Image Reporting and Data System (TI-RADS) scores. With both applications, S-Detect™ produces a report to show the characteristics of the lesion, including composition, echogenecity, orientation, shape, etc., along with the risk of malignancy, e.g. “high suspicion”.

ContextVision, the leading independent vendor of ultrasound image enhancement software, showcased its latest research in artificial intelligence at ECR.  Its prototype VEPiO (Virtual Expert Personal Image Optimizer), which is built on the company’s Virtual Expert artificial intelligence platform, can automatically optimize ultrasound images for individual patients. VEPiO aims to improve diagnostic accuracy and reduce scan times, particularly for more challenging patients, by making automated setting adjustments to obtain the optimal image quality. The company is also exploring the use of deep learning to optimise image quality, for organ-specific segmentation and for decision-support functionalities.

Ultrasound OEMs must decide whether deep learning technology is ready to be embedded into their systems, or to take a “wait and see” approach. Although many research papers have found that deep learning can produce good results in specific medical imaging applications, often at or near the performance of experienced radiologists, these are usually based on relatively small datasets and/or small reader studies. It remains to be seen if deep learning will perform as expected in routine clinical use. Although Samsung has taken an early lead and is the first of the major ultrasound vendors to embed deep learning, it carefully positions S-Detect™ for Breast as a decision support tool for “the beginner or non-breast radiologist”.

OEMs must also decide whether to establish an in-house deep learning capability or to partner with a specialist. Deep learning engineers are a scarce and expensive resource and most mid-tier ultrasound OEMs will struggle to attract and retain talent. Instead we expect they will partner with independent software vendors, such as ContextVision. For the major OEMs, we expect to see a combination of build, buy and partner strategies. Most of the major modality OEMs have, to varying extents, established in-house R&D efforts for machine learning and with over 50 start-ups developing artificial intelligence solutions for medical imaging, there’s certainly no shortage of options for acquisitions and partnerships.

Another limiting factor is the additional processing power, typically GPUs, required for embedded deep learning algorithms. Ultrasound is a fiercely contended and price sensitive market and OEMs will be reluctant to add additional hardware cost. Initially we expect deep learning to be an optional feature on premium systems only, such as with the Samsung example, but as is often the case in ultrasound, features that start out on premium systems typically cascade to less expensive high-end and mid-range systems over time.

With deep learning technology progressing at a rapid pace, and ultrasound OEMs constantly on the look-out for the next “big thing” to differentiate their products, it seems inevitable that deep learning will increasingly be embedded in ultrasound systems, both as workflow tools to help with productivity and decision support tools to improve clinical outcomes.  It’s no longer a question of will it happen, but when will it happen, and the OEMs that wait too long will get left behind in the AI race.

 

Canon + Toshiba + Vital + Olea = Serious Competitor, but do they have a missing link?

Written by Steve Holloway

Following the announcement of the completed acquisition of Toshiba Medical Systems by Canon, co-branding for the new firm was proudly displayed at the exhibition. For other exhibitors at the show, it was an ominous sign. Here’s a few reasons why:

Canon DR fills a hole in the Toshiba X-ray portfolio: Toshiba Medical Europe has a solid presence in the European X-ray market, but only in the interventional and fluoroscopy X-ray segments, two saturated and mature markets. Most growth in the European X-ray market in the last five years has come from Flat Panel Detector (FPD) digital radiography for both fixed and mobile systems. This is a market where Canon has a strong reputation for FPD panel technology and smaller equipment sales through their acquisition of Delft DI. Combining the two offerings with Toshiba’s strong CT, MRI and ultrasound offerings will allow the combined entity to target large imaging equipment bundle deals with a full complement of systems.

Strong focus on R&D and innovation: Both Canon and Toshiba Medical are well-known and respected for technically strong products, especially in their core application sectors. While it will take some time for the two firms to integrate R&D and manufacturing capability, the combined brand will no doubt continue to be viewed as a leading vendor for technical capability and image quality, putting them in a good position to challenge the “big three” (Philips Healthcare, Siemens Healthineers and GE Healthcare) for top spot in European imaging hardware.

Back on the acquisition trail: Perhaps the biggest challenge for the combined entity will not be imaging hardware-related, but software-related. While the Vital and Olea Medical products are highly-regarded for advanced imaging and visualisation, the combined offering will be missing a central software platform for managing imaging content and workflow.

While not yet essential in Europe, the importance of clinical content management and enterprise imaging is increasing. What’s more, all major competitors have established imaging IT platforms (Philips Healthcare Intellispace platform, Siemens syngo and Digital Ecosystem, GE Healthcare Centricity platform and Healthcloud). Even mid-size vendors such as AGFA Healthcare (Orbis) and Carestream (Vue) have a significant installed base in Europe.

Vital Images (a subsidiary of Toshiba Medical Systems) more recently started to expand its capability to include workflow tools and VNA in their Vitrea product line, but do not yet have the scale of installed base or feature-set to match other major competitors. Without such a platform, the new Canon-Toshiba venture may still find it hard to compete in large hospital networks and regional tenders requiring both hardware and software capability.

So, while the new vendor will increasingly be able to compete in Europe, it will need to make more acquisitions to boost its clinical software offerings to challenge for top-spot in Europe in the long-term.

 

New Service from Signify Research: Clinical Content Management IT – 2017
This and other issues will be explored in full in Signify Research’s upcoming intelligence service ‘Clinical Content Management IT – World, with first delieverable due in February 2017. For further details please click here or contact simon.harris@signifyresearch.net

Signify Research Analyst Insights from ECR Today 2017

Our Daily Insights from 2017 ECR Show Paper

Written by Steve Holloway

As well as attending the ECR show in Vienna last week, Signify Research analysts also provided a daily column for the Congress Newspaper – ECR Today.

To read our insights on major themes from the show and how these relate to the European markets for ultrasound, MRI, CT, radiology IT and digital X-ray, please click on the links below for digital version of the daily paper – we’re on page 18 in each.

A step closer to Doc Mccoy’s favourite toy? (page 18)
01/03/2017 https://www.myesr.org/media/1384

Does breast MRI hold future for mammography market? (page 18)
02/03/2017 https://www.myesr.org/media/1389

Gap widening between CT innovation and installation (page 18)
03/03/2017 https://www.myesr.org/media/1499

The great enabler: artificial intelligence in radiology (page 18)
04/03/2017 https://www.myesr.org/media/1583

Where next for digital X-ray? (page 18)
05/03/2017 https://www.myesr.org/media/1653

These stories will also be run online at www.auntminnieeurope.com the week following the conference, starting Wednesday 8th March 2017.

New Service from Signify Research: Clinical Content Management IT – 2017
This and other issues will be explored in full in Signify Research’s upcoming intelligence service ‘Clinical Content Management IT – World, with first delieverable due in February 2017. For further details please click here or contact steve.holloway@signifyresearch.net

HIMSS 2017: PHM Market Observations

HIMSS 2017: Population Health Management Market Observations

Written by Alex Green

With Population Health Management (PHM) plastered across the majority of exhibitors’ stands at HIMSS last week, it’s hard not to be cynical about the subject and consign it to the list of many other transient themes. However, if you delve underneath the hype there were several serious and well defined solutions on show. Whilst no single vendor yet offers a complete solution, both in terms of technology and services, vendors now have a relatively clear and consistent definition for what needs to be offered to address PHM. One message I thought particularly hit the mark was from Health Catalyst. Their message; PHM is a verb not a noun, and solutions should be designed with this in mind. This certainly resonated and is a message several other vendors would do well to take on board.
Here are our top five PHM market observations from the show:

EHR Vendors Catching Up Fast

EHR vendors have struggled to keep pace with some of the specialist vendors in the PHM market. This has led to companies such as Wellcentive (now part of Philips), IBM (via its ownership of Truven Health, Phytel and Explorys) and Optum taking market leading positions. This is despite the EHR vendors having a significant installed base of customers which should have provided rich pickings for their PHM solutions. Providers and other customer groups have often reported that the PHM solutions offered by EHR vendors are limited in functionality compared to those offered by the specialists. In many cases the solutions are often simple bolt-ons to the EHR offerings, developed largely only to address Meaningful Use, rather than solutions that will drive change as health systems increasingly take on risk.

However, the tide does appear to be turning. Via acquisitions and product development a number of the EHR vendors appear to have caught up and are starting to leverage the advantage that a large installed base of provider customers brings. This was evident from the solutions on show at HIMSS and also if considering the 2016 financial results from a number of the EHR vendors that were announced in the run up to HIMSS. For example, Cerner, reporting $234 million of PHM business in 2016 and Allscripts, reporting $235 million PHM business for the same year. Both are good illustrations that PHM is starting to have positive financial outcomes for some (although not yet all) of the EHR vendors.

Analytics Moving Beyond Claims and Clinical Data

A key focus of many of the PHM and analytics vendors during the show was that PHM platforms require analytics solutions that go well beyond simple claims and clinical data when developing risk stratification. Pooling data from other sources, such a demographic data, social determinants of health, patient generated heath data, geographic information sources and other unstructured sources is now viewed as essential by many (although there is still work to do to convince some physicians). Using this data to develop patient personas will enable providers and payers when implementing stratification processes to better target and communicate with different patient types.

Admittedly, the requirement to aggregate non-clinical/claims data has been the general message for some time, but I certainly witnessed an increased emphasis on this during my meetings last week. This is perhaps a sign that the companies that weren’t doing this well have started to get their act together. This is one part of the PHM market where specialist analytics solution providers still have an advantage to some extent on some of the broader solution providers and some of the EHR vendors. Companies such as SCIO, Lexis Nexis and Health Catalyst certainly exhibited some solutions around this that were particularly compelling.

All or Nothing

Rolling out one element of PHM does not mean you have a PHM solution. From vendor and provider discussions during HIMSS it was clear that in many cases solutions had been rolled out that utilised just one or two elements of PHM. This may be a data analytics or data aggregation tool, or a patient engagement platform.  Stand-alone these components are not addressing the PHM need and will not give the provider the outcomes that are required as they make the transition to value-based care or to taking on risk. Only a complete solution that brings in data aggregation, analytics, risk stratification, care management, care coordination and patient engagement with a results feedback loop will allow providers to obtain the full benefits they’re looking for from PHM. Furthermore, the technology is just the first step; organisational adjustments and infrastructure changes are essential in ensuring that the technology investment pays dividends. My discussions from HIMSS illustrated that in many cases providers and other customer groups that are dipping their toes in PHM are heading for failure as they are not embracing the whole system approach that is required. This does not mean that a vendor needs to offer a complete solution, there is certainly room for best of breed specialists for certain elements of PHM. Rather, the providers need to be taking a whole system approach.

Consumerisation of Healthcare

The modern healthcare consumer in the US is used to having choices. In banking, retail, travel, and most other areas, product or service information is abundant and decisions are made quickly, based on price, convenience, reputation and quality.  Increasingly, consumers are approaching healthcare with a similar attitude. They want to be able to compare the quality of the service they’ll receive, view feedback from other service users, manage appointments online, understand the cost implications of medical procedures, contribute their heath data to the decision-making process and be able to easily communicate electronically with providers. HIMSS demonstrated that technology vendors are now starting to take this on board and are developing patient engagement solutions that address the marketing needs of providers operating in an environment where their customers are increasingly fickle, demanding and where brand loyalty carries a lot less weight. Influence Health has been a long-time player in this space with solutions that have typically been used for marketing purposes. The company is now integrating its solutions with clinical patient engagement functionality to meet this need. SalesForce continued its high profile showing at HIMSS, adding weight to the argument that good CRM is increasingly essential in this more consumer centric healthcare environment. EHR vendors are also developing solutions that better address the marketing needs of providers. For example, Allscripts has a compelling 2017 development plan for its FollowMyHealth solution that will go a long way to addressing the needs of providers that want to execute sophisticated marketing strategies.

It’s Getting Harder to Differentiate

As I toured the booths last week, the constant theme I heard from the vendors was how only they had the right data aggregation tools that blended claims and clinical data from multiple EHR with social determinants of health. How only their analytics solutions gave in-depth actionable solutions for patient stratification. How only their portal went further than meaningful use to offer a truly compelling patient experience. Finally, how only their implementation team could offer the provider support needed to really take advantage of PHM. Unfortunately, the reasons given as to why they were different were more than often than not, very similar. I am being slightly facetious here as there were some that could provide good evidence of solid differentiators. However, most solution providers are now clear on what is needed to offer a comprehensive well-constructed PHM solution, including support services. And although each vendor is at a different point in terms of how far along they are to having a solution that addresses all facets well, they are all moving ever closer. This will ultimately result in differentiation becoming increasingly difficult.

New Market Report from Signify Research Publishing Soon

A full analysis of the population health market will be provided in Signify Research’s upcoming market reports ‘Population Health Management – North America Market Report 2017’, publishing in 1Q 2017, and ‘Population Health Management – EMEA, Asia & Latin America Market Report 2017’, publishing in 2Q 2017. For further details please click here or contact Alex.Green@signifyresearch.net.

The Signify Innovator Series: St. Joseph Health

The Signify Innovator Series: Technology Innovation within St. Joseph Health

Interview conducted by Alex Green

As part of our Innovator Series, Signify Research was able to meet up with Dr. Michael Marino, DO, MBA who is Chief, IS Operations/Clinical Systems, at St. Joseph Health, an Integrated Healthcare Delivery System. I spoke with Dr. Marino about how St. Joseph’s was pioneering the use of technology for patient engagement, population health management and telehealth.

Key takeaways

  • St. Joseph’s has rolled out a sophisticated patient engagement solution from Hart (www.hart.com) that goes well beyond Meaningful Use requirements
  • The provider is also using a risk stratification tool from Verscend (www.verscend.com) that aids maximizing the benefits from the patient engagement outreach
  • It is also working with Jvion (www.jvion.com) and Clearsense (www.clearsense.com) to integrate data on social determinants of health into the stratification process
  • St. Joseph’s experience in using solutions from the large EHR vendors to address patient engagement, population health and telehealth has been disappointing to date
  • Medtronic (www.medtronic.com) has been used to pilot a number of remote patient monitoring telehealth initiatives and St. Joseph’s has a partnership with MDLive (https://welcome.mdlive.com/) to enable the roll out of telemedicine video consultation services

Can you tell me about how St. Joseph’s has been leading the way in terms of its use of innovative technology?

A good place to start is how we’ve been developing the use of patient portals as they relate to patient engagement.  Initially Meaningful Use ushered in patient portals but the requirements were set so low that the major EHR vendors developed solutions that had very limited use. Providers only had to put in place a simple portal and sign people up, but there were no requirements to ensure that the portal was useful and that people were using it. This will change with Meaningful Use 3, but in the meantime we’ve been developing our portal so that it actually has benefits for patients and is used regularly.

A basic portal where a patient can only look at their discharge instructions for example, isn’t going be a portal that a patient will want to interact with regularly. In order to make portals more sticky and of use to a greater share of the population, St. Joseph’s partnered with a development company that had a fair amount of experience in the more of the social elements of healthcare. It was a start-up company called Hart (www.hart.com). What Hart offered was an app that allowed patients to aggregate their daily activities with their medical information.

What we’re seeing in the locations we’ve rolled this solution out is if you add the social components of wellness; such as step tracking, sleep tracking, adding challenges within friendship or other social groups, on the same portal that patients can get their annual cholesterol check booked, then the overall use of the portal increases massively.

Once patients are used to using the portal for the wellness tracking functionality, they then start to use it for other things such as online scheduling of appointments, reviewing discharge instructions, booking and holding their place in a queue in the urgent care unit without having to physically turn up and wait in a room. The results have been pretty dramatic. In some of our employee centred clinics where we’ve rolled out the Hart system patients are now typically hitting the portal once a week, whereas before it may have been once per year. The Hart app functionality is integrated into our legacy EHR solution so the data from both can be aggregated.

Why didn’t you use your EHR provider’s portal solution?

We use Meditech’s EHR solution across all of our hospitals. At the time we made the assessment, Meditech had its standard portal that had been designed to hit meaningful use. However, it was three shades of blue. It didn’t offer much beyond the standard meaningful use requirements. For example, you could download a CCDA or you could see your discharge notes, but you couldn’t feedback into it. St. Joseph’s want to embrace where the trends are going with wellness and the Meditech solution just did not meet that need.

Is the data that’s obtained from the patients wellness monitoring used when you stratify how to  manage that patient and the population as a whole?

This is where the big opportunity is. However, the difficulty we’re having is that there is no good evidence as to how to react to this data. So you have patients tracking their steps, but from a clinical point of view you there is no evidence as to what the appropriate amount of steps is. There are benchmarks that say 7,000 steps, 10,000 steps, etc. but in reality, these are just arbitrary numbers that do not relate to a patient’s existing physical condition. If you’re 6’ 2” and weigh 200 lbs and consuming 3,500 calories per day, how much you should actually be walking? There is a similar issue with sleep. A lot of people are really interested in tracking their sleep. But what can you do with the information? The science hasn’t caught up with the consumer yet on just what are the right amounts to be targeting. This is where we could potentially be supported more by the solution vendors.

How will you expand how the portal will be used going forward?

The most important element is still how you manage people to do the right thing. For example, it’s flu season, have you remembered to get your flu shot? For a diabetes patients how can we use the portal to ensure they are having an A1C every six months? We have these kind of reminder services in place now, but have just not yet rolled it out. This is the kind of thing that’s really going to change healthcare. Historically healthcare has been much more about me sitting in my doctor’s office and you coming to me with a problem. I’ll do a great job of interacting with you but once you go away, that’s where the interaction ends. The portal and patient engagement will change this approach.

At St. Joseph’s we have a comprehensive set of disease registries which we use to reach out to people using a manual process.  For example, doctors use the drug registry to monitor if a patient has had their basic metabolic panel to have their potassium checked.  In the current process a letter will be sent out to remind the patient if this has not occurred. However, what we’re now starting to do is using the portal and patient engagement tools to transition this to a computer driven process, to remind people via email, text, etc. With a computer-based system there is a lot more opportunity to keep nudging patients, ultimately driving better adherence and compliance, particularly if there is a simple call to action that can also then be followed electronically. Paper-based systems are a lot more arduous. We’re using Hart for this again. It’s ready to go and we’re just getting a critical mass of people signed up before we launch.

Do you have in place any solutions that build in risk stratification so that you know where to focus?

As well as the standard registries that allow us to put people into cohorts, we’re using a platform from Verisk Health, now Verscend (www.verscend.com). Their solution allows us to score patients, put them into cohorts and stratify how these cohorts are managed. We then have nurse navigator teams that actively manage the cohorts based on the information from Verisk’s platform. The fact that St. Joseph’s, and California in general, has been in the risk business for some time means that this isn’t that new. PHM is just an extension of this traditional function of managing risk.

However, the portal and patient engagement tools now mean they can be better managed when discharged and we’re no longer relying on the doctors just stating “Here’s your paperwork, have a nice day, see your doctor in two weeks”.

How important is non-clinical data in this risk stratification process, for example social determinants of health?

Very. On the hospital side, we have two pilots where this is key. For the two pilots we’re working with two different analytics companies that allow us to feed in data, such as social determinants of health, into the decision making and risk stratification process. The two analytics companies are Jvion (www.jvion.com) and Clearsense (www.clearsense.com).

Jvion is made up of a team of former Google engineers that have been collecting data for around a decade, and now have a huge database of population information such as what’s the average income level on my street, how many people are in each household and what are the demographics of those individuals. St. Joseph’s is marrying that up with our EHR data. The example I like to use is if I have a knee replacement, I go back to a very nice household where my wife is a doctor. If a different person has a knee replacement, for example a mechanic, who lives by themselves, in a house with lots of stairs, doesn’t have a support network, doesn’t have easy access to transport, then he needs a different level of post-care support. Both of us could look the same clinically. 55 year old males, 6’ 2”, a little over 200lbs. However, based on this data alone you could end up driving interventions where they’re not needed. The patient that lives alone, dropped out of high school, may not have understood his discharge instructions well, probably does need a home visit whereas I may not. The Jvion platform allows us to feed in other non-clinical information into the decision-making process. Information that can flesh out this picture can be of huge value as we try to maximise the use of our resources.

How do you address the issue that you don’t always have a complete longitudinal view of the patient’s healthcare interactions?

We have care management and coordination tools that we’ve used in the past, such as the solution from Allscripts. It’s not great and it doesn’t integrate well, even into Allscripts platform. I don’t think anybody does this well and that’s certainly a challenge for the vendors to improve their solutions in this area. Vendors will tell you their solution is great and is up to the challenge but they’re not there yet.

We were also an early adopter of Explorys’ platform, now part of IBM. We still have their tools but the difficulty we’d have is that once you get out of the IT conversation and into the operations discussion, the vendors always want you to go after their new shiny tool. Explorys was a perfect example. We’ve had Explorys for four or five years, we run the data, we have all these great registries that have been built within Explorys, but unless it perfectly matches our operations workflow it’s not helpful.

What programs do you have in place around remote patient monitoring?

We’ve been using remote patient monitoring platforms that support the use of blood pressure monitors, weight scales, pulse oximeter and blood glucose monitors, for three and a half years. Initially we developed the solution with Hart; however, after a while they decided this wasn’t an area they specifically wanted to focus on and so now we’re transitioning to the Medtronic (www.medtronic.com) platform.

To date we’ve rolled out our remote patient monitoring solution to a relatively small cohort of a couple of hundred patients. This did significantly reduce readmission in that group and was very successful. Information from the remote monitoring was rapidly getting out to nurse teams and cases were escalated to doctors when needed so that action could be taken to stop readmissions before they occurred. The difficulty we’re having related to the question of when do we stop remote monitoring? Should we monitor three months, six months or should the monitoring continue all the way through the remainder of the patient’s life. We always try to do things that are evidence based. When you go to the literature for evidence around remote patient monitoring best practice, there is very little advice.

In terms of rolling out further, we are looking to expand this cohort. There are two things holding it back, the evidence on when to stop and then potentially restart, and then the issue of how this is paid for and the return on investment.

So what’s your vision for how technology and innovation will be used going forward in St. Joseph’s?

One area where we’re planning on innovating relates to telemedicine. We currently have a partnership with MDLive (https://welcome.mdlive.com/) where we’re increasingly rolling out video visits to patients. We’ve already rolled it out at multiple sites and are working through all our ambulatory sites up to this Spring.

Some patients want old fashioned care with the same doctor and limited use of technology. Others, often with less complicated care requirements, don’t care who they see. If the issue is relatively uncomplicated many want a video visit that’s quick and convenient. At the same time if a patient needs to see a real person they want the tools to quickly figure out when and where to get treated. They want to do this electronically just like when booking a flight or a restaurant.

On the other end of the spectrum, where people truly have lots of health problems, it’s understanding their risk, helping them manage their way through the system, giving them tools so they can track their medicine online and easily access their paperwork. This is where a comprehensive patient portal is crucial.

The minute you tell someone they’re going home from the hospital, that’s the last thing they hear. Immediately the patient starts to think about the logistics of other parts of their life. Picking up the dog, collecting groceries, visiting family, etc. they miss the instructions the doctor is giving on changing the bandage, picking up prescriptions, planning a return appointment with the physician, seeing their regular doctor. With a good portal, that the patient is used to using, you can send this information and build in reminders so that instructions are followed. This is really where we’re focusing our efforts going forward.

About St. Joseph Health

St. Joseph Health (SJH) is a value-based healthcare delivery system that serves residents throughout Northern and Southern California, West Texas and Eastern New Mexico. SJH provides a full range of care facilities including 16 acute care hospitals, home health agencies, hospice care, outpatient services, skilled nursing facilities, community clinics and physician groups. For more information visit www.stjhs.org.

For further details please click here or contact Alex.Green@signifyresearch.net.

HIMSS 2017: Clinical IT Show Report

HIMSS 2017: Clinical IT Show Report

Written by Steve Holloway

After a hectic week of meetings, booth tours and press briefings, here’s The Signify View on the key takeaways from the HIMSS 2017 meet for Imaging IT and Clinical Content Management (CCM) IT stakeholders:

Emergence of Agnostic Clinical Enterprise (ACE) Platforms

While the shift has been gradual, it’s clear that the world’s largest imaging IT vendors are making a bid to “lock-in” their customer base to broader, enterprise clinical IT platforms. Siemens Healthineers announced its “Digital Ecosytem” model at the show, following similar recent announcements from Philips Healthcare (“Intellispace” platform) and GE Healthcare (“Health Cloud”). The move is hardly surprising – enterprise EHR vendors have done little to establish any real expertise in best-of-breed clinical IT or imaging IT software to date. Therefore the “the big three” are leveraging their clinical expertise and modality hardware footprint to expand the breadth of their clinical IT offerings, including analytics, dashboarding, integrated workflow and even population health and telehealth capability. These new solutions, that Signify Research has termed Agnostic Clinical Enterprise (ACE) platforms, look set to be the foundation for future cross-discipline implementations.

In adopting the ACE platform model, there are many benefits for provider and vendor alike. The single ACE platform model allows the vendor to become embedded in the providers’ core clinical workflow and care management, while also putting themselves in prime position to win long-term, managed service deals, including imaging hardware, clinical care device supply and lucrative professional services.

For providers, the ACE platform model offers a single vendor to deal with for clinical IT (“one-throat-to-choke”) and a partner to share the risk of previously capital-intensive procurement. Moreover, the ACE platform model will, over time, use the core platform vendor as a contractor. If the provider wants to bring in a new technology or software for a specific clinical function, the ACE platform vendor will have responsibility to sub-contract and integrate the new module into their platform. This will lead to greater choice for the provider in each clinical discipline.

It is still early days for the ACE platform approach as was evident from the solutions on show. The few ecosystems being touted are essentially proprietary ecosystems, available only to current customers of the vendor’s software, integrated with a few select partners. The benefit of choice for providers therefore remains very limited. Little has also been discussed on how the new clinical ecosystem will interact with the incumbent EHR platforms – any sign of encroachment into acute EHR and the focus on interoperability will soon be lost. Furthermore, the clinical IT market is awash with mid-size and small vendors each with a market role to play. For mid-size vendors with partial imaging modality business, such as AGFA Healthcare, Carestream Healthcare and Fujifilm Medical, a strategic decision will need to be made, whether to build their own ACE platform ecosystem, or specialise in a clinical area and look for platform partners. Smaller vendors, such as those with best-of-breed clinical capability will probably sub-contract into ACE platform ecosystems, or ultimately be acquired. For the smaller, more generalist vendors, they may well soon see their addressable market shrinking.

FHIR gains momentum but blockchain the missing link?

Interoperability was certainly on the agenda at HIMSS, albeit dwarfed by the big themes of cybersecurity, artificial intelligence and US health legislation uncertainty. Adoption of the new Fast Interoperability for Health (FHIR) standard was evident in some new solutions on show, but was patchy and far from mainstream – unsurprising considering FHIR is far from a well- defined and established standard today.

In contrast industry hype around blockchain was more evident and widely discussed, especially as the major EHR vendors were keen to be seen to be visibly working on interoperability. It was clear though that beneath the hype, little has happened so far. There are few working examples of blockchain in healthcare, with almost all tied into the world of financial administration and payer-provider workflow.

From a clinical IT perspective, blockchain is a long way off. It is certainly intriguing in the fact its traceability could offer a true “longitudinal” record for clinical audit, especially when applied to tracking the patient through complex, multi-provider clinical pathways. In addition, some at the show saw blockchain as a smarter way to drive interoperability and exchange of information between different health providers and vendor platforms.

That said, it is still very early days for blockchain in healthcare. Most major vendors are only starting to investigate it’s potential and scepticism remains high despite the hype. One discussion we held led to a comparison of healthcare and finance in adoption of new technology and standards – with healthcare estimated to be approximately 20 years behind finance. Blockchain in finance today is only just starting to be trialled, so for healthcare it’s reasonable to assume it’s a long way off.

Artificial Intelligence meet Workflow

There was a healthy dose of reality being eschewed from most participants with regards to artificial intelligence (AI). It seems most now understand AI will not be replacing physicians and fully diagnosing patients anytime soon so instead are focusing more on the benefits it really can provide. Interestingly, AI to date has also driven far more vendor partnerships, with major clinical IT vendors increasingly looking to work with AI specialists in targeted clinical applications.

There was also an array of AI solutions on show and almost all focused on a few key areas: workflow, efficiency, analytics, clinical audit and in some select cases, physician decision-support. Most clinical focus was on workflow analytics, with examples such as enabling customised pre-fetching of images or clinically-relevant content to add context to diagnosis. Clinical dashboard and audit was also a clear topic, with a range of solutions on show to enable providers to better monitor and predict adverse events or clinical compliance based on real-time clinical data. Automation of manual processes (quantification tools particularly) was also shown in a variety of clinical applications

However, as was evident from the findings of the recent Signify Research report on Machine Learning in Medical Imaging (published January 2017), adoption of AI for decision support and Computer Aided Diagnosis (CADx) will be a gradual process, with only a few specific clinical applications commercially available in the next five years.

Cloud Gaining Traction

Unlike many of the trends discussed above still in infancy, the adoption of cloud technology for clinical IT appears to at last be gaining traction. Vendors were reporting a far greater interest and uptake of cloud solutions, suggesting providers are overcoming concerns on data security and wanting to take advantage of the greater flexibility enabled by cloud implementations. Notably, Microsoft Azure and Amazon Web Services (AWS) were regularly discussed as making significant inroads to cloud service provision for healthcare in the last year.

That said, penetration of full third-party hosted architectures remains relatively niche and is tied closely to the scale of deployment. For small private centres and physician offices, fully hosted and Software-as-a-Service (SaaS) solutions make sense as the burden and cost of managing IT hardware, middleware, storage and maintenance are removed. For small to mid-sized hospitals, the hybrid architecture model is often more appropriate; third-party hosting is used for long-term storage, back-up and disaster recovery off-site, while primary data remains on-site. For larger hospitals and multi-provider networks, most have already invested in long-term IT infrastructure to support their networks, along with sizeable IT administration teams. Therefore, “private cloud” is most common, in which the clinical IT software is hosted on the providers’ infrastructure with a variety of in-house or third-party maintenance options.

Consequently, vendors were keen to showcase cloud and mobile-access solutions at the show, especially with regards to viewers and data management solutions (be it image archive, vendor-neutral archive or independent clinical archive solutions).

Reality Bites

Despite the broad theme of interoperability in healthcare of late, the gulf between clinical IT vendors and EHR vendors was still palpable.  Progress in some applications such as risk stratification analytics and care management (as part of population health), telehealth and clinical archiving, was on show, but mostly the worlds of clinical IT and EHR remained very much separate. Diagnostic imaging especially appeared to remain estranged from wider health IT, while it was also notable how few imaging focused symposia sessions were part of the HIMSS schedule. So, while interoperability was much hyped and the largest clinical vendors are looking to expand clinical capability, there appears to be little change yet in breaking down the barriers of clinical data interoperability in the complex mesh of vendor, provider and payer networks.

New Service from Signify Research: Clinical Content Management IT – 2017
This and other issues will be explored in full in Signify Research’s upcoming intelligence service ‘Clinical Content Management IT – World, with first delieverable due in February 2017. For further details please click here or contact simon.harris@signifyresearch.net

User-Specificity Key for Cloud Adoption in Imaging IT?

A Tipping Point for Cloud Adoption in Imaging IT

Written by Steve Holloway

The quaint English phrase “Good things come to those who wait” is apt advice to industry stakeholders bullish on the adoption of cloud technology for imaging IT. After a decade of wholesale change to imaging IT, there is little evidence to show cloud adoption has progressed towards the mainstream. Yet, a growing array of cloud-based imaging IT solutions are now available and cloud is again being widely debated. Could this latest breed of products signal the start of mainstream adoption?

Different Users Have Very Different Needs

One size fits all” has rarely worked well in imaging IT. In fact, almost every imaging IT deployment today is different, nuanced by the unique complexity of pre-existing infrastructure, legacy software, organisational complexity, scale and user needs. This is too often overlooked by vendors and providers, leading to long, complex deployments and expensive professional services bills. Past products have also fallen foul of this critical fact and failed to cater for specific user groups.

The driving factors for selecting a cloud solution for imaging IT vary significantly between user groups. A large academic hospital will in general look at cloud as a means to improved accessibility to imaging data, with the ability to rapidly scale up new deployments and upgrades, while maintaining security and data ownership. Cloud adoption is not, as is regularly misunderstood, a cost-saving exercise for imaging IT in this scenario; frankly, the savings pale into insignificance in comparison to recent investment in electronic medical records (EMR) or cost-savings potential with better care pathway management. In contrast, a small community hospital usually wants to better manage cost, limit exposure to on-site hardware downtime and improve flexibility of use.

Small Providers to Drive Adoption?

At last it appears the industry is coming to terms with this and new cloud IT imaging products are being targeted to specific user groups. For small clinics and hospitals, this is more commonly in the form of SaaS-based products, with more predictable cost, integrated maintenance and support and full off-site storage. Moreover, the push towards modular imaging IT software will help to spur this change, leading to a common feature set of image ingestion, storage, workflow and viewing, in a secure, thin-client, hosted environment. Adoption to date has been slow –with the “trickle-down” effect of new technology from deployments at larger institutions. However, given the growing abundance of lower-cost solutions targeting this market, the adoption of hosted cloud for imaging IT solutions should outstrip that of the larger provider segment.

For larger providers and networks, tied to large capital budgets and long-term infrastructure investment, the managed service approach is a harder sell. Here the onus is on workflow capability and, above all, speed. Cloud is viewed as an enabler, to help develop inter-disciplinary information sharing, improve data access and improve the roll-out and scaling up of new software and features.

However, data ownership and security are also of tantamount importance to the provider and willingness to allow data “off-site” is uncommon. Consequently, vendors are now offering imaging IT software that better fits the “private cloud” model, either in a managed (third party administration and operation) or non-hosted model (software is located and managed by the health provider in its own data centre, but is accessible via a proprietary private cloud network). Cloud adoption will certainly ramp up in the larger provider market but with more focus on managed or non-hosted private clouds and far less third-party hosting.

External Forces

So, while user demand varies significantly and availability of user-specific cloud imaging IT solutions is improving, there are a few other factors further pushing the development of cloud technology towards a mainstream adoption tipping point. The focus on integrated care between disciplines and providers means data interoperability is top of many health providers’ agendas, while more common use of mobile technology is demanding data access from any networked location or device. Furthermore, a greater push for patient access and ownership of data is focusing the industry to utilise cloud as a means increasing the use of patient-provider data portals (enabling wider provider choice) and in improving security and data confidentiality. Add to this a global shortage of radiologists, stimulating growth for teleradiology and remote reading, and the digitalisation of pathology images, demanding more cost-effective storage options, and the case of cloud imaging IT becomes far stronger.

In the mid- to long-term, the influence of deep learning and artificial intelligence will also play an industry defining part in driving cloud adoption. Decision Support Tools and Computer Aided Diagnosis (CADx) software will be based on deep learning platforms that will need widespread access to large volumes of imaging data to be able to “learn” from it. Therefore, the common model today of on premise imaging IT and data-storage will be a major barrier to deep-learning unless cloud technology is embraced.

Taken together, these factors point to a need for more widespread adoption of cloud technology for imaging IT. Of course, barriers such as security concerns and data ownership still exist, but the direction and demands of future healthcare appear set to abolish them soon. So, while the tipping point for cloud-enabled imaging IT is probably still a few years-off, those “good things” are just around the corner.

Initial Takeaways from PHM Vendor 2016 Financial Results

Initial Takeaways from PHM Vendor 2016 Financial Results

Written by Alex Green

Many of the larger EHR vendors that are active in population health management have recently released their Q4 2016 and full year results, with more to follow shortly. Signify Research examines what we can take away so far.

Cerner Corporation

  • Q4 2016: Revenue up 7% on Q4 2015
  • Full year 2016: Revenue grew 8% compared to 2015 to $4.8 billion
  • Population Health: Revenue grew 13% in 2016 compared to 2015

The top level: For a relatively mature company, 8% growth for the year would seem healthy. However, it is much less than the 30% revenue growth seen in 2015 and it was also down on the double-digit guidance the that company gave at the start of the financial year.

Much of this short fall was a result of a decline in Cerner’s Systems Sales, which fell from $1.28 billion in 2015 to $1.27 billion in 2016, with licensed software and technology resale taking the brunt of the decline. However, this was partially offset by a 30% increase in SaaS software sales.

PHM driving SaaS Growth: The biggest driver of the increase in Cerner’s SaaS business was population health management (PHM). Overall the PHM business of Cerner was up 13% on the 2015 figure of $214 million. Some good news but it also isn’t without caution.

The PHM business grew at a significantly faster rate for Cerner in 2015 (21%) and as a share of the overall business, PHM remains at 5%, unchanged for the last three years and in fact slightly down on the 2014 figure.

Time to deliver on long term PHM strategy: In its investor communiques during 2016, Cerner continued to push the message that it saw PHM as a significant growth driver for its overall business in the medium term, targeting PHM driving approximately 20% of revenues in 2025. If this is to remain realistic it needs to see an upturn in its PHM business growth soon.

The company does believe that the market is approaching an inflection point for population health, as the transition from fee-for-service to at-risk models accelerates, a view that Signify Research agrees strongly with. One ingredient that has been lacking in the Cerner PHM strategy so far has been a strong advisory portfolio to address client implementation and process reengineering demands that accompany a PHM rollout. Some of its competitors, particularly those not from an EHR background such as Health Catalyst, have done a better initial job of this. Therefore, the company’s recent commitment to expand this area of its solution should certainly assist with growth towards the longer-term target for PHM.

Athena Health

  • Q4 2016: Revenue up 12% on Q4 2015
  • Full year 2016: Revenue grew 17% compared to 2015 to $1.08 billion
  • Population Health: 2.2 million lives covered with population health solution
  • Patient Engagement: 57,861 providers using patient engagement solution

Core business: Athena Health achieved overall revenues of $1.08 billion in 2016, up from $925 million in 2015. This represents 17% growth, down a little on the previous year and at the bottom end of the guidance given out at the start of 2016.

Most of the company’s revenues are from its core ambulatory EHR & revenue cycle management businesses, both of which grew strongly in terms of provider customers. In 2016 its athenaCollector revenue cycle management user base grew 16% to 87,691 providers and its althenaClinicals EHR user base grew 26% to 41,340 providers. However, for both products, the growth rate was lower than 2015, re-enforcing the fact that for Athena to continue to see growth at rates similar to previous years, it needs to move beyond its traditional customer base.

Addressing PHM though patient engagement: Athena Health is very aware of this need to expand beyond its traditional ambulatory markets and products and it sees PHM as an opportunity to aid executing on this strategy. However, to date what it labels as its PHM business is relatively small, representing less than 1% of the company revenues.

However, when you include Athena’s patient engagement product line, athenaCommunicator, as part of the PHM market, then its already a significant player, with 64,763 providers using its patient engagement solution at the end of 2016, up 23% on the previous year.

The right direction: On the surface the Athena strategy towards PHM does seem somewhat disjointed. Patient engagement and care coordination are typically viewed as components of PHM. However, the Athena product offering keeps them separate and to some extent restricts their use by limiting implementation of certain PHM solutions, such as athenaCoordinator, to customers that also use athenaCollect or athenaClinicals. However, it’s overall strategy of expanding its offering beyond its core ambulatory claims and EHR customer base is the right one for the company.

Others Yet to Report

These two companies represent an important but relatively small share of the overall population health management market. Many other key players in the PHM market are still to announce their full year results.

For example, Allscripts is due to announce its results on 16th February. As with Cerner and Athena Health, results for Allscripts for the first three quarters of the year were mixed. It had seen growth in its PHM business of 4.4% but in a similar vein to Cerner, the share that PHM was taking of the overall business had remained relatively stagnant. Tomorrow will enable us to see whether the final quarter of the year has changed this.

Conifer Health Solutions is also due to announce its results (via its owner Tenet Health) later this month. Conifer had grown its business 13.6% during the first three quarters of the year. PHM still plays second fiddle to Conifer Health’s revenue cycle management business, in particular its captive business with Tenet. However, PHM is central to its growth strategy and the final quarter’s results should give some insight on how successful it’s been executing on this.

Analysis of All PHM Players 2016 performance in New Market Report from Signify Research Publishing Soon

A full analysis of all leading PHM players will be provided in Signify Research’s upcoming market reports ‘Population Health Management – North America Market Report 2017’, publishing in March 2017.

This will include analysis of the PHM businesses of Cerner, Allscripts, Medecision, GetWellNetwork, eClinicalWorks, AxisPointHealth, HealthCatalyst, Emmi Solutions, Meditech, AthenaHealth, Welltok, Verscend, Transcend Insights, Philips, Orion Health, Optum, NextGen Healthcare, YourCareUniverse, MedHost, McKesson, Lightbeam, InfluenceHealth, IBM,  I2I Population Heath, HealthDialog, GetRealHealth, GE Health, Forward Health, Evolent, Epic, Enli, Conifer Health, Caradigm, Aetna and The Advisory Group.

For further details please click here or contact Alex.Green@signifyresearch.net.

Machine learning in radiology targets efficiency

Machine learning in radiology targets efficiency

Written by Simon Harris for AuntMinnie Europe

Whilst artificial intelligence (AI) is unlikely to replace radiologists any time soon, a new breed of machine learning-based software applications is poised to take on many of their tedious, repetitive, and time-consuming tasks – improving productivity and freeing up more time to focus on value-added activities.

In most countries, radiologists are already operating at, or near, capacity; any further gains in efficiency is likely to be derived from the use of “intelligent” workflow software tools. Furthermore, radiology is evolving from a largely descriptive reporting model to a more quantitative discipline, placing added demands on radiologists. As a result, the need for workflow efficiency has never been greater. It’s time to cut through the AI hyperbole and take advantage of the many benefits that machine learning is bringing to radiology.

There is a growing array of intelligent image analysis products that automate various stages of the imaging diagnosis workflow. Whilst early generation computer-aided detection (CAD) products largely failed to meet expectations, the application of advanced machine-learning techniques such as deep learning will, in part, enable CAD products to evolve from purely detection systems to more advanced decision-support tools.

This is the opening extract of a feature article for AuntMinnie Europe.

To read the full article, please click here.

(Access to the article may require free membership to AuntMinnie Europe – it’s full of great content and insight so well worth signing up!)

Quantitative Imaging Market to Exceed $500M in 2021

Quantitative Imaging Software Market to Exceed $500 million in 2021

Written by Simon Harris

The market for Quantitative Imaging Software1 comprises a wide range of software tools that provide automated measurements of anatomical structures, for various body sites and across multiple modalities. These tools vary in complexity, from tools that provide size measurements, to tools that provide more advanced metrics, such as perfusion analysis and texture analysis.

The market is being driven by the evolution of radiology from a largely descriptive field to a more quantitative discipline. Quantitative imaging, also called radiomics, is the use of algorithmic tools to provide objective and repeatable measurements of imaging biomarkers, such as size, texture, calcification, location in the organ and rate of growth. These biomarkers are indications of disease characteristics and may be useful for predicting prognosis and therapeutic response. Quantitative imaging decreases the subjectivity associated with radiologist interpretation of medical images, leading to increased diagnostic and prognostic accuracy.

Quantitative imaging tools have been available for many years and are typically sold as applications for advanced visualisation platforms. There is a growing trend to combine quantitative imaging data with other relevant information, such as pathology reports and patient information extracted from an EHR. This relatively new class of products, called Decision Support Tools, is forecast to take-off in the coming years, with the first products now entering the market.

Several companies are developing Computer-aided Diagnosis (CADx) systems that provide the functionality of Decision Support Tools and provide interpretation of medical images, for example, a probability score for the presence of cancer. The first FDA approved CADx systems are expected to enter the market later this year. These first generation CADx systems will have narrow diagnostic capabilities and will be limited to specific parts of the body and specific modalities, e.g. diagnosis of breast cancer from MRI scans. The introduction of CADx systems with broad diagnostic capabilities, at an affordable price point, will be the trigger for more widespread uptake of CADx systems, but this is likely to be several years away.

This topic and other issues will be explored in full in Signify Research’s upcoming market report ‘Machine Learning in Radiology – World Market Report, publishing in January 2017. For further details please click here or contact Simon.Harris@signifyresearch.net.

 

1 The Quantitative Imaging Software market comprises Quantitative Imaging Tools, Decision Support Tools and Computer-aided Diagnosis Systems.

Digging into GE & Philips Annual Results

Digging into GE & Philips Annual Results

Written by Steve Holloway

Two of the largest global health technology vendors released their Q4 2016 and full year 2016 results in the last week. We dig into the key takeaways for both GE Healthcare and Philips Health Tech divisions.

 

GE Healthcare

  • 4Q 2016: Health division posted revenue growth of 3%; operating profit up 10%
  • Full year 2016: Health division revenues posted growth of 4%; operating profit up 10%

The Signify View:

GE Healthcare posted a solid fourth quarter and full year results for 2016. Here’s some key highlights and clues as to future performance:

Lifesciences increasingly a strong growth driver: GE has been making significant inroads into the Lifesciences marketplace, showcased with 9% growth in Lifesciences revenues in Q4 2016. While no split is provided in the recent full year 2016 results, a 2015 company report shows Lifesciences accounted for close to one quarter of the $18 billion GE Healthcare business line (~23%). This push towards biotechnology and pharmaceuticals is clearly paying off and will also help position the Healthcare division for future growth markets, especially in areas such as predictive disease treatment, cell therapy, personalised medicine and genomics. Moreover, diversification into the more predictable Lifesciences sector has also helped the healthcare division to smooth out swings in its core capital-intensive diagnostic imaging, clinical care and healthcare informatics business lines.

Long-term play in BRIC+ regions is starting to pay off: GE Healthcare has made it a strategic priority to push into emerging healthcare markets in the BRIC+ regions, at significant cost. While this has delivered growth in some regions (such as China) other emerging markets can be fickle, with large swings in procurement demand for high-end products, adding further risk to this strategy. However, Q4 2016 results suggest this strategy is starting to pay dividends, with Healthcare division revenue in China growing 19% and Latin America 16% (in comparison to 6% growth in Europe and -1% in the US). This may also reflect GE’s strategy for “in-region, for-region” manufacturing, enabling it to compete in lower-cost product markets against local competitors.

Core device business is steady but not stellar: Posting a slim 2% growth for non-Lifescience segments (such as Imaging, Clinical Care and Health Informatics) in Q4 2016 is a reasonable performance (in comparison to a revenue decline in Q4 2015), especially with operating profit increasing by 10%. However, fourth quarter results are often a useful marker as to the robustness and future potential of capital-intensive markets such as Diagnostic Imaging and Healthcare Informatics. Therefore, slim revenue growth in Q4 suggests GE Healthcare has more work to do in 2017 in its traditional healthcare markets.

 

Philips Healthcare

  • 4Q 2016: Health Tech division posted revenue growth of 5%; adjusted EBITA at 15.3%, 1.9% up on Q4 2015
  • Full year 2016: Health division revenues posted growth of 5% on full year 2015

The Signify View:

Philips Healthcare posted healthy Q4 2016 and full year results for 2016. Here’s some key highlights and clues as to future performance:

Large, long-term deal strategy paying off: Philips has been one of the pioneers of pushing a managed service approach for hardware and software to health providers. With several long-term deals (often 10 years or more) including bundled imaging and clinical device hardware and health informatics software closed in 2016, Philips is becoming more entrenched with key large clients and protecting it’s installed base long-term, while ensuring more repeatable revenue generation. Furthermore, as the health informatics sector consolidates towards central health informatics platforms for healthcare providers, entrenched vendors with long-term bundled managed service deals will be prime position to capitalise.

Innovative health tech sectors yet to be realised: Philips has made a strategic move in recent years towards markets that combine technology with new care models. These newer products such as telehealth are part of Philips Healthcare’s Connected Care & Health Informatics business line. While it has built up a solid reputation as a leading vendor in this sector, this move has yet to provide strong growth. Accounting for 19% of healthcare revenues (when combined with Health Informatics), the Connected Care business posted comparable growth of 4% on 2015, the same rate as Philips Diagnosis and Treatment business (which accounts for 39% of revenues). Given that Connected Care is relatively small, yet covers an under-penetrated market with high-potential for growth, it would seem Philips has yet to fully capitalise here.

Mature markets positive; emerging markets steady; Western European orders take a dive: Unlike GE Healthcare (see above), Philips Healthcare posted solid comparable fourth quarter revenue growth of 4% and 5% in Western Europe and North America respectively. However, emerging “growth geographies” were far weaker than GE Healthcare, with Philips Healthcare comparable sales growth at only 5% for 4Q 2016. This is because Philips has invested less aggressively in emerging markets and has also been less focused on rolling-out “value” device products specifically for emerging markets, instead preferring to focus on higher-end devices in more established markets. Consequently, Philips portfolio is more exposed to swings in mature markets and less exposed to swings in emerging markets, such as the BRIC+ regions.

Given this, one concern is evident for Philips Health Tech division in 2017: the outlook for mature markets is increasingly uncertain, especially with US healthcare change imminent with Donald Trump as president and a worsening economic picture for Europe. This may also be exemplified in a steep drop-off (15% in Q4 2016) in order intake in Western Europe for the Diagnostic & Treatment and Connected Care & Health Informatics segments. So, while 2016 revenues have been solid in mature markets, 2017 could be a more challenging year for Philips Health Tech’s core business.

Time to Move Beyond Meaningful Use & MACRA

The Signify View: Time to Move Beyond Meaningful Use & MACRA

Written by Alex Green

Meaningful Use targets set by the Centers for Medicare and Medicaid Services (CMS) have served a purpose in rapidly bringing patient engagement and population health management (PHM) to the forefront of the US healthcare industry. However, providers, payers, employers and consumers have a huge amount more to gain if, and when, the use of these platforms moves beyond the box-ticking exercise of hitting Meaningful Use and MACRA targets.

Here’s our take on why:

The Signify View

The modern healthcare consumer in the US is used to having choices. In banking, retail, travel, and most other walks of life, product or service information is abundant and decisions are made quickly, based on price, convenience, reputation and quality. Increasingly, consumers are approaching healthcare with a similar attitude. They want to be able to compare the quality of the service they’ll receive, view feedback from other service users, manage appointments online, understand the cost implications of medical procedures, contribute their heath data to the decision-making process and be able to easily communicate electronically with providers.

At the same time health providers are under pressure to adapt to this new, consumer driven healthcare market environment. They need to be able to address the demands of this consumer-centric approach to healthcare and provide the improved cost transparency, greater convenience, better communication tools and a wider set of service information for the patient.

A Convergence of Demand

The solution from both perspectives, can be provided by population health management platforms and patient engagement platforms, the latter often being implemented as a component of the former. Patient engagement platforms provide both the consumer and provider with a vehicle for improving the communication channels.  They provide consumers with the information and the service access they need to obtain the information they’re demanding, while also giving providers a strategic tool to retain existing customers and roll out initiatives that result in much more comprehensive care management solutions for the population they’re serving.

However, many implementations of PHM and patient engagement platforms are not yet addressing these requirements, instead they are just being used to hit CMS targets around reimbursement.

Following purely a MACRA adherence strategy, a clinician can hit their CMS targets for patient access by ensuring that one of their patients interacts with their patient portal once during a three month period over 2017. Or similarly they can hit their patient education target by providing one patient over the three month period with targeted education material via their EHR. This will allow them to take a step towards receiving full reimbursement. But it doesn’t touch any of the issues outlined above and only scrapes the surface of what they otherwise could have achieved from the investment made in the enabling platform. A big missed opportunity!

Reducing Healthcare Costs

As well as having to address the increasingly consumer-centric nature of the healthcare industry, providers are also grappling with the move to value-based care and an overall agenda of better managing costs.

The strategy often employed to address this is via Accountable Care Organizations (ACOs). These can be physician led, hospital led, and even payer-led, but at their core is the objective of better managing the health of a whole population in order to control costs.

Population Health Management platforms are again the central tool employed to achieve this. These platforms provide:

  • The risk stratification solutions and analytics that can be used to target and pinpoint which patients within the population are those that are most likely to develop a long-term condition, or for those already managing long-term conditions which are most likely to be admitted or re-admitted to hospital.
  • The care management/care coordination tools that pull data together from disparate sources and provide multi-disciplinary care teams with a central tool to coordinate care across these populations.
  • The patient engagement tools that allow for providers and consumers to easily interact, share health data, provide reminders for screenings, allow for online appointment booking, provision remote health monitoring and allow for greater management of care costs.

The meaningful use, and future MACRA Advancing Care Information (ACI) targets around patient education, portal access, care transitions and patient generated health care started to encourage the use of population health management platforms as described above. But again, following a strategy based purely on hitting meaningful use or MACRA targets will not address this drive to manage the cost of providing healthcare for a given population. ACOs and providers need to instead embrace a much broader strategy when employing their PHM solution, one that goes well beyond these legislative targets.

Times Are Changing

Signify Research’s view is that ACOs, providers and payers do understand this, and they have now started to move on from their initial deployments. For many organisations the investment in technology has been made, it’s the re-engineering of processes that are now required in order to fulfil the technology’s potential. Over the coming years we’ll see greater use of the full range of features offered that address the demand driven by the consumerisation of health and the need to better manage health care costs.

This is good news for all. For the consumers demanding a healthcare experience that provides the choice, information, control and transparency they’re used to seeing elsewhere; for the providers and ACOs implementing strategies to support their move to better manage unforeseen financial risk, and of course to the vendors of these platforms.

 

New Market Report from Signify Research Publishing Soon

A full analysis of these issues will be provided in Signify Research’s upcoming market reports ‘Population Health Management – North America Market Report 2017’, publishing in March 2017, and ‘Population Health Management – EMEA, Asia & Latin America Market Report 2017’, publishing in June 2017. For further details please click here or contact Alex.Green@signifyresearch.net.

Hard Brexit Signals Rocky Road for Radiology

Hard Brexit Signals Rocky Road for Radiology

After the strongest hint yet from Prime Minister Theresa May that the U.K. will look to extract itself from the European Union (EU) single market (a so-called hard Brexit), it seems right to review early speculation on the outlook for medical imaging and healthcare IT.

Let’s focus on three factors: the U.K. National Health Service (NHS), regulatory aspects, and the market impact.

Radiology in U.K. NHS already suffering

Hard Brexit has few positives for the U.K. NHS, a system that’s already in some difficulty. Not only are current resources in radiology already at a breaking point, but also stricter immigration controls and a worsening economic picture could see the NHS imaging service facing a clinical brain drain.

This is the opening extract of Steve’s regular monthly market column for AuntMinnie Europe.  

To read the full article, please click here.

(Access to the article may require free membership to AuntMinnie Europe – it’s full of great content and insight so well worth signing up!) 

Significant Barriers Still Exist for Risk-Sharing Contracting

The Signify View: Significant Barriers Still Exist for Risk-Sharing Contracting

Written by Steve Holloway

Despite the sudden surge of interest in risk-sharing contracts, a by-product of a recent wholesale legislative shift towards value-based care, multiple barriers remain that will limit widespread adoption, both for clinical content management and wider healthcare IT.

Here’s our take:

Providers’ ability to measure ROI is poor

Risk-sharing contracts take many forms, though the most common models rely on the ability of vendors and healthcare providers to define and measure agreed “key performance indices” (KPIs) targets. If a vendor’s technology or software solution fails to meet these agreed KPIs, financial penalties are incurred – exceed and bonuses are available. While this may seem attractive to health providers as a tangible way to hold vendors to account, there is a fundamental flaw: most health providers do not have robust systems in place for measuring KPIs, let alone return on investment (ROI). Without accurate and robust data, it may prove challenging for providers to hold their vendors to account when KPIs are missed.

It’s easy to forget that healthcare providers are relatively new to healthcare IT adoption. Ever changing legislation, tightening budgets and connecting a patchwork of disparate legacy IT systems have been primary concerns of late, leaving little focus and investment in quality performance metrics and analytics. Even radiology, an early adopter of IT and digitalised for almost two decades, is only beginning to implement analytics and dashboarding capability for KPI and ROI measurement.

If this remains the case, there will be two potential outcomes in the short to mid-term; providers will avoid risk-sharing contracts until they can improve their own KPI and ROI measurement, or professional services and consulting on KPI and ROI will be incorporated into risk-sharing contracts to improve reporting over the course of a contract term. The second option also has further complications in vendor self-interest, in that the same vendor could be advising the provider on how best to meet KPIs, while inherently also protecting itself from contractual penalties. One way around this is to bring in a third party integrator, consultant or specialist vendor to offer the advisory and optimisation services, though to date there have been few examples of these services rendered at scale.

Healthcare remains heavily capital expenditure focused

Despite the increasing focus on risk sharing and managed services contracts, most of healthcare remains based on annualised capital expenditure models. In public healthcare, the root lies in the process of governmental budgeting and rigid procurement frameworks, limiting providers’ ability to be more innovative in contracting. Imaging IT software and services may also be part of larger procurement deals for imaging modality hardware.

Admittedly, private healthcare providers have more flexibility for contract innovation and are likely to be more open to sharing risk with technology vendors, especially if a case can be made for making future expenditure more predictable. That said, few have been willing so far to stray from the traditional business model. To put this in context, we should look at managed services adoption; the benefits of a managed services contract are generally clearer and simpler to implement than risk-sharing contracts, yet to date uptake of managed services has been very low. This suggests the more complex risk-sharing has a long way to go before it becomes a common approach.

The interoperability chasm has only shrunk slightly

Healthcare providers are under increasing pressure to connect disparate health IT systems to drive the interoperability of patient and health data. Using imaging as an example, providers increasingly want to share standardised and unstructured image data within and between networks, spurring the advent of enterprise imaging platforms and clinical content management solutions. While some progress is being made, interoperability of images between common software such as EMR and imaging IT still has significant challenges and barriers.

Put in context of risk-sharing contracts, this is a major concern for providers. Firstly, without full and centralised access to the correct data in a structured format, gaining confident and accurate insights for KPI measurement is very difficult. Furthermore, there is today a far greater focus on care quality and coordination between disciplines and providers. Without improvements in interoperability and the interfacing of current provider systems on complex patient care, especially comorbid pathways, the value of risk-sharing contracts for providers will be significantly reduced.

Some way to go

It’s true that in some instances risk-sharing could be successfully implemented today, but these are relatively few. For imaging IT and clinical content management, scale is a major factor. Small private radiologist reading groups or imaging centres may look to adopt this model to make operations more predictable. These use-cases are relatively simple in purpose and do not have the complex issues associated with multi-department, multi-site hospital providers, not to mention the scale of risk.

When the above is considered with the healthcare environment of changing legislation and tightening healthcare budgets, the argument for risk-sharing contracts will not be enough to sway most healthcare providers in the short to mid-term. The rewards are simply not yet well known to counter the severe cost of failure.

Machine Learning in Radiology – Vendors Must Prove The ROI

Machine Learning in Medical Imaging – Vendors Must Prove The ROI

Written by Simon Harris

Machine learning was undoubtedly one of the hottest topics in radiology last year, with a steady stream of academic research papers highlighting how machine learning, particularly deep learning, can outperform traditional algorithms or manual processes in certain use-cases. Investment in machine learning start-ups also continued, with several companies attracting early stage funding. To date, more than $100m has been invested in start-ups that are developing AI solutions for radiology. Furthermore, commercial activity gained pace, with at least 20 companies exhibiting AI-based products at the RSNA conference towards the end of the year, although most were prototypes and only a handful had regulatory clearance. 2017 should see commercial activity ramp-up, FDA approvals permitting.

Whilst the enthusiasm for machine learning is certainly justified, it inevitably raises expectations, potentially to unrealistic levels. To counter this, machine learning companies must clearly articulate the value proposition of their solutions and demonstrate a clear return on investment (ROI) to healthcare providers. With healthcare budgets under pressure globally, vendors should demonstrate both improved clinical outcomes and a tangible ROI to stand the best chance of success. As a bare minimum, vendors must prove that any quality improvements from using their cognitive tools do not negatively impact clinician productivity.

Using cognitive tools for repetitive and time consuming tasks enables radiologists to focus on value-added tasks or to perform extra reads. Machine learning can be used to extract relevant information from an EHR or lab report to automatically populate the radiologist’s report. The same cognitive tool can present the radiologist with treatment outcomes from similar cases, to aid with diagnosis and treatment planning. In this example, the combination of increased radiologist productivity and improved clinical outcomes makes for a compelling argument for healthcare providers to invest in machine learning.

A commercially available example of how automated tools can lead to productivity gains is 4D Flow from Arterys. 4D Flow uses cloud-based image processing technology to provide visualization and quantification of blood flow on cardiac MRI studies. With 4D Flow, cardiac MRI exam times can be significantly reduced from typically 60 to 90 minutes to around 10 minutes, which increases the efficiency and throughput of the hospital’s MRI service. Additionally, automated segmentation eliminates the need for radiologists to calculate measurements between areas of the heart. Cardiologists will need to switch from using echocardiograms, but the benefit of more accurate flow data that can be tracked over time should be a motivator. Arterys also recently received FDA 510(k) clearance and CE Mark approval for its Cardio DL™ product that provides automated, editable ventricle segmentations based on conventional cardiac MRI images, eliminating the need for manual segmentation. Cardio DL™ is also cloud-based and leverages deep learning technology.

Another example is iCAD’s PowerLook® Tomo Detection, a Computer Aided Detection (CADe) system for breast tomosynthesis that’s built on deep learning technology. Each image in a tomosynthesis data set is analysed to detect potential areas of interest and the system blends those areas onto a synthetic 2D image so that they are visible on a single image of the breast. Based on initial trials, the company claims that the additional reading time associated with breast tomosynthesis over 2D mammography is reduced by an average of 29.2%, with no change in radiologist performance.

In the above examples, the ROI from using cognitive analytical tools is largely derived from radiologist (and technician) productivity gains. However, bigger opportunities are to be found higher-up the healthcare value chain. For example, cognitive tools can review existing scans to identify incidental findings that have not been followed-up and that could represent missed billing opportunities for the healthcare provider. Moreover, predictive analytics can identify at-risk patients to enable early intervention and to avoid costly readmissions, both leading to a reduction in treatment costs.

Whether machine learning is positioned as a productivity play, a cost-saving play or a revenue-generating play, one thing is clear – vendors must prove the ROI. Unsubstantiated claims of how machine learning can outperform or replace radiologists may attract the news headlines but do little to support the efficacy of the technology. Vendors must adopt a more responsible, fact-based approach to marketing their solutions if machine learning in radiology is to avoid the “trough of disillusionment”.

 

New Market Report from Signify Research Publishing Soon
This and other issues will be explored in full in Signify Research’s upcoming market report ‘Machine Learning in Radiology – World Market Report, publishing in January 2017. For further details please click here or contact simon.harris@signifyresearch.net

Back to the future: Top themes from RSNA 2016

Back to the future: Top themes from RSNA 2016?

Greeted with fiercely mild weather for late November/early December, the 102nd RSNA show already had the feel of being a bit different. Of course, there was the packed program of fascinating research and exhibition halls bursting with advanced imaging hardware and cutting-edge IT solutions. But there was also the growing feeling, perhaps spurred by the theme “Beyond Imaging,” that radiology is in a mode of transition.

This was perfectly exemplified near the entrance to exhibition hall B, where a radiology printed textbook stall was positioned next to an artificial intelligence software vendor. Amongst the glitz of RSNA, it’s easy to forget that much on show is a long way from having an immediate impact on the daily working practice of most radiologists, but it is a great chance to gauge where radiology is heading.

This is the opening extract of Steve’s regular monthly market column for AuntMinnie Europe.  

To read the full article, please click here. 

(Access to the article may require free membership to AuntMinnie Europe – it’s full of great content and insight so well worth signing up!) 

RSNA Imaging IT: Innovation Under The Surface

RSNA Imaging IT: Innovation Under The Surface

Written by Steve Holloway

 

Holy Smoke IT’s Everywhere
Probably the standout from RSNA is just how big imaging informatics and clinical data management has become at RSNA. Both technical exhibit halls were packed with imaging IT solutions, coming at the market from all angles.

While the big issue impacting almost every conversation was artificial intelligence (see our other RSNA round-up on this), it is becoming increasingly evident how many vendors currently have or want a piece of the imaging informatics pie. To add some structure to the plethora of solutions on show, we’ve categorised our takeaways into the three core functions of imaging informatics: view, manage and store.

View
Viewing technology has undergone a major change in recent years, most notably in merging with advanced visualisation and expanding its reach out to a larger volume of clinical disciplines. Moreover, as enterprise imaging and EMR systems have rolled out across health systems, a new array of users has emerged with unique needs, demanding more and more from viewing technology.

Therefore, viewing technology has had to evolve, and this was evident on the RSNA technical exhibit show floor;

  • Thin-client, mobile-enabled, server-side rendering: is rapidly becoming the standard for viewing technology, allowing almost-instant web and mobile access for large imaging studies in any location within a given authorised network.
  • User and application interface morphing: another major play from multiple vendors was the adaptability of viewer interfaces, toolsets and even user-preferences, based on the type of image or information displayed and physician using the viewer. While this capability has been available for a while in some platforms, it is increasingly being pushed as a vital component, especially for high-volume reporting.
  • Pre-load of context information: improvements in management and storage of historical prior studies and data into centralised, accessible repositories, has also led to automated, intelligent pre-fetching of related data that may be relevant to the reading or reviewing physician. This enables the diagnostician to quickly access previous patient data that may add context to their diagnosis within the same viewer. While this capability has huge potential for diagnostics reading and clinical review, it remains dependent on the underlying management and workflow capability, an area that needs considerable work to get to true interoperability.

Manage
Management of imaging and associated clinical data has remained in flux, with vendors from viewing, archiving and storage, EMR and enterprise content management all vying to disrupt the traditional PACS market. There was also notable presence of a range of middleware specialists in DICOM-routing, migration, worklist and decision support tools. This suggests “D econstructed PACS” is not a marketing fad and is a viable alternative for providers that have the resources and capability to manage a migration to a multi-vendor best-of-breed solution.

However, it was also abundantly clear that enterprise imaging remains the core focus for most vendors and providers, unsurprising perhaps considering many incumbent PACS vendors are taking this route. Expansion of departmental PACS capability to a more enterprise platform approach has short- and mid-term benefits for most providers, allowing ingestion of non-imaging content, a single vendor relationship and avoiding complicated and lengthy PACS migration is clear. That said, many of these solutions do little to enable true interoperability of imaging data, something that over time will be of growing importance with the advent of artificial intelligence and population health management.

One positive from the show was the high profile of the RSNA Image Sharing Validation programme, an accreditation scheme for vendors that pass a set of interoperability criteria for the sharing of images and imaging reports. While this is a great start in highlighting the issue that has plagued PACS for decades, there is clearly more that needs to be done on the issue of interoperability both for imaging content and all health data.

Store
Unsurprisingly, almost every vendor involved with image informatics now offers some form of Vendor Neutral Archive (VNA), though few solutions are capable of true agnosticism to data type, clinical application or interfacing vendor.

Consequently, it’s clear that a two-tier market is emerging:

  • VNA: solutions that are essentially storage and archiving solutions to replace the “A” of PACS, with limited capability for non-DICOM content and which are usually deployed as a means of connecting disparate PACS or supporting enterprise imaging platforms.
  • Independent Clinical Archives (ICA): Fully content- and vendor-agnostic storage and content management platforms based on rigorous, standards-based repositories that exhibit true interoperability of content and support not only imaging content, but all structured (EMR) and unstructured data across health systems.

This distinction, while not immediately clear from strolling the exhibition hall, will, over time, become increasingly important as the demands of ingesting new clinical specialities stretches the capability of these solutions. Further muddying the issue was the raft of vendors in both categories also offering an array of workflow, management, clinical and viewing modules with their storage solutions, thereby appealing to providers across deconstructed PACS and enterprise imaging models. While disruption of the incumbent PACS market has been vital in driving the market towards improvements in interoperability and removing proprietary data blocking, the scramble of marketing jargon and merging of capability and features has made the proposition for health providers increasingly challenging and confusing.

Evolving Business Models Point to Cloud
Perhaps most notable was the increasing presence of new innovative business models across the imaging informatics sector. While today still a fundamentally capital-heavy IT sector, healthcare is tentatively moving to embrace managed services, a move that will undoubtedly also drive the market towards cloud IT technology.

Many vendors now offer some variation of their traditional capital expenditure model, though uptake to date has been minimal. Yet with major US and international initiatives putting greater emphasis on value-based care and a growing appetite for risk-sharing between providers and vendors, momentum for alternate business models is increasing substantially.

Therefore, expect adoption of hybrid and hosted imaging informatics to become increasingly common, despite the obvious concerns that exist around security and data ownership. Why so bullish?

Because the future direction of radiology, aptly caught in the show theme “Beyond Imaging” will demand a more flexible, interoperable and affordable approach to image and health data management. The constant query and amalgamation of data for artificial intelligence will require another level of processing capability, and the flexibility for image and data exchange to maximise radiologist reading resource will also require it. Finally, the spiralling cost of management and storage for the exponentially growing data is unsupportable without it. Get ready for a whole lot more discussion on the role of cloud IT in imaging informatics.

New Service from Signify Research: Clinical Content Management IT – 2017
This and other issues will be explored in full in Signify Research’s upcoming intelligence service ‘Clinical Content Management IT – World, with first delieverable due in February 2017. For further details please click here or contact simon.harris@signifyresearch.net

RSNA: Deep Learning Takes Centre Stage, but Beware the Hype

RSNA Goes AI: Deep Learning Takes Centre Stage, but Beware the Hype

Written by Simon Harris

Artificial Intelligence was undoubtedly one of the key themes at this year’s RSNA, featuring prominently on the exhibition floor and in several scientific sessions. At least 20 companies displayed products featuring AI technologies and a handful more used AI as a key part of their marketing messages, even if the use-case wasn’t entirely clear.

The use of artificial intelligence in medical imaging is not a new trend. The first generation of computer-assisted detection (CADe) products entered the market in the late 90s and used machine learning techniques such as shallow neural networks and support vector machines. What’s new is the increasing use of deep learning techniques, and in particular, convolutional neural networks.

With traditional machine learning, the algorithms are hand-crafted, meaning that the programmer essentially hard-codes the system to look for specific features. This is a time-intensive process that requires extensive clinical domain knowledge. Moreover, the performance of the algorithm is limited by the underlying rules and statistical modelling; hence the high number of false-positives generated by early CADe systems.

Deep learning techniques feature much larger (typically 10 layers or more) neural networks and the algorithms are trained using large sets of images. This requires considerably more computational power than traditional machine learning, which has been enabled by the introduction in recent years of affordable GPU-accelerated computing, which allows the algorithms to run much faster than with CPUs alone. By feeding the algorithms with radiologist annotated images and a “ground-truth”, the system automatically learns about the image features, rather than being programmed what to look for.  As such, deep learning methods typically produce faster and more accurate results over traditional hand-coded classification techniques.

So how is deep learning being applied in radiology? From walking the exhibition floor at RSNA there were two key themes, as discussed below.

Next Generation CADe

Deep learning has the potential to significantly enhance the performance of existing CADe systems, by offering improved sensitivity without burdening radiologists with a high rate of false-positives. Increasingly, CADe systems will supplement detection with automatic quantification of imaging biomarkers. Additionally, the results from computer-assisted detection (and quantification) can be presented alongside patient information extracted from an EHR, such as patient history, laboratory results and prior studies, to provide the clinician with an imaging decision support tool (see next section).

Moreover, deep learning offers improved support in the detection of co-morbidities and incidental findings. For example, research published earlier this year by researchers at Icahn School of Medicine at Mount Sinai in New York City found that existing mammograms can be used to detect calcified plaques in breast tissue, which can lead to heart attack or stroke. In commercial healthcare systems, such as the US, this may help to ensure that opportunities to bill for additional procedures are not missed. The combination of improved accuracy and enhanced functionality will make next generation CADe systems a far more compelling proposition than earlier systems.

iCAD made a big play on deep learning at RSNA, with a large part of its booth dedicated to PowerLook® Tomo Detection, a CADe system for breast tomosynthesis that’s built on deep learning technology. Each image in a tomosynthesis data set is analysed to detect potential areas of interest and the system blends those areas onto a synthetic 2D image so that they are visible on a single image of the breast. Based on initial trials, the company claims that the additional reading time associated with breast tomosynthesis over 2D mammography is significantly reduced by using its CADe software, by an average of 29.2%. iCAD received CE Mark certification for PowerLook Tomo Detection in April 2016 and is in active dialogue with the US FDA regarding pre-market approval.

Riverain Technologies, which is best known for its image analysis tools for nodule detection in chest x-rays, used RSNA 2016 for the commercial launch of its ClearRead CT Suite, comprising ClearRead CT Vessel Suppress and ClearRead CT Detect, which aids in the detection of nodules in chest CT scans. The vessel suppression tool features deep learning technology. Riverain received FDA 510(k) clearance for ClearRead CT in September.

From CADe to CADx and Imaging Decision Support

The theme of this year’s RSNA was Beyond Imaging, to reflect the broadening role that radiologists are playing in the larger medical community. The theme also reflects how radiologists will increasingly be able to leverage non-imaging data extracted from EHRs and other sources to assist in making diagnostic decisions. In addition to patient data, imaging decision support tools can provide radiologists with other supporting information, such as the treatment outcomes of patients who presented with similar conditions.  Beyond Imaging also captures how radiology is evolving from a largely qualitative to an increasingly quantitative discipline, with the increasing use of automated quantification tools to provide accurate and repeatable metrics of lesions and tumours, for example.

The first generation of imaging decision support and computer-assisted diagnosis (CADx) products are starting to enter the market and a handful were on show at RSNA. RADLogics presented its Virtual Resident™ decision support solution, based on its AlphaPoint™ cloud-based image analysis platform. The platform incorporates machine learning algorithmic tools for automatic analysis of X-ray and CT images. The results are combined with the patient’s medical record information into a preliminary report, in much the same way that a resident prepares information for a radiologist to review.

HealthMyne used RSNA to preview its QIDS software platform which provides radiologists with a quantitative imaging dashboard, including time-sequenced Epic EHR information. Laboratory results, treatment details, and the health status for each patient are viewable in a timeline-based longitudinal representation. As an example, a longitudinal representation could feature a plot of tumour size relative to the duration of a course of radiotherapy, with icons to denote the dates of follow-up CT scans from which tumour size was determined. Scans can then be examined by clicking on the icon and opening a viewer. QIDS retrieves prior studies, performs image registration and localization of previously identified lesions. The analytics software, which is not built on deep learning, also provides information such as tumour size, Lung-RADS categories for use in lung cancer screening, and other quantitative metrics. The product will be fully launched in January 2017.

Quantitative Insights (QI) Inc. showed its QuantX breast imaging workstation. Alongside a multi-modality viewer, QuantX provides automatic detection and quantification on MRI images for the characterization of breast lesions, to assist in breast cancer diagnosis. QuantX features a breast imaging decision support system with direct correlation to a database of lesions with known pathology, based on biopsy results. The system generates a QI Score™ to represent the probability of malignancy. QI has submitted a de novo 510(k) application to the FDA and believes that a decision is imminent. Should the company be successful, QuantX will be the first CADx product cleared by the FDA.

IBM gave demos of several Watson-powered initiatives, under both the Merge Healthcare and Watson Health Imaging brands. Examples included a solution for aggregating and filtering electronic health records and technology for automated analysis of cardiac ultrasounds and improved diagnosis of aortic stenosis. The most impressive demo was for a decision support tool code-named Avicenna, which automatically detects and quantifies anatomical features and abnormalities (the demos used a CT scan), and extracts relevant information from a patient’s electronic health record. Avicenna has a cognitive ‘reasoning’ capability that considers the imaging and non-imaging information to suggest possible diagnoses. Big Blue was tight lipped about the release date for Avicenna, but it will likely need another year at least, and most likely two years, to complete clinical trials and obtain regulatory approval. IBM’s first cognitive solution for radiology to hit the market will be a Cognitive Peer Review Tool, intended to help healthcare professionals reconcile differences between a patient’s clinical evidence and the data in that patient’s EHR, which is due to be released in 1Q 2017.

Separate the Hype from Reality

In addition to the above examples, several start-ups, including Enlitic, Zebra Medical, Lunit and Vuno, used RSNA to showcase how they are applying deep learning to medical imaging. For example, Enlitic gave a demo of a chest x-ray triage product and a solution for lung cancer screening, both powered by deep learning. Enlitic is in the process of gathering clinical validation for its products and does not yet have regulatory clearance to sell.

However, some of the other start-ups were less forthcoming regarding their product development plans, with one company’s booth no more than a display carrying the company’s logo. Many radiologists remain sceptical of the capabilities of artificial intelligence and some see it as threat. Moreover, many remember the limitations of early generation mammography CADe systems. Vendors need to complete and promote clinical studies to validate their claims, otherwise marketing soundbites may impede the acceptance of deep learning in radiology. More customer education is required so that the conversations at next year’s RSNA move on from “what’s deep learning?” to “tell me how can deep learning help me do my job better”.

 

New Market Report from Signify Research Publishing Soon
This and other issues will be explored in full in Signify Research’s upcoming market report ‘Machine Learning in Radiology – World Market Report, publishing in January 2017. For further details please click here or contact Simon.Harris@signifyresearch.net

RSNA Ultrasound: Premium Market Heats Up

RSNA Ultrasound: Premium Market Heats Up

Written by Steve Holloway

At first glance of the RSNA 2016 exhibition floor, one may have observed little has changed in ultrasound, apart from the re-emergence of live ultrasound scanning demonstrations after a 27-year hiatus. However, if you managed to get away from talking about artificial intelligence and the future role of radiology, innovation in ultrasound was clearly on show, albeit more subtly.

A new premium system baseline
The “flagship” premium category continues to evolve and break new ground for ultrasound clinical capability. While there were no market-defining features on show, it’s clear a new baseline has been established for system capability in the premium radiology category.

This was most evident from the array of premium systems now boasting a combination of shearwave and conventional elastography, real-time fusion and contrast-enhanced ultrasound (CEUS). This was particularly pertinent given the scrum of vendors showcasing liver CEUS, only approved in the US by the FDA earlier this year. Vendors clearly saw a big opportunity for demonstrating the combined diagnostic power of fusion imaging, quantitative shearwave elastography and real-time fusion for liver lesion assessment.

Throw in an array of nuanced improvements in workflow, widescreen displays and various other features, and it is becoming clear that the minimum criteria to qualify for “flagship” radiology ultrasound has been raised again.

Breast & MSK Everywhere
As has been evident in the past few shows, breast ultrasound continued to expand its reach across the booths of ultrasound exhibitors, albeit in a variety of forms. Focus continued to revolve around the lengthy time for conventional breast ultrasound scanning, with both automated breast ultrasound (ABUS) and whole-breast ultrasound (WBUS) seen as a potential time saving solution – most vendors in ABUS and WBUS claim scan times for similar results in the range of 1-5minutes, versus approximately 30 minutes for a conventional scan. While this is important for workflow and as ABUS/WBUS becomes more commonly used, the lack of an adaptive screening pathway based on breast-density pre-screening appears to still be some way off, suggesting large-scale use will be limited for the near future.

Also evident was the growing use of ultrasound for musculoskeletal (MSK) applications. Seen initially by many in the industry as the fourth true point-of-care (POC) ultrasound application after anaesthesiology, emergency medicine and critical care, MSK use is increasingly straddling the POC/traditional divide, highlighted in a variety of cart-based, pedestal and compact ultrasound systems. This suggest two distinct MSK markets are emerging; lower-capability, compact ultrasound for true POC imaging on the sport field or primary clinic, and more advanced diagnostic second or third generation users in specialist centres or the hospital, requiring higher capability systems. Quick to profit from this maturinguse-case, the focus on MSK by traditional cart-based system providers was far more pronounced than usual at the show.

Transducer innovation, app-based ultrasound & OB/GYN market heats up
Other notable trends from the show included:

  • Growing focus on innovation in transducer technology, especially around high-frequency capability and wide-bandwidth solutions, designed to offer greater flexibility of use, or reduction in different types of transducer required
  • System-in-a-transducer ultrasound was also on show again, both from Philips Healthcare’s Lumify and Clarius Mobile Health. Use of tablets as wi-fi remote-controls for premium ultrasound systems was also showcased with several premium cart-based systems, as a workflow benefit.
  • The OB/GYN market, traditionally dominated by GE Healthcare, Samsung Medison, Hitachi-Aloka and Mindray, has also been targeted by Toshiba Medical (traditionally a general imaging and cardiology ultrasound specialist), with the launch of a dedicated system for women’s health, Aplio i800WHC.

Advanced visualization: Why so many viewers?

Advanced Visualization: Why so many viewers?

November 21, 2016 – As many of us gear up for the annual pilgrimage to the RSNA meeting in Chicago, it’s safe to expect artificial intelligence and big data will be front and center of the exhibition. Perhaps less expected will be the abundance of visualization software on display from a growing selection of exhibitors.

Once the staple diet of diagnostic and advanced visualization (AV) specialists, viewing software is of great importance to both clinicians and vendors alike. Today, many radiologists use integrated case-load worklists and diagnostic tools in their primary viewer, and the lucky ones have a suite of AV tools too. For most, the biggest change in recent years has been integrating AV tools into a single viewing platform, saving some walking between workstations. No big surprises here.

So why the sudden expansion of viewing software?

This is the opening extract of Steve’s regular monthly market column for AuntMinnie Europe.  

To read the full article, please click here. 

(Access to the article may require free membership to AuntMinnie Europe – it’s full of great content and insight so well worth it!!) 

Defining the Opportunity: Machine Learning in Radiology

Defining the Opportunity for Machine Learning in Radiology

Read time: 4 minutes 30 seconds

The application of machine learning in radiology is evolving at a rapid pace and whilst fully automated diagnostic systems are still several years away, there is a growing number of machine learning tools available now that are helping radiologists to improve their efficiency and to make better diagnoses. This Analyst Insight from Signify Research describes the current applications of machine learning in radiology and points to how the technology will increasingly be applied in four distinct areas, as shown the diagram below.

Computer Aided Diagnosis

Computer-aided detection (CADe) systems are intended to identify a variety of cancers such as breast cancer, prostate cancer, and lung lesions. They are most commonly used to detect microcalcifications and masses on screening mammograms. Despite concerns regarding the benefits of CADe and the high rate of false positives and false negatives, the market has grown steadily over the last two decades, most notably in the US where more than 90% of mammograms are interpreted using CADe. This has largely been driven by the availability of reimbursement for the use of CADe in breast screening. It is far less commonly used for detecting other cancers, where reimbursement for using CADe is currently not available.

Most CADe systems are rules-based and programmed to identify specific features. However, machine learning techniques, particularly deep learning using convolutional neural networks, are increasingly being applied. Deep learning has the potential to improve the accuracy of CADe, particularly for soft-tissue analysis. Moreover, as deep learning systems can be trained to identify features using large datasets, the algorithm development times are massively reduced compared with the traditional approach of ‘manually-crafting’ algorithms.

The benefits for radiologists are likely to be enhanced product performance (i.e. fewer false alerts), a faster pace of product innovation and a wider choice of products, particularly for the detection of lung, colon and prostate cancers, where there are currently only a handful of products on the market. For breast cancer detection, there will likely be a wider selection of CADe solutions for use with different modalities, e.g. breast MRI and breast ultrasound, in addition to digital mammography. In breast tomosysnthesis, deep learning can be applied to reduce the additional reading time associated with 3D images compared with 2D mammograms. For example, in March this year iCAD released PowerLook Tomo Detection which is built on deep learning technology. The system extracts areas of interest from the 3D planes and blends them onto a single 2D image. iCAD claims that its new system can reduce the interpretation time for breast tomosynthesis by an average of 29.2%.

Lastly, the increasing use of machine learning will enable CADe to gradually evolve from purely detection systems, to more advanced decision support and computer-aided diagnosis, as described in the following sections.

Quantitative Analysis Tools

Quantitative analysis tools are essentially workflow tools that provide radiologists with automatic detection and measurements of imaging features (biomarkers) to assist with diagnosis, such as lung density, breast density, analysis of coronary and peripheral vessels, etc. Much like with CADe, machine learning is increasingly being applied and the benefits to radiologists are much the same – improved accuracy, enhanced functionality and an increasing choice of products due to the faster algorithm development times. Most vendors currently offer a small number of quantification tools for very specific tasks, but the faster time to market associated with machine learning algorithms will enable vendors to offer an expanded ‘toolkit’, with solutions for multiple applications across multiple modalities.

Zebra Medical Vision, an Israeli start-up applying machine learning to medical imaging, has announced several algorithms for its Imaging Analytics platform. Its algorithm for lung applications analyses Chest CT scans to detect emphysematous regions in the lungs, and quantifies the volume of emphysema in comparison to the overall lung volume. Another example is 4D Flow from Arterys, which uses a cloud-based image processing technology to provide visualization and quantification of blood flow on cardiac MRI studies. 4D Flow utilizes machine learning analytics for automatic identification and segmentation and Arterys is validating the use of deep learning algorithms.

Decision Support Tools

Decision support tools provide detection and quantification, alongside supporting information extracted from an EHR, pathology reports and other patient records, to assist with diagnosis. Other features may include:

  • Registration algorithms, to pinpoint areas of interest in one study and have those areas linked to the same points in previous scans,
  • Automatic population of radiology reports with quantitative data,
  • Predictive analytics to identify high risk patients and enable early treatment
  • Treatment planning to determine the best course of treatment for an individual, by reviewing the outcomes of previous treatment pathways for patients with similar conditions.

Decision support tools do not provide automated diagnosis (see next section) and instead are intended to help radiologists improve their efficiency, while also improving accuracy and consistency. Early detection of high risk patients and improved treatment planning can also lead to cost savings for health providers and improved quality of care.

Today there are relatively few imaging decision support tools on the market, but this is likely to be one of the main applications for machine learning in radiology in the coming years. At this year’s RSNA meeting, HealthMyne will release its Quantitative Imaging Decision Support (QIDS) platform which combines imaging data with electronic medical record, radiation therapy and other clinical information to provide clinical decision support in the primary read. Another example is AlphaPoint™ from RADLogics, which was launched in early 2016. AlphaPoint is a cloud-based platform that incorporates machine learning algorithmic tools for automatic analysis of images and merges the results with the patient’s medical record information in a preliminary report.

IBM Watson Health is currently developing a radiology assistant product, code-named Qibo, that reviews imaging studies and patient records to produce a summary of the most important information.

Computer Aided Diagnosis

Computer aided diagnosis (CADx) systems are intended to provide information beyond detection and quantification by also providing interpretation of the scan, for example by providing a probability score for the presence of cancer. These systems are heavily regulated by the FDA and as far as Signify Research is aware, there are no commercially available CADx systems for clinical use.

Several companies are developing fully automated diagnostic systems and probably the most ambitious is IBM Watson Health, whose project is code-named Avicenna, after an 11th century philosopher who wrote an influential medical encyclopedia.  The key difference between Avicenna and existing decision support tools, as described in the previous section, is that Avicenna has a “reasoning” system that makes use of multiple data sources, such as scans, patient records and data from similar cases, to suggest diagnoses and possible treatment paths.  IBM has shown previews of Avicenna, but has not indicated when it will be commercialized.  Other companies developing CADx solutions include Aidence, Enlitic and CureMetrix, to name a few.

To summarize, we expect that 2017 will be a pivotal year for machine learning in medical imaging as the FDA is expected to approve the first tranche of detection and quantification tools based on deep learning. Although the clinical benefits of these systems are somewhat unproven, the early results from validation trials are encouraging. We also expect that several companies will release decision support tools which will help with market education as to the benefits of these systems. In the US, the trend to value-based care will be a major driver for their uptake. Looking beyond 2017, the rapid advancements in artificial intelligence technology, primarily driven by technology giants such as Apple, Google, Facebook and IBM, suggest that the importance of machine learning in radiology will only increase over time. That said, there are still legal and ethical considerations that need to be addressed, notwithstanding proving the efficacy of the technology, before computer-aided diagnosis becomes mainstream.

 

New Market Report from Signify Research Publishing Soon
This and other issues will be explored in full in Signify Research’s upcoming market report ‘Diagnostic Analytics in Radiology – World Market Report, publishing in January 2017. For further details please click here

Analysis of Potential AGFA Takeover by CompuGroup

The Signify View: Analysis of Outcome as AGFA Mulls Takeover by CompuGroup

  • CompuGroup Medical has made a public non-binding indication of interest to take over $2.6B AGFA business; no deal has been confirmed
  • The two companies have some cross over in medical Electronic Patient Record (EPR) software
  • A deal would allow CompuGroup Medical to tap into AGFA’s established customer base for imaging IT and X-ray equipment in Western Europe and US
  • Significant question mark remains over the future of the AGFA Graphics business line and strategic fit between the two firms

The Signify View
Here’s our key takeaways:

1. AGFA’s established customer base in Western European and the US is a big draw for CompuGroup Medical
The biggest positive for CompuGroup Medical is access to an established customer base in Western Europe and the US. CompuGroup’s core market focus and successes have been in Eastern and Central Europe to date, though recent acquisitions have expanded this reach.
In contrast, AGFA has spent decades building a strong reputation as a clinical specialist in medical imaging, mostly focused on digital radiography X-ray and imaging IT software. AGFA’s sizeable healthcare customer base would therefore make an attractive target for CompuGroup’s EPR products. In Western Europe, a market relatively immature in EPR adoption, this offer significant growth potential.
In contrast, the US market for EPR in hospitals is intensively competitive and dominated by major players such as Epic Systems and Cerner, limiting growth opportunity. However, the large and fragmented ambulatory and outpatient EPR market may be more attractive and could well suit CompuGroup’s products.

2. AGFA agreement would be an “all-in” move on healthcare
AGFA’s Healthcare unit accounted for only 41.5% of its $2.6 billion revenue in 2015, with Graphics and Speciality Products making up the remainder. AGFA may well be looking for some assurances on the future of the non-healthcare business lines. However, CompuGroup is a pure healthcare focused company, so the Graphics business is unlikely to be viewed as an essential asset, and could be targeted for potential sell-off.
If so, any deal acceptance from AGFA will essentially be an “all-in” move to focus on healthcare, mirroring the recent position taken by Philips, one of its major competitors north of the border. This will be viewed as a risk, given that the AGFA Healthcare business unit revenues have been stagnant in the last few years. However, when viewed in the wider market context, the healthcare sector is already a target for some of the largest technology and IT firms, such as Google, Apple, IBM and Microsoft. This confidence and investment therefore suggest it would be a risk worth taking.

3. Strategic mid- to long-term fit questionable
On balance the deal appears more favourable to CompuGroup than AGFA. It offers market access to more mature and profitable markets and will add a strong brand focused on clinical excellence in medical imaging and imaging IT software, providing further credibility to its growing market presence in Europe.
However, in context of the wider market, the deal would not solve the biggest strategic challenges for AGFA.

As a well-respected mid-sized firm entrenched in medical imaging hardware and software, the focus for AGFA should remain in driving clinical excellence in its core field of diagnostic imaging. Expansion to cover EPR and a new Eastern European customer base could drive some short-term revenue growth for both its software and hardware product lines. However, the joint EPR of an AGFA- CompuGroup deal would unlikely concern the major EPR or enterprise health IT vendors, so over time would offer little in terms of long-term growth potential. Moreover, most market growth and disruption is in the enterprise health platform segment, especially for population health management, analytics platforms and cognitive computing, a challenge for a firm of AGFA’s size to compete in.
Instead, AGFA should look to expand its reach within the clinical realm, focusing on applications where its expertise in clinical software and workflow can have most impact, especially in applications relatively new to digitalisation, such as Pathology. It should also look more to associate and partner in provision of new business models, service, cloud technology, cyber security and cognitive computing (much like its recent association with IBM Watson Imaging collaborative).
So, unless CompuGroup is willing to pay a premium for AGFA, the likelihood of a deal given the challenge of the Graphics business and poor long-term strategic fit of the healthcare assets is relatively low.

Impact of Trump Presidency on Patient Engagement

The Signify View: Impact of Trump Presidency on Patient Engagement Market

One of the defining themes of the Trump presidential campaign was the pledge to “completely repeal Obamacare”. Since winning the election there have been numerous signs from the president-elect that indicate his definition of “completely” may not quite be the one we all expected or understood. Whether the Affordable Care Act (ACA) is fully repealed or whether ultimately it’s amended but maintained, there will be significant ramifications for patient engagement platform market.

The Signify View
So why is the ACA important for the patient engagement market?

  • The ACA was designed to provide access to care for all Americans, improve population health outcomes, and decrease healthcare costs.
  • ACA has been successful in significantly reducing the uninsured rate in the US
  • That has brought with it additional demand on many healthcare services, such as primary care physicians, and emergency departments – which does not meet the final objective of decreasing healthcare cost.
  • This is a clear indication for the importance of patient engagement, something that has not been fully utilized by the initial ACA roll-out. Patient education is essential to better manage health cost, especially in ensuring patients appropriately use health services and self-manage chronic conditions.
  • This has now brought patient engagement into focus as a central tool in addressing some of the cost and demand elements driven by the ACA, leading ultimately to the implementation of the reimbursement targets and measures put in place for meaningful use and now MACRA.

So if the ACA is completely repealed isn’t this bad news for patient engagement platform suppliers?

Trump’s Plans Will Need Patient Engagement
Post-election, all the signs are that parts of the ACA will remain, with Trump himself now toning down his position from “completely repeal Obamacare” to “Either Obamacare will be amended or repealed and replaced”. Whatever the final outcome, there are a number of reasons why Signify Research expects patient engagement platforms will still be key. These are explored below.

Reduce Healthcare Costs
One of the central goals in repealing or changing the ACA is to reduce costs. “We have to repeal it and replace it with something absolutely much less expensive.” being the president-elect’s mantra. Doing this while not completely pulling the rug out from under 20 million people who now have health insurance coverage as a result of the ACA, will be an impossible challenge without measures to encourage certain patient behaviours.
These behaviours, along with the drive to improve efficiency in patient management, include:

  • Improving patients’ knowledge, skills, ability and willingness to proactivity manage their own health.
  • Provide interventions designed to increase activation and promote positive patient behaviour.
  • Reduce the burden on physicians and hospitals when managing a condition.
  • Supporting the move to value-based care, which Signify Research expects to still be a central theme going forward.

It’s therefore the Signify View that patient engagement platforms will be used as a supporting tool in the new administration’s drive to reduce the overall cost burden of healthcare in the US.

Enable Patient-centered Healthcare
The Trump administration has stated on is transition website that creating a patient-centered healthcare system is a key goal of their strategy. As we all know, patient-centered means patients being central in the decision-making processes governing their healthcare. Patient engagement platforms will be an essential tool in this process as they allow patients to efficiently access the educational resources, personal medical data, financial tools and clinician support required in the decision-making process.

Health Savings Accounts
Finally, as was stated throughout the presidential campaign, the Trump solution will incorporate Health Savings Accounts (HSAs). HSAs should drive increased consumer demand for transparency in care costs and care quality – and patient engagement platforms can be a key tool in supporting this transparency.

For these reasons, the outlook for patient engagement platforms remains very positive. The current uncertainty around the future of the ACA does mean that the path forward is a little less clear, but it’s one we firmly believe will still have patient engagement at its core.

New Market Report from Signify Research Publishing Soon
This and other issues will be explored in full in Signify Research’s upcoming market report ‘Patient Engagement Platforms & Portals – World Market Report 2017’, publishing in February 2017. For further details please click here or contact Alex.Green@signifyresearch.net