What you need to know about data flow and federated AI

Sharecare is a digital health company that offers consumers an artificial intelligence-based mobile application. But he has a strong take on AI and how it’s used.

Sharecare believes that while other companies are using augmented analytics and AI to understand data with business intelligence tools, they are missing out on the benefits of data fluidity and federated AI. Using federated AI and data fluidity, Sharecare says it digs deeper to find hidden similarities in data that business intelligence tools wouldn’t be able to detect in healthcare environments.

To better understand the flow of data and federated AI, IT health news met with Akshay Sharma, executive vice president of artificial intelligence at Sharecare, for an in-depth interview.

Q: What exactly is federated AI and how is it different from any other form of AI?

A: Federated AI, or federated learning, ensures that user data stays on the device. For example, applications that run specific programs at the edge of the network can still learn to process data and create better and more efficient models by sharing a mathematical representation of key clinical characteristics, not data.

Traditional machine learning requires centralizing data to train and build a model. However, with advanced artificial intelligence and federated learning combined with other privacy preservation techniques and zero trust infrastructure, it is possible to build models in a distributed data configuration while reducing the risk of a single point of attack.

The application of federated learning is also applicable in cloud environments where data does not have to leave the systems on which it exists but can enable learning. We call this federated cloud learning, which organizations can use to collaborate, keeping data confidential.

Q: What is data fluency and why is it important for AI?

A: Data Fluidity is a framework and a set of tools to quickly unleash the value of clinical data by simultaneously involving each key player in a collaborative environment. A machine learning environment with a data fluency framework engages clinicians, actuaries, data engineers, data scientists, managers, infrastructure engineers, and all other business stakeholders to explore data, ask questions, quickly create analyzes and even model the data.

This new approach to enterprise data analysis is specially designed for healthcare to improve workflows, collaboration and rapid prototyping of ideas before spending time and money on creation. of models.

Q: How do data flow platforms make it easier and more efficient for analysts, engineers, data scientists and clinicians to collaborate?

A: Traditional healthcare systems are very siled and many organizations struggle to discover the value of their data and unlock actionable clinical trends and insights. Not only are data authoring systems and teams isolated from data transformation teams and systems, engineers and data scientists use coding languages ​​while clinicians and finance teams use Word or Excel.

Disconnecting creates a situation where knowledge of the data is translated outside of the programming environment. Transformations between system boundaries are lossy and without feedback loops to improve an algorithm or code. Yet all stakeholders need early and iterative access to data to build health algorithms efficiently and with greater transparency.

The modern healthcare stack makes it easy to collaborate cross-functional teams from a unique, data-driven perspective in Python Notebooks with a user interface for non-engineering partners. Building AI models can be time consuming and expensive to build, and hedging your bets by getting early prototype contributions across all areas of expertise is essential.

Data Fluidity provides an environment for critical stakeholders to discover value in addition to data or information and in a real-time, agile and iterative manner. Feedback from non-engineering teams is immediate and can help instantly improve the model or underlying code in the notebook.

Each domain expert can have multiple data views that facilitate deep collaboration and discovery of data information, enabling a continuous learning environment, from care to research and from research to care. Data Fluidity works with cloud-native architectures, and many techniques can also automatically extend to edge computing, where the patient and their data resides.

Q: Why do you say the future of healthcare analytics is federated AI and data fluency?

A: Traditional healthcare analytics relies on understanding a given set of data using business intelligence-driven tools. The employees who use these tools are generally not engineers but analysts, statisticians and business users.

The problem with traditional business data analysis is that you don’t learn from the data; you only understand what’s inside. To learn from data, you need to build machine learning into the equation and effective feedback loops from all relevant stakeholders.

Machine learning helps reveal hidden patterns in data, especially if there are nonlinear relationships that are not easily identifiable by humans. Proactive collaboration at the data layer provides transparency into how analysis models or metrics are built and makes it easier to disentangle biases or assumptions and correct them in real time.

Federated AI and data fluency also tackle barriers to data acquisition, which often aren’t technological, but instead include privacy, trust, regulatory compliance, and intellectual property. This is especially the case in healthcare, where patients and consumers alike expect privacy with respect to personal information and where organizations want to protect the value of their data and are also required to follow regulatory laws such as as HIPAA in the United States and GDPR in the Eurozone.

Access to health data is extremely difficult and protected by walls of compliance. Usually, at best, access is provided to anonymized data with several security measures. Federated AI and the principles of data fluidity can share a model without sharing the data used to train it and address these concerns. It will play a critical role in understanding information within distributed data silos while navigating compliance barriers.

The privacy-preserving approach to unlocking the value of healthcare data is crucial for the future of healthcare. It’s about improving the adoption and understanding of machine learning in healthcare to generate actionable insights and better health outcomes. Federated AI goes beyond traditional enterprise data analysis to create a machine learning environment for data fluidity and explainability that enables parallel model training from multi-omic pipelines automated.

Twitter: @SiwickiHealthIT
Email the author: [email protected]
Healthcare IT News is a publication of HIMSS Media.

Source link

Previous If intelligence turns you on, you might be a sapiosexual
Next Banks seek to temper corporate deposits