Friday, May 3, 2024

FDA’s CTO Discusses Data, Change Management and the FDA’s Technology Modernization Action Plan

FedHealthIT’s President, Susan Sharer, had the opportunity to sit down with Vid Desai, Chief Technology Officer with the Food and Drug Administration (FDA) to discuss the massive amounts of data the FDA touches; prioritizing; the challenge of change management; and how industry can solve some of these bigger problems.

Explain the Enormity of the Data the FDA Must Manage

Data sets are growing and changing in an accelerated manner. The information we used to receive was static in the sense that it reflected actions and results collected and submitted over a certain period of time. The science is changing; genomics and DNA are taking on a bigger role and that is what is being submitted to us. You hear the term “farm to table” but for us it is from “seed to germination to table” and all of the inspection, transfer and audit data points along the way from a safety perspective. The science is also more targeted, as are treatments and therapies. Drug companies are producing these targeted treatments and that means they have to develop hundreds of specialized drug entities. The number of data sources, volume and speed at which useful data is generated is growing exponentially. In such a dynamic environment, there are obvious benefits in us obtaining structured data directly, rather than having to process from unstructured PDF documents.

The huge amounts of data we are receiving is hard to visualize and, added to the sheer volume, is that the data is coming in constantly, in near real-time. All of this means we want to be able to consume data as it is available and make real-time decisions so that, if something is happening, we can make those critical decisions faster and get them right. This is where technologies like AI come into play and can solve the challenge of taking in that huge flood of information and processing it quickly.

Looking around at the FDA and across industry, we are in the infancy of using those technologies; we’re really still experimenting. Part of that is because we still have a lot of data quality issues to solve. If you consider all of the information inside an electronic health record and how many hands have had a role in pulling all of that together, different physicians, perhaps different hospitals, who may each use different terminology to explain their understanding or interpretation of something, you start to see the challenge. For machine learning to be effective, there has to be conformity and we aren’t there yet.

The FDA recognizes the potential for these technologies, but we have to resolve our basic infrastructure and technologies first. We have much to modernize and transform but it must be done in a way that works within our dynamic environment.

The FDA’s Technology Modernization Action Plan

This forward movement that is required is detailed in our Technology Modernization Action Plan, which outlines the “important near-term actions that the FDA is taking to modernize use of technology—computer hardware, software, data, and analytics—to advance the FDA’s public health mission.”

There are three key elements to the plan:

Get the Fundamentals Right!

We need to fix the fundamentals first. We need to ensure we have caught up to where the technology environment is today in terms of cloud and the ability to be Agile. There are still some challenges and we know that our current structure is not conducive to where we need to go. We know we must be faster and more nimble, to be ready for real change, and that is a focus of ours.

Develop Use Cases to Align with Regulatory Decision Making

Part of moving forward will require a focus on use cases, on developing both near-term issues and more aspirational problems we want to solve, so that people with experience and expertise can see where we are driving and bring forward the solutions we need. We need to ensure the work environment is efficient, that things are being processed in a modern way. There are some technology solutions that can be applied.

On the other side, is the more dynamic, aspirational vision. Think, for instance, of a patient who goes to pick up a prescription and if the label they receive could be customized, so instead of having to read through eight pages of information on the drug, they have a very focused, personalized label that is based on the patient’s individual history, that states, “these are the interactions to be aware of.” Think of going to a store and being able to scan a bar code to see the path of the product you are looking at, so you can make an informed decision based on all of the available information.

If we can create use cases for both the mundane and the aspirational, we can think forward to what implementation might look like, to the day-to-day systems, to what do we need to do, how we source the data we need, and then how do we use that data to make decisions.

Legislation is also a big piece of this. None of this forward motion would have been possible without legislation like the 21st Century Cures Act. Legislation that makes us fundamentally change how we think about and consume data is critical. There are still some barriers to how we bring all of it together, but I believe legislation can play a role in encouraging that forward motion, in giving that nudge industry sometimes needs to make things happen. We don’t want to make anyone less competitive and that can be a primary disincentive to sharing for the industry, but there is a greater good and legislation can be a push there.

Industry Engagement

There are also problems we are seeing that do not yet have solutions. The data sets we are interested in are distributed from the source, so a big question is how you access all of the data, bring it all together and ensure it can be trusted? Blockchain is part of the solution, perhaps, but how do we ensure the data is good? We need to find ways to get information from a single source of truth, but how do you decide what that is? There are elements technology can solve, but they aren’t yet properly put together to provide the assurance that we are getting the information from the right source, that it is maintained in the right way, and that we are tuned in to any changes that may happen along the way. It’s a very complex problem, where we need to have smart people help figure it out.

There is the huge problem mentioned before around how we access and consume data from the master data management side. There will never be a single source for what we need, so things like security, access and trust remain unsolved problems. Blockchain will play a role but it has to be more than that; the solution has to be operationally supportable and has to be able to be implemented in a distributed way.

We’re thinking more about public forums, about taking the opportunity to present our challenges and use cases as a way to engage industry to help us solve those key issues.

How do you Manage Innovation and Scale with Flat Budgets?

We have to shift from a project-based environment to one of continuous improvement. One of my pet peeves is that people talk about modernization but most look at it as a project, and I think that is fundamentally flawed. There may be a modernization activity, to catch up, that is a project, but if you haven’t implemented continuous improvement in the environment, then you have wasted a lot of time and money and will need another modernization project a few years down the road again.

This is one of the most challenging things we are embarking on. It isn’t about keeping a data center running but about keeping it modern. The contract to keep a data center modern is very different from the one that just keeps it running.

Truly, data centers are a business we need to be getting out of. In the world of IT, “I” comes before “T.” Our business is information, and the truth is, cloud providers are doing a lot better job at the technology part. They are doing it to scale, with efficiencies we can’t meet, so our focus needs to be more on the information side.

I view budgets as having a discretionary and a non-discretionary component. The non-discretionary side has all of the costs that you have very little control over, like maintenance costs, etc. Unfortunately, a big part of most IT budgets is locked-up by those non-discretionary costs. Discretionary costs are those that you have control over and can make a choice on. Discretionary costs allow you to fund innovation as well as activities that are aligned to your business needs. It is where businesses get the most value from their IT investments. I have to figure out ways to shrink non-discretionary spending so that we can invest in projects where the agency gets the most value, and those that progress our agency mission.

I have identified a few priorities, including process improvement, so that we are able to do things faster and in a more efficient way. We also need to get our financial house in order, in regards to the discretionary and non-discretionary spending I mentioned. Moving to the cloud-forward approach is a must. We also need to enhance the user experience related to the programs we deliver and how we deliver them, and then of course, people and culture. As technologists, we may not naturally be good at this last part, so we need the help of experts to help us deal with those challenges.

Tell us More About the User Experience Need

We live in a tech savvy world and people expect their IT departments to deliver a modern user experience across all IT services. We plan to deliver a modern experience by ensuring the equipment our users use directly, or behind the walls, is current and well maintained. This speaks to the point of having an environment where continuous improvement is part of the IT culture and not just a modernization project. Running an old operating system on a new device really does not enhance user experience. Embracing programs like Microsoft’s Windows-as-a-service now gets us to a point where the operating system is constantly being updated every six months. The support windows for operating systems and platforms is shrinking from many years to a few months, and this forces applications to also adopt a continuous update cycle. The environment is moving to something more agile, so even if you don’t like the coming change, you have to do it. I don’t think many people have put the pieces together to understand the ripple effect of these changes and how to manage it all.

We’re having discussions across our centers around the impact of this continuous improvement environment. I think we’ll see that more of the things that used to be hard to do in Government will start happening. As we shift to this very dynamic and continuously improving environment, it requires change and support from umbrella organizations, like the HHS, and even across the Federal government. Solutions for typical IT problems are pretty common; some of the harder problems – of  how we get information, share it, and use it – those  require collaboration across centers, agencies, federal departments and key stakeholders across the healthcare and food eco systems to move forward.

About Vid Desai

LEAVE A REPLY

Please enter your comment!
Please enter your name here

FedHealthIT Xtra – Find Out More!

Recent News

Don’t Miss A Thing

Heather Seftel-Kirk
Heather Seftel-Kirk
A writer for more than a decade, Heather helps hone the voice of FedHealthIT, helping to shape the information we share, working with collaborators and stakeholders to ensure they are delivering the message they intend and that it is the information our readers want to hear. A firm believer that every person has a story to tell and that every story is worth sharing, if told right, she also believes the written word carries power – to inform, to educate, and also to bring people together.

Subscribe to our mailing list

* indicates required