NEW

FDA Releases Two Companion Reports on Medical Device Safety and Innovation

Today, the U.S. Food and Drug Administration’s (FDA) Center for Devices and Radiological Health Center (CDRH) is releasing two reports on medical device safety and innovation – the core pillars that help protect and promote public health for all. The “CDRH 2024 Safety...

Tri-Imaging, RTI Group Work Together

In a LinkedIn post, Tri-Imaging states, “We are honored to be the first ISO in North America to have the opportunity to try RTI’s new Mako X-Ray Testing Meter. The Mako meter is the most accurate and efficient testing meter that covers the broadest application range...

Detection Technology announces global availability of TFT flat panel detectors

Detection Technology, a global leader in X-ray detector solutions, announces the global availability of a comprehensive range of TFT (thin-film transistor) flat panel detectors. The portfolio includes IGZO (indium gallium zinc oxide) and a-Si (amorphous silicon)...

Bayer, Google Cloud Accelerate Development of AI-Powered Applications for Imaging

Bayer and Google Cloud have announced a collaboration on the development of artificial intelligence (AI) solutions to support radiologists and ultimately better serve patients. As part of the collaboration, Bayer will further develop its innovation platform to...

Empowered Management

By Matt Skoufalos

To say that hospital decision-makers are inundated with data is an understatement. Analytics are used to justify any change in thinking at every level of management, from purchasing to servicing to staffing and beyond. But any department head will also tell you that the value of that information only goes as far as the meaning you can extrapolate from it.

So, how are leaders to know which information is most useful, and under what circumstances? Do better data make a better manager? And, in the case of imaging services, which are both revenue drivers and cost centers, how does the applicability of that information empower those in the driver’s seat to best set the course for the winding road ahead?

Nicole Walton-Trujillo, ultrasound manager for Desert Radiology of Las Vegas, Nevada believes that one of the best ways to determine which data are useful is to connect her departmental operations with the strategic plans of her enterprise. Walton-Trujillo describes “a line that goes between the C-suite and the strategic plan” as offering the clearest guidance for her teams.

“You should always look at your company’s three-to-five-year plan,” she said. “In the end, you as a leader are responsible for facilitating it. What’s the company plan, these are the benchmarks, and then what are you doing to meet that bullet point? What are you doing, what are you looking at, so it aligns?”

To that end, Walton-Trujillo has created an internal scorecard, updated monthly, that charts “anything I can access that I can quantify,” she said, from technology to patient volumes to staffing hours to service.

“I have a whole section for quality under biomed – every machine, every minute it was down, how many patients it affected in every modality every month,” she said.

Foremost, Walton-Trujillo said that creating internal metrics for downtime accounting not only “keeps my service provider and me speaking the same language,” but has helped her build the case for changes to staffing and resource allocation with her supervisors.

“We have a service provider, and our definition of downtime is not the same,” she said. “My definition is the second it went down to the second it’s back in patient use.”

Charting the amount and nature of the downtime her staff experiences allows Walton-Trujillo to let executives know which factors have adversely affected patient throughput, and consequently slowed revenues or monopolized human resources that could otherwise have been differently allocated.

“I could now say to my COO, ‘I need a new scanner,’ and they’re going to say, ‘Show me why,’ ” she said. “I can say, ‘Here’s my staffing, here’s the amount of overtime I had to pay to accommodate patients.’ They can justify me being able to give somebody a job [because it means being] able to treat more patients a month.”

“It’s all about building that foundation of communication and managing it well,” Walton-Trujillo said. “More often than not, when I go to those quarterly meetings with my service provider, they expect it. Now they say, ‘Do our numbers match? Do I need to adjust anything?’ They know I know what I’m talking about. I have built that foundation of working with them.”

Despite tracking as much metric information as she can, Walton-Trujillo believes that the only innate value it has lies in whatever insight those data can provide that’s relevant to departmental operations. Her rule of thumb: “if they’re not doing you any good, don’t track them; or if you’re tracking them and you don’t understand why, you need to keep pushing to understand why.”

The data gathered in any department can also have a practical effect on staff management.

“You can’t educate and guide growth if you don’t understand why you’re tracking it,” Walton-Trujillo said.

She’s leveraged the data collected in her department to better inform staff via an interview style of communication, as opposed to making meetings into “a data dump.”

“How many staff meetings did you go to that they gave you a packet of information, and it went to your to-be-filed pile, and you never looked at it again?” she said. “If you don’t have that knowledge behind it, then you’re just parroting data, and I can do that in an email.”

Jim Fedele, senior program director of clinical engineering at the Williamsport, Pennsylvania-based BioTronics, is another firm believer that “those who have the data control their own destiny.” But like Walton-Trujillo, he believes that applying that data to make meaningful comparisons is everything.

“Looking at the process, you have to find out where to start,” Fedele said. “What’s broken that creates this situation?”

To determine performance indicators requires identifying the problem to be addressed, whether relevant data can be collected around it, ways in which those data can be affected, and whether measuring those outcomes can influence change, Fedele said. But even some of the most commonly gathered metrics may not intuit meaningful comparisons in every circumstance.

“At our level, a lot of people look at uptime as an indicator for imaging equipment,” Fedele said. “A lot of my hospitals have only one CT scanner, so when one CT scanner’s down, that’s a problem. But do we really have an ability to affect that? You can kind of gather that data, you know that the unit was down, but then it becomes these gray areas – was it completely down, or was there something that we couldn’t do? To me, gathering that as a performance indicator or as a data point is less useful because there’s so many different ways to interpret that.”

The most meaningful data that are collected require the least amount of distilling, he said. If a data set requires excessive manipulation to derive the desired results, there’s a fair chance that the metric it produces won’t offer a true reflection of conditions in the department. And when outcomes are tied to those data – like

people’s employment, or reimbursements – there can be natural inclinations to massage the figures, which is counterproductive to performance improvement.

“Every day, we should learn,” Fedele said. “That’s what the data and the analytics should be for: to learn about what’s going on with your equipment, with your users, the environment. As an industry, we are afraid to collect certain data, to really disclose the things that are going on, because we’re afraid we’re going to get benchmarked against somebody else, and our jobs are on the line.”

Another concern is when performance metrics are being driven by outside pressures, like third-party consultancies, auditing agencies or leadership goals that are out of alignment with departmental practices. Over the course of a 30-year career, Fedele estimates he’s been through a consultant’s review every three years or so. He’s seen multiple variations on performance benchmarking, and experienced the impact of self-reported data on departments of various sizes.

“We went through those things to drive change internally,” he said. “Inevitably, what I found was gross variability in the data set. When I watched the outcomes from that, people generally got defensive. Right away, as soon as you have someone asking you about this data, you’re going to resist that because you’re worried about your own career.”

“I’m always uncomfortable with some of the consultants that come in and say, ‘You should have five guys but you have seven,’ and you know all seven of your guys are really busy,” Fedele said. “What is it that you have to be able to do to function with five guys? A lot of times that gets pushed back on the department manager to figure it out. That’s not really fair.”

Conversely, Fedele believes that if leadership embraces the intention of data collection as performance improvement and not as coded language for cost- (and job-) cutting, then that information can be much more useful for driving meaningful internal change. He also pointed to two significant institutional hurdles often overlooked in efficiency improvement: namely, that a significant percentage of health care is staffed by an aging workforce, and that frequent turnover in key executive positions makes it difficult to hit goals when institutional courses are constantly shifting.

“For a lot of people, this is their single career, and this is all they know,” Fedele said. “Hospitals have always been clunky because they used to be a captive audience. What we have now is consumerism and highly competitive markets; offsite centers and specialty clinics that do it better than anybody else because they think about throughput.”

“As we get younger leaders in key positions, I feel like some of this change will occur as needed,” he said. “It takes longer than five years, and most of these people last three.”

As decision-makers and their priorities can shift frequently, third-party resources can provide a place to seek guidance, said Bill Algee, director of imaging services at Columbus Regional Hospital in Columbus, Indiana. As president of the Association for Medical Imaging (AHRA), Algee believes that “having a place to go where you can discuss those dilemmas” can offer departmental leaders the perspective they need to recalibrate their own filters outside of the strictures of their institution.

“Getting the right data out there can be difficult,” he said. “Who’s making those decisions and the different metrics you can use can be somewhat cumbersome at times. There’s a lot of different ways to get it, but making sure you’re comparing apples to apples is the biggest challenge.”

General measurements like work-hours per procedure or costs per procedure are common denominators in many facilities, Algee said, but some of the most useful metrics in his own hospital are specialized details that have been derived from internal process improvements, which may not always be tied to quantitative evaluations.

“There’s extraneous things going on that you can’t quantify by one procedure,” Algee said. “You have to look at the whole picture, and you have to listen to the story. Some of those soft skills include listening to the staff. How are we setting those expectations?”

Algee also believes that facilities can hamstring their operations by trying to attach too much meaning to data as a differentiator. Evaluating his monthly metrics for cost efficiencies resulted in a report that showed Columbus Regional was “way off the national norm” for per-procedure nuclear medicine costs. At the outset, it seemed to present a budgetary problem, but a deeper dive into the data revealed that was because the institution performs cardiac PET studies that used a pricey radiopharmaceutical.

“That’s a choice we’ve made because the patients are getting better studies [with the contrast agent],” Algee said. “We were about to stop doing it because our costs were so far out of whack. Could we give that up? Yes, but you’ll see that cardiac PET is getting more momentum. Part of my fear was, I can get rid of that, but if it’s a hotbed in two years, and I go back to that vendor, the costs might go up. You have to play that market.”

“We’ve decided that we’re going to really hone in on that because that data helped us to drive to that,” Algee said. “What made us really look at what our cost per procedure was helped us see [that] we do have a competitive edge, [and] we’re not using it to our advantage as much as we should be.”

“Those things all relate together,” Algee said. “I think sometimes we get caught looking at individual things instead of the whole spectrum. It’s trying to find that balance.”

Previous

Next

Submit a Comment

Your email address will not be published. Required fields are marked *

Open