Topics: AI | Data Science | Data Visualization | Deep Learning | Machine Learning | NLP | Python | R | Statistics

KDnuggets Home » News » 2015 » Jul » Opinions, Interviews, Reports » Lund University Develops an Artificial Neural Network for Matching Heart Transplant Donors with Recipients ( 15:n22 )

Lund University Develops an Artificial Neural Network for Matching Heart Transplant Donors with Recipients

Finding the correct donor for the transplant has been challenging and intensively researched usecase in data science. Here, you can find how MathWorks was used to resolve this problem.

By MathWorks.

A heart transplant recipient’s survival depends on dozens of variables, including the weight, gender, age, and blood type of both donor and recipient, and the ischemic time—or the time during a transplant when there is no blood flow to the organ.

To better understand transplant risk factors and improve patient outcomes, researchers at Lund University and Skåne University Hospital in Sweden use artificial neural networks (ANNs) to explore the complex non­linear relationships among multiple variables. The ANN models are trained using donor and recipient data obtained from two global databases: the International Society for Heart and Lung Transplantation (ISHLT) registry and the Nordic Thoracic Transplantation Database (NTTD). The Lund researchers accelerated the training and simulation of their ANNs by using MATLAB®, Neural Network Toolbox™, and MathWorks parallel computing products.

“Many of the techniques we use are computer-intensive and time-consuming,” says Dr. Johan Nilsson, Associate Professor in the Division of Cardiothoracic Surgery at Lund University. “We used Parallel Computing Toolbox with MATLAB Distributed Computing Server to distribute the work on a 56-processor cluster. This enabled us to rapidly identify an optimal neural network configuration using MATLAB and Neural Network Toolbox, train the network using data from the transplantation databases, and then run simulations to analyze risk factors and survival rates.”


Understanding how various risk factors affect survival rates involved hundreds of thousands of computationally and data-intensive operations—for example, the team had to test hundreds of ANN configurations to identify the best one. An analysis of six variables requires the simulation of 30,000 different combinations. Simulating all these combinations for 50,000 patients took weeks using an open-source software package.

Nilsson and his colleagues encountered reliability problems with the software they were using, as well. “The software was unstable, which led to crashes during long, multiday simulations,” Nilsson explains. “In addition, some of the results it produced were not quite right. When we publish our findings, we need to be very sure we can trust the results.”


To address the speed and reliability challenges, Lund University researchers developed their initial ANN model using MATLAB and Neural Network Toolbox. To find the optimal network configuration, they wrote MATLAB scripts that varied the number of hidden nodes used in the network for a range of weight decay (or regularization) values.

Matlab Lund2

The team used Parallel Computing Toolbox™ and MATLAB Distributed Computing Server™ to accelerate the simulation of more than 200 ANN configurations. They then evaluated the results to find the best-performing configuration.

After training the ANN using donor and recipient information from the databases, they verified the model’s accuracy by simulating outcomes for 10,000 patients that had been omitted from the training set. They then compared the results against actual survival rates.

In the next phase, the team conducted thousands of simulations in parallel to rank the 57 risk factors considered in the study for predicting long-term survival.

Lund Matlab Screenshot

Using results from Monte Carlo simulations on the computer cluster and simulated annealing techniques, the researchers identified the best and worst possible donors for any particular recipient.

As a final step, the team developed an automated process that ranks the recipient waiting list to identify the best candidates for a prospective donor.

In the next major phase of the project, Lund University researchers are using the ANN to investigate the use of Human Leukocyte Antigen (HLA) genetic profiles to match donors with recipients.


  • Prospective five-year survival rate raised by up to 10%. “In a simulated randomized trial, the preliminary results show that the ANN model we developed using MATLAB and Neural Network Toolbox would transplant approximately 20% more patients than would have been considered using traditional selection criteria,” says Nilsson. “The prospective five-year survival rate for the ANN-selected patients was 5–10% higher than those matched with the criteria physicians use today.” 1, 2
  • Network training time reduced by more than two-thirds. “Using Neural Network Toolbox and MATLAB, it took us 5 to 10 minutes to train our ANNs,” says Nilsson. “Training took 30 to 60 minutes using open-source software. That is a big difference, because we were training and evaluating hundreds of network configurations.”
  • Simulation time cut from weeks to days. “When we switched to MATLAB and MathWorks parallel computing technologies, we completed experiments that regularly took 3 to 4 weeks in about 5 days,” says Nilsson. “More importantly, the simulations were completed reliably, with no crashes.”

 1 Nilsson, J., Ohlsson, M., Höglund, P., Ekmehag, B., Koul, B., and Andersson, B. (2015). “The International Heart Transplant Survival Algorithm (IHTSA): A New Model to Improve Organ Sharing and Survival.” PLOS ONE, 10(3), e0118644. doi:10.1371/journal.pone.0118644

2 Ansari, D., Andersson, B., Ohlsson, M., Höglund, P., Andersson, R., and Nilsson, J. (2013). “CODUSA–Customize Optimal Donor Using Simulated Annealing in Heart Transplantation.” Scientific Reports, 3, 1922. doi:10.1038/srep01922

MathWorks retains full copyright of this paper.

Sign Up

By subscribing you accept KDnuggets Privacy Policy