Challenges surrounding the safe supply of blood

Blood test vials for blood groups held by nurse. Blood analysis concept, selective focus, copy space

Lorna Rothery discussed the challenges associated with the safe transfusion and supply of blood with Dr Evan M. Bloch, focusing on where future efforts to reduce transfusion-transmitted infections should be directed

What are some key challenges regarding the sustainable and safe supply of blood and blood products, particularly in resource-limited settings?

Globally, there is a blood deficit. Studies have shown that much of sub-Saharan Africa, large parts of South Asia, and the Caribbean face a significant gap; approximately 40% of blood is collected in high-income countries, which comprise only about 18% of the world’s population. There are two aspects to consider here: the blood supply, which includes access and availability, and safety. While these elements are linked, they also have distinct differences.

If you examine the evolution of blood products, deficiencies can be found throughout the process. It spans national policy and oversight to blood collection and testing, clinical use of blood through to post-transfusion surveillance. A primary issue is the need for blood donors, which is complex and relies heavily on human factors. It’s crucial to have individuals who understand the importance of blood donation and to recruit them successfully, which can be quite costly and complicated. The donor pool influences both safety and availability.

The WHO advocates exclusively for voluntary donors, as they have long been recognized as posing the lowest risk of transfusion-transmitting infections. The challenge lies in recruiting these voluntary donors; ironically, recruiting volunteers can sometimes be more expensive than paying individuals – since establishing the necessary infrastructure to contact people and maintain engagement with donors is essential for sustaining the donor pool.

The other two categories that exist, to varying degrees, in low-income settings are replacement donors and paid donors. Replacement donors include family members and friends of the intended recipient, often recruited during emergencies within a hospital-based system. For instance, if a patient needs an operation, the physician or blood bank might approach them. One scenario is when the hospital has some blood in inventory but needs it replenished, requiring family and friends to commit to a certain number of units. Alternatively, if they need blood urgently, they may reach out to friends and family members asking, “Who can donate?” This creates a disincentive for anyone to admit to high-risk behavior.

Paid donors are also not a uniform group because, in places like sub-Saharan Africa, it’s poorly defined. In studies to date, replacement and paid donors have consistently been shown to have higher rates of infectious marker positivity than voluntary, non-remunerated blood donors. However, determining the extent of this issue is challenging. The prevailing belief is that paid donation is viewed as a last resort in Africa, which is not a favorable option. It poses significant risks.

In former Soviet Bloc countries, paid donation was a more common practice, whereby the associated risks have been more variable. Infectious risk by remuneration status can also be addressed, at least in part, when one controls for first-time versus repeat donation. First-time donors are disproportionately at higher risk because their reasons for donation are often unclear; in some cases, they may be test-seeking, given known high-risk behavior. However, repeat donors consistently exhibit a much lower infectious risk—regardless of whether the donation is voluntary or paid. There’s a strong case for aiming to recruit voluntary donors, but an emphasis on donor retention and repeat donation is critical because it is needed to sustain the donor pool.

Another challenge involves laboratory infrastructure that is not exclusive to blood banking but spans pathology and lab medicine. In some respects, it’s a marketing issue whereby laboratory medicine is largely hidden despite its importance to clinical care. If you can’t diagnose, you can’t effectively guide treatment. In blood banking, testing has historically been inadequate in low- and middle-income countries. The situation varies significantly by location. Even within the same country, vastly different capabilities may exist. For instance, one can imagine low-resourced, remote islands or rural communities at one end of the spectrum and large cities resembling high-income settings at the other.

In high-income countries, they have long enjoyed robust donor history screening using donor history questionnaires along with parallel antibody and molecular testing for major transfusion-transmitted infections. The major advantage of molecular testing is its extraordinary sensitivity, which reduces the window period of detection. While many see HIV as a major risk, the Hepatitis C virus can have a window period of two to three months before antibodies become detectable. In contrast, HIV typically has a window period of about two to three weeks. Molecular assays can reduce the window periods to six or seven days for both HIV and Hepatitis C viruses.

Certain pathogens are known to be endemic in resource-limited settings, yet effective strategies to combat them are insufficient. Much of this inadequacy stems from challenges in the supply chain, including the cost of reagents and the need for technical expertise to operate the assays and equipment. Even when testing is available, it is often limited to basic screening, providing only positive or negative results. Donors are managed accordingly, which could lead to false negative or false positive results, depending on the case. In a high-income setting, multiple layers of testing are typically performed, including screening, repeat testing, confirmatory testing, and supplementary testing. Laboratory capacity poses a significant challenge for safe blood transfusion in many low and middle-income countries.

Blood donors have long been an underappreciated resource for population-based epidemiology. Various investigators have leveraged this in certain areas, leading to incredible work. There are numerous examples of how donor populations have been used to answer questions of biology, such as those related to the epidemiology and pathogenesis of West Nile virus. Harvey Alter won the Nobel Prize for his research on transfusion-related hepatitis, which significantly advanced our understanding of infection dynamics.

Donor transfusion recipient cohorts remain a largely untapped resource, particularly for emerging infectious diseases. Although many discuss active surveillance, and there are select examples, there isn’t a well-coordinated and cohesive approach in place. In high-income countries, initiatives have been developed for surveillance, but less so in low-income countries, where such an infrastructure could have the greatest yield. Surveillance is crucial, presenting a massive opportunity. Some studies on emerging infectious diseases, particularly those related to SARS-CoV-2 and COVID-19, have been insightful. Utilizing donors themselves to provide information about ongoing trends is highly valuable, even if it is not framed as a transfusion problem.

How is the prevalence of infectious diseases impacted by factors like urbanization and climate change?

It’s fashionable to claim that everything is getting worse, although I have to concede that it probably is. However, this does not necessarily occur linearly. For instance, vector-borne diseases are a significant concern associated with climate change. Increased rainfall causes more standing water, which means more mosquitoes and, consequently, more cases of dengue and malaria, among other diseases. Conversely, desertification can have the opposite effect. Transforming areas into deserts can dry out environments, and rising temperatures can also impact the biology of disease vectors such as ticks. Urbanization and climate change undeniably alter the epidemiology of infectious diseases. Predicting the exact direction of these changes is difficult. In some regions, conditions may deteriorate, while in others, they could improve. Furthermore, factors unrelated to climate drive the spread of infections; urbanization influences the dynamics of both vector-borne pathogens and respiratory agents, as was seen in the SARS-CoV-2 outbreak. Further, factors such as international travel facilitate the spread, as was observed with Zika.

How can innovations and new technologies help reduce the risks associated with infectious pathogens?

Pathogen reduction has gained traction, particularly with the availability of pathogen-reduced platelets, which markedly reduce the risk of bacterial contamination and transfusion-associated sepsis. We are still awaiting an optimized red cell or whole blood technology. Historically, the paradigm has been that a test is developed and then implemented when a new pathogen emerges. This process is slow, targeted, and expensive. The concept behind pathogen reduction involves applying various technologies to treat the blood product globally. Pathogen reduction is effective against different classes of pathogens; therefore, one is able to address both the pathogens that we currently test for as well as the emerging and reemerging agents, rendering this a proactive approach.

There have been discussions about the adoption of more multiplex testing and array-based testing, which could help survey more agents. The challenge is understanding how to interpret the results, as we don’t want to uncover information that is not clinically relevant, such as the case of agents that are not transfusion-transmissible. Any findings have implications for donors, which need to be managed appropriately.

A revival of cold-stored platelets is another area of interest that could be used to address bacterial risk.

What benefits could artificial blood products offer in the sustainable supply of safe blood?

Artificial blood was one of the biggest biotech failures of the 20th century. It garnered a lot of interest during the early HIV pandemic. There remains considerable interest, but we are still somewhat away from that. Artificial blood encompasses blood substitutes or bloodless medicine, and there are different iterations of blood substitutes. There were perfluorocarbons, which are essentially solvents with an incredible capacity for dissolving oxygen and became licensed in the US in the late ’80s. It proved too cumbersome to administer and was hugely complex and expensive. That was taken off the market in the early ’90s. The hemoglobin-based oxygen carriers have had a chequered history; they haven’t had brilliant outcomes. It isn’t very easy to determine efficacy in an objective way because one would rather give blood than a substitute.

When people start using substitutes, it’s often when the situation is so desperate that the outcomes are often confounded by just what’s happening. The patients are so sick that it’s questionable whether anything could have helped at that point. However, there are selective uses for hemoglobin-based oxygen carriers, particularly for patients with multiple red blood cell antibodies and/or those patients who have rare blood type, where obtaining compatible blood may not be possible in a timely manner. There have been high-profile case reports suggesting that the administration of hemoglobin-based oxygen carriers could have been lifesaving. However, it’s important to note that in the context of global health, low and middle-income countries tend to be the last to benefit from these advancements.

There are concepts like blood pharming, where you can engineer red cells using stem cell substrates. The question is, who’s going to pay $6,000 or $7,000 for a unit of red cells? Just because something can be achieved as a proof of concept or academic exercise doesn’t mean it translates into policy. Frankly, it’s an engineering problem. You need massive capacity to even sustain a fraction of what the current donor pool provides for free.

Where do you feel that research and policy priorities should lie? How much global collaboration do we need in this space?

There have been targets for intervention, and partnering with capacity building in low-resource settings can help. First, one needs to address the structural challenges in pathology and lab medicine more broadly, as blood banking falls under this umbrella. If it’s solely about targeting transfusion infrastructure, I was fortunate to be involved in a project in Georgia that launched the world’s first national hepatitis C elimination program.

Georgia prioritized everything that could influence the risk of Hep C, which included blood transfusion. Consequently, all aspects of blood transfusion infrastructure are being actively addressed as part of the program, overhauling everything from policy to donor recruitment, testing, and beyond. As a result, the prevalence of infectious marker positivity has decreased among blood donors since the inception of the program. In the background, they are tackling hepatitis C in the population, with fewer infected individuals entering the system. To me, that’s the direction to take.

You can’t address the elements in isolation if you genuinely want to make a meaningful impact; otherwise, you merely engage in something superficial. It is crucial to invest heavily in comprehensive solutions. Actions like that are essential. Occasionally, someone feels inspired for various reasons to step in and invest to address a problem substantively.

Contributor Details

OAG Webinar

LEAVE A REPLY

Please enter your comment!
Please enter your name here