Biopharma cannot fix what it cannot see, according to cloud-based process optimization developer Bigfinite, which wants industry to make better use of analytical technology to understand processes. Data is core to the industry 4.0 idea. In theory, the ability to pass information from one operation to the next will allow firms to optimize biopharmaceutical manufacturing processes, ultimately saving them time and money.

Yet industry has been slow to adopt a data-centric approach, says Pep Gubau, CEO of Bigfinite, who suggests greater investment in technology and know-how is needed.

“The industry needs to begin adopting new technologies that help them to understand, measure, and control the variations of their processes,” he notes. “The amount of data and analytics that pharma must apply in order to achieve Pharma 4.0 cannot be accomplished with the old technologies. During the implementation of this digital transformation, they will have to write the new good practices with the support of their specialized partners.”

Bigfinite’s business is based on a software as a service (SaaS) platform that uses IoT, big data, AI, and cloud technologies to help biopharma optimize manufacturing processes. Real-time analysis is key, Gubau says, citing corrective action and preventive action (CAPA) as an area of potential application.

“You can’t fix what you can’t measure” he explains. “Unifying data enables a holistic understanding of how processes are running and what is causing variations at any given time. Being able to determine the main cause of abnormal events and provide immediate feedback and to resolve problems and adjust operations can save tremendous time and effort in clearing CAPA.”

He adds, “Once root causes can be determined, that can help with predicting when that deviation might occur again allowing for proactive preventative measures. Of course, all of this should be done without sacrificing a GxP-compliance or risk of suffering data integrity issues.”


Full integration is critical to successful digital biomanufacturing. It is also a potential challenge for companies interested in the approach, according to Gubau, who says preventing the development of data “silos” is vital. “Data segmented in disparate systems creates delays to responses to non-conformance events, slowing time-to-market, and longer batch release times,” he points out.

Fortunately, for biopharmaceutical companies interested in data-centric manufacturing, help is available. “Regulators are open and delighted to support this transition because they know ultimately it’s the patients that will benefit, according to Gubau, citing the FDA’s Emerging Technology Team (ETT)—which aims to further regulatory and industry understanding of manufacturing innovations—as evidence of support for greater use of data.

“Regulators are requesting deeper knowledge about the process (CPP) and products (CQA) in order to have more quality control on the final drug,” Gubau says. The more robust the process, the better quality control in the final products. Quality, safety, and efficacy are the main attributes that matter in drugs and all of them are well controlled in efficient processes. If pharma can find ways to be more efficient in their manufacturing processes, this will increase speed, batch yields, and assure compliance and get more drugs to patients who need them.”

Gubau also points to various collaborative efforts as a reason for optimism, arguing that biopharma has a culture of sharing non-competitive information.

“Compared to other industries, in our experience the biopharma industry is accustomed to frequently sharing details of best practices at various industry events and through industry associations such as BioPhorum, PDA including their CPV of the Future initiative, and Xavier Health at Xavier University who are doing great work in their AI Core Team initiative,” he says.

Previous articleRemote Bioprocessing Experiments on Demand
Next articleFurther Adapting New Bioprocess Technology in an Era of Change