The Challenges of Biopharma Scale-up
New strategies may make moving products from lab to plant easier
Biotechnology is an explosive field with a deep pool of researchers focused intensely on discovering the next generation of drugs. But this strength can become a weakness when it'S time to move a biopharmaceutical drug from the laboratory to the pilot plant, a change that requires a shift in focus from science to process engineering. In the field of biopharmaceuticals, pilot plant scale-up is in its early days, with scientists not only figuring out how to scale-up the material but also figuring out how to figure it out.
Typically, the story of a bioprocess scale-up begins with a new molecule that someone wants to take into the next phase of development. Because this is the first time in the process when larger quantities are required, this is usually toxicology. By the time a molecule reaches the market, it may have gone through seven or eight stages of scale-up.
The scale-up process serves two purposes. First, it provides the materials needed for clinical studies, and second, it informs the bioprocess engineering team about possible methods for large-scale manufacture. These purposes are sometimes at odds with each other. The intense research focus of the biotechnology industry-especially at small, start-up companies-brings with it forward momentum and a desire to move as quickly as possible into clinical studies. At the same time, engineers like to take the time to design a solid, scaleable process to move to the manufacturing plant later. The ultimate resolution is quite often in the middle, a hybrid of methods that have already been effective in the laboratory, plus scale-up techniques and equipment for larger quantities. Because it'S in the best interests of both teams to make an efficient and successful transition, it'S worth paying attention to some of the pitfalls of biological scale-up, as well as some potentially helpful new ideas and technologies coming on line.
In many ways, a biological process scale-up is simpler than one that is performed for a chemical process. Equipment of choice is a fermenter, and, typically, the goal is to produce the molecule (often a protein) in a reliable cell line such as suspension-adapted Chinese hamster ovary cells.
The absence of serum is important, because bovine serum varies quite a bit from lot to lot. In addition to the problems with reproducibility that this causes, it is necessary, for human drug manufacture, to certify that the herd the serum came from is disease free. The Food and Drug Administration has been pressuring the industry to abandon serum altogether, and alternatives now exist that should make bovine serum unnecessary.
One bonus resulting from a shift to serum-free media is that it tends to work better with suspension-adapted cell lines. "Once serum is removed, they become more readily suspension adapted," says Mike Ultee, PhD, senior director of biopharmaceutical development and operations at Laureate Pharma, Inc. (Princeton, N.J.).
The cells used in biological fermentation, and the molecules they produce, are shear sensitive, whether they are proteins, nucleic acids, polysaccharides, or other biomolecules. In a mammalian culture, the product is usually secreted by the cells into the media that it grows in. This necessitates separating the cells from the media containing the target molecule. The traditional tool of choice for this type of application is the disk stack centrifuge. This decades-old tool was originally used in the dairy industry; it has also been used to separate oil from water in ship'S bilges.
When it was first used on mammalian cells, the forces exerted by the fluid shear caused the cells to rupture instantly. The cell contents then contaminated the super-natant, creating a more complex separation problem. Low shear versions of the disk stack centrifuge, introduced in the early 1980s, can be used for bioprocessing of shear-sensitive samples, including mammalian cells.
Strategies for purifying the target molecule after separation depend on the nature of that molecule. Laboratory methods include column chromatography, filtration, and centrifugation. Some of these methods, however, are difficult to carry out at the pilot scale. Chromatography, for example, does not scale-up well. For this reason, some scientists are investigating integrated bioprocesses that combine the separation and purification steps. A team of researchers from the Genetic Engineering and Biotechnology Research Institute in Alexandria, Egypt, and from Universit�t Bielefeld in Germany, recently reported a successfully integrated separation and purification process for a recombinant protein from Escherichia coli. (Beshay U, Miksch G, Friehs K, et al. Integrated bioprocess for the production and purification of recombinant proteins by affinity chromatography in Escherichia coli [published online ahead of print May 15, 2008]. Bioproess Biosyst Eng.)
In order to create a combined process, the investigators cloned the gene for B [beta]-glucanase (from Bacillus) into E. coli, combined with one of two constitutive promoters, along with a His6-tag (an affinity tag for capture of the protein later) and a T7 terminator. They then experimented with incorporating the purification into the fermentation process in a number of ways, including adding the adsorbant material directly to the fermentation, along with various internal and external methods. The most successful integrated process in their study was a clever setup that cycled the media through an external resin receptacle at the beginning of the stationary phase.
CRYSTALLIZATION AND PRECIPITATION
A common area of difficulty occurs in downstream processing, when it is time to remove the molecule from solution, either by precipitation or crystallization. This is a delicate task even in the laboratory, and it does not scale-up neatly. "Both of those processes, crystallization or precipitation, are mixing processes," says Steve Kessler, principle of Impact Technology Consultants in Lincoln, Mass. "If someone was carrying that out at the scale of a few drops, or in a test tube, that is not a useful scale for predicting how it would behave at a commercial scale in a mixing vessel."
Both of those processes, crystallization or precipitation, are mixing processes. If someone was carrying that out at the scale of a few drops, or in a test tube, that is not a useful scale for predicting how it would behave at a commercial scale in a mixing vessel.
-Steve Kessler, principle, Impact Technology Consultants
At this stage, it is useful to apply a bit of pilot-plant zen by scaling the process down so that it can be scaled up. This means using industrial-type mixing vessels on a laboratory bench scale to analyze how the mixing process works so that it can be replicated successfully at the larger scale. Once the process is perfected in small engineering equipment, it can more easily be scaled back up again.
But scaling down brings another dimension to bioprocess development. Because there are many variables in a process, it is often helpful to perform numerous small fermentations in order to optimize production of the target molecule. The number of experiments required to isolate all variables can be daunting, however.
Gary Lye, PhD, from University College London, and colleagues from Astex Therapeutics Ltd. (Cambridge, U.K.) and Biogénesis-Bag� (Buenos Aires, Argentina) used scaled-down protein expression in E. coli to develop a generalized method for bioprocess optimization. (Islam RS, Tisi D, Levy MS, et al. Framework for the rapid optimization of soluble protein expression in Escherichia coli combining microscale experiments and statistical experimental design. Biotechnol Prog. 2007;23(4):785-793.)
Clearly, easing the awkward transition of a biopharmaceutical from the laboratory to the pilot plant and beyond requires drawing on both the knowledge base of classical bioprocess engineering and emerging tools and technologies such as systems biology. While each process is unique, it'S apparent that there will never be a one-size-fits-all solution for bioprocess scale-up.
They screened ten variables in the first phase and six variables at three levels in the optimization phase, which would have resulted in 3,033 factor combinations. Using a mathematical method called statistical design of experiment (DoE), the researchers reduced the number of necessary experiments to just 80. DoE creates a model that predicts optimal factor settings for the greatest protein production. Crucial variables included type of broth, temperature at various phases in the process, and speed of shaking during fermentation. They also evaluated the type of microwell plate and found that a plate with large square wells yielded the greatest quantity of protein. This study demonstrated how statistical methods could be used to quickly optimize a process at the microscale.
In subsequent experiments, Dr. Lye and colleagues tackled the problem of scaling up from the microscale to the laboratory scale. (Islam RS, Tisi D, Levy MS, et al. Scale-up of Escherichia coli growth and recombinant protein expression conditions from microwell to laboratory and pilot scale based on matched kLa. Biotechnol Bioeng. 2008;99(5):1128-1139.) The crucial variable in this transition was the size and shape of the vessel. Each of the other variables could be controlled and matched to the microscale system. Vessel size affected mixing parameters; it was necessary to come up with a factor to adjust for this effect. That factor turned out to be oxygen mass transfer coefficients. By matching the oxygen transfer coefficient of the 7.5 liter and 75 liter vessels to that of the microwell plate, the kinetics and protein yield of the reaction were comparable.
The use of microwell plates and microscale bioprocess development are highly amenable to automation, and it'S easy to imagine the potential in a biotechnology industrial setting. If it were possible to optimize the production of a biopharmaceutical in a simple automated microwell fermentation and then take that information directly through the scale-up, many of the tensions discussed above would be alleviated, allowing bioprocess engineers to provide clinical quantities of material quickly, while at the same time optimizing the process for manufacture. In order for this ideal to be realized, more technology development, particularly in the area of microfluidics, is needed.
According to Drs. Lye and Micheletti, "Further research is now required on the engineering fundamentals of both microwell and microfluidic systems and the range of bioprocess unit operations that can be performed at the microscale. Until there is progress in these areas, the full potential of automation and its linkage to experimental design and process modeling will not be realized." (Micheletti M, Lye GJ. Microscale bioprocess optimisation. Curr Opin Biotechnol. 2006;17(6): 611-618.)
SYSTEMS BIOLOGY AND BIOPROCESS SCALE-UP
The use of statistical DoE highlights another area of emergent thought and innovation in bioprocess scale-up: systems biology. Traditional bioprocess engineering is based on a direct cause-and-effect model of biological interactions-the classical mechanics of biology. But with the advent of increasing knowledge and database construction in the omics fields, along with powerful computational technologies for analyzing this data, a systems approach to bioprocess scale-up is well within reach. The benefits are myriad and include leveraging computational resources to predict the most successful experimental parameters and building models for processes before trying them in the pilot plant, a sort of abstraction of the concept of scaling down to scale-up.
There are a number of recent examples of bioprocess development using a systems approach. For example, Korean researchers recently reported the development of an integrated bioprocess for bacterial production of succinic acid using microarray-based genomic, transcriptomic, and proteomic studies to optimize production in the host organism, Mannheima succiniciproducens. (Lee SY, Kim JM, Song H, et al. From genome sequence to integrated bio-process for succinic acid production by Mannheimia succiniciproducens. Appl Microbiol Biotechnol. 2008;79(1):11-22.)
In the Journal of Biotechnology, Oliveira and colleagues call for a bridging of systems biology and bioprocess engineering. "With the emerging demand for production of increasingly complex molecules, bioprocess engineering is increasingly dependent upon mathematical models to handle tasks like optimization, monitoring, and control of the bioreaction process," they wrote. "The main problem of classical models has always been the lack of predictive capacity, due to the oversimplifications done during the establishment of the link between the underlying biological system and its environment."Clearly, easing the awkward transition of a biopharmaceutical from the laboratory to the pilot plant and beyond requires drawing on both the knowledge base of classical bioprocess engineering and emerging tools and technologies such as systems biology. While each process is unique, it'S apparent that there will never be a one-size-fits-all solution for bioprocess scale-up