Session Topics of the RXVIII Conference
Poster session : Scaling New Heights in Bioprocessing
Chairs: Abraham Lenhoff and Michelle Butler and Sanchayita Ghose
Innovative and diverse bioprocessing methods and concepts are vital to develop processes and manufacture medicines now and into the future. Contemporary technologies and platforms that can support a complex and ever-evolving manufacturing environment are important to bring forward for review and integration of ideas. We invite submissions that span the spectrum of bioprocessing topics: comprehensive fundamental research or implementation case studies incorporating experimental and/or theoretical/computational elements, or moonshot concepts, even at the alpha stage, that can revolutionize how we work, design and control processes. The poster sessions continue to be a critical focus for technical discussions at the Recovery conferences, and insightful and creative submissions can maintain this as a vigorous component of the program. In addition to abstracts submitted directly to the poster session, including on topics not covered in the oral sessions, all other abstracts not selected for oral presentations will also be considered for inclusion in the poster program.
Session : Highland Games - in silico Prediction of mAb Purification and Developability
Chairs: Bruno Marques and Richard Willson
Activity: What are the best in silico methods in process development? How far can we develop a mAb manufacturing process with just sequence data? Join a team to predict mAb characteristics (viscosity, aggregation, ProA and IEX elution, etc) based only on the protein sequences. Experimental data for the games are generated by volunteers, who each measure the common set of developability characteristics of antibodies they possess (expected to be mainly from terminated mAb candidates or biosimilars).
Join what is expected to be the largest head-to-head in silico mAb assessment in history.
Benefits: Build a data set for bench-marking, and to improve industry-wide capabilities in this pre-competitive space; Sharing of best practices, as far as teams are willing to say how they achieved their results; Fun and useful networking opportunity, across industry and academia; no material is exchanged between labs.
Roles: Competitors to make predictions; and Volunteers perform a specific set of experiments on inactive mAbs (no longer in development) in their own labs. There can be overlap between the two groups. No group will have access to the data for all the mAbs in the competition.
- January: Team formation, recruiting Volunteers with mAbs and analytical capacity
- February: Populate teams, agree on analytical methods and mAbs to be used
- March: DNA sequences provided
- March - August: mAb generation and characterization
- September: Compare predictions to actual data
- October: Team Presentations at Recovery XVIII
Oral session 1: Revolutionary Products and Process Development
Chairs: Brian Kelley and Nina Bauer
The biopharmaceuticals industry has delivered considerable therapeutic benefits, yet there remain no treatment options for the vast majority of the more than 7,000 known disorders, while approved biologic therapeutics are currently available to only a small fraction of society. Fortunately, there is growing potential to change this situation. Modular scalable platforms such as liposomes and virus-like particles offer the potential to greatly expand our ability to generate safe vaccines more rapidly and at a low cost. Engineered dendritic and CAR-T cells, as well as gene editing and next-generation gene therapy technologies are providing powerful new treatment options for many complex disorders. With the advent of these breakthroughs, we must consider the future state of bioprocessing, for both established and emerging therapies.
- How will recovery science respond to this opportunity?
- The impact of process design decisions could be critical. In addition to potentially controlling cost, quality and development speed, there may be cases where novel technology would be uniquely enabling. What are they?
- Which products may benefit significantly from integration of the downstream process with upstream or drug-product unit operations?
- Which may require decentralized manufacturing?
Oral session 2: Bioprocessing of New mAb Modalities
Chairs: Nihal Tugcu and Marcel Ottens
While monoclonal antibodies (mAbs) remain the dominant class of biologic therapeutics, their molecular diversity is rapidly increasing to include a range of novel and hybrid constructs. Complex mAb-like molecules with dual/triple targeting activity, nanobodies, conjugated mAbs, ADCs (antibody drug conjugates) and other non-natural modalities are being developed to treat a wide range of disorders. Along with this broadening portfolio, divergence from the classical CHO expression system is also observed and shown to be fit for expression of some of these novel constructs. In this session, we invite papers that explore the scientific and business decisions that lead to the development of novel mAb-like structures, as well as the impact of these new modalities on bioprocessing and bioprocess development timelines.
Questions of relevance to this session include:
- What are the scientific drivers behind the choice of a new modality?
- Are platform processes fit to adapt to these new modalities, or do we need completely new process concepts, expression systems and development approaches?
- What new technologies are required to support this effort?
- How must we interface with QbD paradigm to assure success?
Oral session 3: Emerging Biological Classes and the Multi-Scale, Multi-Facility Problem
Chairs: Anne Kantardjieff and Kent Goklen
Manufacturing within biopharma companies is increasingly challenged by an expanding portfolio of biological therapeutic products that creates a need to effectively allocate production and resources across several multi-product facilities, including CMOs. This problem is intensified by the emergence of preventive and curative therapies whose manufacture cannot be achieved through straightforward adaptation of existing platform processes. These include the more than 600 cell- and gene-based therapies currently in clinical trials, new modalities of vaccines poised to provide immunization to a greatly expanded number of patients, particularly in less developed areas of the world, and a variety of novel approaches that serve to improve oral availability of biologics, activity against intracellular targets, or stimulation of a desired immune response.
While they are poised to transform healthcare, emerging biotherapeutic classes pose numerous manufacturing and capacity-planning challenges. This session aims to address the question of how biopharmaceutical manufacturers can make process design and capacity-planning decisions in the face of significant uncertainty in dosage requirements and market acceptance of the product. We invite papers describing case studies, as well as more fundamental approaches, in planning manufacturing capacity in a manner that balances the risks of supply management with the optimization of capital utilization. Papers addressing how manufacturing is changing to enable cost-effective production of these modalities are particularly welcome, as are those addressing the benefits and challenges of implementing new unit operations required to lower manufacturing costs for these emerging therapies.
Questions to be addressed in this session might include:
- How do we balance the risks of providing adequate drug supply against the need for efficient capital utilization for novel new therapies made by processes requiring new facilities?
- How do we factor in the uncertainty of dose requirements and market acceptance in planning such facilities?
- How do we deal with uncertainties in planning for the production of a portfolio of products?
- What is limiting in the production of patient-specific treatments, and what technologies can lower the barrier to implementation?
- What are the benefits and challenges of developing platform processes for individualized therapies?
- What are the strategies to define, measure, and control process- and product-related impurities?
- What are effective strategies for reducing costs in scaled-out systems?
Oral session 4: Integrated and Continuous Bioprocessing
Chairs: Suzanne Farid and Konstantin Konstantinov
The possibility of end-to-end continuous bioprocess facilities is becoming closer to reality with recent advancements. Yet significant effort may still be required to realize implementation for clinical and commercial manufacturing in terms of comprehensive analytics for monitoring and control, novel technologies to close any gaps, and establishing the business case. This session will debate the scenarios where continuous platforms will better serve our future needs and how progress with data-mining and analytics might enable predictive process control and ultimately real-time release with continuous processes. We invite talks that present business cases for integrated and continuous bioprocessing, as well as case studies for the implementation of continuous platforms for clinical products with details of scale-down mimics, PAT, validation, chemometrics, modeling and control strategies.
For this session, we encourage contributions that address the following themes:
o Developing a business case for integrated and continuous bioprocesses
- In view of the track record of batch processes, is there any need for continuous bioprocesses?
- Besides labile molecules, what new modalities best lend themselves to continuous facilities?
- Does continuous bioprocessing have a role to play with enabling precision medicine?
- What is needed for these integrated and continuous bioprocesses to “beat” current batch processes?
- Can continuous bioprocessing increase speed to clinic and speed to market?
- What strategies can be used to overcome the hurdles to implementing continuous bioprocessing?
- What PAT and control strategies are required for successful operation of continuous facilities for commercial manufacture?
- How might one fill gaps with scale-down mimics and analytics?
- How can we use multivariate data analysis (MVDA/data-mining) tools and mechanistic models to enhance process understanding of the linkage between continuous upstream and downstream steps?
- How can QbD tools be used to enable real-time release with continuous processes? What time-frame is required to reach this vision?
- Are their alternative ways to address the current approach of 3 PPQ/PV batches for continuous bioprocesses?
Oral session 5: Quantifying Manufacturability
Chairs: Ranga Godavarti and Juergen Hubbuch
Picking the ‘right molecule’ to bring to the clinic often requires a holistic approach towards understanding not only a molecule’s pharmacology, immunogenicity and toxicology but also its biopharmaceutical properties. Understanding a molecule’s structural properties can enable an assessment of its ‘manufacturability’ or ‘developability’, and facilitate development of molecules with improved properties such as reduced viscosity or aggregation, or resistance to unwanted chemical modifications such as deamidation, clipping or oxidation. Robust strategies for assessing manufacturability that are based in part on fundamental understanding of molecular structure/function relationships can assure efficient design of manufacturing processes. In the era of ‘mAb platform processes’ such assessments have proven critical to proper evaluation of fit to platform and to achieving ever shortening development timelines.
Manufacturability assessments not only rely on a correlation between analytical data and process performance, but also on underlying knowledge of molecular properties and model-based characterization of operations. In this session, we welcome talks on modelling of both the biomolecule and the envisaged bioprocess. Topics of interest include strategies to identify and potentially eliminate hot spots in sequences, understanding of sequence/structural features leading to aggregation and/or high viscosity in protein solutions and related mitigation strategies.
We are specifically interested in case studies that can address the following questions:
- How can we use molecular understanding to better predict process performance?
- What interdependencies do we find between biomolecule design and structural properties?
- What can we learn from in-silico analysis and how can modeling be used to improve protein properties such as manufacturability and stability?
- Will there ever be experimental methodologies coupled with deep theoretical underpinnings to accurately predict long term stability based on short term experiments? Maybe even set expiry based on such predictions?
- How far can we design a biomolecule to fit the process and mitigate aggregation or immunogenicity?
- Can we integrate HTS platforms and advanced modeling into the screening effort to improve process development timelines and/or product quality?
- Manufacturability assessments have been classically applied on mAbs-what about Fc-fusion proteins? Bi-specifics? Other modalities?
Oral session 6: Big Data and the Fully Realized QbD Process – Case Studies
Chairs: Jayme Franklin and Karol Lacki
Drug development is undergoing a business transformation that comes with the expectation that future success will be reliant on flexible manufacturing, integrated real-time quality assurance, and the ability to exploit big science, artificial intelligence (AI) and information technology (IT) to more effectively respond to new practices and processes with speed and agility. This session will explore how ‘omics and related big science, as well as bioinformatics and IT advances enabled by ultra-high-speed computing, will alter the practice of downstream bioprocessing.
We welcome both fundamental investigation and case studies that address the issues defined above as well as the following questions:
- How might PAT employ advanced measurement and monitoring techniques (e.g., vibrational spectroscopy methods) in concert with real-time data analysis to support more robust process control?
- How might old and emerging PAT technologies support big science and AI to drive process understanding to a heightened level?
- What are the approaches and challenges to implementing data management systems, either commercial or developed in-house, for knowledge management in support of end-to-end product life cycle?
Oral session 7: The Need for Speed: Automated Bioprocess Design and Validation
Chairs: Jennifer Pollard and Benjamin Tran
Fewer proven therapeutic targets, increasingly complex molecular formats to address these targets, and added competition have intensified the race to get to market. As we rev up the bioprocess paradigm, using automation as simple screening tools and to increase productivity is no longer sufficient. The industry continues to apply more sophisticated automation and computational methods and workflows to traditional bioprocess development in order to more rapidly design and validate commercializable processes. High-throughput technologies, computational and mechanistic modeling, and machine learning have proven their ability to solve complex problems. Examples abound in the literature of how each of these methods alone can increase productivity and generate data with little resources and material required. However, key to meeting the challenges of increasingly complex bioprocess design on warp-speed timelines will include realizing the full potential of integrating these techniques and applying artificial intelligence to interpret the big datasets we get from these tools. As we transition from using robots to perform experiments to having robots propose and execute new experiments to design and validate our processes, we must retain the ability to make scientific sense of the solutions our tools devise and to communicate this effectively to regulatory agencies. This session seeks contributions from both industry and academia on the applications of these automation and computational tools and workflows to bioprocess design and validation.
Key questions to be addressed in this session include:
- What are the limits of high-throughput bioprocess development and what are the factors that define that limit? Is comprehensive in-silico bioprocess design possible?
- Can high-throughput bioprocess design provide primary process data in addition to supplementary data? Can high-throughput concepts be applied in the manufacturing space?
- What new tools and theoretical/modeling advances are required to account for differences in scales? for deviations and variabilities? for troubleshooting?
Oral session 8: Disruptive Downstream Processing Tools and Technologies (Case Studies)
Chairs: Nooshafarin Sanaie and Jorg Thommes
A cursory look at what one might describe as a platform process to purify a secreted recombinant protein, say a monoclonal antibody, could create the impression that nothing has changed over the past 30 years. The field of purification process development, however, has seen its fair share of disruptive approaches. High throughput development tools have enabled generating a large amount of results in short time, enabling statistical design of experiment approaches early in process development as well as thorough interrogation of the design space in preparation for process licensure. We have also seen substantial improvements in fundamental understanding of protein-protein and protein-surface interactions. This has allowed much more sophisticated and theoretically supported choices of separation conditions. Finally, miniaturization of assays and tighter integration of process development into discovery research has introduced the concept of “developability”, i.e. supporting candidate selection with information on how to transform a candidate into an actual physical product. In this session, we will explore the next generation of disruptive tools in downstream processing, considering the entire spectrum from early phase process development to later process characterization, technology transfer, and regulatory filing.
We will ask:
- Can mechanistic modeling further disrupt the course of process development?
- Will our theoretical understanding of proteins and their interaction with separation media further evolve, such that at some point we can rely on in silico process design for early phase manufacturing?
- Will there be new experimental methodologies allowing more predictive development with fewer experiments rather than generating more and more data with possibly little additional information gain?
- Are analytical methods in place comprehensive and sensitive enough for rapid and robust process development as new modalities and impurities come into play?
- How about lean validation?
- Can late stage characterization and validation efforts get further streamlined to reduce the need for elaborate experimental work?
- How will the smart manufacturing principles of Bioprocessing 3.0, the conference theme, impact our industry, particularly process development?
Oral session 9: Lightning Round: Next Generation Downstream Process Development
Chairs: Hanne Bak and Glen Bolton
Bioprocessing 3.0, the conference theme, embodies the need to transform manufacturing from an enabler to a true value driver. This session will highlight disruptive innovations in downstream processing that serve that goal. Papers describing innovations that showcase relevant and viable ways of carrying out substantially faster, smarter, more economical, or more robust downstream processing are invited. These may describe processing technologies or methods from a broad range of traditional and new biological therapeutic modalities.
Talks should touch on the technical, operational, economical, business and supply chain advantages and disadvantages their new process-development method or technique offer in one or more of the following areas:
- Footprint, portability, product and personnel flow, labor/staffing, automation, dynamic real-time statistical process control (SPC)
- Process Analytical Technology (PAT), real-time non-conformance or process or equipment fault detection/recovery
- Intensification enablers, including for ultra-high concentration processes, blue-sky techniques, and tools enabling in silico process development, process automation, and improved viral clearance, process robustness, and product stability and other quality attributes
Oral session 10: Evolution of Disruptive Membrane and Chromatography Processes via First Principles (Application and Theory)
Chairs: Massimo Morbidelli and Andrew Zydney
Existing recovery and purification technologies are under siege with the clinical success of new therapeutic modalities (e.g., fusion proteins, bispecifics, conjugates, nucleic acids, and derivatized nanoparticles) along with interest in lower-cost and higher productivity manufacturing processes.
Empirical and opportunistic approaches to new unit operations may have reached a practical limit. Hence, the solution of future downstream challenges necessitates a renaissance in the application of first principles of separation science combined with advances in manufacturing technologies. This session will examine novel approaches and the theoretical framework required for the development of new membrane and chromatographic processes specifically targeted to bridge the existing chasm and emerging challenges. Presentations examining new theoretical approaches to develop innovative membrane and chromatographic separations are also of interest.
The session will explore the following topics in detail:
- Can 3D printing enable the development of novel membranes/ chromatographic media/modules with improved flow distribution or ligand distribution?
- Can 3D printing combined with other disruptive technologies support the development of designer surfaces with tailor-made selectivity (analogous to molecular machines)?
- What theoretical gaps still exist in the development of in silico models for more accurate simulation (e.g. allosteric interactions), monitoring, and/or design of separation processes?
- What novel approaches are required for application of membrane and chromatographic systems in continuous processing? What limitations need to be overcome with existing technologies and what are the theoretical limits (e.g. mass transfer effects, ligand density saturation)?
- What unit operations will be required to purify high titer products (e.g. 30 g/L bioreactor titer) and to support primary recovery of high cell density cell culture?
Oral session 11: Next-Generation Bioprocessing Case Studies – Smart Integrated Processing Facilities
Chairs: Raquel Orozco and Jeffrey Salm
As the biologics industry continues to mature and the market for products becomes more competitive, companies are looking for ways to reduce costs and increased flexibility. Many have taken a renewed interest in intensified, integrated and fully automated, or closed processing approaches as potentially enabling manufacturing strategies. Continuous, intensified, or integrated downstream processing presents a unique challenge as many of the standard approaches to purification are traditionally designed to be discreet unit operations. Recent improvements to chromatography and filtration technologies such as PCC/SMB and SPTFF have made implementation of next generation technologies easier. However, these improvements have only resulted in evolutionary changes to existing approaches.
So what will success look like in the future? One barrier to implementation of a revolutionary next gen process is a clear vision of success. This limits investment by companies to develop and realize transformational manufacturing approaches. In silico tools exist that allow us to model potential strategies and show the creation of value, existing gaps and the associated risk. In this session, we hope to look beyond traditional purification approaches and consider what a future Next Generation process might look like. Areas of focus could include how linkages between steps are created and the resulting benefits, end to end approaches that look past individual unit operations, and strategies toward making such processes commercial including control strategies and testing. Finally, we challenge the submitter to present an assessment, potentially using in silico modeling, which addresses the limitations in current next generation process.
Additional questions to consider include:
- Can you quantify what the specific facility/process design enables?
- What is the viral clearance strategy for emerging unit operations? Have they received feedback from FDA’s Emerging Technology Team? If so, feedback was received?
- What are the considerations for release testing? Are there any examples for real time release? What does it enable?
- What are the key learning lessons from failures?
Oral session 12: Process of Things
Chairs: Gisela Ferreira and Bernt Nilsson
Bioprocessing 3.0 is about connecting tasks, services and people into a manufacturing system or system-of-systems that enables rapid decisions, flexible production and robust operation. A vital part in this vision is the automation and on-line control schemes that connect the processing units and machines into an autonomous manufacturing system. Feedback and interpretation of on-line measurements for disturbance rejection and on-line analytics for near real-time release are critical to the reliability and safety of highly automated processes. The fast human interactions and decision making needed for flexible production and reconfiguration within such systems may also require modularization of autonomous elements and control schemes.
This session will focus on advances in automation and on-line control and how those improvements can enable either integrated processing steps or complete integrated downstream processes. Issues and questions surrounding the development and implementation of these advanced systems will be explored, including:
- How can advanced control strategies be integrated into Good Manufacturing Practice?
- Are there enabling analytics and devise technologies that will enhance process on-line control or product quality assessment?
- How can mechanistic modelling support process control and manufacturing?
- How might one develop on-line control laws? Is it possible to automate process development and validation of the region of operation?