Speech in PharmaProcess, 27th -October-2015
Professionals need a forum where disruptive technologies (like big data, IoT or cloud computing) and the impact they will have on current and future manufacturing supply chain strategies will be discussed in order to improve and maximize industry processes.
What are the challenges in adopting new technologies within regulated environments? How they can create real-time models which allow verification of the process before the outputs of the process are obtained? Could be considered as realistic process the Continuous Process Validation? How the batch manufacturing can be considered as a continuous process rather than a discrete process?
You can download the presentation from this link.
QbD, PAT and other industrial technologies applied to the Pharma manufacturing through the cloud and big data tools. How to optimize the pharmaceutical manufacturing processes using the power of the data.
miércoles, octubre 28, 2015
miércoles, octubre 21, 2015
Speech at IL3: Continuous verification and protocols for process monitoring
Big data: The end of the utopy of the continuous process validation
Conference in the IL3 (University of Barcelona)
20th - October - 2015
A process is considered adequately known when:
Focus production in the knowledge of the processes involved reverts to a drastic reduction of the costs and burden of validation systems, since different options are provided to justify and qualify the different production processes.
Nowadays, big data is accepted by Pharma R+D when computing power is required.
Conference in the IL3 (University of Barcelona)
20th - October - 2015
A process is considered adequately known when:
- All critical sources of variability are identified and explained.
- The variability is managed from the process
- The quality of the product and its attributes can be predicted correctly in the design space established by the materials, the environment and other conditions involved in its manufacture.
Focus production in the knowledge of the processes involved reverts to a drastic reduction of the costs and burden of validation systems, since different options are provided to justify and qualify the different production processes.
Big data and regulated industry: the new paradigm
- Cloud environment can not be compared with traditional systems
- Security access is managed in a different way
- The owner of applications become the end user
- Pricing is by use and service, not by licensing
- The validation concept should evolve to the new reality (GAMP X?)
- The science of processes is not based just in pre-defined parameters (CPP & CQA). All inputs and data can participate.
Nowadays, big data is accepted by Pharma R+D when computing power is required.
jueves, abril 16, 2015
Quality by Design (QbD) & Paperless Lab Consultancy
Big Data: New data structures for a new concept of regulated manufacturing
Representing to ISPE-Spain, Pep and I we are presenting a seminar in the PaperLess Academy Event on 14-15 of April - 2015 in Barcelona.
Event
Seminar content
Representing to ISPE-Spain, Pep and I we are presenting a seminar in the PaperLess Academy Event on 14-15 of April - 2015 in Barcelona.
Big Data is the reference to systems that manage large data sets in the context of information technology and communications. Nowadays vast amounts of information are continuously generated and stored in multiple different systems: external and internal hard drives, virtual disks, network storage, pen-drives, e-drives, etc. Right now 1400000 GB = 1400 TB per minute are transferred to Internet.
The huge amount of the generated electronic information is directly related to the ease with which it can be stored. Without a multitude of elements and storage devices that allow quick access and low cost, it would not make sense to create much data. This factor, coupled with the high technology and high degree of penetration of the ICT in to the society, has favored the avalanche of information produced by the different areas of reality that surrounds us.
Today many areas and professional activities have based a fundamental part of their work on managing large amounts of data. Academic simulation environments (engineering, meteorology, astrophysics, physics of matter, environmental sciences), scientific institutions aimed to healthcare (genomics, biomedical research, epidemiology) and many sectors of different non-scientific applications (police investigation, social networks , smart cities, financial systems, marketing) need to mine Giga-bytes of information in a short period of time to extract knowledge in real time. The concept of big-data is inherent into all these scenarios. More and more appear new sectors that require tools for the analysis of large amounts of bytes by second.
Which is the challenge for the pharmaceutical industry? How the regulated industries should adapt its structures to the new concept of data?
Event
Seminar content
Suscribirse a:
Entradas (Atom)