Uncategorized

Are Changes in Regimes and Regulations Good for Formulation Technology?

By David Calvert, January 2017

The political turmoil of last year including Brexit and the election of Donald Trump to be the 45th President of the United States made for a lot of speculation as to the impact of these dramatic changes and of course has provided satire shows with material for at least the next four years.

As 2017 starts though we can start to see some of the realities of these decisions and this made me wonder about how this could impact on the work of formulators. The regulatory landscape is key in driving many of the new formulations, whether these be lowering volatile organic compounds (VOC) emissions, restricting the use of certain chemicals due to REACh regulations or developing formulations with a lower carbon footprint.

This latter is not necessarily driven by specific regulations but more responding to initiatives like the various Climate Change Agreements such as Kyoto or Paris. In some cases the move towards lowering carbon footprint in its various guises is driven by a desire to demonstrate Corporate Social Responsibility (CSR). Some of the major FMCG companies such as RB, Proctor and Gamble and Unilever have placed sustainability high on their agenda.

In their 2015 Sustainability Report RB have a target going forward to reduce the CO2 footprint of their products by one third by 2020 and reduce the CO2 emissions from their manufacturing by 40% before 2020.

P&G have a similar commitment and in their 2015 sustainability objectives  state that they are focusing their efforts in the following areas:

“(1) Reducing the intensity of greenhouse gas emissions (GHG) from our own operations through:

Driving energy efficiency measures throughout our facilities

Transitioning fuel sources toward cleaner alternatives

Driving more energy-efficient modes of transporting finished products to our customers

(2) Helping consumers to reduce their own GHG emissions through the use of our products via:

Product and packaging innovations that enable more efficient consumer product use and energy consumption

Consumer education to reduce GHG emissions such as the benefits of using cold water for machine washing

(3) Partnering with external stakeholders to reduce greenhouse gas emissions in our supply chain, including:

Ensuring our sourcing of renewable commodities does not contribute to deforestation

Developing renewable material replacements for petroleum derived raw materials”

Unilever launched their Sustainable Living Plan in 2010 as their blueprint for sustainable business and have stated that they wish to decouple their growth from their environmental impact.  As part of this, they are aiming to reduce their environmental impact by 50% by 2030 and this includes a target to be carbon positive in manufacturing by 2030 by saving 1 million tonnes of CO2.

So how will all of this be changed by the potential appointment of Scott Pruitt to be the new head of the Environmental Protection Agency (EPA) in the USA? For those of you who are not aware of Scott Pruitt, he has served as Oklahoma’s attorney general since 2011 and is presently representing the state in a lawsuit against the EPA to halt the clean power plan. Pruitt is a well-known climate change sceptic who has cast doubt on the evidence that human activity is causing the planet to warm.  At time of writing the confirmation hearing for Scott Pruitt has just taken place and barring some startling revelation it is likely he will become the new EPA head. We will have to see if he does introduce measures to reduce restriction on fossil fuels in the US and whether some of the companies change their approach to sustainability. My own personal view is that the approach of the companies mentioned will not change but perhaps the pace of change will be reduced somewhat and focus outside of the US.

If the EPA then introduces measures which “relax” regulation on other emissions such as those to water, will this mean that agrochemical formulations will be easier to develop? Is this good news for formulators? A new approach to some of the pressure groups who cast doubt on any scientific findings may well be refreshing if the pressure to remove some agrochemical actives from the market is reduced but we will have to wait and see. On a similar vein, the former Chief Executive of the Crop Protection Association in the UK stated in a letter to the Financial Times in December 2016 that he hoped the UK would “lead the way in striking a sensible balance between protecting and enhancing the environment and at the same time, supporting UK farmers to provide a healthy, safe, reliable and affordable food supply”

Staying with Brexit, I feel that change will be slower and any impact will be unlikely to be seen until at least two years after Article 50 is triggered. There remains the possibility that the UK’s exit could lead to yet more regulations for formulators to comply with, although it is hoped that common sense will prevail and the majority of the European regulations will simply be “ported over”. Putting the case forward for the UK to change the regulatory landscape the President of the British Crop Protection Council argued this month for the UK post-Brexit to move closer to the US EPA risk-based approach. He argued for the removal of the “EU’s unscientific hazard based assessments and the associated Candidates for Substitution and Comparative assessment processes”.

Whether the UK can afford to implement its own new regulatory approach may be more is another question and the whole issue may be decided more by politics than the practical aspects.

I guess my conclusion from compiling this article is that formulators – like everyone else – will be affected by the political turmoil of 2016 but how and whether this is positive cannot be determined as yet. All we can do as they say is “wait and see” and be quick to adapt once clarity is there, if that is ever the case!

Posted by iformula

Sustainable Reformulation using Hansen Solubility Parameters

With Daniel Schmidt of the University of Massachusetts Lowell

Formulators in all industries are increasingly being charged with finding new ways to reduce the environmental impact of formulated products as well as to eliminate hazardous components. The formulator is faced with a variety of conflicting factors when carrying out such reformulation so the ability to use practical tools with sound scientific basis can be decisive.

Among the talks to be given at the upcoming HSP50 conference (York, UK, 5-7 April 2017), Prof. Daniel F. Schmidt of the Department of Plastics Engineering at the University of Massachusetts Lowell (UML) will present an overview of work being carried out by multiple researchers at his institution involving the use of Hansen Solubility Parameters (HSPs) and provide examples of a number of successes as well as opportunities for further development.

Most generally, the efforts at UML and the affiliated Massachusetts Toxics Use Reduction Institute (TURI) fall into two broad categories: searching for alternatives to existing formulations and predicting the compositions of new formulations. In the former case, many efforts have been driven by a desire to identify greener, more sustainable solutions to existing problems and to address increasing regulatory pressure on volatile organic compounds (VOCs) and hazardous air pollutants (HAPs). In the latter case, the motivation is to save time, effort, and materials and reduce the number of experiments needed to identify new, useful formulations.

Daniel Schmidt takes up the story:

One example of work to identify alternative formulations is the replacement of methylene chloride in a gel-based paint stripper formulation. Due to increasing regulatory pressure, such formulations are less and less acceptable, but existing alternatives have major performance issues as compared to methylene chloride based strippers. The use of the HSP approach made it possible to identify a blend providing excellent performance while addressing regulatory concerns. A similar story will be presented concerning the replacement of aromatic hydrocarbon solvents in a contact adhesive formulation. Concerns over HAPs drove our industrial partners to search for alternatives; the HSP approach, coupled with optimization software enabling the evaluation of more complex blends and the use of cost as a selection parameter, yielded formulations meeting environmental and cost targets. In both cases, our research team found that the existing formulations in use by industry were themselves suboptimal. This result emphasizes that even in those cases where solvent replacement is not necessary, opportunities exist to enhance performance via a thorough assessment of materials already in use.

In addition to solvent replacement, researchers at UML have also used the HSP approach in service of a project concerning styrene replacement in vinyl ester resins, with concerns over VOCs and associated regulatory pressure once more driving the work. Also related to concerns over sustainability, HSP calculations were used to identify greener solvents for two stubbornly insoluble polymers: poly(3-hexylthiophene) (P3HT), an active material in organic optoelectronics, and poly(butylene succinate) (PBS), a biodegradable replacement for materials like low-density polyethylene. In the latter case, it was further shown that copolymerization broadens the range of solubility for a family of PBS-based renewable co-polyesters. Attempts to identify bio-derived solvents for polystyrene foam recycling have been made with support from the HSP model, and TURI has compiled a large database of safer molecules that is now included with the Hansen Solubility Parameters in Practice software package to further their goal of seeing toxic substances phased out in favor of more sustainable and practical alternatives. TURI’s Cleaning Laboratory is making use of this approach as well, given that demand for this type of support is only increasing over time. As an example, another problem recently identified in which the HSP approach promises to provide assistance is to guide formulators in identification of a replacement for methanol in windshield washer fluid.

In addition to the aforementioned efforts to identify alternative formulations, several attempts at compatibility prediction in new compositions of matter have been carried out.  In one example, predictions were made concerning the compatibility of a new class of small molecule biofilm inhibitors with various grades of plastics used in medical device applications, as well as solvents used to process these small molecules in this context. The ability to perform such predictions was critical to the success of the work, given that the small molecules were custom-synthesized and available only in very small quantities. In a second project, the HSP approach was used to guide solvent selection for a series of never before studied high-impact co-polyesters. While here the level of success was more limited, this effort nevertheless highlights one of the opportunities for improving the predictive power of this approach. Related to such efforts, ongoing discussions involving how best to define HSP values for heterogeneous entities such as surface-modified clay nanoparticles promise to yield results of significance to both science and industry moving forward.

In sum, a broad range of exciting work has been and is being enabled thanks to the use of the HSP approach by researchers at UML and TURI. These efforts range from improving the processing of novel materials in support of fundamental research concerning their behavior to the solution of pressing industrial problems involving solvent replacement, VOC and HAP reduction, and improvements in safety and sustainability more generally. While challenges remain and not all efforts result in success, these efforts demonstrate that there are clear opportunities to utilize the HSP model to make progress on problems of real importance to researchers in academia and industry, and to further extend this approach so that today’s difficulties become tomorrow’s successes.”

We hope that this item has given you an insight into how the HSP approach can be used to help reformulate industrial products and we look forward to seeing you in York in April 2017.

Posted by iformula

Understanding the stability, behaviour and surface properties of nanoparticles and quantum dots using Hansen Solubility Parameters

With Dietmar Lerche of LUM GmbH

Devices based on quantum dots and smart nanoparticles are currently getting a lot of attention. For instance, the colours, contrast and brightness of displays based on quantum dots provide benefits for users of TVs and phones. In addition, such particles can create value when incorporated into novel materials. So, smart particles can be designed to respond to light or magnetic fields, or can be used as carriers to deliver medicines in a targeted manner.

Numerous methods have been developed to design nanoparticles and new particles are appearing all the time. However it’s especially important to ensure that the particles produced are stable in their intended use. Unfortunately it’s not always obvious how the particles are going to interact with the other materials such as polymers, dispersants or solvents which form part of the product formulation.

Dietmar Lerche of LUM GmBH is a speaker at the HSP50 conference in April 2017. He takes up the story:

“In order to tackle the challenges of compatibility and stability of particles, we took a look at the approach developed by Charles Hansen fifty years ago. Put simply, Hansen’s approach turns the chemist’s well-known rule of thumb, “like dissolves like” into a quantitative and predictive system that allows solvents and solutes to be matched to each other. We had seen that nanoparticles (or in effect their surfaces) could also be assigned their own set of Hansen Solubility Parameters (HSP). Then the stability of particles (estimated by sedimentation velocity) could be predicted by looking at the extent of compatibility with the solvent medium, as described by the HSP of that medium. In other words “like is compatible with like”.

However, observing sedimentation with the naked eye is a tedious process, because under normal gravity it can be very slow and can take weeks to obtain a ranking of sedimentation velocity. So when I read an article about the characterisation of carbon black by HSP, something clicked. Why not combine our STEP-Technology® with multi-sample analytical photocentrifugation? This system effectively uses centrifugation to accelerate the sedimentation process. Furthermore because our LUMiSizer® has a multi-sample capability, twelve different solvents can be tested at the same time and a full set of 48 test solvents, as originally proposed by Charles Hansen, can be compared in about a half day. 

We started by looking at industrial pigment particles in our application lab and we devised a classification scheme for the relative sedimentation time of the particles in different solvents, which took into account the density and viscosity of the solvents. Once we had entered the sedimentation scores into the HSPiP software we obtained some promising HSP values for the particles. [D.Lerche, S. Horvath, T. Sobisch, Efficient instrument based determination of the Hansen Solubility Parameters for talc-based pigment particles by multisample analytical centrifugation: Zero to One Scoring Dispersion Letters  6, 2015, 5]

Following the award of an R&D grant from the German Ministry of Economics and Energy, we began some work with the Institute of Particle Technology at the Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU) in Germany (headed by Prof. W. Peukert). This allowed us to access the necessary nanoparticle know-how and technology so we could look at carbon black more closely as well as fine tune the sedimentation scoring scheme. We found that we could reproducibly determine the HSP of the particles (independently in Erlangen and in Berlin) as well as differentiate between different industrial grades of carbon black. We were also able to show that the dispersing process influences the carbon black particle surface and that this can be quantified experimentally by using HSP.

Then we took a look at quantum dots made from zinc oxide (ZnO) particles with a mean size of just a few nanometres. Particles such as these have potential to be applied in new optoelectronic devices, in sensors or as UV-blockers. A knowledge of how particles interact in different solvents is essential when optimising the conditions in specific applications.

Based on the dispersion process and measurement procedures we derived, again by using analytical centrifugation with the LUMiSizer®, for the first time HSP values for synthesised ZnO nanoparticles. In addition, the function and surface properties of ZnO particles can be tailored by the adsorption of different organic ligands at the particle surface. Thus, we modified the surface chemistry of ZnO quantum dots in this way and found that the polar ligands are made an increased polar contribution to the HSP values measured.

Because HSP values are very sensitive to the different ligands bound at the surface, HSP determination is a highly effective method to quantify the surface properties of colloidal particles. Thus an approach using HSP data can be used to help custom-build specific functionalities of nanoparticles and nanomaterials.

We are looking forward to participating actively in the HSP50 conference and hope to learn more about the developing science of Hansen Solubility Parameters as well as about growing applications all over the world.”

Prof. Dr. Dietmar Lerche

CEO, LUM GmbH, Berlin, Germany

Posted by iformula

Stefan Langner on using HSP for smart formulations of organic electronics

We’re always impressed at how established methods like Hansen Solubility Parameters (HSP) can be not only used but also improved to solve new problems in industry. Below in the latest item in our series of items focusing on the upcoming HSP50 conference, Stefan Langner of the University of Erlangen-Nuremberg tells us about how HSP can be used to help produce formulations of advanced electronic materials.

If you’d like to keep up to date with the latest developments in solubility, solvation and the uses of HSP in formulation then you’ll benefit from attending HSP50, to be held at the University of York from April 5th-7th 2017. Registration is now open, just follow the links from our webpage.

We’re pleased to say that Stefan Langner is one of the keynote speakers at HSP50 and he’s looking forward to meeting fellow members of the community during the many opportunities for networking discussions during the conference.

“When you are working on advanced organic electronics using novel materials such as conducting polymers there are many challenges along the way. One of these is to solubilise these complex molecules. So if you find that, say, chlorobenzene does a good job, why not carry on using it during development?

That was the case for organic photovoltaics (OPV) and it took a while before the issue of production scale-up started to be significant for us. It was only then that we discovered that you can’t use chlorobenzene in most practical coating machines.

Another challenge for OPV was to get controlled phase separation of some of the key components. This could easily be achieved by some minutes annealing in an oven. But again, this simply could not scale up for production. What was needed was a solvent blend that would “force” phase separation within the production drying oven.

And finally for OPV, coatings are invariably multi-layer so a solvent which is perfect for one layer might, during coating, destroy a previous layer.

In all three cases, the problem could perhaps be solved with gut feel or trial and error. But these aren’t effective or efficient approaches when dealing with expensive materials and complex multi-layer products. So, instead, we needed a rational approach to controlling solubility. There are many fine solubility theories out there, but they are often difficult to apply to the complex materials used in OPV. Therefore, as part of one of the leading teams in OPV formulations working under Professor Brabec in Erlangen-Nuremberg, we decided to adopt a powerful but pragmatic approach using Hansen Solubility Parameters (HSP). We discovered that a key technique for measuring HSP, although useful, was rather too difficult to apply in our systems as it involved handling 15 to 20 sometimes unpleasant solvents. Instead, working with one of the authors of the HSPiP software, Prof. Steven Abbott, we developed a “grid” technique which created a rational range of solvents by mixing a few carefully-selected solvents in controlled proportions.

This “grid” technique has now spread to other areas of formulation and is a key feature of the HSPiP software. With our measured values of the HSP values for the key components, we could then start to successfully create rational solvent blends to replace chlorobenzene for production, to control phase separation (by ensuring that one “good” solvent for a key component evaporated quickly, forcing that component to separate out of solution), and to arrange for solvent blends that provided good enough solubility for one layer without disturbing a previous layer.”

 Stefan Langner 2016

Reference:

“Determination of the P3HT:PCBM solubility parameters via a binary solvent gradient method: Impact of solubility on the photovoltaic performance”, Florian Machui, Stefan Langner, Xiangdong Zhu, Steven Abbott, Christoph J.Brabec, Solar Energy Materials & Solar Cells 100 (2012) 138–146

Posted by iformula

A Matter of Time?

Programming temporal shapeshifting Xiaobo Hu, Jing Zhou, Mohammad Vatankhah-Varnosfaderani, William F. M. Daniel, Qiaoxi Li, Aleksandr P. Zhushma, Andrey V. Dobrynin & Sergei S. Sheiko Nature Communications 7, Article number: 12919 doi:10.1038/ncomms12919

This is the answer to a number of everyday questions. For example:

  • When will England win the football world cup?
  • When will Boris Johnson make a fool of himself?
  • When will this article get to the point?

Well one of the holy grails of formulation technology is the ability to deliver an active ingredient to a specific target in a specific place at a specific time. For example, many pharmaceutical products need to survive the passage through the acidic environment in the human gut, seed coatings may need to release an active ingredient when germination starts, materials such as cement may need to flow for a certain period of time and then set.

Time is however not the primary controlling factor in the formulation technologies used at present. Instead there are often external triggers such as pH, light, or temperature which are intended to act at a certain time. However I came across an article recently however which could change this and make time a direct controlling factor. In a paper in Nature Communications, researchers from the Universities of North Carolina and Akron have published work that they entitled Programming Temporal Shapeshifting.

From this rather convoluted title, they have outlined how they can make materials in the form of dual network hydrogels which will change shape after a specific period of time from seconds up to hours. They do this by having covalent crosslinks in the first network which provide elastic energy storage and temporary hydrogen bonds in the second network which regulate the energy release rate. These hydrogen bonds can then be reversed, which means the rate and pathway of a shape transformation can be encoded in the material without any external stimulus or trigger.

Obviously this work is in its early stages but the potential to truly time the release of an active, or even multiple deliveries does bring tremendous opportunities not just in the pharmaceutical sector. So those days of searching for a pH, heat, or light trigger may be a thing of the past and it will really be just a matter of time…

David Calvert, October 2016

At iFormulate our clients we frequently ask us to find and evaluate new technologies of potential interest and use to them. We can do this by keeping abreast of a wide variety of technical developments. If you think we might be able to help you then contact us on info@iformulate.biz or take a look at our website www.iformulate.biz.

Posted by iformula

The bees or not the bees? That is the question

The world’s population is projected to grow to nearly 10 billion by 2050, according to the UN. Agrochemicals help to achieve the quantity of crops needed to support that population. But of course agrochemicals are always under scrutiny. There are few more fraught questions in the world of agrochemicals at the moment than whether the use of neonicotinoid insecticides should be restricted or banned because of their possible negative effect on bee colonies. During an excellent short conference “Are Neonicotinoids Killing Bees” held by the SCI recently I was struck by the role that evidence has to play in regulatory decisions as well as by that good old favourite, the “law of unintended consequences”.

It’s a complex subject, so I’ll try to keep it brief. Bees, whether domestic honey bees or wild bumblebees, are vital pollinators without which we would be very short of food. Colonies of bees collapse every now and again, there are multiple reasons for this. Anyway, effectively, the EU has placed a temporary ban on the use of neonicotinoids, applying the precautionary principle. Whether or not this is justified is not really the subject of this piece. For what it’s worth, on the basis of the evidence very thoroughly presented at the conference, the situation seem to be that:

– in much higher doses than those seen in the field, neonicotinoids can kill bees (well, they are insecticides, so no surprise here);
– in the lab at lower “field” doses they do not kill, but they may affect the behaviour of individual bees;
– under realistic field trial conditions they do not appear to have any effect on bee colonies;
– residues are barely measurable under field conditions.

Anyway, a fairly familiar story for chemicals, what’s interesting is what has happened in approximately two years since the ban. At the SCI conference, agronomist and entomologist Dr Alan Dewar explained he has studied insecticide use on the oilseed rape crop in eastern England since the ban. The unintended consequences have been that more insecticide has had to be used, the incidence of pyrethroid-resistant pests has increased and therefore the area and volume of useful crop has decreased significantly. Not only is this a loss to farmers’ income but honey bees depend on oilseed rape to thrive – so ironically the ban could increase pesticide use, increase pest populations and probably serve to reduce the bee population. Be careful what you wish for!

In a pessimistic frame of mind, I thought back to the Informa Agrow Forum I had attended on the previous day, to maybe find a glimmer of hope. Maybe not hope just yet, but some opportunities for sure. It was clear from the Forum discussion that huge opportunities are emerging for precision and “smart” agriculture, combining sensors, “big data” gathering, and precision robotic application of pesticides. So pesticides would be applied only exactly where and when you wanted them, minimising the environmental impact and reducing dramatically the quantities of pesticides used. It is also clear that responsible use of pesticides within integrated pest management (IPM) programmes (which ensure the maintenance of habitat, crop rotation and pesticide strategies to minimise resistance) also has a great deal of exciting potential.

At the moment our regulations are pretty crude, they look at things in isolation and conclude “good pesticide/bad pesticide” or “ban/no-ban”. Is there a chance that a new generation of “smart” regulation, which looks at the whole picture, could evolve to steer us towards this more appealing future? A smart regulatory system would approve a product for use provided that it was used as part of specified practices (e.g. precision agriculture and IPM), with monitoring and corrective actions all playing a part. It’ll be challenging to achieve but whether we are industrialist, consumer, environmental campaigner or politician, I suspect we all want the same outcomes: A healthy, well-nourished and prosperous population, thriving industries, together with a high level of biodiversity and sustainable agricultural practices with low environmental impact. That’s my glimmer of hope.

Jim Bullock, October 2016

At iFormulate we mainly concentrate on product formulation and technology challenges, whether in agrochemicals or other chemical-using industries. However we know it’s important for us to understand the bigger picture as well. If you think we might be able to help you then contact us on info@iformulate.biz or take a look at our website www.iformulate.biz.

Thank you to the SCI and Informa Agribusiness Intelligence for hosting these two events.

Posted by iformula

Fifty Years of Hansen Solubility Parameters – Charles Hansen Reflects

At iFormulate we’re delighted to be involved with HSP50, a conference to celebrate fifty years of Hansen Solubility Parameters (HSP), a vital tool for practical formulators in all industries. We are also very pleased that Dr Charles Hansen, the originator of HSP has provided us with his insights and reflections on the subject. Having worked with HSP we’d agree with his conclusion “the Golden Age of HSP not as something of the past but something that is developing all around us right now!”.

50 years ago, the practical scientist had very few options for formulating rationally in terms of solubility. The Hildebrand solubility parameter worked very well for non-polar systems but was inherently unsuited to more general systems that included “hydrogen bonding”. Adding an extra “polar” parameter was not sufficient to capture the complexity of the chemistry but I thought that including a hydrogen bonding parameter could provide a compact set of three parameters (now called HSP) that should be useful. The problem was to assign values to the three parameters. A significant requirement was that the measurable energy of vaporisation must necessarily be the sum of the cohesive energy that derives simultaneously from the non-polar, polar, and hydrogen bonding interactions. Via a “bootstrap” technique involving thermodynamic theory and many experiments, a self-consistent set of parameters emerged for a wide range of solvents that gave good solubility understanding for a range of commercial polymers. The key was to use a “Distance” based on the sum of the squares of the individual differences in the three terms, where a small Distance would be a closer solubility match. A mix of two bad solvents that gave values with a small Distance would mean that two bad solvents could create a better or perhaps even a good solvent mixture. This turned out to be a good test for the theory, and happily, reliable predictions were found for taking bad (or poor) solvents and creating good solvent blends, and indeed this has been a key practical aspect of HSP ever since.

My Doctoral dissertation at the Technical University of Denmark was successfully defended in 1967.  Shortly thereafter I was able to apply this knowledge to the real world of paints and pigments at PPG Industries. The management were far-sighted and allowed me to publish, including a 1969 paper called “The Universality of the Solubility Parameter”, showing that the three-parameter approach applied far outside the original field of solvents and polymers.

Today, looking back over 50 years of what came to be called Hansen Solubility Parameters, I am not surprised, but I am certainly delighted, that this prediction of Universality has been proven correct. HSP are used all around the world, in industry and academia, helping to solve tricky formulation problems. But although I am happy to look back at the past 50 years, the 50th Anniversary Conference is very much focused on the future. Thanks to the power of modern technology in packages such as HSPiP, I see the Golden Age of HSP not as something of the past but something that is developing all around us right now!”

Dr Charles Hansen, 2016.

Posted by iformula

Spray Drying: New Tricks From Established Technologies

By Jim Bullock, May 2016

As a humble scientist in a new job (too) many years ago, one of my earliest introductions to the daunting world of chemical engineering was my first encounter with a spray dryer. As I was shown around a scale-up lab I was introduced to the technician running the pilot scale spray dryer – a fairly big piece of kit about the same height as (but a lot wider than) an adult human. At that moment I was surprised to see the technician apparently assault the spray dryer violently with a crude weapon of his own devising – a short length of flexible rubber hose wrapped in duct tape. Mildly alarmed, I learned a little later that this wasn’t his way of venting his frustration, but actually a fairly practical way of removing build-up of material from the sides of the dryer. All of which taught me that there is a difference between the laboratory world of delicate glassware and the practical world of large chemical processing equipment – and that scaling up processes brings with it a new set of challenges and considerations.

At that time I saw spray drying as simply a standard processing step, a way of taking a liquid product and making a dry powder from it. Useful, but not much more. Then one day our lab in the speciality chemicals division of the company was visited by a young scientist from the (much more profitable, cutting edge and richer) pharmaceutical division. She had come to use our small scale lab spray dryer (which was made of glass and definitely not something to be attacked) for some experiments. As we were a bit surprised that our expensive pharma colleagues didn’t have their own spray dryer, we asked her why she had to use ours and got the reply “because we don’t really know anything about spray drying”.

The latter experience was an important indication to me that technologies which are completely established in one industry can be completely novel to another industry, and that this disparity has nothing to do with how well the industries are funded or how good its scientists and engineers are. So when we fast forward to the present day and the recently held University of Leeds course “Spray Drying and Atomisation of Formulations” (supported by iFormulate) it was rewarding to see plenty of innovation on show. Over time, the pharmaceutical industry has well and truly adopted spray drying as a technique for precisely designing and engineering small particles, especially for delivery of active ingredients in inhaled dosage products. So it turns out that all those years ago I didn’t “really know anything about spray drying” either. At the course it was also good to see delegates from industries as diverse as food, catalysts, agrochemicals, explosives and pharmaceuticals benefiting from an understanding of the common science and engineering principles and practicalities – and exchanging ideas that crossed those industry boundaries.

If you missed our recent short webinar on Spray Drying (featuring Professor David York from the University of Leeds) then you can still view the recording via our website. Finally, if you are interested in finding out more about the benefits of cross-industry innovation and exchange of ideas, take a look at our new White Paper “Open Innovation Across Sectors”.

Posted by iformula

Premier Design for Formulations – Is it Really Worth it?

By David Calvert, May 2016

As the football Premier League in England came to a conclusion, I asked myself the question “what system would be considered the “Leicester City” of formulation design?”. To most of those in the pharmaceutical industry, then Quality by Design (QbD) is considered best practice. One of the main drivers for QbD was to try and minimise the time taken to get a dossier through the regulatory process and I was interested therefore in a  presentation at the recent QbD symposium organised by De Montfort University (based by chance in the very same “Premier City”).

In his presentation, Colm Reddington of the MHRA (the UK medicines regulator) examined 69 applications made to the Chemistry, Pharmacy and Standards (CPS) expert advisory group which advises the Commission on Human Medicines (CHM) on matters relating to a range of quality issues. He classified the applications by the extent to which they demonstrated enhanced Quality by Design approaches. The applications were ranked with those at the lower having a mainly empirical approach and those at the top which had established a design space and used process analytical tools (PAT) amongst other QbD techniques. Interestingly around half of the applications were considered to still use the traditional empirical approach but around 20% were towards the higher end of the spectrum. During the examination process, the majority of the major objections came where the empirical approach was taken with only one major objection in the top two categories.

Whilst the sample size was small, it does offer some proof towards the logical conclusion that a good design will lead to a better defined product. Even in this study though it is worth considering if a full blown QbD approach is always the best approach. In our training course on “Design for Formulation” which we ran last year, Ian Jolliffe and our other speakers demonstrated how a structured approach to the formulation design process is the best one, but that this must be tailored towards each specific industry – and a risk management approach is key. You can view a webinar recording of Ian discussing what you need to do before you go in the lab on our web-site  and Ian’s forthcoming webinar “iFormulate introduces Powderology – An Introduction to the Mysteries of Powders”, you will also hear how this approach can be key in working with powders in all types of formulations. You can register for this on the webinar page.

Posted by iformula

Making Cosmetics – A “Clean” and Differentiated Industry?

By David Calvert, March 2016

Having attended the Making Cosmetics Exhibition recently at the Ricoh Arena in Coventry I thought I would share my observations on what seem to be the key issues in this industry and what makes it different, or not, to other formulating industries.

As you would expect from this consumer retail driven market, there is a great emphasis on the brand and how to develop it. As well as the exhibition, there was an extensive seminar programme (too many to attend in my short time there) with a number of presentations looking at realising ideas, developing and enhancing brands and how to identify what retailers are looking for in a  successful brands. Branding is of course a key element in a number of other formulating markets such as FMCG, coatings, healthcare  and even pharma to some degree, but I have the impression that in cosmetics it is of utmost importance.

Of course a good brand is based on good claims and a number of presentations looked at testing and substantiating these claims. This topic is a key element of the forthcoming SCS Annual Conference “Science in a Bottle” in May. What perhaps impressed me at the exhibition were the large number of portable devices which were being demonstrated allowing in-particular measurements of skin properties to be taken quickly and non-invasively at the consumer. Maybe this “off-site” measurement is a trend that we will see spreading into other industries.

Clearly, product claims are based on the ingredients themselves and not surprisingly there were a number of new ingredients which were natural or bio-based and could claim to be clean ingredients. Away from the exhibition I was intrigued by a recent e-mail I received from DuPont Tate and Lyle promoting their new Zemea® propanediol for cosmetics as an alternative to petroleum-based glycols. Looks interesting and wonder if it will be taken up by the formulators.

Of course ingredients need to be combined or formulated into the products and as you would infer from the name of the exhibition there were a number of processing and contract manufacturing companies exhibiting. The processing companies were familiar ones from other markets with Silverson, Netzsch, IKA and the ever expanding Ytron Group exhibiting. I must admit I did not see that the equipment differed too much from that used in other industries with mixing, suspending, dissolving and aerating being common unit operations for all formulators.

Cleaning is also a common operation but I was struck by both Doronwell and Ecolab exhibiting and making presentations on their cleaning solutions. This has spread from the food industry and the need for CIP in the pharma industry but increased emphasis could be down to the expanding number of ingredients in cosmetic formulations and also some extra scrutiny on sweating assets and looking for less down time due to cleaning. I recently listened in on a webinar from Croda on industrial cleaning where they outlined the use of HLB to match cleaning solution with the product needing to be cleaned. Although this may seem old science to some, it appears to have been effective and the offers on show at Making Cosmetics were also clearly built around good science.

Two final observations, good to see the National Formulation Centre talking about their programme and interesting that Chemspeed were exhibiting and talking about automation of processes for faster product development in cosmetics. Again, a trend we have seen in other formulating industries.

The next “sister” exhibition is “Making Pharmaceuticals” on 26-27 April in Birmingham. We will be there on the 26th running a workshop on Open Innovation with our Associate Partner, Malcolm McKechnie. Please come along if you are going to the exhibition. If you are unable to make it but are interested in Open Innovation, then you can download our White Paper on Open innovation from our web-site.

Posted by iformula