Tag Archives: Blog

Seamless innovation by improved laboratory informatics

OSTHUS Webinar

Life science data is a major asset for biopharmaceutical and chemical industries. Their availability via electronic systems is a prerequisite for collaborative work and successful innovation.  Currently, most laboratories have to deal with a multitude of data sources originating from different instruments, systems, sites and external resources all with their own data formats. As a consequence, scientists often have to do a lot of manual effort to gain access to the data they need and IT teams are struggling to maintain the large amounts of different IT solutions.  Data analytics becomes an inefficient process with a high amount of integration effort.

Typically data integration of life science data is very time consuming. So why is it so time consuming? Data integration projects are complex because you combine data from multiple data sources and these often have different data standards, different data formats, different semantics and different data quality.  Data integration is characterized by a high degree of exception handling. Typographical errors caused at data entry, fuzzy definitions of concepts, or inconsistent interpretation of data by different informatics systems are typical root causes. To integrate data efficiently you need to have in depth knowledge of both IT and the science –  a combination which is quite rare.  So what can you do if you don’t have these people readily available?

One way to make these integration projects go smoother is using reference architectures and data standards. We will show several uses cases to illustrate how reference architectures can support you in data integration, data curation & data migration. One of these reference architectures addresses the migration of biopharmaceutical data using an integration layer on top of a data warehouse as part of a discovery data integration process. This architecture was used in a data integration project combining multiple heterogeneous data sources, all with different data formats and standards.






Central in this architecture is a data curation platform – the operation data store – which provides an easy to use easy-to-use entry to scientific data management and can be used by non-IT experts giving your scientific data experts control of the data integration. During the webinar we will show you additional reference architectures and will highlight how data standards can help you in tackling these complicated projects.

So do you recognize these problems? Is your company dealing with a myriad of data sources or are you losing too much time connecting data? Join us for this webinar in which we will show you how data integration projects can be made easy.

Register here for OSTHUS Webinar.

For more information on our company and services visit www.osthus.com or contact us at [email protected]

Baumer and the “Bourdon Original”– a successful combination of innovation and reliability

By: Stefan Diepenbrock, Manager Public Relations, Baumer

Who would have thought it? While the wheel of innovation keeps turning faster and faster in the process industry, a more than 160-year-old technology is still used for pressure measurement today. When Eugène Bourdon filed a patent application for the “Bourdon tube” named after him in Paris in 1849, not even he expected that his technology would still be the most common method for mechanical pressure measurement, for example, in the oil and gas industry in the 21st century. Eugène Bourdon’s legacy is continued today by the Swiss sensor manufacturer Baumer, which currently is the owner of the only “Bourdon Original” label thanks to the takeover of the Bourdon-Haenni company in 2005 and its integration in the Baumer Group.

Today Baumer Bourdon Haenni is the competence center for mechanical measuring instruments within the Baumer Group, where Eugène Bourdon’s pioneering spirit is still ever-present. The Bourdon tube principle is combined with Baumer’s innovation know-how to create latest generation mechanical pressure gauges as part of an extensive product portfolio which Baumer has reliably provided for the petrochemical industry, power stations, water treatment plants, aviation and marine applications, and the food industry for many years.

Bourdon fits very well with the rest of the Baumer family

Reliability through innovation – both formed the basis of the Bourdon tube invented by Eugène Bourdon, and this principle is still valid today. It is not for nothing that many demanding customers rely on this pressure measurement method. Reliability through innovation is important in product development also to Baumer.  “That’s why Bourdon fits so well with the rest of the Baumer family,” explains Sascha Engel, Marketing & Sales Manager for Process Instrumentation at Baumer.

Thanks to its many benefits, the Bourdon tube in Baumer products continues to be a popular solution even after more than 160 years. During the construction of a steam engine, Bourdon noticed that the spirally wound tube used for compressing steam was flattened during production. To correct this, the tube was sealed at one end and pressurized at the other end. As a result, the spiral started to bend up while the tube tended to regain its round cross section. Impressed by this observation, Bourdon immediately carried out several tests and ultimately invented a pressure gauge based on the path change of the end of a bent Bourdon tube with an elliptical cross section.

This simplicity makes these Bourdon-tube gauges easy to handle and maintain. In addition, they are suitable for a wide pressure range and offer a high level of accuracy (up to 0.1 % of the end value). “Because they do not require an external power source, they are not prone to voltage fluctuations or power outages,” Sascha Engel explains.

Proven Bourdon principle is updated by leading-edge quality methods

Baumer continues to rely on this proven Bourdon principle for its mechanical pressure gauges. at its production site in Vendome (France) however, this traditional principle is updated by implementing -the latest quality methods which offer decisive benefits in demanding oil and gas applications.

Bourdon tubes, which have different dimensions depending on the nominal sizes of the measuring device and the measuring range, are manufactured from long stainless steel tubes in a bending machine. The Bourdon tubes are given their typical C shape and a flattened cross section. The Bourdon tubes are welded to the connection piece of the measuring device at one end and to the tube end piece at the other end. The welded measuring systems are checked for leaks and undergo an overpressure cycle to eliminate tensions in the material. Then they are welded to the pressure gauge housing in an automated laser welding plant.

After assembly of the pointer mechanism, the pressure gauge is adjusted and checked upon compliance with measuring range and accuracy class. This is done in a patented, semi-automatic process which virtually eliminates adjustment errors. The pressure gauge is completely assembled, sealed, and packed for shipment. On request, every pressure gauge comes with a calibration certificate issued by a computer-based system with optical scanning of the pointer position. The certificates are archived and can be allocated to the measuring device at any time.

BTrace – Baumer Traceability System

At Baumer, this completely documented and traceable process is called BTrace, which stands for “Baumer Traceability System”. It describes a unique method in order fulfilment. Combined with lean philosophy, it provides the customer with measurable added value in cost efficiency, reliability, and traceability. BTrace comprises several elements:

B – as in Baumer Business System: Effective production and management processes are combined in the Baumer Business System to meet customer requirements. Strictly oriented towards lean philosophy, it creates substantial customer benefits: the right product at the right place at the right time.

T – as in traceability: Baumer offers complete transparency throughout the entire order process. The result is a fully traceable product compliant to the highest safety standards.

R – as in reliability: Baumer pursues a “zero-defects philosophy”. –In doing so, core components are continuously monitored and undergo regular stress tests. All test results are checked and -archived.

A – as in automation: BTrace offers a progressive and patented calibration and certification system. The automated and EDP-based system guarantees complete traceability starting from material acquisition to the finished product by using 3.1 certificates.

C – as in consistency: Based on lean philosophy, MUDA, KAIZEN, and optimized sourcing methods, Baumer continuously works on improvements to provide the highest quality and safety standards. BTrace ensures consistency by transparent material and product codes based on universal configurations for an easy and flexible order process.

E – as in excellence: Highest quality for the customer is the core of the BTrace philosophy – at all levels.

Register here for the Baumer Webinar ‘Innovative Sensor Technologies for the Oil & Gas Industry’.

More information at: www.baumer.com/bourdon

Baumer Group
The Baumer Group is an internationally leading manufacturer and developer of sensors, and encoders, measuring instruments, and components for automated image processing. Baumer combines innovative technology and customer oriented service into intelligent solutions for factory and process automation and offers a uniquely wide range of related products and technologies. The family company has around 2,300 employees with manufacturing facilities, sales offices, and agents in 37 offices and 19 countries, always close to the customer. With consistently high quality standards worldwide and a huge potential for innovation, Baumer brings its customers critical advantages and measurable added value across many industries. For further information, visit www.baumer.com on the Internet.


Trials fail due to poor design. Don’t let yours be the next!

Flawed methods in clinical trials have had a big part to play in the poor track record for Neuroscience drug development with consequences for both patients and the pharmaceutical industry.  Nowhere are the consequences for patients more acutely felt than in the area of Alzheimer’s drug development where the industry counts just three Alzheimer’s drug wins in 13 years and 101 losses, according to Pharmaceutical Research and Manufacturers of America (PhRMA). Failed trials also have big consequences for companies, often losing organizations billions in capitalization. In many cases the losses could be avoided through better trial design[1].

Research has shown that clinical programs have a high failure rate, even when drugs are known to be effective. For example, Khan et al. (2002)[2] reviewed the data from nine antidepressants approved by the United States Food and Drug Administration between 1985 and 2000. Of the 92 treatment arms reviewed, less than half showed statistically significant separation from placebo.

Surely we can be doing more at the trial design stage to avoid failed trials. Newer methodologies, such as adaptive designs, patient enrichment and risk based monitoring have been discussed extensively but adoption by the industry has been slow.

In the upcoming webinar sponsored by Covance Inc., leading industry experts share their insights and experiences gathered in the design and execution of novel trial designs in psychiatry – designs focused on speeding availability of effective treatment to patients, selecting patient subgroups most likely to respond and minimizing placebo response rates.

Scott Berry, PhD, President and Senior Statistical Scientist at Berry Consultants, shares his thoughts on how adaptive designs can reduce length and increase flexibility in exploratory trials. Prospective subgroup analysis and wider dose ranges can be studied without increasing sample size allowing for more efficient use of trial budgets.

Sanjeev Pathak, MD, Senior Medical Director at Alkermes Inc., will review how his group has adopted methodologies that reduce the impact of placebo response while lowering sample size by 20-40%. He outlines his group’s reasons for adopting novel trial designs and lessons learned from discussions with regulators.

By attending the webinar you will –

  • Explore how novel trial designs can benefit your Neuroscience development program
  • Examine the pros and cons of implementing adaptive and enrichment designs
  • Identify barriers to adoption and how to overcome them, including organizational, regulatory and resource related barriers

If you’d like to hear more about how trial design can help improve your chances of success, then join me on November 17th for the Covance-sponsored webinar “Innovation in psychiatry trial design – How to improve the probability of success”.

Webinar Details and Registration
Monday, November 17, 2014
10 am New York, US/15:00 London, UK
60 minutes including Q&A

[1]Becker RE, Greig NH. Why So Few Drugs for Alzheimer’s Disease? Are Methods Failing Drugs? Curr Alzheimer Res. 2010 November 1; 7(7): 642–651

[2]Khan A, Leventhal RM, Khan SR, Brown WA. Severity of Depression and Response to Antidepressants and Placebo: An Analysis of the Food and Drug Administration Database. Journal of Clinical Psychopharmacology. February 2002 – Volume 22 – Issue 1 – pp 40-45

We’ve done it this way. Why change?

By Kevin Kolmetz, Moog, Product Manager, LinkedIn, and Richard Kim, Moog, Engineering Manager, LinkedIn

In the some parts of the oil and gas industry, you may hear people say, “We’ve done things a certain way. It works. Why change what works?” While an organization might have always used, for example, a hydraulic solution for a topside drilling application, we would like to challenge them to look at improving on what works, perhaps by using an electro-mechanical solution for their application. Our solution approach is to “connect” with you or your director of technology and talk about what’s best for your motion control situation.

Let me take an additional example from the automotive test industry. We have developed a tire-coupled simulation system with actuators that move following a prescribed drive file that replicates the motion from a real world road test.  We’re moving each of the car’s wheels at an almost frenetic frequency. Typically a lot of people believe electric actuation is always faster than hydraulic actuation in that situation. But in hydraulics, you actually have more stored energy. With electric actuation, you have a motor that has to spool up – that takes time. In this case then hydraulics can actually be faster.

Moog became involved in this webinar because designing reliability and performance into your motion control applications matters. Speed, force, reliability and even profits hinge on motion control. You want to select the correct technology. We know that a lot is, literally, riding on your choice.

Here’s a three-step process we use to help people make their design choices.

First, we look at the architecture underpinning your application. When you think about older components as part of an architecture they tend to be run off a centralized control scheme. Newer, smart components have smart controls. For instance, you could have a master controller in a control room on an oil platform going out to every component which also each have a feedback device. And that smart component will tell you that you need to check something related to, say, one of the components on a pipe racking system. So, architecturally speaking, we dig into whether someone is working with a centralized control system or distributed control system. Knowing this will result in the best motion control solution.

Second, we examine device-level form factor. Simply put, that means we look at what a specific component does and its working environment. We look at the physical conditions and needs of what’s going on with your application. Your topside application need may be right over the wellhead which can give you a lot of room for a solution that requires a really rapid response time. Device-level form factor can also examine energy density. For example, you may need to move a valve that needs 30,000 pounds of force for a cutter assembly that’s part of a blow-out preventer. All these situations need to be considered for the best motion control solution.

And, third, we study device-level dynamics. In other words, how rapidly does your application move? Is it moving 18 times every tenth of a second? Alternatively, some devices move very slowly and some are needed for secondary and tertiary processes. Regardless of how quickly or slowly it moves, we design devices that control things at the component level that then fit into the entire motion solution.

Every application has some unique aspects to it. We’ve seen almost everything.  We have even come up against some customers who takes their “white space” and make a design more complicated than it needs to be. For instance one customer was trying to design a huge, precision winch to control a block and tackle for moving a drill pipe up and down requiring 3,000 hp. The customer designed a complex array of cables and pulleys as that kind of hardware was what they knew. We certainly listened. But ultimately gave them a solution with a much lower potential for problems.

We’ll discuss this in more detail in the webinar on the 16th October. In the meantime, what’s your design process? What works for you? What might be worth changing?

See you at the webinar! – ‘Hydralic or Electric? Engineering the Best Control Solution’