Friday, 13 September 2013

The Impact of Operations Research on People, Business and Society


Last week the international conference on Operations Research OR2013 took place in Rotterdam. The conference was the result of the close corporation of the GermanOR society (GOR), the Erasmus University Rotterdam and the Dutch OR society.  The conference started with a keynote from Alexander Rinnooy Kan, an expert in our field and several times ranked as the most influential Dutch Person, on “How to educate Operations Research practitioners”.  Key point in his lecture was that although Operations Research practitioners have been successful in applying their knowledge, continued education is of great importance. The obvious reason is keeping pace with new developments in research; on the other hand trends like Big Data give rise to new problems and applications, here research and practice can go hand in hand. The OR2013 conference programme had a special plenary session, led by the Chair of the Amsterdam Business School Marc Salomon, in which special attention was given to the impact of Operations Research on people, business and society.  Over 50 C-suite business representatives attended the plenary as a special guest.

Wim Fabries of Dutch Railways
The session started with Wim Fabries, Director of Transport and board member of the passenger division of Dutch Railways. Fabries illustrated that providing reliable rail transportation is complex, requiring many interrelated decisions concerning the time table, rolling stock and crew. To support these decisions Dutch Railways has a special department, the department of Process Quality and Innovation, which uses Operations Research to support these decisions. The Operations Research practitioners of Dutch Railways had their finest hour when a new and robust time table had to be constructed which facilitated the growing passenger and freight transport on the already highly utilized railway network. The new schedule was successfully launched in December 2006, providing a daily schedule for about 5,500 daily trains. This “tour de force” was rewarded with the Franz Edelman award in 2008. Fabries indicated that Operations Research continues to be of high importance. For example in effectively managing disruptions or in constructing a reliable winter schedule so people can continue to use the trains with minimal impact on their travel plans.

Pieter Bootsma of AirFrance KLM
Pieter Bootsma, Executive Vice-President Marketing, Revenue Management and Network at Air France KLM, explained that without the use of Operations Research successfully running an airline would be impossible. In nearly every process within the company Operations Research is involved.  Whether it is in strategic planning, crew management, flight operations, planning of ground services or maintenance scheduling, without Operations Research managing these processes would be nearly impossible. Given the narrow margins in the airline industry, slight improvements in efficiency can make the difference between profit and loss. In his talk, Pieter Bootsma highlighted the use of OR in Revenue management.  The essence of revenue management is to use price or availability of seats to influence customer demand. By analysing booking behaviour of passengers, Air France KLM is able to estimate the willingness to pay a certain price of each passenger category. For example business people are willing to pay more for a seat than leisure passengers and they tend to book there flights closer to the actual departure date. With this knowledge Air France KLM can use the availability of seats and set the right price to maximize revenue. As a consequence availability and/or price of a passenger seat will vary over time. Revenue management has led to a paradigm shift in the airline business as it focusses on maximizing revenue, not the number of seats occupied. By many it has been coined the single most important technical development in transportation management, showing that Operations Research can be a disruptive technology.

Luke Disney of North Star Alliance
The impact of Operations Research on humanitarian assistance was illustrated by North Star Alliance Executive Director, Luke Disney. The North Star Alliance started as a practical industry response to an urgent health problem, the spread of HIV/AIDS among truck drivers in sub-Saharan Africa, negatively impacting the distribution of relief food to hungry communities. By establishing a network of drop-in health clinics, called Roadside Wellness Centres, at truck stops, ports, rail junctions and border crossings North Start Alliance can offer mobile populations like truck drivers with essential healthcare and information.  The access to healthcare allows truck drivers to get treatment when necessary while at work, securing the distribution of relief food and road transportation in Africa. Luke Disney highlighted the impact of Operations Research with Polaris; a model that is used to optimise the placement of new and repositioning of existing RWCs, including the optimisation of staffing and inventory levels. Key in building the network of RWCs is to improve the continuity of care along the trade lanes within Africa. Continuity of care ensures that truck drivers can have access to healthcare everywhere they go. Also it ensures that medical help is in the neighbourhood when assistance is required suddenly, for example when a truck driver gets Malaria or when he loses his pills while being treated for Tuberculosis or HIV. Since financial resources are limited, the Polaris model helps the North Star Alliance to gradually build a network, having the biggest increase in continuity of care for each dollar invested.


Operations Research has proved to be the best answer to handle complexity many times in the past. It came into existence during the 2nd world war, where mathematicians revolutionized the way wars are waged and won by applying mathematics to almost any challenge in warfare. Today Operations Research has found its way in many applications that impacts business, people and society. The above stories point this out very clear. You could say it is the least known most influential scientific practice of our time.



Sunday, 1 September 2013

Will Big Data end Operations Research?

Until a few years ago the term Big Data was only known to a limited group of experts. Now it is nearly impossible to visit a website or read an article without stumbling across Big Data. A Google query results in over 1.5 billion hits in less than half a second, two years ago this was only one fifth of that number. Hits containing links to web pages on the increased number of visitors of a museum or the improvement of the supply chain performance of Tesco due to use of Big Data. You get the impression that Big Data is everywhere. Many times Big Data is positioned as the answer to everything. It is the end of theory as former Wired editor Chris Anderson wants us to belief.  The promise of Big Data seems to imply that in the sheer size of data sets there lurks some kind of magic. When the size of the data set passes some critical threshold the answer to all questions will come forward as if Apollo no longer lives in Delphi but in large data sets. Has Deep Thought become reality with Big Data and will it answer all our questions, including the ultimate question to Life, The Universe and Everything? Or a slightly simpler question whether P = NP or not? Will Big Data end Operations Research?

red = Big Data, blue = ORMS
The introduction of enterprise wide information and planning systems like ERP and the Internet has led to a vast increase in the data that is being collected and stored. IBM estimates that each day 2.5 quintillion bytes of data are generated and this rate is growing every day. So fast that 90% of the data that we have available today arose in the past 2 years. The ability to use this data can have enormous benefits. The success of companies like Google and Amazon proofs that. When sales of two items correlate at Amazon they end up in each other’s “Customers Who Bought This Item Also Bought” lists, boosting sales. The same principle is used by Google in their page rank algorithm. Using similar techniques Target was able to identify a correlation between the sales of a set of products and pregnancy. Using point of sales data and data from customer loyalty cards this correlation was used to personalise ads and offers sent to Target customers upsetting a father when his teenage daughter started to get ads for diapers and baby oil. Quite a story, but is it proof of the success of big data? We must be cautious not to be fooled by our observation bias, as we don’t know how many Target customers incorrectly received the pregnancy related ads.

As data is not objective and correlation most of the time doesn’t imply a causal link we must be cautious in blindly following what the data seems to tell us. Data is created as we gather it, it acquires meaning when we analyse it. Every analyst should know the many pitfalls in each of these steps. For example, the analysis of Twitter and Foursquare data during hurricane Sandy from the New York area showed some interesting results. The number of tweets coming from Manhattan suggested that Manhattan was worse off than Coney Island. However, the reverse was true, as smart phone ownership is much higher in Manhattan. Due to black outs, recharging smart phones became impossible, lowering the number of tweets and check-ins from Coney Island even more. A similar thing happened in the beginning of the year when Google estimated the number of flu infected people. Google estimated that 11% of the US population was infected, twice the level the CDC estimated. Cause of the overestimation probably was the media hype boosting the number of Google queries on the subject. A lot of data not a panacea after all?

Making decisions based on correlations found in data alone can bring you benefits when you are a Target customer, but can keep you from rescue when you’re living in Coney Island. Quality decision making doesn’t result from data alone, let alone the random quest for correlations in large data sets. Data however is an important ingredient for quality decision making. As Ron Howard suggests, quality decision making starts with framing the problem. The decision is supported by what you know (data), what you want (decision criteria, objectives) and what you can do (alternatives, requirements and restrictions). Collectively, these represent the decision basis, the specification of the decision. Logic (the math model) operates on the decision basis to produce the best alternative. Note that if any one of the basic elements is missing, there is no decision to be made. What to decide when there are no alternatives? How to decide between alternatives when it’s unclear how to rank them? If you do not have any data linking what you decide to what will happen, then all alternatives serve equally well. The reverse is also true, gathering data that doesn’t help to judge or rank the alternative decisions to the problem is pointless. Data is said to be the new oil. My take is that organisations shouldn’t be fixated on gathering and mining data but combine it with a structured approach on decision making. Than data will becomethe new soil for improvements, new insights and innovations.



It took Deep Though 7.5 million years to answer the ultimate question. As nobody knew what the ultimate question to Life, The Universe and Everything actually was, nobody knows what to make of the answer (42). To find the question belonging to the answer (some kind of intergalactic Jeopardy!) a new computer is constructed (not Watson) which takes 10 million years to find the answer. Unfortunately it is destroyed 5 minutes before it reaches the answer. An Operations Researcher (or Certified Analytics Professional) would probably have done a better job, starting with framing the question, gathering and validating the relevant data, constructing and calibrating a model, and finaly providing the best possible answer. I already know that it isn’t 42 or Big Data, but Operations Research.