Monthly Archives: July 2014

Valuing scientific research

We’ve had a few interesting discussions during the classes at Kellogg. A couple of interesting and provocative discussions from last week included the following:

Generating return on academic investments and scientific research. What we were taught was one way of valuing commercial projects by finding what their NPV (net present value) is. The problem with applying this approach to scientific research projects, is that on the face of it, the majority of basic research is financially a one way street. You get grants, you spend the grant on materials and paying students/post-docs to do the work and then you publish. Publishing doesn’t generate income. It might help you get future grants but you can’t put that easily into an NPV calculation. If you do more applied science, you can probably patent some of your inventions and try to license them out to recoup costs (which is basically what Tech Transfer Offices do). However, for a lot of scientific research this isn’t possible. So if you can’t put a value on your project’s worth, should you still do it? Absolutely, yes – if, it is well thought out. While I do believe that scientists could do more to apply risk management and project management principles to the process of conducting scientific research. Applying numerical valuation principles is clearly in conflict with the nature of academic research.

Fundamental scientific research which may not have a NPV value and whose worth cannot be foreseen is necessary, since this research can drive innovation. The beauty of the university system is that scientists and engineers have the freedom to follow far-out ideas (although some may argue that this is becoming increasingly impossible with the squeeze on funding and tenure opportunities). Companies, on the other hand, are driven by profit generation, which means that they won’t take on projects with a negative NPV (i.e. basic science research). What they can do is build on the ideas generated at universities and turn them into tangible and financially viable projects. Some great examples of this co-dependency and its effect on innovation are described in Nathan Rosenberg’s book (check out the recommended reading tab).

Another interesting point made the other day was: “Today, there is no shortage of capital in the world, merely a shortage of good ideas that deserve that capital” – Prof. Effi Benmelech (paraphrased). To some extent I would agree with that comment. Much of the technological advances in the world in the last few decades have been incremental. From the point of view of scientific research funding, I think I would actually disagree and suggest instead that perhaps the distribution of capital and the metrics for assessing good ideas is skewed, rather than there being a lack of good ideas.

Kellogg – week 4

The last couple of days on the Kellogg course (today and last week) have probably been the toughest so far. We have had 7.5 hrs of Finance, 4.5 hrs of Operations Management and 3 hrs of Risk Management and Competitive Strategy.

Finance with Efraim Benmelech has been the hardest of these to grasp, despite having taken 10 credits of courses in Investment Planning and Financial Planning at Åbo Akademi a few years ago. However, I am really grateful I took these classes at ÅA. They gave me the chance to practice the process of actually valuing projects, and allowed me to become familiar with the concepts. I would have been so much more lost without that background. For the Kellogg course we were given a crash course in how to value projects and companies using Net Present Value (NPV), Capital Asset Pricing Model (CAPM) and the Weighted Average Cost of Capital (WACC). In particular, we were shown the pitfalls of considering certain entities and allowing outside influences to cloud our decisions. The case studies in the Kellogg course were more complex than what I had used at ÅA, with many more variables added in to consider (and also confuse!) when thinking about project valuation. Something that resonated with me, was the concept of ignoring what are called sunk costs. Sunk costs are investments that have already been made before the decision to do a project is made, such as equipment or buildings. The mistake many people make is to take into account these prior investments when making a decision to do a project. From a financial perspective this is the wrong thing to do, we should only consider the future value of a project, not what has already been done.

Operations Management with Gad Allon was really interesting. We were introduced to “Lean Operations” a similar concept to the ideas described in the bestselling book “The Lean Startup” by Eric Reis, usually a must-read for anyone working with startups (Something I should get around to reading, after I finish “Studies on Science and the Innovation Process” by Nathan Rosenberg – a book that is getting sandy since I have started taking it to the beach with me!). The idea behind Lean Operations is that you map your process of how you are doing something, such as manufacturing a laptop. Then you look at where there is waste occurring in your process. Then, you take steps to eliminate waste, e.g. moving your suppliers closer to your main factory so you don’t waste time and costs in transportation. This is something that Dell embraced. Another example would be to consider where you have quality control in your process. Do you only have a quality control step at the end, or is your quality control at strategic points in your process so that you can catch mistakes along the way and rectify them, rather than only catching them at the end. We actually did a practical exercise to illustrate this concept to us. Thinking about my PhD, this concept is familiar to anyone undertaking scientific experiments. For example during cloning of a plasmid. Do you perform all the steps of cloning at once and only at the end check if it worked? Or, do you spend a bit of time in between steps to check you are actually getting the cloning products you are expecting along the way? While the latter way is slower, at least you have the chance to catch where in the process a technique has failed.

The last set of sessions today were in Risk Management with Nabil Al-Najjar. This was a very engaging set of lectures. It touched on some of the principles we learnt in the Finance lectures but then introduced us to some tools used in the process of Risk Management. One of the most useful tools was the Pre-mortem. Often companies are unable to evaluate areas of risk. Perhaps because they are clouded by hubris, or the organisational structure means that not all voices get heard during planning meetings. An example would be that a junior team member feels unable to voice a dissenting opinion about a project, because they feel afraid of looking like they aren’t supportive. It is also quite difficult to look forward and consider all the different things that could go wrong with a project. One way to resolve this is to perform a pre-mortem. This is based on the idea that hindsight is 20/20. A scenario is given to a team that their project failed spectacularly sometime in the future. They then need to put themselves at that time in the project and look back and think of all the ways the project could have failed. Be it from something mundane, to something quite extreme, but plausible. We actually did practice this and it was surprisingly easy to look back rather than forward.

What I am really enjoying about the Kellogg courses is the wide variety of subjects we are being exposed to. It’s a shame we do not get to practice everything as thoroughly as you would in a full MBA program. However, I think this course gives us a better grounding as future managers and project leaders to understand the various tools and processes used in the corporate environment.