Research Software Engineering conference 2018 #RSE18

On 3rd September the peaceful campus of Birmingham University came alive with bubbling groups of research software engineers, talking in excited tones about their latest optimisation tool and favourite python library, as the third annual conference of Research Software Engineers was started!

A real global affair, #RSE18 had 314 delegates from 12 countries. That represents a nearly 50% increase over last year’s attendance and also a 7% increase in women attending compared to 2017.

Big News!

UK RSE Association is turning into the Society of Research Software Engineering! A legal, independent, professional organisation!

The UK RSE Association has seen significant growth since its inception in 2013, to over 1000 members. The community’s growth has made the informal, volunteer run format unsustainable. The move will enable the society to hold funds, employ staff, and operate as an independent organisation to represent the interests of the RSE community. Visit the RSE website for more information and sign up to receive updates.


Eleanor Robson (Professor of Ancient Middle Eastern History and Head of the History Department at University College London) kicked the conference off with the first keynote about Nammu and Oracc, two digital humanities software packages. Nammu is an easy-to-use text-editor for cuneiform (a written language from c. 3500-3000 BCE) inscriptions. Oracc is a collaborative online publishing platform for cuneiform transcriptions. Eleanor and a team of RSEs from UCL are developing these tools to aid with the protection, conservation and documentation of some of the world’s oldest written language. It was exciting to hear how computational approaches can be developed in all subject matters, even ancient history!

The second keynote of day 1 was given by Andrew Fitzgibbon from Microsoft Research on “Building computer vision systems that really work”.  He spoke on the technology and software that is working in Microsoft’s HoloLens, how it grew from 2D point mapping to real world camera tracking and in-situ generation of 3D geometry. Andrew talked passionately about how doing ‘research’ and ‘engineering’ should always be done with a real-life application as the end goal. It’s how our generation can positively affect the future through the science that we do.

Later in the day I got a chance to try the Microsoft HoloLens. It was incredibly interesting to see how it mapped the environment around you in real time. It was relatively robust with flat surfaces such as floor, stairs, and walls but had trouble picking individual people out in a busy environment. It took a few minutes to get used to the ‘pinch’ controls but once I got to grips with it, manipulating the augmented environment was quite intuitive. Other than the field of view being incredibly small I can see lots of applications for this kind of tech. Excited to try the next iteration!

Day 2 was kicked off by James Howison from University of Texas at Austin talking on sustainability in scientific software. As software has become more important to the practice of research, policy makers and researchers have become concerned about a perceived lack of sustained benefit from software produced by grant funded projects. James expressed that ~20% of the cost of a code’s lifetime is in development, the rest is maintenance. The software ecosystem of a development project is complex and sustainability should be considered at every step, from proposal to publication. Academic software has a unique issue as standalone branches developed for a particular area of research are often not merged back to the master. Leaving unfinished and unsupported strands of a software package just floating.

Andreas Fidjeland from Google’s DeepMind finished the conference’s keynotes with a talk on how machine learning and AI are transforming research. DeepMind’s scientific mission is to push the boundaries of AI, developing programs that can learn to solve complex problems without needing to be taught how. Andreas talk about how AI was trained to play Go, a game with very simple rules but incredibly complex permutations and number of potential moves. When world leading Go players play with intuition, it is not a trivial task to convert their logic to physical quantities that can be evaluated.

Talks & Workshops

Between the keynotes (and networking over snacks and coffee of course!) various parallel sessions were held. These consisted of both workshops and talks and ran over the two days. The general theme of the sessions that I attended was enabling better software through guidelines and tools, the conference had a big emphasis on knowledge and skill sharing to better equip RSEs in their roles. This was echoed in the talks and especially the workshops. There were workshops focusing on best practices for programming languages such as C++ and Python, and discussing technologies such as containerisation and machine learning.

Another of the conference’s aims was to help foster a welcoming and connected community with RSEs regardless of location. This was done through a series of talks that focused on community and how to build a thriving group. Not just a community between fellow RSEs but with funders and wider research fields.

Diversity is another theme, and a workshop was held to discuss best practices for inclusive hiring – making us think about how to reach the best staff for the job, and give everyone an equal chance through the interview and selection process.

Among the many discussions that took place, the issue of recognition and reputation was brought up many times. The general consensus was that RSEs are under recognised for their contribution to academic research. It is not common place to be acknowledged for work completed on software that was used by an academic, therefore it can be difficult to build an external reputation and measure the impact your work has had. This is one of the goals of the RSE Society is to put in place a recognition scheme on-par to publication.

High Performance Computing Workshop

Aiman Shaikh and I stayed an extra day in Birmingham to attend the RSE in High Performance Computing day workshop. Approximately 10% of conference delegates stayed for this extra workshop.  Its aims were to bring together RSEs working in the HPC space.

The day was split in two, with talks from tier-1 and tier-2 HPC representatives in the morning and a discussion on HPC Champions in the afternoon. The talks covered a wide range of HPC topics, from open source benchmarking tools to developing collaborations through hackathons. After some interesting talks (and more coffee and pastries!) we were introduced to HPC Champions.

HPC Champions is the next step on from the ARCHER Champions programme. An ARCHER Champion is an individual within a regional or national centre who has the ability to support and advise users on how to access the HPC resources available to them in the UK. They can also be an RSE that works with a group of HPC users to provide support and skill sharing within that group. HPC Champions are going to have a similar description, but with more emphasis on championing excellent HPC software practices. HPC Champions is not just for RSEs within the HPC space, its aim is to be inclusive to all who work with HPC; RSEs, Supporters, Researchers, and Industry. HPC Champions is going to be linked to HPC UK.

Final Thoughts

Overall it was a great experience attending RSE18! Being surrounded by like-minded people who have gathered to openly discuss and share resources and skills was invigorating. There is a real community developing for RSEs and it is exciting to be here at the start of the RSE Society. Topics ranged from best practices to better user interfaces to programming tools for easy and faster code to cutting edge science. I particularly enjoyed discovering how what I thought was a purely scientific role can be extended to a larger range of subjects.

Tim Powell




Work experience at the Hartree Centre

In this post we talk about developing activities for the Hartree Centre work experience programme and what happened when we challenged 6 students to work together to build a 20-node mini super computer.

STFC runs a work experience programme every year with applicants expressing an interest in placements within the centre. Initially, we had a view of taking just one student to join our Future Technologies team but after hearing about other placements, we wanted to move away from the ‘lone student’ experience and offer a group-based opportunity. We hoped that this would show students how we operate here in multi-disciplinary teams working together to solve challenges. As a result, 6 students from local colleges joined us for 2 weeks to find out more about life here at the Hartree Centre.

Student experience

We were certain that we wanted the group to gain an insight into the breadth of projects undertaken by the Hartree Centre, provide an introduction to parallel computing and data science, and for students to have a shared, enjoyable and valuable learning experience. We also wanted to make sure students were able to work towards something tangible and an ‘end product’ that would have a lifespan after the work experience programme.

Our idea was to be able to use the experience of the students building a 20-node Raspberry Pi cluster to build upon existing learning modules we had started to make for a local school, culminating in a set of resources available for our local community. The guidance notes on Wee Archie from our colleagues at EPCC helped direct our thoughts as to how we could structure the programme for students from a variety of academic backgrounds.

Continue reading “Work experience at the Hartree Centre”


Shaping IT Service Management at The Hartree Centre: Service Design

The third in a series of blog posts from Dave Cable, Head of Service Operations here at The Hartree Centre gives us an introduction to service design, transition, configuration management and change management. 

In my previous post, I described the key aspects of the ITIL Service Operation area that we have implemented at the Hartree Centre.  In this post, I’ll move on to Service Design and Service Transition.

What is Service Design?

The ITIL area of Service Strategy considers all the business requirements for IT services, and from them constructs a high-level view of the range of services to be offered.  Service Design turns this high-level portfolio into a set of service specifications for inclusion in the organisation’s Service Catalogue.  It takes account of the requirements for information security, availability and capacity.  Service catalogue entries also include details of standard service levels (SLA metrics) and provide, where appropriate, pricing information.  Note that non-standard service levels may be negotiated with individual customers.

Continue reading “Shaping IT Service Management at The Hartree Centre: Service Design”


Diversity & Inclusion in HPC

High Performance Computing (HPC) and High Performance Data Analytics (HPDA) – the provenance of the Hartree Centre – are rapidly expanding areas of importance to academia and industry, with myriad new employment opportunities arising. It is predicted that the gap between supply and demand of skilled staff will continue to grow. Despite the face that women make up 51% of the population, on average only around 15% of people working in IT are women. The proportion working in HPC and HPDA is even less. When taken in conjunction with recent evidence that diverse teams and organisations outperform less diverse competitors, there are sound business reasons why Diversity and Inclusion is a priority, as well as moral and social imperatives.

I am one of the founders of Women in HPC, which was formed in the UK by a small group of women who were interested in exploring the reasons why so few women were working in all areas of High Performance Computing. From small beginnings, it has grown into an organisation and network with global reach, holding programmes of events at the major international supercomputing and IT conferences.

Continue reading “Diversity & Inclusion in HPC”


EuroEXA: working together to build an exascale demonstrator

As proud members of the European HPC community, I think it’s safe to say our efforts to achieve a world-class extreme scale, power-efficient and resilient HPC platform are ambitious. We’re working towards a machine that can scale to 100 petaflops.

This three and a half year, 20 million euro Horizon2020 funded project has been designed to answer these challenges:

  • How do we build an exascale machine within a sensible energy budget?
  • How do we design something so that we’re not moving huge amounts of data around?
  • How do we achieve our ambitions cost-effectively?
  • How do we deal with all of the complexity associated with running applications on a machine of that size?

First of all, it’s important to note here that we’re not going to be starting from scratch. EuroEXA will build on previous projects that have demonstrated smaller elements of our community ambitions. This learning has directed the approach to EuroEXA and Professor John Goodacre based at The University of Manchester is leading the project and has pulled together a consortium of 40 partners industry and academic partners across Europe. Each project partner will play a fundamental role in bringing together key components of this undertaking. We’ll explain the specific role we’ll have here at the Hartree Centre later on.

Continue reading “EuroEXA: working together to build an exascale demonstrator”


Shaping IT Service Management at The Hartree Centre: Service Operation

The second in a series of blog posts from Dave Cable, Head of Service Operations here at The Hartree Centre gives us an introduction to Service Operation, the primary interface for service delivery with customers.

In the first post of this series, I gave a brief description of IT Service Management and the specific implementation we have adopted, known as ITIL.  In this post, I describe how we have implemented one function and three key processes from the ITIL area of Service Operation.

What is Service Operation?

Service Operation is the collection of processes and functions that describe how to deliver services to customers at agreed levels.

Why is it important?

Service Operation represents the primary interface for service delivery with customers.  As such, it can win or lose business.  It also helps the service provider, by providing clear mechanisms for prioritising customer requests for assistance, and tools to identify deep-rooted issues that require additional effort to resolve.

Continue reading “Shaping IT Service Management at The Hartree Centre: Service Operation”


Inspiration, ideas and innovation: Girls in Tech outreach event

In this post, Katharina Reusch, a Software Engineer from IBM Research takes us through their second annual ‘Girls in Tech’ event held on Ada Lovelace Day.

It was that time of year for the second annual “Girls in Tech” outreach event, organised by Katharina Reusch from IBM Research in collaboration with the Science and Technology Facilities Council (STFC). The event was sponsored and initiated by IBM UK Foundation (our Early Professionals Programme for Graduates, Apprentices, Interns and Futures) and the IBM Girls Who Can team. Girls Who Can is a support network within IBM UK Foundation, with the aim to provide a healthy and positive environment where not just women, but all the work force, can prosper and fulfil their potential. After a successful trial event with 80 girls back in October 2016, we decided to go even bigger this year and run a joint event at STFC’s Daresbury (DL) and Rutherford Laboratory (RAL) Campus with 90 girls at each site, aged 12-13.


We had a busy day, packed with activities to introduce the girls to our cutting-edge technologies and where our products fit in everyday life along with our aspirations for where future technologies can make an impact. This was illustrated with demonstrations of IBM and STFC projects currently underway in the UK.

The girls also had a chance to quiz us in a career Q&A session (the most popular session on the day!), to understand how to get into a technology career with all the different avenues available to them, from work experience, apprenticeships, graduate schemes and professional career development.

But a day learning about technology is nothing without a bit of hands-on experience: In the Arduino coding challenge, the girls had to code and wire up a temperature sensor for the Ada Lovelace Earth Observation Satellite. Again, this proved to be a very popular session with great feedback from both volunteers, teachers and pupils.

“Science and innovation wouldn’t be possible without inspired minds, great ideas and grand challenges.”

Science and Innovation wouldn’t be possible without inspired minds, great ideas and grand challenges, so for the third activity we set the girls a 60 minute innovation challenge: come up with an innovative idea, outline a prototype and do a 1-minute elevator pitch to everyone in the big lecture theatre at the end of the day. We were all amazed with the creativity, imagination and truly innovative ideas the girls came up with – we even noted some down some for our own work! We covered a wide spectrum of ideas from robots organising your daily schedule at home, medical robots for elderly, smart microwaves to self-learning hair salons.

The winning team at Daresbury invented “Reflect and Select”, a smart mirror in which you can try on online shopping items virtually in the mirror and purchase with one click – who would not buy into that idea? The winning team at RAL introduced a hovering wheel chair to allow disabled people a new found freedom in movement, a wonderful example for “out-of-the-box” thinking!

Throughout the day, the positive spirit and excitement caught everyone, volunteers, teachers and girls. Our IBM staff “had a blast working with the girls, such an inspiring crowd!” and said “the RAL event was excellent and even I felt inspired by all the science and technology on-site.” Teachers confirmed that “it was a great day and the girls enjoyed it; they were clearly talking more about the subject on the way home than going” and Dianne Kennedy from St. John Plessington High wrote to us after the event: “Thank you for the really enjoyable day.  The pupils really enjoyed the experience, hopefully this will encourage them to think about choosing a STEM subject” and Ruth Harrison from Lowton High School thought:

“the balance was right, it was wonderful to see young, vibrant, bright women inspiring our girls to think about a career in STEM and raise their aspirations –  whatever their academic ability.”

This feedback was also confirmed by the numbers as 77% (DL) / 80%(RAL) girls said they now want to find out more about STEM when they get home. We further asked whether the event made them more likely to consider choosing a science/technology degree at university or for an apprenticeship, with 53% (DL) / 63% (RAL) confirming this to be more encouraged and 32% (DL) / 19% (RAL) considering this as a career choice anyway.

We were so pleased with the feedback received from teachers and girls and are keen to plan the next event to inspire even more young pupils to join us in a truly rewarding career choice!

Last but not least, a big shout out for the IBMers Houda Achouri, Kashif Taj, Georgia Prosser, Jenni Marr and STFC’s Sophy Palmer, Phill Day and Wendy Cotterill to help make the event possible and the helpers on the day: Georgia Clarkson, Malgorzata Zimon, Blair Edwards, Martyn Spink, Lan Hoang, Flaviu Cipcigan, Anna Paola Carrieri, Dave Cable, Navatha Tirungari, Rob Allan, Roger Downing, Laura Johnston, Holly Halford, Gemma Reed, Julia Game, Shannon Wilson, Olivia O’Sullivan, Lisa Whimperley, Peter Kane, Greg Corbett, Tom Dack, Jeremy Spencer, Louise Davies, Tom Byrne, Chris Oliver, Jacob Ward, Mostafa Nabi, Sarah James, Rosie Davies, Kate Winfield, Eilidh Southren, Kyle Birtwell, Lauren Mowberry, Vicky Stowell, Dave Wilsher, Manny Olaiya, Preeti Kaur, and Ffion Argent.

Continue reading “Inspiration, ideas and innovation: Girls in Tech outreach event”


Shaping IT Service Management at The Hartree Centre: Introduction

The first in a series of blog posts from Dave Cable, Head of Service Operations here at The Hartree Centre gives us a gentle introduction in to the world of IT Service Management. Look out for future posts covering service operationservice design, and continual service improvement.

What is IT Service Management?

IT Service Management (ITSM) is the proper design, governance and operation of IT-related services to meet agreed customer needs within predictable cost and efficiency bounds.  It brings together policies, processes and people with the common goal of service delivery and continuous improvement.

Why is it important?

Any IT service provider needs a clear idea of what it is they are trying to deliver and to whom.  The provider also needs to understand the costs of providing services alongside any financial returns.  ITSM provides a mechanism for businesses to be able to meet these requirements.

Continue reading “Shaping IT Service Management at The Hartree Centre: Introduction”


From a Computing GCSE to being Deputy Director

“Life is like a large pond, you are surrounded by lilypads and depending on your capabilities and circumstances you have to pick the next one to step onto.”

When I was younger, growing up in Wigan I was mainly interested in three things: football, computers and radio control cars. At school, I decided to study A Levels in maths, physics and chemistry and then went off to study chemistry at the University of Leeds with no fixed idea of what I wanted to do or where I was going afterwards.

After a period of unemployment, I was lucky enough to get a job as a Research Chemist with Crosfield, a Unilever company at the time. This involved working with Crosfield silica to remove protein from beer, essentially increasing the shelf-life of the product. To me, this was great, I was a beer scientist at the age of 21! I enjoyed the challenge of working on new formulations and eventually discovered a way of improving the shelf-life of beer using 50-70% less material than previous methods. At first, the brewers we worked with did not seem to buy in to the idea so the sales staff invited me out with them to explain the process to our customers. That was my first taste of sales and I really enjoyed it so I started to try to go out with the sales team as much as I could.

My next ‘career leap’ was in to telesales and this turned out to be a terrible idea as it really did not suit the way I liked to work and how I liked to develop customer relationships and insight. From there, I went to work for Dionex in a regional sales role with a remit for selling chromatography columns that separate chemical components. It was this position that helped me to recognise that I was actually quite good at sales and learned an important point:

“people do not just buy kit, they buy answers to the problems they want to solve.”

This led me back to my interest in computing where I taught myself how to use a macro-based scripting process that increased the efficiency of the sales process, helping me to match solutions to customer problems.

After several years in London, I re-located back to the North West and found a job with the Science and Technology Facilities Council (STFC) project managing an e-Science programme that looked at the application of web-based computing technology for solving real-life industrial problems. It was at this point that I decided it would be useful to return to education and studied part-time at Manchester Metropolitan University towards an MBA. It was great to be able to examine problems I encountered at work in an academic context. I focused on methods for lowering the barriers to innovation for industrial engagement which meant that, naturally, my job role changed from project management to industrial liaison where I helped understand the user requirements for a new data catalogue system at the Diamond Light Source.

My MBA also helped me to understand the essential technical elements of my role such as licensing and commercial requirements. These skills were utilised heavily during my next position as a Business Development Manager, where I worked to develop strategic industry partnerships for computational modelling research.

Four years ago, I joined the newly established STFC Hartree Centre as Head of Business Development with responsibility for a team that encouraged collaboration between industrial and research partners to extract value from big data, HPC and cognitive technologies for societal and economic benefits.

For me the best thing about the Hartree Centre is working with clients to understand how to integrate technology in to their organisations, ensuring that it works in the best way for them.

Clearly they thought I was doing something right because 18 months ago I was promoted to the Hartree Centre’s Deputy Director, which means I am more involved in strategic decision-making, stakeholder and project management and still get to dabble in some of the business development projects and partnerships I cultivated in the early days. It’s been a real pleasure to watch the centre grow from a small team of people in its infancy to a 50-strong department that has become one of the jewels in both STFC’s crown and a key asset to the Sci-Tech Daresbury campus and the North West. My experience as a deputy director has also provided me with exciting new challenges and opportunities for development in a rewarding position at the very forefront of digital transformation.

My advice to anyone would be to make the most of the opportunities available to you, recognise what you enjoy and what you’re good at and never stop learning or challenging yourself throughout your career.

Good luck!

Michael Gleaves
Deputy Director, STFC Hartree Centre


Michael was invited to participate in a panel discussion on academic career development at the Business of Science Conference which was held in Manchester on 18 May 2017. Other panellists included:


Shaping the Northern Powerhouse

Delegates attending APM Project Management Conference 2016.
Image credit: APM

The Association for Project Management (APM) recently held their first Manchester based conference, and the Northern Powerhouse initiative by UK Government was their key theme. Claire Trinder and Lisa Booth from our Programme Management Office attended the event, and it got them thinking about where the Hartree Centre fits in.

“If the Northern Powerhouse were a country, it would be amongst the biggest economies in Europe. If we can make this region an economic powerhouse, the whole of the UK will benefit.”

Phillip Hammond, Chancellor of the Exchequer

It sounds simple enough when you put it like that, but as we discovered at the APM conference, there’s a lot more to unlocking the benefits of the Northern Powerhouse than meets the eye.

The event, held in early December 2016, zeroed in on the developments in infrastructure, communication and technology projects that are being designed to re-balance the UK economy in line with the government’s Northern Powerhouse vision laid out in its strategy document. In summary, the Northern Powerhouse is a vision for a more joined up region in which northern towns and cities work collaboratively, sharing skills and resources to unlock the economic potential of the area.

The APM Conference placed a strong emphasis on the importance of organisations, particularly those from the public sector, providing an infrastructure for businesses of all shapes and sizes to build upon. This was suggested with a view to building an environment that encourages and fosters innovation across three key sectors: manufacturing, energy and health – with the integration of the digital technologies sector spanning across them all.

The day got us thinking about our own work at the Hartree Centre and the role we are playing in supporting the Northern Powerhouse strategy, both in terms of our technologies and capabilities as well as the way we manage and deliver our projects.

Advanced manufacturing

Over a quarter of the UK’s total manufacturing output comes from the North and innovation within manufacturing is going to be critical to the success of the next industrial revolution. This is why we are proud to be partners in LCR4.0, a new business support programme that will help manufacturing SMEs in Liverpool City Region to take advantage of the opportunities afforded by digital technologies. This project really emphasises the importance of collaboration and synergising regions to encourage technological innovation.

Healthcare innovation

Another prime capability for the North that is on the Northern Powerhouse agenda is health. This encouraged us to think about the ways in which the Hartree Centre are working to support innovation in healthcare. In particular, as part of our Alder Hey cognitive hospital project, we are working in collaboration with Alder Hey Children’s Hospital to develop a technology using the IBM Watson cognitive computing system to help revolutionise personalised medicine, putting patients at the centre of their care. This work links in with the Northern Powerhouse strategy by sharing capabilities, helping organisations to process large amounts of data, extracting the most relevant parts and transforming the information to create useful and relevant personal insights.


As the capability, complexity and sheer number of devices increases, supporting growth in digital technologies, there are growing demands on energy. At the Hartree Centre, we are proud to be at the forefront of energy efficient computing research and are exploring power use in computing, devising ways to achieve step changes in energy efficiency without compromising on performance. This work will hopefully feed in to the Northern Powerhouse by supporting the digital revolution, both environmentally and financially, enabling us to do more.

Final thoughts

The APM conference not only prompted us to think about how our work might support the Northern Powerhouse strategy, but also the diverse programme of speakers afforded us the opportunity to understand how other organisations are working towards the same agenda.

“The event provided a great insight into how the profession of project management is changing and how having skilled project professionals is critical for the success of any large project or programme. It is clear that the Hartree Centre will have an important role to play in in supporting the Northern Powerhouse strategy and central to that will be our skills not only in the technologies but also in project management.”

Claire Trinder, Head of Programme Management

“I was really pleased to listen to the talks and realise how well the work that we are doing at Hartree is aligned with the Northern Powerhouse strategy.  Being a relatively new Programme Management Office, I think we also heard some valuable lessons learnt and took away ideas for how to improve upon our own project delivery. Hopefully the Hartree Centre can be part of the programme at next year’s conference!”

Lisa Booth, Partner Services Officer


Bringing big data to life | TechUK’s Big Data in Action Roadshow comes to Manchester


Last week, the Hartree Centre sponsored TechUK’s Big Data in Action Roadshow in Manchester, held as part of a series of events across the UK to demonstrate the tools and technologies available for businesses to use, explore and get value from their data. Read on to find out how the day unfolded. Continue reading “Bringing big data to life | TechUK’s Big Data in Action Roadshow comes to Manchester”


An experiment, of sorts….. improving my health and well-being in the workplace

Dawn Geatches

Dawn Geatches, Project Scientist at the Hartree Centre, has been actively trying to improve her health in the workplace. Here, she shares her experience:

Recently I have been carrying out an experiment. Given that I work in science at STFC and have the pleasure of working on some very exciting Hartree Centre projects, you might rightly say “So what? You’re a scientist and they do that!” Continue reading “An experiment, of sorts….. improving my health and well-being in the workplace”


Through the gears: boosting car industry competitiveness


The visualisation facilities at the Hartree Centre have been used to help car manufacturers reduce time and money from their innovation processes

Now that the summer break is pretty much over (what was that I hear some of you shout?), I thought it was time for us to publish another post on here. In this post I touch a little on the automotive industry.

The automotive industry is one of those sectors that countries tend to use as a barometer of their overall industrial and economic performance. In the UK, the sector enjoyed a pretty buoyant 2015 all things considered. Continue reading “Through the gears: boosting car industry competitiveness”

Introducing HPiC, the Hartree Centre’s Raspberry Pi Cluster

The Hartree Centre has a new pocket-sized addition to our data centre! One of our Research Software Engineers, Tim Powell tells us all about it…

​HPiC has been created as a host for software demonstrations and for outreach events. It simulates a supercomputer by networking together 20 Raspberry Pi 3 Model B’s, allowing them to communicate and execute parallel programs.

The Raspberry Pi is a low-cost, low-power, single-board computer designed to make computer science more accessible to amateur developers, schools, and developing countries. Released in 2013, Raspberry Pis can be used for a wide range of applications – from robotics, to music streaming, to smart mirrors! The incredibly versatile Raspberry Pi 3 computer has a Quad Core 1.2Ghz ARM processor at its heart, 1GB of RAM, WiFi, Bluetooth capabilities and a whole host of device connectivity via a GPIO connector.

​​HPiC replicates high performance computing (HPC) techniques and can perform over 1,000 million instructions per second. HPiC has 19 ‘worker’ nodes (1 node = 1 raspberry pi), each with a quad-core ARM processor, resulting in 76 cores to utilise for parallel computing. The remaining node is called the ‘Head Node’ and allows us to interact and submit jobs to the ‘worker’ nodes.

HPiC’s case is built to mimic the Hartree Centre’s machine room. Currently, there are two demos available on HPiC: a Smoothed Particle Hydrodynamics (SPH) Simulation and a Mandelbrot Set Race (PiBrot). Both of these show key supercomputing techniques and we’re sourcing more demos at the moment.

The SPH simulation in action

The SPH simulation shows how water interacts in a variety of environmental conditions by changing gravity, viscosity and density, etc. The simulation runs in parallel on several nodes at the same time by utilising domain decomposition. This means that each processor is assigned a different part of the simulation space. A dynamic load balancing algorithm adjusts the domains to ensure each processor has approximately the same volume of fluid (or number of SPH particles), maximising the performance of the simulation.

Mandelbrot set calculation race underway. Left: single node, right: 18 node

PiBrot, however, is a race… a race to calculate a Mandelbrot set. However, one side has an advantage as it uses 18 nodes to calculate the set, whereas the other side only uses a single node. This demonstrates how mathematical calculations can be spread across several nodes to speed up the process with proper parallelisation of code.

Building a prototype foam core case

Designing and building HPiC was an interesting and fun opportunity. When designing the case, I had to think how to best portray the Hartree Centre and supercomputing, whilst making it as accessible and friendly to the public as possible. Once the idea to mimic the machine room had been established I mocked up a temporary case out of foam core, this was mainly to check that all the hardware could fit in! I then took my physical mock-up and converted it into a digital 3D model.

3D model of the HPiC case

Finally, the plans were sent off to an external company who built the case. After a late night at the office, too many cable ties, and copious amounts of electrical tape the Raspberry Pi cluster was assembled!

Finished! HPiC complete with case

By this point there was quite a lot of discussion about what to actually call our cluster so I held a competition to decide on the name with a raspberry flavoured cake as the prize! We settled on HPiC as it can stand for both High Performance Computing and Hartree Pi Cluster!

Last month my fellow Research Software Engineer, Aiman Shaikh attended the annual EuroScience Open Forum in Toulouse, France, helping to inspire attendees with our amazing science and technology at the United Kingdom stand. In pride of place was the new mini supercomputer… it was HPiC’s first outing. Aiman did an amazing job showcasing HPiC, and encouraging delegates to interact with the SPH simulation and PiBrot race. Amongst visitors from across Europe keen to see a demonstration of HPiC, were Sharon Cosgrove, Executive Director STFC Strategy, Planning and Communications and Rebecca Endean, UKRI Strategy Director.

Aiman demonstrating HPiC to Rebecca Endean, UKRI Strategy Director.

What’s next? We’re currently developing some more demos to run on our little HPiC as well as looking to get the case engraved with the name and our logo, we can’t wait to take it out to more events and get more people excited about the world of HPC!

International Women’s Day 2018 | Janet Lane-Claypon

To mark International Women’s Day, Hartree Centre Data Scientist, Simon Goodchild writes a blog post to celebrate the work of a pioneering epidemiologist and doctor Janet Lane-Claypon. At the time of writing the post, Simon was studying medical statistics for the first time as part of a statistical society diploma and was surprised to have not previously heard about a woman who had invented two of the key techniques he was learning about!

Janet Lane-Claypon

How do you know that your treatment actually works?

How do you know whether something in the environment may impact upon your health?

These are some of the most basic and most important questions in medicine and epidemiology. Getting good answers is vital, and nowadays there are established procedures for finding sensible answers. Several of these can be traced back to the under-recognised work of Janet Lane-Claypon in the early part of the 20th century.

In 1907, Lane-Claypon was working at the Institute of Preventative Medicine in London, investigating the growth of babies. She was ideally placed to investigate work of this nature, having been a brilliant student at the London School of Medicine for Women (the last bit is rather a sign of the times), starting in 1898 and earning academic distinction that included both M.D. and Ph.D. degrees. She looked at whether it was better for a baby’s growth to feed them cow’s milk or human milk. Lane-Claypon approached this by finding comparable groups of infants, some who had been fed cows’ milk and some who had been breast-fed, and studied the differences between their weight.

Group portrait: Lister Institute of Preventative Medicine in 1907
Credit: Wellcome Library, London. Wellcome Images

This may seem obvious to us now, but at the time it was a new way of solving problems. Lane-Claypon’s study is one of the very first examples of a cohort study – comparing reasonably sized groups of similar people to try and determine the size of an effect. In this case, she travelled to Berlin, where a charitable fund was paying for consultations for newborn babies. She was able to obtain data about the weights of 300 babies who had been breast-fed and 204 who had been fed cows’ milk, and analysed this to determine whether either was more effective.

Her final report was a careful, detailed examination of the data which was pioneering in a number of ways. From a simple plot of the mean weight of the two groups over time, it appeared that breast-fed babies gained weight more quickly, but she was careful to investigate all the possible causes for this observation. First, she looked at whether the result was simply due to chance, something she describes as sampling error. To do this she used what is now known as a two-sample z-test, which compares the difference between the two means to the expected variation, which can be measured from the standard deviations of the two samples. If the difference is larger than expected due to chance variation, then it is likely to be significant.

Looking at a particular small part of the data, the general conclusion didn’t seem to work for one group; for babies in the first eight days, cows’ milk seemed more effective. Lane-Clayton analysed this using the t-test which had only been published a few years previously – at the time it was the statistical state-of-the-art and only used by experts – and concluded that this result probably wasn’t significant.

Finally, and of equal importance, she investigated whether the effect was due not to the different type of milk, but other causes. These are called confounding factors, and it is critical to work out whether they have an effect if you’re trying to decide whether your data really shows what it seemed to. Lane-Claypon was concerned that the effects she saw might be due to social class, so she calculated the correlation between the babies’ weights and their fathers’ wages while controlling for the method of feeding. At the time, this was another piece of cutting-edge statistics, as she used a method published by Pearson in 1909. The correlation turned out to be 0.026±0.036, effectively zero within experimental error.

Having analysed the data so carefully and ruled out likely confounding factors, Lane-Claypon could be confident in her conclusion that “the evidence dealt with throughout this report emphasises very forcibly the importance of breast-feeding for the young of all species.” Almost all the features of a modern study are already here – data collection, good statistical analysis to see if the conclusions are not just down to chance, and an investigation of whether other factors might be causing the result. Nowadays it would just be proper procedure, but in 1912 this was incredibly innovative. Not only had Lane-Claypon created a new form of study, but she had also carried it out rigorously, and used the latest methods in statistics to analyse her data.

Later in her career, she moved to the Ministry of Health and began studying breast cancer. In 1926 she extended her cohort study by publishing one of the first case-control studies, looking for the causes of breast cancer. In this study, she compared 500 women who had breast cancer with 500 controls, women free of breast cancer, and used a detailed 50-question survey to collect as much information as she could about their life histories. Using this, she identified that women who had more children, who started having children earlier and who breastfed more were less likely to develop cancer; conclusions which were confirmed by a 2010 re-analysis of her data using the full power of modern statistical methods.

Lane-Claypon’s career was brought to a premature end in 1929 when she got married, as the Civil Service didn’t allow married women to work there. She retired to the countryside and lived until she was 90. In her career she had pioneered two of the most important methods for modern epidemiology, and it is hard not to agree with Katherine Nightingale in the MRC’s Insight when she says:

  “Who knows what she could have achieved if she’d carried on?”

Shaping IT Service Management at The Hartree Centre: Continual Service Improvement

The last in a series of blog posts from Dave Cable, Head of Service Operations here at The Hartree Centre summarises the steps we have taken to implement IT Service Management. 

In previous posts, I described three key components of ITIL infrastructure which we have implemented at the Hartree Centre – Service Operations, Service Design and Service Transition.  These are all inter-dependent and equal in stature.  However, there is one further area of ITIL which is slightly different because it underpins all of the above – Continual Service Improvement (CSI).  Continuous improvement is vital, because it ensures that processes and functions do not remain static.  They develop and improve in response to operational lessons learnt, leading to overall improvements in service quality.  Continuous improvement provides a feedback mechanism and tools to incorporate that feedback.  It can also work with quality management tools.

ITIL provides two complementary tools to implement CSI – the Deming Cycle, and the Seven-step Improvement Process.

Continue reading “Shaping IT Service Management at The Hartree Centre: Continual Service Improvement”