Quantum Meets HPC: Quantum computing is the future, but HPC is here to stay.

Stefano Mensa is a HPC Applications Specialist at the Hartree Centre and a quantum computing enthusiast. We caught up with him after the International Supercomputing Conference (ISC) 2022 to talk about his perspective on the role of quantum computing and what this means for the world of high performance computing (HPC). 

ISC 2022 focuses on critical developments in HPC machine learning and high-performance data analytics. This year the conference saw a noticeable shift towards the exploration of quantum computing, with an emphasis on successful applications in science, commerce, and engineering.  

Hello Stefano, welcome back from ISC! This year there were 15 sessions dedicated to the opportunities and challenges associated with quantum computing. What do you think this means for the future of HPC? 

First, the good news: HPC is alive and well and it’s here to stay. So, no one of us is going to be out of the game anytime soon. A lot of things are going on in the field, both with respect to new hardware, software and applications.

In general, there is a commitment in reducing the environmental impact of HPC and new solutions are being developed towards the reduction of power consumption of chips, efficient cooling of HPC systems and energy-efficient workload scheduling. Also gaining momentum in HPC is digital twinning, which virtually represents a physical asset along with the real-time data related to it. This is nice to see as the Hartree Centre has been working on this for a while now, so we are well equipped to take on challenges in this area.

We are home to experts in this field, with a visualisation team and state-of-the-art visualisation facilities to model digital twin assets and related data. An example of digital twinning, our team has developed our virtual wind tunnel, that companies can use to explore fluid dynamics/aerodynamics of digital assets like cars. For more about how digital twinning works you can contact our visualisation team.  

For those who are still getting to grips with quantum computing, can you define it in an accessible way for us? And can you explain how it differs from HPC? 

Quantum computing is a rapidly emerging technology that exploits the laws of quantum mechanics to solve problems that are too complex for classical computing. HPC facilities are the state-of-the-art classical computers and are used to solve very large real-world scientific problems. They use thousands of computers that are connected together to work towards the solution of a single problem. In some cases, the complexity of the problem is so big that sheer classical computational power is not sufficient to provide a solution, as some problems would still take too long to be solved. However, quantum computing leverages the laws of quantum mechanics, and allows computational scientists to explore these complex problems under a different perspective. 

A table comparing some of the differences between quantum computing and supercomputing.
A table comparing some of the differences between quantum computing and supercomputing.

The consensus seems to be that quantum is the future of computing, but you are saying that HPC is “alive and well and here to stay” how do you see these two areas working in conjunction moving forwards? 

Quantum needs HPC and vice-versa. Basically, with the current state of play in the field of quantum computing it is impossible to solve a task entirely on a quantum computer. This will be probably true for a long time.  

There is wide-spread acknowledgment in the community that quantum processors must be considered as “accelerators,” and the remaining part of a hardcore simulation would still be performed on classical HPC. 

This is great, however, it opens a whole new can of worms that you need to consider, like how to couple the quantum processing unit to a HPC platform or the requirements of the data transfer process and runtime.  

Currently, HPC centres across the world are securing real Quantum Hardware and Quantum simulators. In Germany, Leibniz Supercomputing Centre (LRZ), has secured a 20 qubits system from IQM. Meanwhile in Japan, the University of Tokyo has secured a quantum machine from IBM, and scaled up to 53 qubits. Both computers are using superconducting chips, furthermore according to an Atos study, 76% HPC centres plan to get into quantum in the near future, and 71% will invest in on-premise infrastructures.  

You were part of the workshop on Quantum and Hybrid Quantum-Classical Computing Approaches that the Hartree Centre co-organised at ISC. From the workshop and the talks across the conference what would you say are some of the challenges facing quantum moving forwards? 

The first big hurdle for widespread adoption of quantum computing in industry is funding. Quantum hardware costs tens of millions of pounds. It is already difficult for government-funded or academic supercomputing centres to acquire funding for HPC procurement, as such, quantum hardware presents a bigger challenge, even more so for industry. To justify the expenditure, you need to demonstrate the impact and benefits it will bring. You can do that by working with organisations like the Hartree Centre, to develop proof-of-concept applications to solve real-world challenges and test them out on real quantum hardware. 

Another hurdle to widespread adoption of quantum is access to skilled staff, especially quantum software engineers. At the Hartree Centre we have staff exploring quantum technology and industry applications to help organisations to access it, and navigate its possibilities, to discover the next step for their businesses. 

Finally, and the most important in my opinion – as it is my field of work! – how do you integrate a quantum processing unit into a HPC facility? Since ISC it feels clear to me that this is where the big effort from HPC centres is going to be placed. This is an ambitious technical end-point for the scientific computing community. Currently, scientific communities mainly access quantum hardware via cloud interfaces and only a handful of facilities in the world have access to actual quantum computers on premise. The aim is to have a seamless integration of quantum hardware inside classical resources, such as a HPC compute node, to increase the computing power and efficiency. 

From what you are describing there are still some steps until we reach the widespread use of quantum computing. What would you say are some of the priorities for organisations to address right now? 

Obviously, we are at the infancy of an emerging technology. There are no standards yet for best practice in quantum computing, and each vendor is developing their own application programming interface and software development kit (SDK), as such there are no fixed rules. However, it looks like some SDKs are going to be long lived. Given that useful large quantum computing architectures are still far in the future, the need for reliable quantum simulators such as the Atos Quantum Learning Machine are more important than ever. Ultimately, the challenge is to develop emulators that accurately simulate physical systems, a sort of digital twin of a quantum computer.  

These points and challenges were discussed across talks at ISC22 and our workshop. Adopting quantum is not going to be an easy road but if you are as excited about quantum I am, then I am confident we can tackle these challenges with enthusiasm and progress this exciting emerging technology.

If you would like to learn more about quantum computing and its applications, please visit our website. If you are interested in collaborating with the Hartree Centre on a project, please contact us.

Staying curious: Reflections on six years as the Hartree Centre Director

As the longest serving Director of the Hartree Centre, Alison Kennedy has taken the Hartree Centre from strength to strength during her six years of leadership. In 2021, she led the team to successfully secure over £200 million of government funding to run the Hartree National Centre for Digital Innovation (HNCDI) programme, applying advanced digital technologies such as high performance computing, data analytics and AI to enhance productivity in the UK industry.

On Alison’s departure, we asked her to share some of her thoughts and experiences advancing the industry application of digital technologies and look forward to what the future might hold.

Where are we now?

This year, 2022, marks the ten year anniversary of the Hartree Centre. It’s an interesting year for us, seeing how far we’ve come, and for me particularly it marks a transition in my career as I move on from my Directorship of the Hartree Centre.

The Hartree Centre had 12 staff when I started, and now we have a team of over 110 people and growing. We’re in a strong position as a department of STFC’s National Laboratories but we’ve had to work hard to transition from being dependent on funding from a series of fixed-term projects to becoming a sustainable entity with a distinct role in the UKRI landscape. Our current status reflects a recognition and appreciation of  the value of our work at the intersection of HPC research, business networks and national and regional government infrastructure. Being at that intersection is what makes us unique and what makes us strong.

“The practical application of science is fundamental to its value.”

 The importance of the Hartree Centre, a department specifically dedicated to supporting businesses and public sector organisations to adopt and apply new digital technologies cannot be overstated. We allow organisations to experiment and learn in a safe environment to ensure they know what technology works for them before they fully invest in it – de-risking that process of exploration. We’re flexible in our approach to emerging technologies – even beginning to investigate the potential of quantum computing for industry in collaboration with the National Quantum Computing Centre (NQCC). But what makes us unique and what really matters to us is that we work very closely with businesses, technology partners and the public sector to ensure that solutions and applications we develop are useful and usable.

Embracing change

One of the things that’s difficult to immediately grasp is that when the Hartree Centre was founded, the notion that high performance computing could be adopted and being used by industry in a whole range of areas was really novel and exciting. Supercomputers were primarily a tool for scientific research, and they were portrayed as being very expensive, very difficult to use, and suitable only for a adoption by a small minority of scientists with long experience in difficult computing simulation and modelling problems.

So for the Hartree Centre, one of the key motivators for us has been the challenge of “democratisation” of high-end novel technology.  How can we make it much more accessible to a much wider range of people? How can we understand what some of these industrial challenges are so we can apply it effectively?

Alison Kennedy taking part in a panel session at STFC’s Digital Tech Cluster launch event.

In the last decade, the world has moved forward in immeasurable ways. We’ve seen profound changes in both the technologies and the language we use to describe them. When I started working at the Hartree Centre, we talked about cognitive computing – now the world is more comfortable with terms like artificial intelligence and AI being used in the workplace. These technologies are no longer the preserve of science fiction and people have learned and begun to accept that these technologies don’t mean fewer jobs – just that everyone’s job spec will change.

“I think that the best piece of advice I’d give to anyone at the start of their career is to stay curious and be adaptable.”

When I look back at the changes in technologies and opportunities over the past 40 years, I’d say it’s very, very unlikely that if you work in technology, that you’ll be doing a similar job in five, ten, fifteen or twenty years’ time. Many of the jobs that we are now recruiting for at the Hartree Centre really didn’t exist in their current form five or ten years ago. If you can stay adaptable, and think about where the technology is going and how you can apply it in other areas, you’ll be set to succeed. Think about what you are interested in, think about what skills you can develop, be enthusiastic, be open to learning new ideas and you will then definitely be part of the solution.

Making digital technologies work for businesses

I think one of the first things we realised early on working with businesses is that it is easy for people who are excited by technology to engage with people in industry who are excited by technology.

However, if you want that technology to be adopted and to be used, then you need to engage the hearts and minds of a whole range of people who are working in industry, from their funders and executives to their customers and their supply chains.

“It’s not just about having a good technology solution. It’s about ensuring that the people you work with understand the power of digital transformation and how adopting digital solutions will benefit their businesses.”

Our projects are not about somebody coming in and doing something for a company using our “super technology powers”. We build multi-skilled teams with professional project management that work collaboratively with our partners so that we can get the best results for them. By talking to our teams and answering their questions, the company is part of the project development not just the “end user”.

Hear more from Alison Kennedy in her recent interview with Cambium LLP.

Acknowledging the power of diverse, multidisciplinary teams

I’ve always thought it’s important that our technology teams reflect society at large. If we’re going to effectively tackle a whole range of challenges, from environmental to societal to economic, then we need to have people who understand what these challenges are who come from a variety of backgrounds and who reflect our society.

Also, from a practical point of view, the areas that we are working in – Science, Technology, Engineering and Maths (STEM) – have a shortage of applicants. There’s not enough people in the UK with these skills to meet the demand in research and industry sectors. We need to be as open as possible to say: “What’s absolutely essential for this job vs. what can we teach people when they get here?” I’m really pleased that over the years I’ve been at the Hartree Centre, we’ve developed into a more diverse team of people working on our projects – I think this has really benefited us in terms of being able to understand and contribute to some of the challenges that we’re working with – but there’s still a way to go.

Alison Kennedy with Hartree Centre staff at Supercomputing 2019 in Denver, Colorado.

I’m also interested in the way we are moving beyond thinking just about STEM skills in the UK – I’ve noticed more of a focus to add the arts into that mix. It’s important that we are able to illustrate to people in industry and government what the possibilities are. We need to be creative to make it as easy as possible for them to understand what the results of a particular project might be. So, when we look to recruit new team members at the Hartree Centre, we are not just looking for people who have good technical skills. We are also looking for people who are good communicators, who can manage projects to high standards, who have an interest in challenges and who understand the impact of solving them. Our people don’t just want to develop very deep expertise in one area.

To this end, we also have people on our team in the Hartree Centre who have an understanding of digital communications and social media who can interpret our work in a more creative way to engage with different audiences, as well as people who work on the data visualisation side of things. One of our big investments at the Hartree Centre has been a visualisation suite, where we can bring the results of many of our projects to life in a really visual way. We know that for the vast majority of people, this makes it much easier to understand than looking at statistics and formulae.

We use everything from infographics to advanced visualisation to films about our work, anything which helps to convey the power of these new technologies and to spark interest in people to inspire them.

“We want people to think: Wow, that’s really great! I wonder if these technologies could be applied to my particular problem.”

Looking forward, I think there’s a huge amount to be excited about. We’re seeing new applications of existing technologies in an increasing number of areas, alongside the advent of emerging technologies like quantum computing, which is radically different from traditional computing and will enable us to look at problems that cannot be solved on current mainstream computers.

From developing more personalised medicine and healthcare treatments to applying AI in conjunction with simulation and modelling to speed up the refinement of an aeroplane wing design, we have truly only touched the tip of the iceberg. And I for one am excited to see what the Hartree Centre does next!

Alison will be succeeded by Professor Katherine Royse in April 2022.

Meet the team | Training and Events Manager

We spoke to Nia Alexandrova about her role at the STFC Hartree Centre, what keeps her coming into work every day and how the shift towards remote-working has changed the way events are run.

Can you introduce yourself and tell us about your role at the Hartree Centre?  
I am the Training and Events Manager, so my responsibilities involve building and designing the Hartree Centre’s training strategy and programme. This means working with the researchers in their area of applied research to design specific courses and learning materials for different audiences and entry levels. It can also involve managing the process of organising and delivering an event. Because of my background and own research, I am able to support people in finding better ways to teach depending on different audiences. Specifically, my research is in collaborative training and collaborative learning in technology-rich environments. 

So your background is in research? 
My education was in engineering but that was a very long time ago! I started as a research assistant and was involved with some programmes that were being overhauled and transitioned from Liverpool University to the University of Reading. I then went to work in Barcelona Supercomputing Centre to help define and develop their training programme. We built up a team, starting with myself and later Maria-Ribera Sancho (former Dean of FIB – Barcelona School of Informatics), to create a coordinated approach. from the ground up to create a coordinated approach. When I came back to the UK, I joined the STFC Talent Pool and this opportunity in training and events came up which was well suited to my skills! I also knew Alison Kennedy (who had just become the Director of the Hartree Centre at the time) from being acquainted with Women in HPC and she confirmed to me that this was probably a place where I would want to work! 

A woman with mid length brown hair and a mustard coloured shirt presenting in front of a large curved screen showing an image of a supercomputer.

So she was right then! What keeps you coming to work every day?  
I hate doing the same thing over and over again and working at the Hartree Centre is very exciting and quite challenging in that way – every day is different! For example with HNCDI’s EXPLAIN training programme, we are working on the challenge of enticing people to engage with training they may never have thought they needed. It’s easy to present supercomputing and AI to an academic audience but more difficult to engage with individuals or private companies and their leadership, who may not be aware of the benefits of upskilling their teams in digital transformation, computing or AI. At the moment we are in an interesting time, people are becoming aware of the need for digital transformation. 

For the Hartree Centre too, the entire time I’ve been working here we have been growing and evolving. We are finding different ways to develop things, finding the best way to support people and exploring how to teach in the best possible way. The challenge is ensuring that when you’re training individuals, you’re giving them the skills they need not just in their own job, but to go and change behaviours and attitudes to digital transformation in their own company. 

What would you say has been your biggest challenge recently? 
Until the COVID-19 pandemic, all our training has been hands-on and face-to-face in the physical Hartree Centre building. So the recent – and very sudden – transition to virtual events were initially very disruptive for us and a time for fast problem-solving! 

However now we can recognise that it was an inevitable step forward that was just accelerated by the global circumstances, and the Hartree Centre Training, Events and Communications teams worked together really well to find a way to support everyone digitally in a short timeframe. It became an ultimately positive experience that enabled us to enhance our training offering and we are continuing to explore the use of hybrid, virtual and face-to-face events and refine our approach. 

Woman wearing mustard colour shirt pointing at a computer screen and smiling.

So as an events manager, what kind of events do you like to attend? 
 Big international conferences like Supercomputing or ISC are always interesting. When you are physically attending these exhibitions, they feel enormous, we are talking about thousands and thousands of people – and that is an exciting atmosphere. I can share best practice with the global HPC training community and be part of meetings and get involved with the communities that I wouldn’t encounter locally. It helps you to see the bigger picture and also gain some exposure for your organisation. I have seen some really interesting keynotes and in recent years there was a trend of not only inviting people from  the high performance computing (HPC) industry but people who are slightly outside of it. That is a really interesting way to see how someone’s work in industry intersects with HPC outside of the HPC research community.

When you’re not at work, what do you most enjoy doing?  
I love drawing so I go to a life drawing group every Monday. I don’t like to sit and watch TV, I have to have my hands busy so I do knitting and crochet a bit. I love the Daresbury Laboratory book club, it’s a lot of fun, and I’m glad we continued it on Zoom during the pandemic. I’m grateful for living near Daresbury because it is a very beautiful area. I always knew this but during lockdown, I started to appreciate it even more because it allows you to do 5 or 10 minute walks very close to home and you get to go around and see ducks, flowers and woodlands and all the time I’m thinking if I was living in a big city I would miss this. 

You can catch up with Nia’s work by exploring our upcoming training events on the Hartree Centre websiteregister your interest for fully booked events or sign up for future updates by subscribing to the Hartree Centre newsletter 

Data science and AI help for SMEs in Cheshire and Warrington

Hi! I am Tim Powell, a Business Development Manager at the Hartree Centre. In this blog post I am going to be talking about a relatively new funding opportunity for SMEs that I’m working on at the moment, Cheshire & Warrington 4.0. 

Tim Powell, Business Development Manager, STFC Hartree Centre

So, what is CW4.0? 

Cheshire and Warrington 4.0 (CW4.0) is an EDRF fully-funded programme of hands on support for businesses in Cheshire and Warrington focused on the exploration and adoption of digital technologies. The programme is built on the success of LCR 4.0 which supported over 300 companies in the Liverpool City Region to develop new products, decrease time to market, and accelerate productivity and turnover – all while creating 125 new jobs!  

Through the CW4.0 programme SMEs in Cheshire and Warrington can access technical expertise from our team of experts here at the Hartree Centre. Our data scientists and software engineers have a strong track record of working on collaborative projects to solve industry challenges. To give you an idea, here are some examples of the areas we work in:

  • Artificial Intelligence applications, including machine learning and natural language processing 
  • Predictive maintenance and data analytics 
  • Modelling and simulation 
  • Software development and optimisation 
  • Cloud migration and applications 
  • IoT (Internet of Things) integration 

Our first CW4.0 engagement has already kicked off with G2O Water Technologies, Tristan Philips the VP of Engineering has this to say about his hopes for the outcome of the project: 

“Being able to do Computational Fluid Dynamics at Hartree is essential to model and design enhanced membranes that are able to filter almost unfilterable waters, extract precious materials from water streams and decarbonise the water industry.”

Tristan Phillips, VP Engineering
G2O Water Technologies

We have also just kicked off a project with Chester-based Circles Health & Wellbeing who are looking to develop an AI chatbot for assistance in mental health services and are developing more projects in the pipeline covering areas such as predictive maintenance, using machine learning to improve routing algorithms and building data warehouses. 

“We are excited to be working with STFC on this hugely important healthcare project. Mental health patient numbers are ever-growing and placing a huge strain on healthcare services which are buckling under the pressure. Working with the Hartree Centre – a respected AI development partner – will enable us to build a dedicated healthcare assistant solution that will set a benchmark for similar future conversational AI assistants, delivering cost-efficient, patient-centric support services that enhance a client’s healthcare experience, build confidence in more human/tech blended healthcare solutions and deliver positive, measurable outcomes. The pressure to get this right is colossal and we are delighted to have such a talented and knowledgeable partner to work alongside us.”

Tom Mackarel , Director and Co-founder
Circles Health & Wellbeing

How does it work? 

CW4.0 projects can vary from creating a brand new proof of concept (PoC) or minimum viable product (MVP) to help accelerate a start-up to market or to add value to an existing product through digitisation. The process of engaging with us on a CW4.0 project is simpler than many other grant applications.  

After an initial discussion with me to define the challenge statement followed by an eligibility check, I engage with our technical staff to write a project scope that will look to create a custom solution to a company’s specific industry challenge. The project scope will be presented back to the company for fine tuning before we go ahead and submit the final application with each CW4.0 technical project typically lasting 2 – 4 months. 

The process works really well for companies who already know how and what they want to innovate on but if your company is interested in digital innovation and not sure which direction to take or the options available to you, don’t worry, we can help with that too.  

CW4.0 is also designed to help signpost companies in the right direction by offering a fully funded, risk-free, feasibility study or digital innovation report. Our experience working across a wide range of industries from engineering and manufacturing and life sciences to energy, professional services and transport will be used alongside our technical expertise to benefit you. The feasibility study or digital innovation report will be created working alongside your company as domain experts to discover what will work best for you.

Manufacturing your digital future | CW4.0

Not just digital innovation – from virtual to physical 

Here at STFC, alongside the Hartree Centre there is another department who are delivering support as part of CW4.0 so I would like to take some time to showcase how the Campus Technology Hub (CTH) can also benefit SMEs across Cheshire and Warrington. 

Companies can access a range of 3D printing capabilities and explore how 3D printing could aid product development and streamline manufacturing processes to reduce time and costs and look at rapid prototyping of complex designs on a project-by-project basis. With 3D printers ranging from desktop-sized, fused deposition modelling printers that can print in a variety of plastics, through to industrial metal 3D printers and material varying from plastics like PLA or ABS, to material reinforced with fibreglass or carbon fibre, resin polymers and 316 stainless steel – the possibilities are endless! 

To find out more about accessing support from the Campus Technology Hub specifically, you can contact my colleague Michaela at michaela.kiernan@stfc.ac.uk

Am I eligible? 

The main eligibility criteria for CW4.0 are that the company is classed as an SME, haven’t used the allocated state aid, and have a registered premise in the postcode catchment area below: 

Cheshire Warrington Chester 
CW1 WA1 CH1 
CW2 WA2 CH2 
CW3 WA4 CH3 
CW4 WA5 CH4 
CW5 WA6 CH64 
CW6 WA7 CH65 
CW7 WA8 CH66 
CW8 WA13  
CW9 WA16  
CW10   
CW11   
CW12   

Who can help me? 

To discuss how the Hartree Centre can provide innovation support to your business, help increase productivity, access new markets, kickstart new product and job creations and enable growth through CW4.0, please get in touch with at info@candw4.uk. 


Part-funded by the European Regional Development Fund (ERDF), CW4.0 brings together the combined expertise and capabilities of the Virtual Engineering Centre (University of Liverpool), Liverpool John Moores University, the Science and Technology Facilities Council (STFC) and the Northern Automotive Alliance. 

HPC is Now | Supercomputing 2019

In November 2019, the Science and Technology Facilities Council (STFC) Hartree Centre and Scientific Computing Department exhibited at international conference Supercomputing 2019 (SC19) in Denver, USA. In this blog post, Research Software Engineer Tim Powell shares some thoughts and insights from the Hartree Centre team.

Hartree Centre team members attending Supercomputing 2019.

The variety of experiences one can have at Supercomputing is vast, and I think this is a good echo for the direction high performance computing (HPC) is going. The number of different disciplines that are adopting HPC and the different techniques available to acquire your computing power are growing more diverse. When discussing the themes of SC19 with a colleague (in the stationery room of all places) I accidentally summed it up quite well: “Supercomputing 2019 was tall and broad.”

So let’s look at each aspect of this assessment – first up: “tall”. The next phase of supercomputing is exa-scale. There was a significant number of talks, birds-of-a-feather, and panels discussing exa-scale computing, the applications, software, and hardware.

Our Chief Research Officer, Vassil Alexandrov, gives his account of Supercomputing 2019 and the current exa-scale landscape here:

“Supercomputing 2019 was a busy time for me, as always! In the discussions and talks I attended, I felt that this year’s content was of an even higher quality than previous years, and I noted that there were more precise presentations delivered by researchers.

One area which I paid particular attention to was the discussion around exa-scale. The US National Labs are making big moves with their Exa-Scale Computing Project. They are investing $1.8 billion in hardware and a similar amount for the development of software. The current US roadmap is to have their first machine, Frontier, in place in Q3 of 2021 costing an estimated $400 million. With another two machines to be delivered in 2022, each costing $600 million. All 3 machines are expected to be exa-scale and are rumoured to be a combination of AMD, Intel, Cray, and NVIDIA.

Europe are also heading towards exa-scale computing – eight centres across Europe are going to host large peta-scale and pre-exa-scale machines in their program to developing exa-scale capabilities, with machines expecting to reach 150-200 peta-flops. Japan is about to install their Post-K supercomputer which is based on ARM processors and it is likely to be a very efficient machine. The expectation is for it to be operational early 2020 so I am excited to see what the results will be when it is up and running. China is also a player but that is behind closed doors at the moment. It will be interesting to see what they reveal.

Throughout SC19, it was clear that the software challenges are going to be harder than the hardware challenges. My opinion is that we are still a few years off from having true exa-scale machines.”

Vassil Alexandrov chairs the 10th Workshop on Latest Advances in Scalable Algorithms for Large-Scale Systems for academia and industry alongside Prof. Jack Dongarra (UTK & ORNL), Al Geist (ORNL) and Dr Christian Engelmann at Supercomputing 2019.

Now, let’s talk about how SC19 was broad”.

More so this year, than in previous years, have the different applications of HPC become so obvious. Multitudes of National Laboratories and Research Institutes from around the globe were seen displaying use cases on their stands in the exhibition hall, and there was a large variety of different topics discussed in talks and panels. There was, quite literally, something for everyone – assuming you have an interest or involvement in computation that is!

I think this is largely due to the growth in access to data, and new techniques such as machine learning and artificial intelligence (AI) requiring disciplines that traditionally don’t use HPC to access more computing resource. Additionally, with the massively growing offering of cloud computing resource, the barrier to entry has been significantly reduced and it is easier than ever to provision a cluster-in-the-cloud.

So tall is more powerful computing, and broad is more computing applications. This all accumulates in a bigger impact of High Performance Computing, which again was echoed at SC19 with a series of talks in the 1st HPC Impact Showcase.

My personal highlight this year at SC19 was participating in the Building the Future panel at the 6th SC Workshop on Best Practices for HPC Training and Education. The all-day workshop focused around common challenges for enhancing HPC training and education, and allowed the global community to share experiences and resources to address them. The Building the Future panel focused the discussion around how we as trainers and educators can best prepare for the future of HPC and the training and education needs it will bring. The key take-away from my talk was that there is a diverse future of applications for HPC and we need to help facilitate the power of HPC to non-HPC experts who are only just finding uses for it.

Tim Powell speaks at the Building The Future panel during the 6th SC Workshop on Best Practices for HPC Training and Education.

On the following day I was fortunate enough to attend the Early Careers Program, aimed at people in the first few years of their career in HPC and delivering a variety of activities, talks, and panels. It was great to see STFC represented by Catherine Jones and Alison Kennedy. As a Research Software Engineer (RSE) I particularly enjoyed panels and talks involving RSE and members from the RSE Societies around the globe. It’s great to see that managing research software properly is being put on the international stage at conferences as big as SC! I also noted that in a series of talks on cloud computing, a lot of time was given over to discussing the advantages (rarely the disadvantages) of tailor-made HPC in the cloud.

As a team, we had great fun facilitating a very popular build-your-own Lego supercomputer activity, in the form of our very own Scafell Pike! Needless to say, our limited supplies disappeared quicker and quicker each morning as the word spread. Our HPiC Raspberry Pi cluster was also present, boasting some new and updated demos developed by our recent summer placement students James and Lizzie!

The Hartree Centre takes its supercomputer Scafell Pike to Supercomputing 2019… in Lego form!

I also spoke to some of my colleagues to get their own perspectives on SC19. Aiman Shaikh, Research Software Engineer, discussed her first time at the conference:

“I really enjoyed being part of the Women in HPC workshop, and attending technical talks around containers in HPC and LLVM compilers. The networking events held by different vendors was also a great opportunity to meet people. There was so much going on everywhere that it was difficult to keep pace with everything!

HPC and Cloud Operations at CERN was a very interesting talk by Maria Girone, who talked about technologies used at CERN, software and architecture issues and how they are investigating machine learning (ML) for object detection and reconstruction.

The Women in HPC workshop was really good, especially the keynote from Bev Crair, Lenovo, on “the butterfly effect of inclusive leadership”. Bev said that diverse teams lift performance by inviting in creativity, which I completely agree with. Another inspiring and motivating talk by Hai Ah Nam from Los Alamas National Lab talked about surviving difficult events and minimising their impact to your career. Hai explained that we cannot stop unforeseen events in life but we can focus on how to tackle them. The Women in HPC networking events, often joined by many diverse groups of people, provided a great chance to network with attendees from all different backgrounds.

The journey of exploration did not ended after SC as afterwards I went to the Rockies with some colleagues, which was fun-filled few days walking and with so little light pollution we could see the Milky Way at night!”

Aiman Shaikh gets involved in the Women in HPC workshop at Supercomputing 2019.

SC19 was a new experience for Research Software Engineer Drew Silcock too:

“Attending SC19 for the first time really exposed me to the wider scientific computing community. I gained an understanding of the various technologies used by the scientists and engineers and for what purposes they were used. Many are scaling their applications with standard MPI+ OpenMP stacks, but I attended several interesting workshops and talks about alternative technologies and approaches. Of particular interest to me are all topics relating to the development and programming languages and compilers, so I very much enjoyed hearing from people working on and with the LLVM compiler toolchain, additions to the C++ standard and the development of domain-specific languages for scientific computing.

In terms of trends, it’s exciting to see how many people are starting and continuing to use Python for scientific computing. Cloud services are also becoming increasingly relevant, especially for new companies without on premise capabilities. As machine learning models get bigger and bigger, there is more effort being put into bridging the gaps between the HPC and ML communities to ensure that they can benefit each other.”

Jony Castagna, a NVIDA Deep Learning Ambassador with 10 years experience in HPC and several years experience in Deep Learning, shared his thoughts:

“We’re seeing fast-growing applications of Deep Learning for science. Three different approaches have been identified: support/accelerate current algorithms like via AI precondition or matrix solver through Neural Networks (NN); solve partial differential equation using NN but enforcing physical information (via Physical Informed Neural Networks, PINN); fully replacing physical equations with NN trained using numerical simulation data. In particular this latest approach seems most attractive as it seems to show the capability of NN in learning the physics from data and extrapolate further at higher speed. For example, in the work of Kadupitiya, Fox and Jadhao, a simple NN has been used to predict the contact density of ions in Nanoconfinement using trained data from a Molecular Dynamic (MD) simulation. A strong match between prediction and MD simulation has been presented.

An increasing use of C++17 standard library has emerged for performance portability. Many paradigms, like Kokkos, RAJA, HPX, etc. have been presented as possible solution for targeting different architectures. However, NVIDIA doesn’t look to be standardising the heterogeneous programming, they expect the hardware to become more homogeneous between CPU and GPU. We’d like to test NN with DL_MESO to see how well they perform in reproducing coarse grain simulation. We have also applied for an ECAM2 project to port DL_MESO on C++17 and use Kokkos for performance portability. This will allow us to compare performance with the current CUDA version and understand how well Kokkos can perform.”

James Clark and Aiman Shaikh attend talks by Mellanox Technologies at Supercomputing 2019.

High Performance Software Engineer James Clark concluded:

“On Sunday I presented at the Atos Quantum Workshop. This was a showcase of how the Hartree Centre is using our Quantum Learning Machine, such as our joint training and access programme with Atos and our ongoing project work with Rolls-Royce.

I also talked about our future plans to develop quantum software that can take advantage of both quantum computing and HPC.

One of the most interesting developments in HPC this year was how far ARM CPUs have come. Riken and Fujitsu’s Fugaku is one of the major success stories, with the first deployment of the new SVE (Scalable Vector Extensions) instructions. Fujitsu announced that Cray will be bringing their ARM CPUs to the rest of the world. NVidia also announced that their GPGPUs will be supported on ARM platforms, with a number of ARM CPUs listed as supported on release. I am looking forward to the increased competition in the hardware space turns out, especially with AMD’s Rome CPUs and Intel’s Xe GPUs. The future of HPC looks to be very interesting and it’s an exciting time to be involved.”

I couldn’t have said it better myself!