Why big data and cognitive computing are perfect partners

Welcome to our new Hartree Centre blog.

Our first few posts aim to shine a light on what’s happening here at Hartree and how this relates both to wider trends in cutting-edge computing and to the key challenges and opportunities facing UK industry today. It’s a chance for us to get to know each other a little better… so please do let us know what you think and what you’d like to hear from us about.

Let’s begin. If you’re fairly new to the world of big data, high performance computing and cognitive technologies then we hope that this is the place to start. So where better to kick off than with that ‘fashionable’ concept of big data…

Whatever your line of business, big data is a big deal. It may be a cliché but the ability to squeeze meaning and value from vast, fast-growing data mountains inside and outside your organisation is increasingly crucial to bottom-line success and staying competitive. As Hartree’s Big Data Technical lead Roger Downing says:

“It’s about being able to make data-driven decisions, creating actionable business outcomes from valuable data inside and outside your organisation.”

For industry, it can be seen as a three-step process. First comes awareness that leveraging big data is business-critical (really, it is!). Second, it’s a question of making it happen. Third, it’s about maximising the impact and effectiveness of these big data operations – especially as more organisations commit to step two and the key to competitive advantage increasingly lies in the ability to achieve step three. That, in turn, increases the demand for technologies that push the boundaries of big data analytics, helping to quarry useful data buried far, wide and deep, and pinpointing patterns and connections that would otherwise remain invisible in very diverse, often unstructured (and sometimes quite random and disparate) datasets.

NEW blog-2-big-data-cognitive

The irony is that many of the things you need computers to do to extract full value from big data are actually quite ‘human’ – we do these things ourselves all the time. To consider and contrast; to identify links and remember small details; to apply ‘critical capacity’; to learn from experience. That’s why cognitive computing is such a game-changer. And while it would be an exaggeration to say that this emerging field of advanced computing still represents an unknown frontier, it’s certainly right at the tip of the arrow-head of computing innovation.

Roger says that:

“Cognitive computing can be split into knowledge-driven machine learning (text, rules and knowledge) and data-driven decisions (statistic-based) and developments in this capability are being driven by technological advances on neural nets working on GPUs. The aim of cognitive computing is to mimic how the human mind works. Fundamentally, that means developing new computing paradigms that behave like our own thought and learning processes but far exceed our capabilities by analysing data and gaining a holistic view of data holdings much faster, more thoroughly and more reliably than humans could possibly do. So cognitive computing is about ‘reasoning with purpose’ and functioning like a person would if they could combine the wisdom of experience with a perfect memory, instant recall and the ability to respond very rapidly.”

It’s clear that big data and cognitive computing can, will and must work closely together if big data’s phenomenal potential is to be realised – with the Hartree Centre already hard at work enabling UK businesses to benefit from this fledgling but fast developing relationship and so take another big step forward in putting data right at the heart of their operations.

If you would like to find out more about us or have any questions at all, please do leave a comment below.