Graph Analytics: A Foundational Building Block for the Data Analytics World

Turning data into useful information is fundamentally about sussing out the relationships between phenomena, items, or things. One of the ways it’s done is via Graph Analytics—representing datasets as graphs, setting algorithms loose on them, and getting impressively directional results.

Examples, you ask? Frankly they’re innumerable, but here’s a short list offered by Tim Mattson, Senior Principal Engineer in Intel’s Parallel Computing Laboratory and the subject of this interview:

  • Recommendation engines
  • Web searches
  • Power grid optimization and troubleshooting
  • Road networks/maps (How do driving apps know where the traffic jams are? Ta-da!)
  • Global police/security surveillance of phones, cameras, the Internet
  • DNA/gene regulation

All driven and informed by the power of Graph Analytics.

Intrigued?

Then watch this Tech.Decoded chat with Tim and get a front row seat to this technology, including how it’s evolving, Intel’s approach working with the software/open source ecosystem to optimize it, and where it’s going in this new decade.

Do it. [13:17 mins]

Get the software

 

More resources
Read these related blogs:

Tim Mattson, Senior Principal Engineer, Intel Corporation

Tim Mattson is a parallel programmer whose 24/7 obsession is … science! With Intel since 1993, Tim is a Senior Principal Engineer whose contributions span a brilliant array of globe-changing efforts, including (and this is the short list) the first TFLOP computer, the OpenMP and OpenCL programming languages, Intel’s first TFLOP chip and the 48 core SCC, Polystore data management systems (in collaboration with MIT), and the GraphBLAS API for expressing graph algorithms as sparse linear algebra.

Currently Tim leads a programming systems research group and collaborates with researchers at MIT on the intersection of AI and data systems (dsail.csail.mit.edu).

Tim earned a B.S. in Chemistry degree from the University of California, Riverside, and a M.S. in Chemistry degree and a Ph.D. in quantum scattering theory from the University of California, Santa Cruz

Henry Gabb, PhD, Sr. Principal Engineer, Intel Corporation

Henry is a senior principal engineer in the Intel Software and Services Group, Developer Products Division, and is the editor of The Parallel Universe, Intel’s quarterly magazine for software innovation. He first joined Intel in 2000 to help drive parallel computing inside and outside the company. He transferred to Intel Labs in 2010 to become the program manager for various research programs in academia, including the Universal Parallel Computing Research Centers at the University of California at Berkeley and the University of Illinois at Urbana-Champaign. Prior to joining Intel, Henry was Director of Scientific Computing at the U.S. Army Engineer Research and Development Center MSRC, a Department of Defense high-performance computing facility. Henry holds a B.S. in biochemistry from Louisiana State University, an M.S. in medical informatics from the Northwestern Feinberg School of Medicine, and a PhD in molecular genetics from the University of Alabama at Birmingham School of Medicine. He has published extensively in computational life science and high-performance computing. Henry recently rejoined Intel after spending four years working on a second PhD in information science at the University of Illinois at Urbana-Champaign, where he established an expertise in applied informatics and machine learning for problems in healthcare and chemical exposure.

For more complete information about compiler optimizations, see our Optimization Notice.