Зарегистрируйтесь сейчас для лучшей персонализированной цитаты!

Latency Analogies (part 2)

Apr, 08, 2013 Hi-network.com

In a prior blog post, I talked about latency analogies.  I compared levels of latencies to your home, your neighborhood, a far-away neighborhood, and another city.  I talked about these localities in terms of communication.

Let's extend that analogy to talk about data locality.

Say that you're out of pineapples.  And you really, really need a pineapple to make that upside-down cake that you love.

So you go to the fruit stand out in front of your subdivision to go buy a pineapple.

Sadly, however, they're out of pineapples.  So you have to go to the local supermarket downtown.

Unbelievably, they're out of pineapples, too.  So you have to drive for a few hours out into the countryside to the pineapple farm to your pineapple.

This is pretty much exactly what the multiple levels of cache do in a server:

  • Check for pineapples (data) in the nearest cache (L1).
  • If the data is not available in the L1 cache, go check the next cache (e.g., L2) - which is a little farther away and takes longer to check.
  • If the data is not available in the caches, go all the way out to RAM and get the data from there.

Each further level you have to go, the longer it takes before you can start making that upside-down cake (i.e., computing using that data).

So when writing your application think about how far you have to go before you get the data: a minute or two to the local fruit stand, 10-20 minutes to the local grocery store, or hours to get out to the farm itself.

But there's a twist here: once you finally get the pineapples from the farm, the local grocery store and fruit stands will remember that you've asked for pineapples recently and will stock up on pineapples.

Specifically: thefirsttime you get pineapples, it's expensive.  But thenexttime you want pineapples, it might be significantly faster because the fruit stands / stores will have them in stock.

...until they local fruit stands / grocery stores run out of pineapples.  Specifically, if you don't keep ordering pineapples, the fruit stands / grocery stores won't stock them, and you'll have to drive all the way out into the country to get them again.

The moral of the story is that dealing with data is all about location and time:

  1. The first time you ask for a piece of data, it'll take a while to retrieve it from RAM.
  2. If you ask for the same data frequently, it'll likely be available from a nearby cache.

The takeaway here (skipping lots and lots of detail, because this blog entry is already too long) is that you should organize your code to glump all your accesses of the same data together to maximize cache re-use.


tag-icon Горячие метки: HPC (HPC) mpi NUMA NUNA

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.