Redis sorted set stores score as a floating point number

Today I was playing a little with our statistics. I was writing some “Go” (golang) that was requesting a top 20 customers from Redis using a “Sorted Set” and the “ZREVRANGEBYSCORE” command. Then I found out that the score of the sorted set was actually stored as a double precision floating point number. Normally, I would not be bothered about this and use the float storage for integer values.

But this time I wanted to make a top 20 of the real-time monthly data traffic of our entire CDN platform. A little background: Hits on the Nginx edges are measures in bytes and the logs are streamed to our statistics cluster. Therefor the real-time statistics counters for the traffic are in bytes. Normally we use 64 bit integers (in worst case they are signed and you lose 1 bit).

2^64 = 9,223,372,036,854,775,807
      EB, PB, TB, GB, MB, kB,  b

If you Google for: “9,223,372,036,854,775,807 bytes per month in gigabit per second” you will find that this is about 26 Tbps on average. We do not have such big customers yet, so that will do for now. So an “int64” will do, but how about the double precision float? Since it has a floating point, theory says it can not reliably count when numbers become too large. But how large is too large? I quickly implemented a small script in golang to find out:

package main

import (
	"fmt"
)

func main() {
	bits := 1
	float := float64(1)
	for float+1 != float {
		float *= 2
		bits++
	}
	fmt.Printf("%.0f = %d bits\n", float, bits)
}

Every step the script doubles the number and tries to add 1 until it fails. This is the output showing when the counting goes wrong:

9007199254740992 = 54 bits

So from 54 bits the counting is no longer precise (to the byte). What does that mean for our CDN statistics? Let’s do the same calculation we did before:

2^54 = 9,007,199,254,740,992
      PB, TB, GB, MB, kB,  b

If you Google for: “9,007,199,254,740,992 bytes per month in gigabits per second” you will find that this is about 25 Gbps on a monthly average. We definitely have customers that do much more than that.

I quickly calculated that the deviation would be less than 0.000000000000001%. But then I realized I was wrong: At 26 Tbps average the deviation might as well be as big as 1 kB (10 bits). Imagine that the customer is mainly serving images and JavaScript from the CDN and has an average file size of 10 kB. In this case the statistics will be off by 10% during the last days of the month!

Okay, this may be the worst case scenario, but still I would not sleep well ignoring it. I feel that when it comes to CDN statistics, accuracy is very important. You are dealing with large numbers and lots of calculation and as you see this may have some unexpected side effects. That is why these kind of seemingly small things keep me busy.

Share

Leave a Reply

Your email address will not be published. Required fields are marked *