SDCH (Shared Dictionary Compression) Support


#1

Google has created a compression format that is an alternative to GZIP: SDCH (Shared Dictionary Compression for HTTP). It is supported by Chrome and Android browsers. Google uses it for its own sites as does LinkedIn and they’ve seen significant performance improvements. Browser support is advertised via Accepts HTTP headers.

Is this something that Fastly can add support for? I think it could have huge benefits for us.

Thanks!


#2

Hi,

I didn’t notice this post earlier, and I’m a little surprised noone else responded yet.

First off, I would like to ask what exactly you mean by “add support for”? Because if you mean “have Fastly do the compression”, I don’t think that’s very feasible. SDCH is the most effective if you have a lot of content shared between multiple objects, and thus requires analysis of not only multiple pieces of content, but also analysis of traffic patterns to figure out which content goes together a lot. That’s currently beyond our scope, and is better done in coordination with other FEO (Front End Optimization).

However, even if you have SDCH all handled at the Origin, we currently normalize the Accept-Encoding header for the most common case (gzip). We’re in the process of adding a header that shows the original value of Accept-Encoding, so customers will be able to do normalization that takes other forms of compression into account.

I personally doubt SDCH will get much traction, since it really is a PITA to get the most efficient use out of it. However, brotli is the new kid on the block, and shows great promise! (With compression and decompression speeds equal to or greater than gzip, and better compression ratios.) So that’s why we’re adding the support for customers to do their own Accept-Encoding normalization.


#3

Thank you for responding!

By support, I did mean “have Fastly do the compression”, I figured that Fastly would be well positioned to analyze the content usage patterns to figure out how to best make the dictionary. Is FEO something that Fastly is not interested in doing?

Can you explain the current gzip “normalization”? Are you not passing the client’s request Accept-Encoding header at this time and just always sending gzip?

Can you explain to me the change? How will we be able to use non-GZIP compression in the future. Will Fastly do the compression on its end? Or will it just allow the origin server to do other types of compression and handle caching appropriately?

What happens today? Does Fastly do GZIP compression at the edge servers? Or does it just pass on the origin server’s compressed content?

I am also very excited about brotli, and I look forward to using your support for it.