Handling Traffic Spikes for Live Streaming Portals

Hello! I run a site that provides tools for TikTok Live creators. When a big creator goes live, our traffic spikes from 0 to 10,000+ users in seconds.

We are currently looking at moving our asset delivery to Fastly to handle these “bursty” traffic patterns. On our current site, https://tocwave.com we are struggling with origin overhead during these peaks.

Does the Fastly community recommend using Edge Rate Limiting to protect the origin, or should we focus strictly on high TTLs for the streaming chunks?

Appreciate any insights from the experts!

1 Like

Hi @Mark, welcome to the forum :slight_smile:

For “bursty” traffic events, I’d recommend that you check out our documentation on Shielding.

Setting this up would mean that your origin only receives one request for each chunk, which will then be distributed throughout our network. This, in combination with a reasonable TTL, should alleviate the pressure on your origin significantly.

Edge Rate Limiting would only really be helpful if you’re seeing bad actors sending a huge amount of requests for chunks. If you were to set it up to reduce traffic during these spikes, you’d be preventing your real users from being served.

Hope this helps. Best of luck!

2 Likes

Hi, thanks for the warm welcome! :slightly_smiling_face:
That makes a lot of sense -using Shielding to collapse requests so the origin only serves one request per chunk is precisely the kind of origin offload we need during those spikes. We’ll prioritize setting up Shielding and tune TTL based on how often the chunks rotate.
Good point on Edge Rate Limiting too — we don’t want to throttle legitimate viewers during a real live event. We’ll keep ERL as a targeted mitigation only if we confirm abusive/request-flood patterns.
Appreciate the help!