Fastly 对大文件分发的改善

2014-10-29 05:04:32 +08:00
 aveline
http://www.fastly.com/blog/improving-the-delivery-of-large-files-with-stream-on-miss-and-large-file-support

http://docs.fastly.com/guides/caching/what-support-does-fastly-have-for-large-files

看了下,似乎 Varnish 3.x 里就有这个功能了:
https://www.varnish-cache.org/docs/3.0/reference/vcl.html
2982 次点击
所在节点    CDN
4 条回复
aveline
2014-10-29 05:08:55 +08:00
仔细看了下,3.x 里文档是这么写的:

Deliver the object to the client directly without fetching the whole object into varnish. If this request is pass'ed it will not be stored in memory. As of Varnish Cache 3.0 the object will marked as busy as it is delivered so only client can access the object.

而 4.x 是:

Deliver the object to the client directly without fetching the whole object into varnish. If this request is pass'ed it will not be stored in memory.
Livid
2014-10-29 05:29:15 +08:00
没有想到之前居然是这样的:

To illustrate this, let’s look at a download example. If a 25MB application is being served from an origin over a connection that’s giving each client ~375kb/s, the download will take about 70 seconds. If that application was cached on Fastly without the Stream-on-Miss feature, then the first client to get a miss would have to wait 70 seconds while Fastly fetched it from the origin, and only then would they start downloading from our edge server.
aveline
2014-10-29 05:32:59 +08:00
@Livid 我也没有想到,另外 100M 的限制也是有点囧,可能因为早期 Varnish 只缓存到内存?
sNullp
2014-10-29 05:43:15 +08:00
我之前测试各种反代程序就是为了观察这种情况:
/t/141198

这是一个专为移动设备优化的页面(即为了让你能够在 Google 搜索结果里秒开这个页面),如果你希望参与 V2EX 社区的讨论,你可以继续到 V2EX 上打开本讨论主题的完整版本。

https://www.v2ex.com/t/142260

V2EX 是创意工作者们的社区,是一个分享自己正在做的有趣事物、交流想法,可以遇见新朋友甚至新机会的地方。

V2EX is a community of developers, designers and creative people.

© 2021 V2EX