Name Date Size #Lines LOC

..14-Jul-2020-

test/H10-May-2019-

Makefile.incH A D30-Jan-2017937 213

READMEH A D30-Apr-20192.9 KiB7951

metalink.ccH A D06-Nov-201927.5 KiB908419

README

1
2                               MMeettaalliinnkk
3
4   Try not to download the same file twice.  Improve cache efficiency
5   and speed up downloads.
6
7   Take standard headers and knowledge about objects in the cache and
8   potentially rewrite those headers so that a client will use a URL
9   that's already cached instead of one that isn't.  The headers are
10   specified in [RFC 6249] (Metalink/HTTP: Mirrors and Hashes) and
11   [RFC 3230] (Instance Digests in HTTP) and are sent by various
12   download redirectors or content distribution networks.
13
14
1511..  WWhhoo CCaarreess??
16
17   More important than saving a little bandwidth, this saves users
18   from frustration.
19
20   A lot of download sites distribute the same files from many
21   different mirrors and users don't know which mirrors are already
22   cached.  These sites often present users with a simple download
23   button, but the button doesn't predictably access the same mirror,
24   or a mirror that's already cached.  To users it seems like the
25   download works sometimes (takes seconds) and not others (takes
26   hours), which is frustrating.
27
28   An extreme example of this happens when users share a limited,
29   possibly unreliable internet connection, as is common in parts of
30   Africa for example.
31
32   [How to cache openSUSE repositories with Squid] is another,
33   different example of a use case where picking a URL that's already
34   cached is valuable.
35
36
3722..  WWhhaatt iitt DDooeess
38
39   When it sees a response with a "Location: ..." header and a
40   "Digest: SHA-256=..." header, it checks if the URL in the Location
41   header is already cached.  If it isn't, then it tries to find a URL
42   that is cached to use instead.  It looks in the cache for some
43   object that matches the digest in the Digest header and if it
44   succeeds, then it rewrites the Location header with that object's
45   URL.
46
47   This way a client should get sent to a URL that's already cached
48   and won't download the file again.
49
50
5133..  HHooww ttoo UUssee iitt
52
53   Just build the plugin and add it to your plugin.config file.
54
55   The code is distributed along with recent versions of Traffic
56   Server, in the plugins/experimental/metalink directory.  To build
57   it, pass the --enable-experimental-plugins option to the configure
58   script when you build Traffic Server:
59
60   <pre>$ ./configure --enable-experimental-plugins</pre>
61
62   When you're done building Traffic Server, add "metalink.so" to your
63   plugin.config file to start using the plugin.
64
65
6644..  RReeaadd MMoorree
67
68   More details are on the [wiki page] in the Traffic Server wiki.
69
70
71   [RFC 6249]    http://tools.ietf.org/html/rfc6249
72
73   [RFC 3230]    http://tools.ietf.org/html/rfc3230
74
75   [How to cache openSUSE repositories with Squid]
76                 http://wiki.jessen.ch/index/How_to_cache_openSUSE_repositories_with_Squid
77
78   [wiki page]   https://cwiki.apache.org/confluence/display/TS/Metalink
79