Summary: | Ports should fetch distfiles in parallel | ||
---|---|---|---|
Product: | Ports & Packages | Reporter: | Tobias Kortkamp <tobik> |
Component: | Ports Framework | Assignee: | Port Management Team <portmgr> |
Status: | Closed Not Accepted | ||
Severity: | Affects Only Me | CC: | diizzy, ports-bugs |
Priority: | --- | ||
Version: | Latest | ||
Hardware: | Any | ||
OS: | Any |
Description
Tobias Kortkamp
2021-04-14 08:55:36 UTC
I don't think ports SHOULD fetch distribution files in parallel, not everyone has DSL or faster internet connections, and not everyone needs to be fast. We could add an option to do it, but you already can do fetch in parallel yourself. `make fetch-list` returns the list of commands it will run to fetch distfiles, one per line. So, all one needs is to add an & at the end of each lines, and run it in a shell. For example by doing: make fetch-list | sed -e 's/$/ \&/' | sh If you only want the command to return when it has finished, this can be expanded to something like this: {make fetch-list | sed -e 's/$/ \&/'; echo wait} | sh I guess this mainly a concern for ports using go, rust etc although hammering a site isn't ideal either. Not sure how much time you'd save but perhaps first try to reuse the same http/https session (keep-alive) instead creating a new connection for each file? I agree with all the comments here. |