Bug 255046 - Ports should fetch distfiles in parallel
Summary: Ports should fetch distfiles in parallel
Status: New
Alias: None
Product: Ports & Packages
Classification: Unclassified
Component: Ports Framework (show other bugs)
Version: Latest
Hardware: Any Any
: --- Affects Only Me
Assignee: Port Management Team
URL:
Keywords:
Depends on:
Blocks:
 
Reported: 2021-04-14 08:55 UTC by Tobias Kortkamp
Modified: 2021-04-14 10:48 UTC (History)
2 users (show)

See Also:


Attachments

Note You need to log in before you can comment on or make changes to this bug.
Description Tobias Kortkamp freebsd_committer 2021-04-14 08:55:36 UTC
Ports should fetch distfiles in parallel.  There are more and more
ports that have >20 distfiles and it is painful to wait for the
framework to fetch each one in sequence.
Comment 1 Mathieu Arnold freebsd_committer 2021-04-14 09:49:30 UTC
I don't think ports SHOULD fetch distribution files in parallel, not everyone has DSL or faster internet connections, and not everyone needs to be fast.

We could add an option to do it, but you already can do fetch in parallel yourself. `make fetch-list` returns the list of commands it will run to fetch distfiles, one per line.

So, all one needs is to add an & at the end of each lines, and run it in a shell.

For example by doing:

make fetch-list | sed -e 's/$/ \&/' | sh

If you only want the command to return when it has finished, this can be expanded to something like this:

{make fetch-list | sed -e 's/$/ \&/'; echo wait} | sh
Comment 2 Daniel Engberg freebsd_committer 2021-04-14 10:48:59 UTC
I guess this mainly a concern for ports using go, rust etc although hammering a site isn't ideal either. Not sure how much time you'd save but perhaps first try to reuse the same http/https session (keep-alive) instead creating a new connection for each file?