Artificial intelligent assistant

Fetch repository data parallel If I give out a "zypper up" or "apt-get update", the repository datas are fetched in linear order. The big Question: Why? Why cannot we speed up the update process by starting all the repository data download parallel? I am not talking about package updates, just repo infos.

In my view it is because it is not necessary.

Currently the typical update processes (apt, yum, etc) are not bandwidth limited in general. The fraction of time of the update process that is spent downloading repository files or packages is either not significant (seconds) or may not be significantly improved by adding parallelisation [since if bandwidth is a problem, parallelisation may make it worse].

There's other limitations. Apt for example does not even support 2 simultaneous operations whereas yum or emerge, for example, does. These limitations that may exist to limit complexity or simply because they're not really an issue to every day users and sysadmins.

Complexity and performance requires effort, which means it needs to be justified to an extent before the effort is spent.

However, if you really want to pursue this, it is generally possible:

* <
* <

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy 999eb775fc791ffa0d3025aed0f937e8