This continues to be an intermittent but persistent issue. Upon investigating latest pip (18.1) sets cache-control: max-age=0 on index requests. This means our caching proxies are not caching the indexes that pip is interested in. If you hit the indexes with your browser the max-age comes back as 600 seconds so we would be caching things for 10 minutes if not for pip explicitly disabling this.
PyPA suggests that pip isn't the tool to download and install python packages from pypi in a CI system. I don't think there is an alternative suggested.
Another suggestion from PyPA is to go back to a bandersnatch mirror. We stopped doing this as PyPI has been growing at an unsustainable rate. Others appear to have noticed this as well and PyPI is packaging rough stats on package sizes now at https://pypi.org/stats/. Total size as of today is 2.6TB and excluding the top 100 packages by size would get us down to about 1TB total.
We can't exclude all of the top packages by size as we depend on some of them like grpcio and numpy, but we may be able to run a reasonably sized pypi mirror now if we decide to spin that up with a large blacklist of packages that we know we don't want (like tf-nightly-gpu).
This continues to be an intermittent but persistent issue. Upon investigating latest pip (18.1) sets cache-control: max-age=0 on index requests. This means our caching proxies are not caching the indexes that pip is interested in. If you hit the indexes with your browser the max-age comes back as 600 seconds so we would be caching things for 10 minutes if not for pip explicitly disabling this.
PyPA suggests that pip isn't the tool to download and install python packages from pypi in a CI system. I don't think there is an alternative suggested.
Another suggestion from PyPA is to go back to a bandersnatch mirror. We stopped doing this as PyPI has been growing at an unsustainable rate. Others appear to have noticed this as well and PyPI is packaging rough stats on package sizes now at https:/ /pypi.org/ stats/. Total size as of today is 2.6TB and excluding the top 100 packages by size would get us down to about 1TB total.
We can't exclude all of the top packages by size as we depend on some of them like grpcio and numpy, but we may be able to run a reasonably sized pypi mirror now if we decide to spin that up with a large blacklist of packages that we know we don't want (like tf-nightly-gpu).