Comment 0 for bug 1809041

Revision history for this message
Jamon Camisso (jamon) wrote : Unable to push to registry with swift storage backend

Somehow the swift backend can time out and stop an from being uploaded. Debug level errors from the registry look like this:

{"err.code":"unknown","err.detail":"swift: Timeout expired while waiting for segments of /docker/registry/v2/blobs/sha256/03/0330ca45a200e1fcef05ae97f434366d262a1c50b3dc053e7928b58dd37211dd/data to show up","err.message":"unknown error","go.version":"go1.10.4","http.request.host":"registry.jujucharms.com","http.request.id":"cdc35629-693b-452d-b6a6-60e214c4d9ca","http.request.method":"PUT","http.request.remoteaddr":"....","http.request.uri":"/v2/jamon/kubeflow-tf-hub/jupyterhub-image/blobs/uploads/92fe2adb-00f8-4515-8743-88bb23e76450?_state=....digest=sha256%3A0330ca45a200e1fcef05ae97f434366d262a1c50b3dc053e7928b58dd37211dd","http.request.useragent":"docker/18.06.1-ce go/go1.10.4 git-commit/e68fc7a kernel/4.15.0-42-generic os/linux arch/amd64 UpstreamClient(Go-http-client/1.1)","http.response.contenttype":"application/json; charset=utf-8","http.response.duration":"13.171153577s","http.response.status":500,"http.response.written":104,"level":"error","msg":"response completed with error","time":"2018-12-18T20:06:53.271720479Z","vars.name":"jamon/kubeflow-tf-hub/jupyterhub-image","vars.uuid":"92fe2adb-00f8-4515-8743-88bb23e76450"}

The " Timeout expired while waiting for segments" message, and 0 length data file make me think that the issue is this upstream bug: https://github.com/docker/distribution/issues/1013

For example, in that failed upload, the files in question look like this in swift:

$ swift stat --lh docker-registry-blobs files/docker/registry/v2/repositories/juju/kubeflow-tf-hub/jupyterhub-image/_layers/sha256/0330ca45a200e1fcef05ae97f434366d262a1c50b3dc053e7928b58dd37211dd/link
       Account: AUTH_18fdda09da1747f4885b940cadff4cc0
     Container: docker-registry-blobs
        Object: files/docker/registry/v2/repositories/juju/kubeflow-tf-hub/jupyterhub-image/_layers/sha256/0330ca45a200e1fcef05ae97f434366d262a1c50b3dc053e7928b58dd37211dd/link
  Content Type: application/octet-stream
Content Length: 71
 Last Modified: Tue, 11 Dec 2018 05:39:19 GMT
          ETag: 239c76eecd4fd38d448173f554cb4a36
 Accept-Ranges: bytes
   X-Timestamp: 1544506758.53727
    X-Trans-Id: txae919e1f3a5a4b0ea0622-005c1955de

$ swift stat --lh docker-registry-blobs files/docker/registry/v2/blobs/sha256/03/0330ca45a200e1fcef05ae97f434366d262a1c50b3dc053e7928b58dd37211dd/data
       Account: AUTH_18fdda09da1747f4885b940cadff4cc0
     Container: docker-registry-blobs
        Object: files/docker/registry/v2/blobs/sha256/03/0330ca45a200e1fcef05ae97f434366d262a1c50b3dc053e7928b58dd37211dd/data
  Content Type: application/octet-stream
Content Length: 0
 Last Modified: Tue, 11 Dec 2018 05:39:15 GMT
          ETag: "d41d8cd98f00b204e9800998ecf8427e"
      Manifest: docker-registry-blobs/segments/2f6/46f636b65722f72656769737472792f76322f7265706f7369746f726965732f6a756a752f6b756265666c6f772d74662d6875622f6a7570797465726875622d696d6167652f5f75706c6f6164732f33353635356461622d346234652d343263382d613532312d6536316433336366613434632f64617461425d87ada26ee6929920326a4d60f263d0cd2c939385f948139e4791eb2d69a6da39a3ee5e6b4b0d3255bfef95601890afd80709
 Accept-Ranges: bytes
   X-Timestamp: 1544506754.17360
    X-Trans-Id: tx2f2d17d03c0248a8a7444-005c1955fb

Anyone pushing an image that contains the layer in question will run into the error, since the files in swift are named based on the sha256 hash. The upstream issue doesn't really have much in the way of fixes.

The workaround for now is to delete the files, and then re-push the image in question.