_translate_from_glance fails with "AttributeError: id" in grenade

Bug #1476770 reported by Matt Riedemann
88
This bug affects 20 people
Affects Status Importance Assigned to Milestone
Glance
Invalid
High
Unassigned
Glance Client
Fix Released
High
Flavio Percoco
Nominated for Kilo by Matt Riedemann
OpenStack-Gate
Fix Committed
Undecided
Matt Riedemann
keystonemiddleware
Fix Released
Undecided
Unassigned
openstack-ansible
Fix Released
High
Jesse Pretorius
Kilo
Fix Released
High
Jesse Pretorius
Liberty
Fix Released
High
Jesse Pretorius
Trunk
Fix Released
High
Jesse Pretorius
oslo.vmware
Fix Released
High
Davanum Srinivas (DIMS)

Bug Description

http://logs.openstack.org/28/204128/2/check/gate-grenade-dsvm/80607dc/logs/old/screen-n-api.txt.gz?level=TRACE

2015-07-21 17:05:37.447 ERROR nova.api.openstack [req-9854210d-b9fc-47ff-9f00-1a0270266e2a tempest-ServersTestJSON-34270062 tempest-ServersTestJSON-745803609] Caught error: id
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack Traceback (most recent call last):
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/opt/stack/old/nova/nova/api/openstack/__init__.py", line 125, in __call__
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack return req.get_response(self.application)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/usr/local/lib/python2.7/dist-packages/webob/request.py", line 1317, in send
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack application, catch_exc_info=False)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/usr/local/lib/python2.7/dist-packages/webob/request.py", line 1281, in call_application
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack app_iter = application(self.environ, start_response)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/usr/local/lib/python2.7/dist-packages/webob/dec.py", line 144, in __call__
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack return resp(environ, start_response)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/usr/local/lib/python2.7/dist-packages/keystonemiddleware/auth_token/__init__.py", line 634, in __call__
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack return self._call_app(env, start_response)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/usr/local/lib/python2.7/dist-packages/keystonemiddleware/auth_token/__init__.py", line 554, in _call_app
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack return self._app(env, _fake_start_response)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/usr/local/lib/python2.7/dist-packages/webob/dec.py", line 144, in __call__
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack return resp(environ, start_response)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/usr/local/lib/python2.7/dist-packages/webob/dec.py", line 144, in __call__
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack return resp(environ, start_response)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/usr/local/lib/python2.7/dist-packages/routes/middleware.py", line 136, in __call__
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack response = self.app(environ, start_response)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/usr/local/lib/python2.7/dist-packages/webob/dec.py", line 144, in __call__
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack return resp(environ, start_response)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/usr/local/lib/python2.7/dist-packages/webob/dec.py", line 130, in __call__
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack resp = self.call_func(req, *args, **self.kwargs)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/usr/local/lib/python2.7/dist-packages/webob/dec.py", line 195, in call_func
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack return self.func(req, *args, **kwargs)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/opt/stack/old/nova/nova/api/openstack/wsgi.py", line 756, in __call__
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack content_type, body, accept)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/opt/stack/old/nova/nova/api/openstack/wsgi.py", line 821, in _process_stack
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack action_result = self.dispatch(meth, request, action_args)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/opt/stack/old/nova/nova/api/openstack/wsgi.py", line 911, in dispatch
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack return method(req=request, **action_args)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/opt/stack/old/nova/nova/api/openstack/compute/servers.py", line 636, in create
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack self._handle_create_exception(*sys.exc_info())
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/opt/stack/old/nova/nova/api/openstack/compute/servers.py", line 465, in _handle_create_exception
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack six.reraise(*exc_info)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/opt/stack/old/nova/nova/api/openstack/compute/servers.py", line 621, in create
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack check_server_group_quota=check_server_group_quota)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/opt/stack/old/nova/nova/hooks.py", line 149, in inner
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack rv = f(*args, **kwargs)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/opt/stack/old/nova/nova/compute/api.py", line 1481, in create
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack check_server_group_quota=check_server_group_quota)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/opt/stack/old/nova/nova/compute/api.py", line 1077, in _create_instance
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack image_id, boot_meta = self._get_image(context, image_href)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/opt/stack/old/nova/nova/compute/api.py", line 765, in _get_image
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack image = self.image_api.get(context, image_href)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/opt/stack/old/nova/nova/image/api.py", line 93, in get
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack show_deleted=show_deleted)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/opt/stack/old/nova/nova/image/glance.py", line 310, in show
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack include_locations=include_locations)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/opt/stack/old/nova/nova/image/glance.py", line 484, in _translate_from_glance
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack include_locations=include_locations)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/opt/stack/old/nova/nova/image/glance.py", line 546, in _extract_attributes
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack queued = getattr(image, 'status') == 'queued'
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/usr/local/lib/python2.7/dist-packages/glanceclient/openstack/common/apiclient/base.py", line 491, in __getattr__
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack self.get()
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/usr/local/lib/python2.7/dist-packages/glanceclient/openstack/common/apiclient/base.py", line 509, in get
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack new = self.manager.get(self.id)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack File "/usr/local/lib/python2.7/dist-packages/glanceclient/openstack/common/apiclient/base.py", line 494, in __getattr__
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack raise AttributeError(k)
2015-07-21 17:05:37.447 21251 TRACE nova.api.openstack AttributeError: id

Whatever this is it's new:

http://logstash.openstack.org/#eyJzZWFyY2giOiJtZXNzYWdlOlwiX3RyYW5zbGF0ZV9mcm9tX2dsYW5jZVwiIEFORCBtZXNzYWdlOlwiQXR0cmlidXRlRXJyb3I6IGlkXCIgQU5EIHRhZ3M6XCJzY3JlZW4tbi1hcGkudHh0XCIiLCJmaWVsZHMiOltdLCJvZmZzZXQiOjAsInRpbWVmcmFtZSI6IjYwNDgwMCIsImdyYXBobW9kZSI6ImNvdW50IiwidGltZSI6eyJ1c2VyX2ludGVydmFsIjowfSwic3RhbXAiOjE0Mzc0OTk4MzcwNTl9

Matt Riedemann (mriedem)
Changed in glance:
status: New → Confirmed
importance: Undecided → High
Revision history for this message
Matt Riedemann (mriedem) wrote :

I assume it's urllib3 1.11 which was released today, requests uses urllib3 and glanceclient uses requests so that must be what's causing the problems.

Revision history for this message
Matt Riedemann (mriedem) wrote :

stable/kilo cap for urllib3 proposed: https://review.openstack.org/#/c/204193/

We'll use that to test the fix to grenade.

Revision history for this message
Matt Riedemann (mriedem) wrote :
Revision history for this message
Davanum Srinivas (DIMS) (dims-v) wrote :

working version was 1.10.4?

Revision history for this message
Ian Cordasco (icordasc) wrote :

requests vendors urllib3 so whatever version is being installed at the gate (e.g., 2.7.0) is using a known working version of urllib3. Unless we're installing requests from system packages and then installing urllib3 via pip, that should have absolutely no effect on requests. Further, neither requests nor urllib3 are in any of the tracebacks.

Revision history for this message
Matt Riedemann (mriedem) wrote :

Confirmed:

(1:34:41 PM) mriedem: could there be a thing where urllib3 1.11 is overwriting what requests uses?
(1:35:41 PM) sigmavirus24: mriedem: unless we're using system packages (apt-get install requests ; pip install urllib3) no
(1:38:00 PM) sigmavirus24: I'm all for being wrong, but I'm failing to see how urllib3 would be causing this beyond a correlation in time (which wouldn't imply causation)
(1:39:35 PM) sigmavirus24: What about backports around images in Nova
(1:39:41 PM) mriedem: sigmavirus24: that's exactly what's happening
(1:39:57 PM) mriedem: python-requests is apt-get installed (it's in the image)
(1:39:58 PM) mriedem: ii python-requests 2.2.1-1ubuntu0.3 all elegant and simple HTTP library for Python, built for human beings
(1:40:21 PM) kragniz: ah, and the debian packages strip out the vendoring, right?
(1:40:21 PM) mriedem: urllib3 is pip installed via oslo.vmware dependency:
(1:40:22 PM) mriedem: http://logs.openstack.org/28/204128/2/check/gate-grenade-dsvm/80607dc/logs/grenade.sh.txt.gz#_2015-07-21_16_56_35_728

Changed in openstack-gate:
status: New → In Progress
assignee: nobody → Matt Riedemann (mriedem)
Revision history for this message
Matt Riedemann (mriedem) wrote :
Changed in glance:
status: Confirmed → Invalid
Changed in oslo.vmware:
status: New → In Progress
Revision history for this message
Matt Riedemann (mriedem) wrote :

The urllib3 cap in oslo.vmware on stable/kilo: https://review.openstack.org/#/c/204269/

Changed in oslo.vmware:
assignee: nobody → Davanum Srinivas (DIMS) (dims-v)
importance: Undecided → High
status: In Progress → Fix Released
Revision history for this message
Matt Riedemann (mriedem) wrote :

So the fix in oslo.vmware was released in 0.11.2. However, this is spiking again:

http://logstash.openstack.org/#eyJmaWVsZHMiOiBbXSwgInNlYXJjaCI6ICJtZXNzYWdlOlwiX3RyYW5zbGF0ZV9mcm9tX2dsYW5jZVwiIEFORCBtZXNzYWdlOlwiQXR0cmlidXRlRXJyb3I6IGlkXCIgQU5EIHRhZ3M6XCJzY3JlZW4tbi1hcGkudHh0XCIiLCAidGltZWZyYW1lIjogIjg2NDAwMCIsICJncmFwaG1vZGUiOiAiY291bnQiLCAib2Zmc2V0IjogMH0=

Looks like that's only failing on the gate-tempest-dsvm-f21 job for devstack and devstack-gate, and that f21 job is a non-voting job.

Sure enough urllib3==1.11 is in that job run.

Changed in openstack-gate:
status: In Progress → Fix Committed
Revision history for this message
Ian Cordasco (icordasc) wrote :

I suspect the same root cause is to blame here. System python-requests is installed and is older. When pip installs urllib3 1.11.0 over the system version, it causes problems because requests doesn't try to anticipate the changes in future urllib3 versions considering how requests releases relative to urllib3. This also points to the fact that we're not using requests from pip even though (according to everyone I talk to) we should. We should fix the problems causing us to not use pip from requests instead of trying to make a short fix around this.

Revision history for this message
Kenneth Burger (burgerk) wrote :

I saw this issue as well updating to urllib3 in nova image-show

 nova/image/glance.py's show() method

following is not getting properties

image = self._client.call(context, version, 'get', image_id)

glance api returns image properties with same case as

• X-Image-Meta-Checksum →d41d8cd98f00b204e9800998ecf8427e
• X-Image-Meta-Container_format →bare
• X-Image-Meta-Created_at →2015-08-13T19:12:26.235339
• X-Image-Meta-Deleted →False
• X-Image-Meta-Disk_format →raw
• X-Image-Meta-Id →03dc7a38-6b1b-4464-b2b5-7e517a8bbbcc
• X-Image-Meta-Is_public →True, etc...

So when glance client calls the api, request/adapters.py build_response() , should convert them into case insensitive like

x-image-meta-checksum, but this is not happening with new urllib3 response object.

They are not converted as expected by the glance client. and then glance client treats this as no properties retrieved for the image. In fact the glance api is returning proper response. But because of this conversion failure, its throwing attribute error.

Revision history for this message
Matthew Edmonds (edmondsw) wrote :

saw this with requests 2.7.0 and urllib3 1.11, where requests was using a symlink to point to urllib3 rather than providing its own version of urllib3 (thanks, red hat). So it doesn't seem like this is just a matter of using the latest versions of both. Requests has to be tied to a specific version. If I look at pypi, it says requests 2.7.0 "Updated urllib3 to 1.10.4". See also https://github.com/kennethreitz/requests/commit/ee7389da98092f6e63f68a490e7f71651c9ec047

Revision history for this message
Matthew Edmonds (edmondsw) wrote :

opened fedora bug for their python-requests rpm spec not requiring a specific python-urllib3 version: https://bugzilla.redhat.com/show_bug.cgi?id=1253823

Revision history for this message
Attila Fazekas (afazekas) wrote :

Ubuntu and Debian does the same thing as Fedora, for saving some storage space they does not duplicates the urllib3 and chardet.

opensuse bundles the chardet and urlib3 as the (old) git version of the requests,
AFAIK this is the only way to prevent this kind of issue.

gentoo just unbundle the chardet, but the urllib3 is not nuked from the ebuild.
half-half in this case it would be ok.

The issue does not visible with debian family because they does not have the latest requests packaged.
BTW: The python-requests is installed on the cloud images because of cloud-init.
You do not see the issue with other distros, because devstack pip installs newer version of requests,
the pip installed version contains it's own (duplicated?) copy of urllib3 and chardet,
Fedora already had the latest version requests packaged, no update happens.

Adding anything to the rpm spec will not prevent pip to replace urllib3 with any other version.

BTW, why `devstack` bumps the python-urllib3 version ?

Revision history for this message
Attila Fazekas (afazekas) wrote :
Changed in python-glanceclient:
status: New → Fix Released
Revision history for this message
Amitabha Biswas (azbiswas) wrote :

I clicked "Fix Released" by mistake, it's still throwing an error in stable/kilo. Please revert the state to "New".

Thanks

Changed in python-glanceclient:
status: Fix Released → New
Revision history for this message
Jesse Keating (jesse-keating) wrote :

We are seeing this in stable/kilo as well, and I believe I know the problem.

First, we isolate each service into its own virtualenv, so the only requirements that will come in are those of the project itself (nova).

Nova requires python-glanceclient, which in turn requires requests, and this requirement is uncapped. The requests library had a new release recently, http://docs.python-requests.org/en/latest/community/updates/#id1 , which bundles urllib3 1.12. Now python-glanceclient is trying to use this new requests and urllib3 and it has a bad time. I was able to resolve this simply by downgrading the requests library to 2.7.0, which was the previous release.

I believe that global-requirements for stable/kilo should cap requests to <=2.7.0 and after that's done, python-glanceclient should also be capped.

Meanwhile we can downgrade requests in our virtualenv as part of the deployment and not be blocked by this bug any longer.

Revision history for this message
Matt Riedemann (mriedem) wrote :

Per comment 17, here is the stable/kilo g-r cap on requests: https://review.openstack.org/#/c/232250/

Once that's merged and synced to python-glanceclient on stable/kilo we can release that as a patch version.

Atsuko Ito (yottatsa)
Changed in fuel:
milestone: none → 8.0
assignee: nobody → MOS Packaging Team (mos-packaging)
Revision history for this message
Atsuko Ito (yottatsa) wrote :

Same for MOS 8.0.
It fails with python-urllib3 1.11-1~u14.04+mos1. I’ve tried it with 1.10.4 from pip and it’s working. Repackaging needed.

I have deployed OpenStack Liberty from https://product-ci.infra.mirantis.net/view/8.0/job/8.0.all/7/ with fuel-main/virtualbox/launch_8GB.sh defaults. In this deployment I have 2 nodes: one for controller and one for compute, without any special options.

VERSION:
  feature_groups:
    - mirantis
  production: "docker"
  release: "8.0"
  openstack_version: "2015.1.0-8.0"
  api: "1.0"
  build_number: "129"
  build_id: "129"
  fuel-nailgun_sha: "a95a1c14595c4ed0dd32a491009cf7bb9641b4e0"
  python-fuelclient_sha: "8cc852ffe19d393f4b529cf8bad5b70a68014a66"
  fuel-agent_sha: "e881f0dabd09af4be4f3e22768b02fe76278e20e"
  fuel-nailgun-agent_sha: "d66f188a1832a9c23b04884a14ef00fc5605ec6d"
  astute_sha: "0f753467a3f16e4d46e7e9f1979905fb178e4d5b"
  fuel-library_sha: "e3d2905b9dd2cc7b4d46201ca9816dd320868917"
  fuel-ostf_sha: "41aa5059243cbb25d7a80b97f8e1060a502b99dd"
  fuel-createmirror_sha: "df6a93f7e2819d3dfa600052b0f901d9594eb0db"
  fuelmain_sha: "f208d8963624ea9cd7810a20258fc6f5a44a33c3"

Atsuko Ito (yottatsa)
no longer affects: fuel
Revision history for this message
Jesse Pretorius (jesse-pretorius) wrote :
Changed in openstack-ansible:
assignee: nobody → Jesse Pretorius (jesse-pretorius)
importance: Undecided → High
status: New → In Progress
milestone: none → 12.0.0
Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Related fix merged to openstack-ansible (master)

Reviewed: https://review.openstack.org/233756
Committed: https://git.openstack.org/cgit/openstack/openstack-ansible/commit/?id=81a750da5ee5cd2b63eae5f13d37372dce6f0cf3
Submitter: Jenkins
Branch: master

commit 81a750da5ee5cd2b63eae5f13d37372dce6f0cf3
Author: Jesse Pretorius <email address hidden>
Date: Mon Oct 12 19:09:58 2015 +0100

    Block/cap incompatible libraries

    This updates the global requirements to block requests 2.8.0 due to:
      https://launchpad.net/bugs/1476770 and
      https://launchpad.net/bugs/1503768 and
      https://launchpad.net/bugs/1505326

    And also blocks oslo.messaging 2.6.0 temporarily due to:
      https://launchpad.net/bugs/1505295

    And also blocks oslo.versionedobjects 0.11.0 temporarily due to:
      https://launchpad.net/bugs/1505677

    And also blocks WebOb<1.5.0 temporarily due to:
      https://launchpad.net/bugs/1505153

    Related-Bug: #1476770
    Related-Bug: #1503768
    Related-Bug: #1505326
    Related-Bug: #1505295
    Related-Bug: #1505153
    Related-Bug: #1505677
    Change-Id: I3aabbf717ef21a41c7bb9d21957df838642926f0

Changed in openstack-ansible:
milestone: 12.0.0 → 12.1.0
Revision history for this message
Dean Meehan (d3an-meehan) wrote :

Finding same problem while using stable/kilo magnum (bay-create). This is due to glance-show <id> returning id when using (--os-image-api-version 1).

Using --os-image-api-version 2 fixes the issue.

no longer affects: python-magnumclient (Suse)
Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix proposed to python-glanceclient (stable/kilo)

Fix proposed to branch: stable/kilo
Review: https://review.openstack.org/244899

Sean Dague (sdague)
Changed in python-glanceclient:
importance: Undecided → High
Changed in python-glanceclient:
assignee: nobody → Flavio Percoco (flaper87)
status: New → In Progress
assignee: Flavio Percoco (flaper87) → Steve Lewis (steve-lewis)
Changed in python-glanceclient:
assignee: Steve Lewis (steve-lewis) → nobody
assignee: nobody → Flavio Percoco (flaper87)
Revision history for this message
Matt Riedemann (mriedem) wrote :

For Kilo, this should be fixed in glanceclient 0.17.3 which will contain a requests<2.8.0 cap:

https://review.openstack.org/#/c/246996/

Revision history for this message
Jordan Pittier (jordan-pittier) wrote :

I think keystonemiddleware needs to issue a release 1.5.3 that include this commit: https://github.com/openstack/keystonemiddleware/commit/d56d96c8d33556e35ca2abffed689753ee0be740 and publish it on pypi.

Revision history for this message
Steve Martinelli (stevemar) wrote :

talked with jordan on irc, keystonemiddleware 1.5.3 is tagged for release for kilo: https://review.openstack.org/247553

Revision history for this message
Steve Martinelli (stevemar) wrote :

ksm is released for kilo, marking this as fix-released for keystonemiddleware

Changed in keystonemiddleware:
status: New → Fix Released
Revision history for this message
Jesse Pretorius (jesse-pretorius) wrote :

With keystonemiddleware 1.5.3 tagged, this will be included automatically with the next tagged releases of OpenStack-Ansible.

Verified in Kilo with a recent build result:
http://logs.openstack.org/57/248557/2/gate/gate-openstack-ansible-dsvm-commit/de13bfd/console.html#_2015-11-26_15_30_45_573

Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Change abandoned on python-glanceclient (stable/kilo)

Change abandoned by Ian Cordasco (<email address hidden>) on branch: stable/kilo
Review: https://review.openstack.org/244899

Revision history for this message
Kairat Kushaev (kkushaev) wrote :

Fix released in glanceclient 0.17.3 as per Matt's comment: https://review.openstack.org/#/c/246996/

Changed in python-glanceclient:
status: In Progress → Fix Released
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Duplicates of this bug

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.