Python gzip lib says "MemoryError: Can't allocate memory for compression object"
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
pyax |
Triaged
|
Undecided
|
Unassigned |
Bug Description
Traceback:
File "Z:\python projects\
query_result = self._callApex(
File "Z:\python projects\
apex_result = apex_method(*args, **kw)
File "Z:\python projects\
return QueryRequest(
File "Z:\python projects\
data=
File "Z:\python projects\
("s", _envNs), ("p", _partnerNs), ("o", _sobjectNs))
File "Z:\python projects\
XmlWriter.
File "Z:\python projects\
self.__gzip = gzip.GzipFile(
File "C:\Python25\
0)
MemoryError: Can't allocate memory for compression object
I'm making the assumption that you're query is resulting in a very large data set - many fields, some quite large and perhaps many records.
Run the query, only instead of returning fields, return count() to see how many rows you're expecting back.
How many fields are you requesting in your query and how big are they?
Do you have a UNIX instance on which you can try this query in pyax? This may be exacerbated by differences in Windows' memory management.
The most immediate thing you can try is to reduce the number of fields you're returning - exclude any that are not necessary (especially large text fields).
Also, if the query wants to return more than 200 rows, we can try reducing the query batch size to the minimum of 200 rows at a time. This will require releasing some code that's checked in, but not yet on the release branch.