duplicate entries in s3 bucket listing
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
goamz |
New
|
Undecided
|
Unassigned |
Bug Description
I wrote a simple program that just writes all keys in a bucket to a file.
the golang-goamz output starts with:
.com/451/
.com/451/
.com/451/
.com/451/
.com/451/
.com/451/
.com/451/
.com/451/
.com/451/
.com/451/
as you can see, pretty much all duplicates.
a similar script in python using the boto library to get a list:
.com/451/
.com/451/
.com/451/
.com/451/
.com/451/
.com/451/
.com/451/
.com/451/
.com/451/
.com/451/
no duplicates here, this is also in line with the s3 web dashboard.
my code is basically just (edited for brevity):
func main() {
auth := aws.Auth{
s3 := s3.New(auth, aws.USEast)
bucket := s3.Bucket(
get_aws := GetFilesAws(bucket)
for in_aws := range get_aws {
}
}
func GetFilesAws(bucket *s3.Bucket) (in_aws chan *s3.Key) {
in_aws = make(chan *s3.Key)
go func() {
marker := ""
for {
if err != nil {
}
if len(listresp.
}
for _, key := range listresp.Contents {
}
}
}()
return in_aws
}
maybe my code is wrong but I believe i implemented correctly according to the spec @ http://