如何解决尝试多处理项目时使用boto3 dynamodb
你好,我尝试用python在dynamodb上创建一个表到表的副本。该副本必须经过多次处理。
localDynamoHost = 'http://127.0.0.1:8000'
def copy_items(src_table: str,dst_table: str,client,segment,total_segments):
# copy over item
print('copy_items')
print(client)
item_count = 0
paginator = client.get_paginator('scan')
for page in paginator.paginate(
TableName=src_table,Select='ALL_ATTRIBUTES',ReturnConsumedCapacity='NONE',ConsistentRead=True,Segment=segment,TotalSegments=total_segments,PaginationConfig={"PageSize": 25}):
batch = []
for item in page['Items']:
item_count += 1
batch.append({
'PutRequest': {
'Item': item
}
})
print("Process {0} put {1} items".format(segment,item_count))
client.batch_write_item(
RequestItems={
dst_table: batch
}
)
if __name__ == "__main__":
if len(sys.argv) != 4:
print("Usage: {0} <source_table_name> <destination_table_name> <isLocal>".format(sys.argv[0]))
sys.exit(1)
table_1 = sys.argv[1]
table_2 = sys.argv[2]
isLocal = sys.argv[3]
# defaults to us-west-2
region = os.getenv('AWS_DEFAULT_REGION',os.getenv('AWS_REGION','us-west-2'))
if not isLocal:
iam_role = boto3.session.Session(profile_name='default')
db_client = iam_role.client('dynamodb')
else:
db_client = boto3.client('dynamodb',endpoint_url=localDynamoHost)
worker = multiprocessing.Process(
target=copy_items,kwargs={
'src_table': table_1,'dst_table': table_2,'client': db_client
}
)
worker.start()
worker.join()
print("*** All Jobs Done. Exiting... ***")
执行它时,出现以下错误:
_pickle.PicklingError: Can't pickle <class 'botocore.client.DynamoDB'>: attribute lookup DynamoDB on botocore.client failed
我已经从以下主题启发了我:How to use boto3 client with Python multiprocessing? 但这似乎不适用于我的情况。
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。