python-超出Azure函数内存
发布时间:2022-03-01 17:43:41 380
相关标签: # node.js
我用python编写了一个Azure函数来解压文件并将其上传回blob存储。它适用于小文件,但在尝试gb文件时失败。
我知道问题是文件太大,无法放入内存,需要流式传输。关于如何解压缩和上传这个文件有什么建议吗
我的代码:
import azure.functions as func
import json
import logging
import os
from zipfile import ZipFile
from azure.storage.blob import BlobServiceClient, ContainerClient, blob
def main(mytimer: func.TimerRequest) -> None:
source_conn_str = xxx
source_container = xxx
blob_service_client_origin = BlobServiceClient.from_connection_string(source_conn_str)
source_fileName = xxx
blob_to_copy = blob_service_client_origin.get_blob_client(container=source_container, blob=source_fileName)
# Step 2. Download zip file to local tmp directory
os.chdir('/tmp/')
print("Downloading file")
blob_data = blob_to_copy.download_blob()
data = blob_data.readall()
print("Download complete")
# Step 3. Save zip file to temp directory
local_filepath = xxx
with open(local_filepath, "wb") as file:
file.write(data)
# Step 3. Unzip file to local tmp directory
with ZipFile(local_filepath, 'r') as zipObj:
zipObj.extractall()
# Step 4. Upload file to storage account
dest_conn_str = xxx
blob_service_client = BlobServiceClient.from_connection_string(dest_conn_str)
container_name = xxx
#Set the local file name
local_file_name = xxx
blob_client = blob_service_client.get_blob_client(container=container_name, blob=local_file_name)
# Upload the file to blob storage
print('Uploading file')
with open(local_file_name, "rb") as data:
blob_client.upload_blob(data, overwrite = True)
print('File Upload complete')`enter code here`
当我运行Azure函数时,它会返回:Exception message: python exited with code 137.这意味着它已经没有记忆了。非常感谢您的任何建议。
特别声明:以上内容(图片及文字)均为互联网收集或者用户上传发布,本站仅提供信息存储服务!如有侵权或有涉及法律问题请联系我们。
举报