将大文件从Google Cloud Storage加载到Google Cloud Functions?
是否可以将Google Cloud Storage中的大文件(> 100MB)加载到Google Cloud Functions中?我在他们的 quotas 中看到,后台功能的最大事件大小限制为10MB。我可以逐块阅读还是类似的内容?
Is there a way to load big files (>100MB) from Google Cloud Storage into Google Cloud Functions? I read in their quotas that the "Max event size for background functions" is limited to 10MB. Can I read it chunk-wise or something like that?
非常感谢。
存储的云功能由文件的元数据触发,该元数据相对较小,不会达到最大事件侧限制。
Cloud Functions for Storage are triggered with the metadata for the file, which is relatively small and won't hit the max-event-side limit.
要访问文件的实际内容,您将将node.js软件包用于云存储,不受10MB限制的影响。
To access the actual contents of the file, you'll use the node.js package for Cloud Storage, which is not affected by the 10MB limit.