我尝试以字节[]的形式将文件上传到Azure Storage Account:
public void uploadFilesInBatch(Map<Document, byte[]> documents) throws InterruptedException, IOException {
// int numThreads = Runtime.getRuntime().availableProcessors();
int numThreads = 6;
ExecutorService executorService = Executors.newFixedThreadPool(numThreads);
List<Runnable> tasks = new ArrayList<>();
TikaConfig config = TikaConfig.getDefaultConfig();
for (Map.Entry<Document, byte[]> entry : documents.entrySet()) {
MediaType mediaType = config.getMimeRepository().detect(new ByteArrayInputStream(entry.getValue()), new Metadata());
tasks.add(() -> {
String fileName = entry.getKey().getFileName();
BlobClient blobClient = blobContainerClient.getBlobClient(fileName);
// synchronized (blobClient) {
blobClient.setHttpHeaders(new BlobHttpHeaders().setContentType(mediaType.getType()));
// blobClient.setHttpHeaders(new
BlobHttpHeaders().setContentType(mediaType.toString()));
blobClient.upload(new ByteArrayInputStream(entry.getValue()));
// }
});
}
for (Runnable task : tasks) {
executorService.submit(task);
}
executorService.shutdown();
executorService.awaitTermination(Long.MAX_VALUE, TimeUnit.SECONDS);
}
它在设置HttpHeaders时工作得很好,但是下面的上传语句从未到达,我不明白原因。如果我在blobClient.setHttpHeaders()
行和blobClient.upload()
行中设置断点,IntelliJ将在blobClient.setHttpHeaders()
行中停止。进一步,下一个任务被添加到tasks
ArrayList中。
我正在使用Java 11和这些Azure依赖项:
<azure-core.version>1.36.0</azure-core.version>
<spring-cloud-azure-starter-keyvault-secrets.version>4.3.0</spring-cloud-azure-starter-keyvault-secrets.version> <!-- Version 5.0.0 requires Java major version 61 instead of 55 -->
<azure-storage-blob.version>12.21.0</azure-storage-blob.version>
<azure-storage-blob-batch.version>12.18.0</azure-storage-blob-batch.version>
Azurite在Docker容器中运行:“ www.example.com ”
也许有趣的是,提到的uploadFilesBatch()
方法是从一个在
public myMethod() {
return CompletableFuture.supplyAsync(() -> { ... }
}
1条答案
按热度按时间afdcj2ne1#
我使用
BlobParallelUploadOptions
沿着BlobClient.uploadWithResponse()
解决了这个问题: