@azure/storage-blob

Azure Blob storage is Microsoft's object storage solution for the cloud. Blob storage is optimized for storing massive amounts of unstructured data. Unstructured data is data that does not adhere to a particular data model or definition, such as text or binary data.

Azure Blob storage is Microsoft's object storage solution for the cloud. Blob storage is optimized for storing massive amounts of unstructured data. Unstructured data is data that does not adhere to a particular data model or definition, such as text or binary data.

This project provides a client library in JavaScript that makes it easy to consume Microsoft Azure Blob Storage service.

Version: 12.0.0-preview.5

Key concepts

Features

  • Blob Storage
    • Get/Set Blob Service Properties
    • Create/List/Delete Containers
    • Create/Read/List/Update/Delete Block Blobs
    • Create/Read/List/Update/Delete Page Blobs
    • Create/Read/List/Update/Delete Append Blobs
  • Features new
    • Asynchronous I/O for all operations using the async methods
    • HttpPipeline which enables a high degree of per-request configurability
    • 1-to-1 correlation with the Storage REST API for clarity and simplicity

Compatibility

This library is compatible with Node.js and browsers, and validated against LTS Node.js versions (>=8.16.0) and latest versions of Chrome, Firefox and Edge.

Compatible with IE11

You need polyfills to make this library work with IE11. The easiest way is to use @babel/polyfill, or polyfill service.

You can also load separate polyfills for missed ES feature(s).This library depends on following ES features which need external polyfills loaded.

  • Promise
  • String.prototype.startsWith
  • String.prototype.endsWith
  • String.prototype.repeat
  • String.prototype.includes
  • Array.prototype.includes
  • Object.assign
  • Object.keys (Override IE11's Object.keys with ES6 polyfill forcely to enable ES6 behavior)
  • Symbol

Differences between Node.js and browsers

There are differences between Node.js and browsers runtime. When getting started with this library, pay attention to APIs or classes marked with "ONLY AVAILABLE IN NODE.JS RUNTIME" or "ONLY AVAILABLE IN BROWSERS".

Following features, interfaces, classes or functions are only available in Node.js
  • Shared Key Authorization based on account name and account key
    • SharedKeyCredential
  • Shared Access Signature(SAS) generation
    • generateAccountSASQueryParameters()
    • generateBlobSASQueryParameters()
  • Parallel uploading and downloading
    • BlockBlobClient.uploadFile()
    • BlockBlobClient.uploadStream()
    • BlobClient.downloadToBuffer()
    • BlobClient.downloadToFile()
Following features, interfaces, classes or functions are only available in browsers
  • Parallel uploading and downloading
    • BlockBlobClient.uploadBrowserData()

Getting started

NPM

The preferred way to install the Azure Blob Storage client library for JavaScript is to use the npm package manager. Simply type the following into a terminal window:

npm install @azure/storage-blob@12.0.0-preview.5

In your TypeScript or JavaScript file, import via following:

import * as AzureStorageBlob from "@azure/storage-blob";

Or

const AzureStorageBlob = require("@azure/storage-blob");

JavaScript bundle

To use the library with JS bundle in the browsers, simply add a script tag to your HTML pages pointing to the downloaded JS bundle file(s):

<script src="https://mydomain/azure-storage-blob.min.js"></script>

The JS bundled file is compatible with UMD standard, if no module system found, following global variable(s) will be exported:

Download

Download latest released JS bundles from links in the GitHub release page.

CORS

You need to set up Cross-Origin Resource Sharing (CORS) rules for your storage account if you need to develop for browsers. Go to Azure portal and Azure Storage Explorer, find your storage account, create new CORS rules for blob/queue/file/table service(s).

For example, you can create following CORS settings for debugging. But please customize the settings carefully according to your requirements in production environment.

  • Allowed origins: *
  • Allowed verbs: DELETE,GET,HEAD,MERGE,POST,OPTIONS,PUT
  • Allowed headers: *
  • Exposed headers: *
  • Maximum age (seconds): 86400

Examples

Samples can be found at Javascript samplesand Typescript samples

Below are the snippets to get started with the Azure Storage blob client library.

Import types

You can use the const Azure = require("@azure/storage-blob"); shown above then use types and functions from Azure.Or you can selectively import certain types,

const { BlobServiceClient, SharedKeyCredential } = require("@azure/storage-blob");

Create the blob service client

Use the constructor to create a instance of BlobServiceClient.

  • Recommended way to instantiate a BlobServiceClient - with DefaultAzureCredential from @azure/identity package

    Setup : Reference - Authorize access to blobs and queues with Azure Active Directory from a client application - https://docs.microsoft.com/azure/storage/common/storage-auth-aad-app

    • Register a new AAD application and give permissions to access Azure Storage on behalf of the signed-in user

      • Register a new application in the Azure Active Directory(in the azure-portal) - https://docs.microsoft.com/azure/active-directory/develop/quickstart-register-app
      • In the API permissions section, select Add a permission and choose Microsoft APIs.
      • Pick Azure Storage and select the checkbox next to user_impersonation and then click Add permissions. This would allow the application to access Azure Storage on behalf of the signed-in user.
    • Grant access to Azure Blob data with RBAC in the Azure Portal

    • Environment setup for the sample

      • From the overview page of your AAD Application, note down the CLIENT ID and TENANT ID. In the "Certificates & Secrets" tab, create a secret and note that down.
      • Make sure you have AZURE_TENANT_ID, AZURE_CLIENT_ID, AZURE_CLIENT_SECRET as environment variables to successfully execute the sample(Can leverage process.env).
    const account = ""; const defaultAzureCredential = new DefaultAzureCredential(); const blobServiceClient = new BlobServiceClient( `https://${account}.blob.core.windows.net`, defaultAzureCredential );

    [Note - Above steps are only for Node.js]

  • Alternatively, you instantiate a BlobServiceClient with a SharedKeyCredential by passing account-name and account-key as arguments. (account-name and account-key can be obtained from the azure portal)[ONLY AVAILABLE IN NODE.JS RUNTIME]

     const account = "account"; const accountKey = "accountkey"; const sharedKeyCredential = new SharedKeyCredential(account, accountKey); const blobServiceClient = new BlobServiceClient( `https://${account}.blob.core.windows.net`, sharedKeyCredential );

Create a new container

Use BlobServiceClient.getContainerClient() to get a container client instance then create a new container resource.

 const containerName = `newcontainer${new Date().getTime()}`; const containerClient = blobServiceClient.getContainerClient(containerName); const createContainerResponse = await containerClient.create(); console.log(`Create container ${containerName} successfully`, createContainerResponse.requestId);

List the containers

Use BlobServiceClient.listContainers() function to iterate the containers,with the new for-await-of syntax:

let i = 1; let iter = await blobServiceClient.listContainers(); for await (const container of iter) { console.log(`Container ${i++}: ${container.name}`); }

Alternatively without using for-await-of:

let i = 1; let iter = blobServiceClient.listContainers(); let containerItem = await iter.next(); while (!containerItem.done) { console.log(`Container ${i++}: ${containerItem.value.name}`); containerItem = await iter.next(); }

In addition, pagination is supported for listing too via byPage():

let i = 1; for await (const response of blobServiceClient.listContainers().byPage({ maxPageSize: 20 })) { if (response.containerItems) { for (const container of response.containerItems) { console.log(`Container ${i++}: ${container.name}`); } } }

For a complete sample on iterating containers please see samples/iterators-containers.ts.

Create a blob by uploading data to

const content = "hello"; const blobName = "newblob" + new Date().getTime(); const blobClient = containerClient.getBlobClient(blobName); const blockBlobClient = blobClient.getBlockBlobClient(); const uploadBlobResponse = await blockBlobClient.upload(content, content.length); console.log(`Upload block blob ${blobName} successfully`, uploadBlobResponse.requestId);

List blobs inside a container

Similar to listing containers.

let i = 1; let iter = await containerClient.listBlobsFlat(); for await (const blob of iter) { console.log(`Blob ${i++}: ${blob.name}`); }

For a complete sample on iterating blobs please see samples/iterators-blobs.ts.

Download a blob and convert it to a string (Node.js)

 const downloadBlockBlobResponse = await blobClient.download(); const downloaded = await streamToString(downloadBlockBlobResponse.readableStreamBody); console.log("Downloaded blob content:", downloaded); async function streamToString(readableStream) { return new Promise((resolve, reject) => { const chunks = []; readableStream.on("data", (data) => { chunks.push(data.toString()); }); readableStream.on("end", () => { resolve(chunks.join("")); }); readableStream.on("error", reject); }); }

Download a blob and convert it to a string (Browsers)

 const downloadBlockBlobResponse = await blobClient.download(); const downloaded = await blobToString(await downloadBlockBlobResponse.blobBody); console.log( "Downloaded blob content", downloaded ); async function blobToString(blob: Blob): Promise<string> { const fileReader = new FileReader(); return new Promise<string>((resolve, reject) => { fileReader.onloadend = (ev: any) => { resolve(ev.target!.result); }; fileReader.onerror = reject; fileReader.readAsText(blob); }); }

A complete example of basic scenarios is at samples/basic.ts.

Troubleshooting

Enabling logging may help uncover useful information about failures. In order to see a log of HTTP requests and responses, set the AZURE_LOG_LEVEL environment variable to info. Alternatively, logging can be enabled at runtime by calling setLogLevel in the @azure/logger:

import { setLogLevel } from "@azure/logger"; setLogLevel("info");

Authenticating with Azure Active Directory

If you have registered an application with an Azure Active Directory tenant, you can assign it to an RBAC role in your Azure Storage account. This enables you to use the Azure.Identity library to authenticate with Azure Storage as shown in the azureAdAuth.ts sample.

Next steps

More code examples

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to aContributor License Agreement (CLA) declaring that you have the right to, and actually do, grant usthe rights to use your contribution. For details, visit https://cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to providea CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructionsprovided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct.For more information see the Code of Conduct FAQ orcontact opencode@microsoft.com with any additional questions or comments.

ncG1vNJzZmiZqqq%2Fpr%2FDpJuom6Njr627wWeaqKqVY8SqusOorqxmnprBcHDWnploopGrrrSv0aKnrWeRr8KzsYysq6iqkZyybq7LqJloaWJjfW98jKmpnq6ZmsRvgY6ipZ2dqGO1tbnL

 Share!