If you've ever built an application where users need to upload files—profile pictures, documents, videos—you've faced a choice: proxy the upload through your own server, or let the user upload directly to cloud storage. Proxying is slow and costly for your server's bandwidth. Direct uploads are the professional standard, but how do you do it securely?
In the AWS world, the answer is S3 Presigned URLs. In Azure, the equivalent—and incredibly powerful—answer is the SAS Token.
This guide will show you exactly how to generate a temporary, secure URL that allows a user to upload a file directly to your private Azure Blob Storage container.
The Problem: How to Accept Files Without Exposing Your Secrets
You can't just give your frontend application the secret keys to your storage account. That would be like leaving the keys to your entire warehouse under the doormat. Anyone could view, delete, and upload anything they want.
The solution is a "valet key" system. The user's browser asks your trusted backend server for a temporary key that only works for uploading one specific file to one specific location for a limited time. This is what a SAS (Shared Access Signature) token does.
The Workflow: A High-Level Look
The process is a simple three-step dance between the user, your server, and Azure:
- The Ask (Client → Your Server): The user's browser makes an API call to an endpoint on your server, like
GET /presignedurl
. - The Generation (Your Server → Azure): Your server, which holds the real secrets, communicates with Azure to generate a special, one-time SAS URL with specific permissions (e.g., "write-only," "expires in 1 hour").
- The Upload (Client → Azure): Your server sends this unique URL back to the client. The client then uses this URL to upload the file directly to Azure Blob Storage, completely bypassing your server.
The Backend: Generating the SAS Upload URL
Here is a complete Express.js route that generates a SAS token. This code assumes you have the necessary Azure credentials (tenantId
, clientId
, clientSecret
, etc.) set up as environment variables.
// This is our backend endpoint, e.g., GET /presignedurl
router.get("/presignedurl", authMiddleware, async (req, res) => {
try {
const accountName = process.env.accountName;
const containerName = process.env.containerName;
const blobName = "image.jpg"; // You can make this dynamic (e.g., using a UUID)
const accountUrl = process.env.accountUrl;
const { ClientSecretCredential } = await import("@azure/identity");
const { BlobServiceClient, BlobSASPermissions, generateBlobSASQueryParameters, SASProtocol } = await import("@azure/storage-blob");
// Authenticate our backend with Azure
const credential = new ClientSecretCredential(process.env.tenantId??"", process.env.clientId??"", process.env.clientSecret??"");
const blobServiceClient = new BlobServiceClient(accountUrl??"", credential);
// Set the token's validity period (e.g., valid for the next hour)
const startsOn = new Date(Date.now() - 5 * 60 * 1000); // Start 5 mins ago to avoid clock skew
const expiresOn = new Date(Date.now() + 60 * 60 * 1000); // Expires in 1 hour
const userDelegationKey = await blobServiceClient.getUserDelegationKey(startsOn, expiresOn);
// Generate the SAS token with specific permissions
const sasToken = generateBlobSASQueryParameters(
{
containerName,
blobName,
permissions: BlobSASPermissions.parse("cw"), // "c" for create, "w" for write
startsOn,
expiresOn,
protocol: SASProtocol.Https, // Enforce HTTPS
},
userDelegationKey,
accountName
).toString();
// Combine everything into the final upload URL
const blobSasUrl = `https://${accountName}.blob.core.windows.net/${containerName}/${blobName}?${sasToken}`;
// Send the URL back to the client
res.json({ uploadUrl: blobSasUrl, blobPath: blobName });
} catch (error) {
console.error("Error generating presigned URL:", error);
res.status(500).json({ error: "Failed to generate presigned URL" });
}
});
When the client calls this endpoint, they will get a JSON response like this:
{
"uploadUrl": "https://youraccount.blob.core.windows.net/yourcontainer/image.jpg?se=...&sp=cw...",
"blobPath": "image.jpg"
}
The Frontend: Using the SAS URL to Upload a File
Now that the client has the special uploadUrl
, they can upload the file directly. This is a two-step process on the frontend.
Step 1: Get the Upload URL from Your Backend
// Assuming 'file' is a File object from an <input type="file">
const file = document.getElementById('myFileInput').files[0];
// First, ask our server for the secure URL
const response = await fetch('https://your-api.com/presignedurl');
const { uploadUrl, blobPath } = await response.json();
Step 2: Use the URL to Upload the File with a PUT Request
// Now, use that URL to upload the file directly to Azure
const uploadResponse = await fetch(uploadUrl, {
method: 'PUT',
body: file,
headers: {
// THIS HEADER IS MANDATORY FOR AZURE!
'x-ms-blob-type': 'BlockBlob',
// The content-type should match your file type
'Content-Type': file.type
}
});
if (uploadResponse.ok) {
console.log('Upload successful!');
} else {
console.error('Upload failed.');
}
🚨 Critical Gotcha: The x-ms-blob-type
Header
When you make the PUT
request to the SAS URL, you must include the header x-ms-blob-type: BlockBlob
. Without this header, Azure will reject your request with a cryptic error. BlockBlob
is the standard type for files like images, videos, and documents.
And that's it! You've successfully implemented a secure, scalable, and efficient file upload system in Azure, offloading the heavy lifting from your server and providing a better experience for your users.