Develop for Azure Blob Storage and Amazon S3 without Subscription

Azure Storage - Amazon S3 Storage

Writing cloud software solutions have been never easier and more convient than now. But sometimes there are some barriers than prevent us, developers, from trying new technologies. For example not everybody has access to an Azure subscription or Amazon Cloud to try those cool cloudy things. But as always, there is a room. In this article we will look at two local deployable storage solutions that you can use both for developing cloud storage solutions and also for your on-premise customers.

On-permise Storage Solutions:

  1. Azurite, Azure Storage API Compatible (Local/Dev Testing)
  2. Zenko CloudServer, Amazon S3 Compatible (Production Ready)


Azurite is a Node.js solution which is compatible with Azure Storage API and acts as an emulator of real Azure Storage. You can easily run it in your machine to develop and test your storage solution. Once you are happy with your code if you need to run it against Azure Storage, just change the configuration to an Azure Storage one and you're done.

You can run Azurite on docker with the following command just by replacing c:/azurite with some path in your machine to store actual blob files:

docker run -p 10000:10000 -p 10001:10001 -v c:/azurite:/data

Once you have the Azurite running, you can use the default connection string to connect to it via Azure Storage Explorer app or Azure Storage SDK:


To explore your Azurite instance install Azure Storage Explorer and in the Add Account section select "Local Storage Emulator" and go ahead with default port numbers.

To develop your storage solution, first install Azure Blob Storage SDK nuget into your project

dotnet add package Azure.Storage.Blobs

And with your Azurite connection string you can now do the following operations:

Create an Azure Storage Container

You can think of a container just like folders in your system.

BlobContainerClient container = new BlobContainerClient(connectionString, containerName);


Upload a file to the container

BlobContainerClient container = new BlobContainerClient(connectionString, containerName);

BlobClient blob = container.GetBlobClient(blobName);


Download a file from the container

BlobContainerClient container = new BlobContainerClient(connectionString, containerName);

BlobClient blob = container.GetBlobClient(blobName);

using(FileStream downStream = new FileStream(donwloadFilePath, FileMode.Create))

Zenko CloudServer

There are lots of S3 compatible storage solutions available in the market but not all of them can be used as an on-premise solution in commercial products specially if your software is closed source then you need to think about the licenses. Fortunately Zenko CloudServer has Apache 2.0 license which can be used in commercial products.

To run Zenko CloudServer in your machine, run it with the following docker command replacing the volume paths with some valid folders in your machine to store storage data:

docker run -p 8000:8000 -v c:/cloudserver/data:/usr/src/app/localData -v c:/cloudserver/metadata//usr/src/app/localMetadata zenko/cloudserver

To test if your storage is up and running you can use a S3 compatible client like S3 Browser (Windows app). After installing that create a new account in the application with the following specifications:

  • Storage Type: S3 Compatbile Storage
  • REST Endpoint:
  • Access Key Id: secretKey1 (default key, you can change it)
  • Secret Access Key: verySecretKey1
  • Use secure transfer (SSL/TLS): Unchecked.

Now you can browse your Zenko CloudServer, create bucket and perform upload and download operations.

With your S3 storage ready, head over to your .NET Core application. Now you can perform the following operations on your S3 compatible storage:

Create a S3 client

    var accessKey = "accessKey1";
    var secretKey = "verySecretKey1";

    var config = new AmazonS3Config() {
        ServiceURL = "",
        ForcePathStyle = true

    // Create an S3 client object.
    var s3Client = new AmazonS3Client(accessKey, secretKey, config);

Create a S3 bucket

await s3Client.PutBucketAsync(bucketName);

Upload a file to S3 storage

    // key is the name of blob which will be created in storage
    var key = Path.GetFileName(filePath);

    var request = new UploadPartRequest()
        BucketName = bucketName,
        FilePath = filePath,
        Key = key 
    await s3Client.UploadPartAsync(request);

Download a file from S3 storage

    var downloadRequest = new GetObjectRequest()
        BucketName = bucketName,
        Key = key // the name of file in the bucket

    var downloadResponse = await s3Client.GetObjectAsync(downloadRequest);

    using(FileStream downStream = new FileStream(downloadPath, FileMode.Create))

In this tutorial we learned how to use some cool open-source projects to simulate Azure Blob Storage and Amazon S3 Storage for developing storage dependent parts of our projects. Also we performed three main storage operations: creating a bucket, uploading a file and downloading a file from these storages. More advanced features like resumable multi part upload and downloads with cancellation tokens can easily be developed with both Azure Storage and Amazon SDKs.

Enjoy your data transfer with cloud storages ;-)