Hot questions for Using Azure in blob

Question:

I have a Queue in Microsoft Azure Storage Explorer in which i'm passing a URI , Now the question is i don't want my server to ping every time to check whether something has came to queue or not despite, i want the client itself notify me that some message has arrived in queue, take appropriate action based on that.

i couldn't find any source in java where Event based example has been provided by Microsoft Azure.

Any Sample Working code or Reference in java will do, Thanks well in Advance.


Answer:

Azure Queues by themselves don't support this mechanism. Your server would need to poll the queue periodically to see if there are any messages in a queue.

If you don't want to write code for polling, an option would be to use service like Azure WebJobs or Azure Functions. Both of them have Storage Queue based triggers and can perform some functionality when a message arrives in a queue. So what you could do is have either WebJob or Function ping your server (assuming it is a web server) when a message arrives in a queue. Please note that these will also constantly poll the queue.

Question:

Here I am using Microsoft Storage Library, But I am not able to upload my file to Azure storage

Code:

 public class UploadFile {

public static void uploadFile(String sasURL,String filePath,String submissionGuid) throws MalformedURLException, URISyntaxException
{
    URI sasUrl = new URI(sasURL);

    try
    {
        CloudBlobContainer container = new CloudBlobContainer(sasUrl);
        CloudBlockBlob blob = container.getBlockBlobReference(sasUrl.getPath());
        File source = new File(filePath);
        blob.upload(new FileInputStream(source), source.length());
    }
    catch (Exception e)
    {
        // Output the stack trace.
        e.printStackTrace();
    }


}
}

Sas Uri Generated is:

 https://assetservice.blob.core.windows.net/org66/7594787459-5373-4485-a5ad-8b8a9b5af62c/Input/834793kfhreh-ee2a-4c80-a766-146fc139f2c4.hlkx?sv=2013-08-15&sr=b&sig=jkdhfueiwhdjscnkljshchcvdhcdsnc&se=2016-07-13T18%3A18%3A09Z&sp=w

I am getting error message:

java.io.IOException
at com.microsoft.azure.storage.core.Utility.initIOException(Utility.java:569)
at com.microsoft.azure.storage.blob.BlobOutputStream.writeBlock(BlobOutputStream.java:444)
at com.microsoft.azure.storage.blob.BlobOutputStream.access$000(BlobOutputStream.java:53)
at com.microsoft.azure.storage.blob.BlobOutputStream$1.call(BlobOutputStream.java:388)
at com.microsoft.azure.storage.blob.BlobOutputStream$1.call(BlobOutputStream.java:385)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)

and authorization error as well, even though sasuri being passed has all the required parameters in it:

at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Caused by: com.microsoft.azure.storage.StorageException: Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
at com.microsoft.azure.storage.StorageException.translateException(StorageException.java:89)
at com.microsoft.azure.storage.core.StorageRequest.materializeException(StorageRequest.java:305)
at com.microsoft.azure.storage.core.ExecutionEngine.executeWithRetry(ExecutionEngine.java:175)
at com.microsoft.azure.storage.blob.CloudBlockBlob.uploadBlockInternal(CloudBlockBlob.java:904)
at com.microsoft.azure.storage.blob.CloudBlockBlob.uploadBlock(CloudBlockBlob.java:876)
at com.microsoft.azure.storage.blob.BlobOutputStream.writeBlock(BlobOutputStream.java:438)
... 9 more

I think I dont understand the part where i should pass Blob-container-name as i even tried passing org66 but it didnt work for me.

I am also confused on the part as in what is my container name etc.


Answer:

Per "&sr=b" in your SAS, it's a blob level SAS rather than a container level SAS, you should directly use it to construct your CloudBlockBlob object, not via CloudBlobContainer:

CloudBlockBlob blob = new CloudBlockBlob(new URI(sasURL));
File source = new File(filePath);
blob.upload(new FileInputStream(source), source.length());

You can find the details on how to correctly use container SAS & blob SAS in this official documentation. Though it's based on C#, the code is generally similar.

Question:

I'm downloading a textfile from my blob storage, and I'd like to preserve the contents of the file (encoding and such).

Is there a difference between downloadText() and downloadText("UTF-8",null,null,null) ? Or is there a better way?

Some test code:

CloudStorageAccount storageAccount = CloudStorageAccount.parse(Blob.storageConnectionString);
        CloudBlobClient blobClient = storageAccount.createCloudBlobClient();
        CloudBlobContainer container = blobClient.getContainerReference("myblob");

        CloudBlockBlob blob = container.getBlockBlobReference("mydir/myfile.txt");

        String txt1 = blob.downloadText();
        String txt2 = blob.downloadText("UTF-8", null, null, null); 

Answer:

The differences from downloadText function with parameters or no parameters are using the default platform encoding or using the specified encoding, you can refer to the javadoc of Class CloudBlockBlob http://azure.github.io/azure-sdk-for-java/.

Per my experience, you can download the text content blob and convert into a UTF8 string, as the below code.

InputStream input =  blob.openInputStream();
InputStreamReader inr = new InputStreamReader(input, "UTF-8");
String utf8str = org.apache.commons.io.IOUtils.toString(inr);

Best Regards.

Question:

I'm using azure SDK, avro-parquet and hadoop libraries to read a parquet file from Blob Container. Currently, I'm downloading file to the temp file, and then create a ParquetReader.

try (InputStream input = blob.openInputStream()) {
                Path tmp = Files.createTempFile("tempFile", ".parquet");

                Files.copy(input, tmp, StandardCopyOption.REPLACE_EXISTING);
                IOUtils.closeQuietly(input);
                InputFile file = HadoopInputFile.fromPath(new org.apache.hadoop.fs.Path(tmp.toFile().getPath()),
                        new Configuration());
                ParquetReader<GenericRecord> reader = AvroParquetReader.<GenericRecord> builder(file).build();

                GenericRecord record;
                while ((record = reader.read()) != null) {
                    recordList.add(record);
                }
            } catch (IOException | StorageException e) {
                log.error(e.getMessage(), e);
            }

I want to read this file using inputStream from azure blob item, without downloading it to my machine. There's such way for S3 ( Read parquet data from AWS s3 bucket), but does this possibility exist for Azure?


Answer:

Find out how to do that.

 StorageCredentials credentials = new StorageCredentialsAccountAndKey(accountName, accountKey);
 CloudStorageAccount connection = new CloudStorageAccount(credentials, true);
 CloudBlobClient blobClient = connection.createCloudBlobClient();
 CloudBlobContainer container = blobClient.getContainerReference(containerName);

 CloudBlob blob = container.getBlockBlobReference(fileName);

 Configuration config = new Configuration();
 config.set("fs.azure", "org.apache.hadoop.fs.azure.NativeAzureFileSystem");
 config.set("fs.azure.sas.<containerName>.<accountName>.blob.core.windows.net", token);
 URI uri = new URI("wasbs://<containerName>@<accountName>.blob.core.windows.net/" + blob.getName());
 InputFile file = HadoopInputFile.fromPath(new org.apache.hadoop.fs.Path(uri),
                config);
 ParquetReader<GenericRecord> reader = AvroParquetReader.<GenericRecord> builder(file).build();

 GenericRecord record;
 while ((record = reader.read()) != null) {
     System.out.println(record);
 }
 reader.close();

Question:

In our Azure portal I have created a storage account and inside created a blob container and inside that a blob which is just a simple text file. I have also set the some random metadata fields on the blob seen here.

In my java code when I access the blob via the Azure SDK I can print the content of the blob, I can acccess the blob properties like the Etag and I can access the container metadata. But I cannot print the blob metadata fields seen above. Specifically this code taken from the samples page doesn't print anything since the received HashMap from blob.getMetadata() method is empty.

System.out.println("Get blob metadata:"); 
             HashMap<String, String> metadata = blob.getMetadata(); 
             Iterator it = metadata.entrySet().iterator(); 
             while (it.hasNext()) { 
                Map.Entry pair = (Map.Entry) it.next(); 
                 System.out.printf(" %s = %s%n", pair.getKey(), pair.getValue()); 
                 it.remove(); 
             } 

If I instead make a REST API call to the blob and ask for the metadata fields I do get them back as HTTP headers. However I would like to access them via the SDK if possible.


Answer:

Before blob.getMetadata(), use blob.downloadAttributes()

This method populates the blob's system properties and user-defined metadata. Before reading or modifying a blob's properties or metadata, call this method or its overload to retrieve the latest values for the blob's properties and metadata from the Microsoft Azure storage service.

Question:

Getting the below error while making a call to Create Container.

Response Code : 411 Response Message : Length Required

String stringToSign = "PUT\n\n\n\n0\n\n\n\n\n\n\n\nx-ms-date:" + date + "\nx-ms-version:" + "2014-02-14\n" + "/" + storageAccount + "/"+ "container-create-test"+"\nrestype:container"+"\ntimeout:60";

Java code snippet.

HttpURLConnection connection = (HttpURLConnection)new URL(url).openConnection();
connection.setRequestMethod(vMethod);
connection.addRequestProperty("Authorization", authHeader);
connection.addRequestProperty("x-ms-date", date);
connection.addRequestProperty("x-ms-version", "2014-02-14");
connection.addRequestProperty("Content-Length", "0");

Answer:

Nothing wrong with the format of StringToSign.

411 Response Message : Length Required

This error means you don't add Content-Length:0 header in your http request.

Update

As you work with HttpURLConnection in Java, Content-Length header can't be set manually by default, see this thread.

In case of other trouble, here's the complete sample for you to refer.

public static void putContainer() throws Exception {
    // Account info
    String accountName = "accountName";
    String accountKey = "accountKey";

    // Request Uri and Method
    String containerName = "containerName";
    String requestUri = "https://"+accountName+".blob.core.windows.net/"+containerName+"?restype=container&timeout=60";
    HttpURLConnection connection = (HttpURLConnection) (new URL(requestUri)).openConnection();
    connection.setRequestMethod("PUT");

    // Request Headers
    // 1. x-ms-version, recommend to use the latest version if possible
    String serviceVersion = "2018-03-28";
    // 2. x-ms-date
    SimpleDateFormat fmt = new SimpleDateFormat("EEE, dd MMM yyyy HH:mm:ss");
    fmt.setTimeZone(TimeZone.getTimeZone("GMT"));
    String date = fmt.format(Calendar.getInstance().getTime()) + " GMT";
    // 3. Authorization
    String authKeyFormat = "SharedKey";
    String caHeader = "x-ms-date:"+date+"\nx-ms-version:"+serviceVersion+"\n";
    String caResource = "/"+accountName+"/"+containerName+"\nrestype:container\ntimeout:60";
    String signStr = "PUT\n\n\n\n\n\n\n\n\n\n\n\n"+caHeader+caResource;
    String authorization = getAuthorization(accountName, authKeyFormat, signStr, accountKey);

    // Send request
    connection.setRequestProperty("x-ms-version", serviceVersion);
    connection.setRequestProperty("x-ms-date",date);
    connection.setRequestProperty("Authorization", authorization);
    // Send 0 byte, code sets Content-Length:0 automatically
    connection.setDoOutput(true);
    connection.setFixedLengthStreamingMode(0);

    System.out.println("Response message : " + connection.getResponseMessage());
    System.out.println("Response code : " + connection.getResponseCode());
}

private static String getAuthorization(String accountName, String authKeyFormat, String signStr, String accountKey) throws NoSuchAlgorithmException, UnsupportedEncodingException, InvalidKeyException {

    SecretKeySpec secretKey = new SecretKeySpec(Base64.getDecoder().decode(accountKey), "HmacSHA256");
    Mac sha256HMAC = Mac.getInstance("HmacSHA256");
    sha256HMAC.init(secretKey);
    String signature = Base64.getEncoder().encodeToString(sha256HMAC.doFinal(signStr.getBytes("UTF-8")));

    return authKeyFormat+" "+accountName+":"+signature;
}

Question:

When i am trying create container for my storage i get StorageException. 1.I created azure account. 2.I created azure storage for blob 3.I wrotten simple test(below) 4.I made this code on local machine and got exception. What is wrong?

public class Test {
public static final String storageConnectionString =
        "DefaultEndpointsProtocol=https;" +
                "AccountName=my_account;" +
                "AccountKey=my_account_key";


public static void main(String[] args) throws StorageException, InvalidKeyException, URISyntaxException {



    pushControll();

}

public static void pushControll() throws URISyntaxException, StorageException, InvalidKeyException {

        CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConnectionString);

        CloudBlobClient blobClient = storageAccount.createCloudBlobClient();


        CloudBlobContainer container = blobClient.getContainerReference("observer");

        container.create();




    }
}

I get StorageException - >:

Exception in thread "main" com.microsoft.azure.storage.StorageException: Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
at com.microsoft.azure.storage.StorageException.translateException(StorageException.java:89)
at com.microsoft.azure.storage.core.StorageRequest.materializeException(StorageRequest.java:307)
at com.microsoft.azure.storage.core.ExecutionEngine.executeWithRetry(ExecutionEngine.java:182)
at com.microsoft.azure.storage.blob.CloudBlobContainer.create(CloudBlobContainer.java:279)
at com.microsoft.azure.storage.blob.CloudBlobContainer.create(CloudBlobContainer.java:252)
at ru.marketirs.model.Test.pushControll(Test.java:40)
at ru.marketirs.model.Test.main(Test.java:25)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)

Process finished with exit code 1

What have i done wrong?


Answer:

Your code looks ok to me. Please check for 2 things: 1) Make sure that the account name/key is proper and 2) Check the clock on your computer and see if it is running slow. These 2 things could cause the error you're getting.

Question:

(For Azure for SDK 10) I'm able to download a file to memory, but I'd like to just download it to a blob or other local object.

  • There appears to be a download function for BlockBlobURL, but this returns a Single<> object: is there a more direct way to just get the blob contents?

  • This link describes downloading to file.

  • I am looking for the Java equivalent of this.


Answer:

BlobURL had a low level interface that I could extract the byte stream from. This was my workaround:

ByteBuffer byteBuffer = blobURL.download(null, null, false, null)
                               .blockingGet()   // DownloadResponse
                               .body(null)      // Flowable<ByteBuffer>
                               .firstOrError()  
                               .blockingGet();  

Question:

I'm calling the downloadRange-function from the Azure Storage SDK for Java (v4.0.0) to download parts of a Page Blob, e.g. downloadRange(0, 1000, os, null, null, null). Another process, a single writer, writes to the end of the page blob. If a write is concurrent to the downloadRange-call AND downloadRange does a retry internally (HTTP GET ) it results in a StorageException with the following text: "The condition specified using HTTP conditional header(s) is not met.".

Is it possible to perform the downloadRange read operation without this happening? In terms of the application, it is safe to access the bytes up until the last page.

The pseudo-code is as follows (using scala):

val blob = container.getPageBlobReference(blobName)
val baos = new ByteArrayOutputStream()
blob.downloadRange(0, totalSize, baos, null, null, null)

Update

Clarification based on the comments below. The use case is a bit special since the read byte range is known to be safe, i.e. it only reads the range of bytes in the blob which are not written to concurrently. The write is only appending to the end of the blob. The question is how to use downloadRange or any other part of the Azure Storage SDK to access the blob with concurrent writes, even in the case of network issues (packet loss, slow transfer, etc.).


Answer:

This answer is based on the comment thread you can read above.

In this particular case, the error is occurring on a retry and not the first call. When the storage library retries a download the if-match gets set because on retry we have to guarantee that the blob has not changed in order to have consistency. Otherwise if a new blob had been set between these calls we'd get half of the old and half of the new, for example. From a library perspective since we don't have the knowledge the subsequent reads will be safe if the blob has changed we have to enforce this. There's not a way to disable that.

In this particular case there is a very unique combination of frequent connection failures, concurrent writes AND foreknowledge of read safety. This is something I'd expect to be generally very rare. The connection failures would be the rarest part so that probably needs more investigation (perhaps a topic for another question).

In this special case I'd recommend doing whatever you can to reduce the amount of time the network is open. Reducing this time means hitting less network failures in the first place reduces the likelihood that the blob has changed if a retry does happen since a small download will simply take less time. Breaking down your reads into smaller chunks is probably the best route to accomplish this. Similarly, you may want to simply manually retry this small portion of the download in the catch statement for this error.

Question:

I am trying to upload a file into Azure Blob. and I am trying to achieve through Upload in vaadin framework. Vaadin Version : 6.7.8

I am able to develop a code for uploading the file into azure blob.

My Problem Statement Lies below :

  • I have written a class UploadToBlob.java to upload a file into azure blob.
  • If I run the class UploadToBlob.java indivually (ran from eclipse run as java application), I am able to upload the file into azure blob.
  • If I create a object of the UploadToBlob class in my other class[ModifyComplaintComponent.java], storageAccount = CloudStorageAccount.parse(storageConnectionString); is not getting execute.

Below is the UploadToBlob.java code:

    package com.---.trs.scms.ui.components;

import com.microsoft.azure.storage.CloudStorageAccount;
import com.microsoft.azure.storage.StorageCredentials;
import com.microsoft.azure.storage.blob.CloudBlobContainer;

public class UploadToBlob {

    public static void main(String[] args) {

        try {

            final String storageConnectionString = "DefaultEndpointsProtocol=https;AccountName=abcd;AccountKey=bmiA7+****==;EndpointSuffix=core.windows.net";
            System.out.println("---I am getting called Main-1 ");

            CloudStorageAccount storageAccount;

            storageAccount = CloudStorageAccount.parse(storageConnectionString);

            com.microsoft.azure.storage.blob.CloudBlobClient blobClient = storageAccount.createCloudBlobClient();

            CloudBlobContainer container = blobClient.getContainerReference("container2");

            container.createIfNotExists();

            String filePath = "C:\\Users\\----\\Desktop\\Timesheet - 19th Aug,2019.pdf";

            com.microsoft.azure.storage.blob.CloudBlockBlob blob = container.getBlockBlobReference("Timesheet.pdf");

            java.io.File source = new java.io.File(filePath);

            java.io.FileInputStream fileInputStream = new java.io.FileInputStream(source);

            blob.upload(fileInputStream, source.length());

        } catch (Exception e) {
            e.printStackTrace();
        }

    }

}

For now , I am passing manual file PATH as above to upload in azure blob, as I told above , this class is getting called till the line of code System.out.println("---I am getting called Main-1 ");

Here is the ModifyComplaintComponent code from where I am calling UploadToBlob.java:

import com.vaadin.ui.HorizontalLayout;
import com.vaadin.ui.Upload;

public class ModifyComplaintComponent extends CustomComponent {


//other component  code which I haven't pasted here
    private Upload uploadnew;

    try {
            System.out.println("------Inside try block-----------");
            UploadToBlob fileReceiver= new UploadToBlob ();

            uploadnew = new Upload("Upload a file", fileReceiver);

            uploadnew.setReceiver(fileReceiver);
            uploadnew.addListener(fileReceiver);

            System.out.println("------end of try block-----------");
        }  catch (Exception e) {
            System.out.println("------catch block-----------");
            e.printStackTrace();

        } 

        HorizontalLayout hlayout = new HorizontalLayout();
        hlayout.setSpacing(true);
        hlayout.addComponent(uploadnew);

}

The Reason why I have given a manual file path in my UploadToBlob code is because I firstly wanted to make this code called from ModifyComplaintComponent class.

Secondly when I try to browse the file , and file gets selected but when I click on upload , I get NullPointerException On Vaadin Upload UI Part and even if i selected the file , UI says "no file choosen"

The challenge I am facing is If I run the Upload.java file individually I am able to upload static file into azure blob , but I wanted to browse and upload a file in vaadin framework into azure blob storage.


Answer:

Firstly, Upload is a Component of Vaadin. You should not create your own Upload class.

Secondly, the public static main method is an entrance where your program starts. If you want to use a method of a class, you need to explicitly invoke it.

TheClassName.MethodName(...) // For static method
new TheClassName(...).MethodName(...) //For non-static method

Thirdly, I did some tests, the following is a successful sample. Two classes will be created:


Class UploadReceiver

This class implements the Receiver interface and some listeners.

import com.microsoft.azure.storage.CloudStorageAccount;
import com.microsoft.azure.storage.StorageException;
import com.microsoft.azure.storage.blob.CloudBlobClient;
import com.microsoft.azure.storage.blob.CloudBlobContainer;
import com.microsoft.azure.storage.blob.CloudBlockBlob;
import com.vaadin.ui.Upload;
import org.springframework.stereotype.Component;

import java.io.OutputStream;
import java.net.URISyntaxException;
import java.security.InvalidKeyException;

@Component
public class UploadReceiver implements Upload.Receiver, Upload.StartedListener, Upload.SucceededListener, Upload.ProgressListener {
    // Storage account connection string.
    public static String conn = "DefaultEndpointsProtocol=https;AccountName=stora***789;AccountKey=G3***w==;EndpointSuffix=core.windows.net";

    @Override
    public OutputStream receiveUpload(String filename, String mimeType) {
        System.out.println("Uploading -> " + mimeType + " ; File name -> " + filename);
        return GetOutputStream("vaadin",filename);
    }

    @Override
    public void uploadStarted(Upload.StartedEvent startedEvent) {
        System.out.println("Upload started!");
    }

    @Override
    public void uploadSucceeded(Upload.SucceededEvent succeededEvent) {
        System.out.println("Upload succeeded!");
    }


    public OutputStream GetOutputStream(String container, String blob){
        OutputStream outputStream = null;
        try{
            CloudStorageAccount storageAccount = CloudStorageAccount.parse(conn);
            CloudBlobClient blobClient = storageAccount.createCloudBlobClient();
            CloudBlobContainer blobContainer = blobClient.getContainerReference(container);
            CloudBlockBlob cloudBlockBlob = blobContainer.getBlockBlobReference(blob);
            outputStream = cloudBlockBlob.openOutputStream();
        } catch (StorageException e) {
            e.printStackTrace();
        } catch (InvalidKeyException e) {
            e.printStackTrace();
        } catch (URISyntaxException e) {
            e.printStackTrace();
        }
        return outputStream;
    }

    @Override
    public void updateProgress(long readBytes, long contentLength) {
        System.out.println("Progress: readBytes -> " + readBytes + " ; contentLength -> " + contentLength);
    }
}

Class MainUI

This is the UI page. I just add an upload component.

import com.vaadin.server.VaadinRequest;
import com.vaadin.spring.annotation.SpringUI;
import com.vaadin.ui.Alignment;
import com.vaadin.ui.UI;
import com.vaadin.ui.Upload;
import com.vaadin.ui.VerticalLayout;
import org.springframework.beans.factory.annotation.Autowired;

@SpringUI
public class MainUI extends UI {

    private VerticalLayout layout;
    private Upload upload;

    private UploadReceiver uploadReceiver;

    @Autowired
    public MainUI(UploadReceiver uploadReceiver){
        this.uploadReceiver = uploadReceiver;
    }

    @Override
    protected void init(VaadinRequest vaadinRequest) {

        // Set layout
        layout = new VerticalLayout();
        layout.setDefaultComponentAlignment(Alignment.MIDDLE_CENTER);
        setContent(layout);


        // Add upload
        upload = new Upload("Upload a file", uploadReceiver);
        upload.addStartedListener(uploadReceiver);
        upload.addSucceededListener(uploadReceiver);
        upload.addProgressListener(uploadReceiver);
        layout.addComponent(upload);
    }
}

Result: After I clicked the upload button and chose a file to upload, I could get the following outputs from console:

And, by checking the storage account with Storage Explorer, I could see that the file was successfully uploaded:


Update:

This is how the upload works:

I do not know how your code passed the compiling. To construct an Upload object, you need to pass a caption string and a receiver which implements Upload.Receiver interface.

public Upload(String caption, Receiver uploadReceiver)

And to implement the Upload.Receiver interface, you have to override the receiveUpload method.

OutputStream receiveUpload(String filename, String mimeType)

The receiveUpload will return an output stream, where vaadin will finally write contents to.

That's all. Give vaadin an output stream, and it will write all the contents to the stream.

The input file is sent from your browser and handled by vaadin. I did not find a way to manually set the input content in vaadin. Sorry.

Question:

Im trying to grab a virtual directory in a container by iterating through a list of "folders" blobs.

folder(prefix) 
 |
 |-->somefile.ext

I noticed that it will only grab that folder(blob) if there is a file within it.

I need to be able to grab the virtual folder even if it has no files in it so I can upload to it.

folder(prefix) 


for (ListBlobItem c : container.listBlobs("prefix")) {
if (c.getUri().toString().endsWith("/")){
//print blob
}
}

Answer:

There is no such thing as a folder in azure blob storage. So there can only be empty containers, not empty folders.

Folders are virtual and only exists due to the blobs in them. The path of the blob defines the virtual folders so that is why you cannot get a virtual folder without blobs in it.

You can "create" a new folder by setting the path of a new blob. For example by uploading a blob named "my_not_yet_existing_folder/myimage.jpg"

For example (modified example from the docs):

try
{
    // Retrieve storage account from connection-string.
    CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConnectionString);

    // Create the blob client.
    CloudBlobClient blobClient = storageAccount.createCloudBlobClient();

   // Retrieve reference to a previously created container.
    CloudBlobContainer container = blobClient.getContainerReference("mycontainer");

    final String filePath = "C:\\myimages\\myimage.jpg";

    // Create or overwrite the "myimage.jpg" blob with contents from a local file.
    CloudBlockBlob blob = container.getBlockBlobReference("my_not_yet_existing_folder/myimage.jpg");
    File source = new File(filePath);
    blob.upload(new FileInputStream(source), source.length());
}
catch (Exception e)
{
    // Output the stack trace.
    e.printStackTrace();
}

Question:

I have created Azure VM and installed my JAVA application then connecting directly it to WASB storage.

When I uploading file in my application, I see file uploaded successfully in WASB storage account. When I am try to retrieve the file. It throws with following error:-

2017-01-03 07:34:23.817 GMT+0000 WARN  [admin-cccd8bdeefad4099b483404727701269-49-43d241427b20490fbee434a9ef31a2f5-libraryService.previewLibraryData] LibraryAPI - Failed to convert from view to data window
java.lang.RuntimeException: Failed to iterate data file
    at com.myapp.library.stacks.DataFileIterator.computeNext(DataFileIterator.java:50)
    at com.myapp.library.stacks.DataFileIterator.computeNext(DataFileIterator.java:17)
    at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:143)
    at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:138)
    at com.myapp.frontend.server.LibraryAPI.toDataWindow(LibraryAPI.java:1453)
    at com.myapp.frontend.server.LibraryAPI.previewLibraryData(LibraryAPI.java:1092)
    at com.myapp.frontend.server.LibraryWebSocketDelegate.previewLibraryData(LibraryWebSocketDelegate.java:278)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at com.myapp.frontend.util.PXWebSocketProtocolHandler$PXMethodHandler.call(PXWebSocketProtocolHandler.java:144)
    at com.myapp.frontend.util.PXWebSocketEndpoint.performMethodCall(PXWebSocketEndpoint.java:284)
    at com.myapp.frontend.util.PXWebSocketEndpoint.access$200(PXWebSocketEndpoint.java:47)
    at com.myapp.frontend.util.PXWebSocketEndpoint$1.run(PXWebSocketEndpoint.java:169)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: Could not read footer: java.lang.NoSuchMethodError: com.microsoft.azure.storage.core.StorageCredentialsHelper.signBlobAndQueueRequest(Lcom/microsoft/azure/storage/StorageCredentials;Ljava/net/HttpURLConnection;JLcom/microsoft/azure/storage/OperationContext;)V
    at parquet.hadoop.ParquetFileReader.readAllFootersInParallel(ParquetFileReader.java:190)
    at parquet.hadoop.ParquetFileReader.readAllFootersInParallelUsingSummaryFiles(ParquetFileReader.java:146)
    at com.myapp.hadoop.common.PxParquetReader.<init>(PxParquetReader.java:90)
    at com.myapp.hadoop.common.PaxParquetReaderImpl.doRead(PaxParquetReaderImpl.java:50)
    at com.myapp.hadoop.common.PaxParquetReaderImpl.access$000(PaxParquetReaderImpl.java:17)
    at com.myapp.hadoop.common.PaxParquetReaderImpl$1.run(PaxParquetReaderImpl.java:41)
    at com.myapp.hadoop.common.PaxParquetReaderImpl$1.run(PaxParquetReaderImpl.java:38)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at com.myapp.hadoop.common.PaxParquetReaderImpl.nextRow(PaxParquetReaderImpl.java:38)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at com.myapp.hadoop.core.DistributionManager$$anon$9.invoke(DistributionManager.scala:296)
    at com.sun.proxy.$Proxy60.nextRow(Unknown Source)
    at com.myapp.library.stacks.ParquetPartFileReader.readRaw(ParquetPartFileReader.java:57)
    at com.myapp.library.stacks.ParquetPartFileReader.readRow(ParquetPartFileReader.java:38)
    at com.myapp.library.stacks.DataFilePagingReader.readRow(DataFilePagingReader.java:63)
    at com.myapp.library.stacks.DataFileIterator.computeNext(DataFileIterator.java:45)
    ... 17 more
Caused by: java.lang.NoSuchMethodError: com.microsoft.azure.storage.core.StorageCredentialsHelper.signBlobAndQueueRequest(Lcom/microsoft/azure/storage/StorageCredentials;Ljava/net/HttpURLConnection;JLcom/microsoft/azure/storage/OperationContext;)V
    at org.apache.hadoop.fs.azure.SendRequestIntercept.eventOccurred(SendRequestIntercept.java:150)
    at org.apache.hadoop.fs.azure.SendRequestIntercept.eventOccurred(SendRequestIntercept.java:40)
    at com.microsoft.azure.storage.StorageEventMultiCaster.fireEvent(StorageEventMultiCaster.java:52)
    at com.microsoft.azure.storage.core.ExecutionEngine.fireSendingRequestEvent(ExecutionEngine.java:360)
    at com.microsoft.azure.storage.core.ExecutionEngine.setupStorageRequest(ExecutionEngine.java:316)
    at com.microsoft.azure.storage.core.ExecutionEngine.executeWithRetry(ExecutionEngine.java:95)
    at com.microsoft.azure.storage.blob.CloudBlob.downloadRangeInternal(CloudBlob.java:1629)
    at com.microsoft.azure.storage.blob.BlobInputStream.dispatchRead(BlobInputStream.java:255)
    at com.microsoft.azure.storage.blob.BlobInputStream.readInternal(BlobInputStream.java:448)
    at com.microsoft.azure.storage.blob.BlobInputStream.read(BlobInputStream.java:420)
    at java.io.BufferedInputStream.read1(BufferedInputStream.java:284)
    at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
    at java.io.DataInputStream.read(DataInputStream.java:149)
    at org.apache.hadoop.fs.azure.NativeAzureFileSystem$NativeAzureFsInputStream.read(NativeAzureFileSystem.java:735)
    at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
    at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
    at java.io.FilterInputStream.read(FilterInputStream.java:83)
    at parquet.bytes.BytesUtils.readIntLittleEndian(BytesUtils.java:63)
    at parquet.hadoop.ParquetFileReader.readFooter(ParquetFileReader.java:284)
    at parquet.hadoop.ParquetFileReader$2.call(ParquetFileReader.java:180)
    at parquet.hadoop.ParquetFileReader$2.call(ParquetFileReader.java:176)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    ... 3 more

How to solve this issue?


Answer:

According to the error information java.lang.NoSuchMethodError, please check your Java runtime version whether or not be compatibled with these dependencies jar files, even there are some version confliction issue for these dependencies.

Please refer to the SO thread How do I fix a NoSuchMethodError? & the Oracle offical explaination for the exception. Meanwhile, there is a blog about how to debug for this exception which may help you resolving it.

Question:

I'm using the SDK for java to crear a SAS to access a blob. This is the code:

SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy();
policy.setPermissionsFromString("r");
Calendar date = Calendar.getInstance();
Date expire = new Date(date.getTimeInMillis() + (expirationMinutes * 60000));
Date start = new Date(date.getTimeInMillis());
policy.setSharedAccessExpiryTime(expire);
policy.setSharedAccessStartTime(start);
return blob.getUri().toString()+"?"+blob.generateSharedAccessSignature(policy, externalFileName);

But when I try to use the url to access the blob I get this error:

<Error>
<Code>AuthenticationFailed</Code>
<Message>
Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. RequestId:f1f169d2-0001-003f-115a-3be1d6000000 Time:2016-11-10T13:57:14.6192554Z
</Message>
<AuthenticationErrorDetail>
SAS identifier cannot be found for specified signed identifier
</AuthenticationErrorDetail>
</Error>

I'm doing the same thing in NET for the same blob an the resulting url (which works) is different that the one I get here:

Doesn't work (java):

/mycontainer/privadoPrueba/cat1.jpg?sig=FFLVk%2FPViHBZhH1JIW6wBbWiJ0%2Bgz0U8wjFzgRoytNo%3D&st=2016-11-10T13%3A55%3A06Z&se=2016-11-10T14%3A06%3A06Z&sv=2015-07-08&si=privadoPrueba%2Fcat1.jpg&sp=r&sr=b

Works (NET):

/mycontainer/privadoPrueba/cat1.jpg?sv=2015-07-08&sr=b&sig=WyiJWltZFj1AkkzST6mo2NjBF1tRSXxrkMP5LEAGJNk%3D&st=2016-11-10T14%3A05%3A41Z&se=2016-11-10T14%3A16%3A41Z&sp=r

How could I fix this?


Answer:

Just looking at the SAS token, you are specifying a policy with the filename. That's probably not what you wanted to do, and is not in the .NET SAS token.

The problem is here I guess:

blob.generateSharedAccessSignature(policy, externalFileName);

The second parameter is probably the policy name, if the API is similar to .NET.

Try this instead:

blob.generateSharedAccessSignature(policy, null);

Question:

I'm trying to use the Azure Storage SDK for Java to copy the page blob of an Azure VM (that is Stopped and Deallocated) from one Azure subscription to another.

Here's the code I'm using:

public class BlobCopyExampleClean {

    public static final String sourceStorageConnectionString =
            "DefaultEndpointsProtocol=https;"
            + "AccountName=sourceStorageAccount;"
            + "AccountKey=key123";

    public static final String destinationStorageConnectionString =
            "DefaultEndpointsProtocol=https;"
            + "AccountName=destinationStorageAccount;"
            + "AccountKey=key321";

    public static void main(String[] args) {

        try {
            CloudStorageAccount srcAccount = CloudStorageAccount.parse(sourceStorageConnectionString);
            CloudBlobClient srcSrvClient = srcAccount.createCloudBlobClient();
            CloudBlobContainer srcContainer = srcSrvClient.getContainerReference("vhds");

            CloudStorageAccount destAccount = CloudStorageAccount.parse(destinationStorageConnectionString);
            CloudBlobClient destSrvClient = destAccount.createCloudBlobClient();
            CloudBlobContainer destContainer = destSrvClient.getContainerReference("vhds");

            boolean result = destContainer.createIfNotExists();

            CloudBlob srcBlob = srcContainer.getPageBlobReference("testvm-2015-11-06.vhd");
            if (srcBlob.exists()) {
                CloudBlob destBlob = destContainer.getPageBlobReference("testvm-2015-11-06-copied.vhd");
                System.out.println("Starting blob copy...");
                String copyJobId = destBlob.startCopyFromBlob(srcBlob);

                CopyState copyState = destBlob.getCopyState();

                while (copyState.getStatus().equals(CopyStatus.PENDING)) {
                    System.out.println("... copying ...");
                    Thread.sleep(30000);
            }

                System.out.println("Copy complete, status was: " + copyState.getStatus() + "!");
            } else {
                System.out.println("Source blob does not exist!");
            }
        } catch (InvalidKeyException e) {
            e.printStackTrace();
        } catch (URISyntaxException e) {
            e.printStackTrace();
        } catch (StorageException e) {
            e.printStackTrace();
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
    }
}

No matter what I try I always get the following error returned to me:

Starting blob copy...
com.microsoft.azure.storage.StorageException: The specified resource does not exist.
    at com.microsoft.azure.storage.StorageException.translateException(StorageException.java:89)
    at com.microsoft.azure.storage.core.StorageRequest.materializeException(StorageRequest.java:305)
    at com.microsoft.azure.storage.core.ExecutionEngine.executeWithRetry(ExecutionEngine.java:175)
    at com.microsoft.azure.storage.blob.CloudBlob.startCopy(CloudBlob.java:883)
    at com.microsoft.azure.storage.blob.CloudBlob.startCopyFromBlob(CloudBlob.java:788)
    at com.company.azure.storage.BlobCopyExampleClean.main(BlobCopyExampleClean.java:44)

I've tried using v1.3.1, v3.1.0 & v4.0.0 of the SDK library and get the same error using both blob.startCopyFromBlob() (v1.3.1 & v3.1.0) and blob.startCopy() (v4.0.0).

The page blob in question has no lease, as the Azure Management Portal says the lease status is "Unlocked" and it does exist, even the Azure API confirms this by entering the code block at line 42.

I've tried copying the blob to another storage account within the same subscription and that gives the same error too.

Looking at the exception in more detail the error code is "CannotVerifyCopySource".


Answer:

Duplicate is posted in the azure-storage-java library issue 59. Copying my and mirobers answer from there:

To copy a blob across accounts, you need to use a SAS token for the source or mark the source container for public access. See the "Authorization" section of the following page: https://msdn.microsoft.com/en-us/library/azure/dd894037.aspx

So for example your code could generate a SAS token granting read access to the source blob, append it to the source blob URL's query string, and then start the copy from the URL. If you're looking for the specific APIs, check out generateSharedAccessSignature on the blob object. You can use the token returned by this to either make a new CloudStorageAccount and follow the code flow above to get a blob reference from that, or append it to the blob URL and use the CloudBlockBlob(URL) constructor to directly get a blob reference.

Your code has one additional problem in that getCopyState does not make a service call. This is just getting the blob copy state previously set by startCopy. Inside your while loop you should try using downloadAttributes instead which will actually do a service call to get the updated copy information from the blob.

Question:

I can connect to Azure Blob Storage using Proxy. Now i want to read all images from Azure blob storage.

            // ConnectionString
        String storageConnectionString =
                "DefaultEndpointsProtocol=https;" +
                "AccountName=xxxxxxx;" +
                "AccountKey=xxxxxxddfcfdcddrc==";

        //Authetication
        Authenticator.setDefault(new Authenticator() {
              protected PasswordAuthentication getPasswordAuthentication() {
                return new
                   PasswordAuthentication(proxyName,passowrd.toCharArray());
            }});

        //Set Proxy Host name and Port
        Proxy proxy = new Proxy(Proxy.Type.HTTP, new InetSocketAddress("xxxxxxxxx", 8080));
        OperationContext op = new OperationContext();
        op.setProxy(proxy);

        // Retrieve storage account from connection-string.
        CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConnectionString);

        // Create the blob client.
       CloudBlobClient blobClient = storageAccount.createCloudBlobClient();

       // Get a reference to a container.
       // The container name must be lower case
       CloudBlobContainer container = blobClient.getContainerReference("test");

       // Create the container if it does not exist with public access.
       System.out.println("Creating container: " + container.getName());


       // Create the container if it does not exist.
       //container.createIfNotExists(BlobContainerPublicAccessType.CONTAINER, new BlobRequestOptions(), op);

       // Delete the blob.
       //container.deleteIfExists(null, null, op);
        LinkedList<String> blobNames = new LinkedList<>();
        Iterable<ListBlobItem> blobs = container.listBlobs();
        blobNames = new LinkedList<>();

       **// the line that hit an error**
        for(ListBlobItem blob: blobs) { 
            blobNames.add(((CloudBlockBlob) blob).getName());
        }

        System.out.println(blobNames.size());

        System.out.println("********Success*********");

When i run above script i got following problem:

java.util.NoSuchElementException: An error occurred while enumerating the result, check the original exception for details.java.util.NoSuchElementException: An error occurred while enumerating the result, check the original exception for details.
at com.microsoft.azure.storage.core.LazySegmentedIterator.hasNext(LazySegmentedIterator.java:113)
at com.microsoft.azure.storage.StorageException: An unknown failure occurred : Connection refused: connect
at com.microsoft.azure.storage.StorageException.translateException(StorageException.java:66)
at com.microsoft.azure.storage.core.ExecutionEngine.executeWithRetry(ExecutionEngine.java:209)
at com.microsoft.azure.storage.core.LazySegmentedIterator.hasNext(LazySegmentedIterator.java:109)
... 1 moreCaused by: java.net.ConnectException: Connection refused: connect

I dono why this error occurs but it throws above exception and Connection refused.


Answer:

You need to pass your OperationContext to the container.listBlobs() call via this overload:

public Iterable<ListBlobItem> listBlobs(final String prefix, final boolean useFlatBlobListing, final EnumSet<BlobListingDetails> listingDetails, BlobRequestOptions options, OperationContext opContext)

In your case that would mean

Iterable<ListBlobItem> blobs = container.listBlobs(null, false, EnumSet.noneOf(BlobListingDetails.class), null, op);

Question:

I'm trying to use a SAS key to retrieve a container from Azure WASB in Java code. This is being done in the HDFS code, but for whatever reason I can't seem to get it to work. I've managed to simplify it down to the application below, which also does not work. I think it is either an issue with how we are generating the SAS token, or permissions on the Azure account. Can someone look at this and point me in the right direction as to what the issue might be? Thanks!

public static void main(String[] arguments)
{
    try {
        String storage_account = "wasbvalidation";
        String container = "demoengagement1";
        CloudBlobClient blobClient = getBlobClient(storage_account);

        CloudBlobContainer blobContainer = blobClient.getContainerReference(container);

        blobContainer.downloadAttributes(); // This call succeeds

        SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy();
        policy.setPermissions(EnumSet.allOf(SharedAccessBlobPermissions.class));
        policy.setSharedAccessStartTime(Date.valueOf(LocalDate.now().minusYears(2)));
        policy.setSharedAccessExpiryTime(Date.valueOf(LocalDate.now().plusYears(2)));

        String sas = blobContainer.getUri().toString() + "?" + blobContainer.generateSharedAccessSignature(policy, null, null, SharedAccessProtocols.HTTPS_ONLY);

        // Code after this point is emulating what HDFS is doing, so I'd rather not change it.
        URI blobUri = new URI(blobContainer.getUri().toString());
        StorageCredentials credentials = new StorageCredentialsSharedAccessSignature(sas);
        CloudBlobContainer sasContainer = new CloudBlobContainer(blobUri, credentials);
        sasContainer.downloadAttributes(); // This call fails, however.
    } catch (Exception e) {
        e.printStackTrace();
    }
}

private static CloudBlobClient getBlobClient(String storageAccount) throws NullPointerException {
    String storageConnectionString = "DefaultEndpointsProtocol=https;" + "AccountName=" + storageAccount + ";" + "AccountKey=" + accountKey;
    CloudStorageAccount csa = null;
    try {
        csa = CloudStorageAccount.parse(storageConnectionString);
    } catch (Exception ex) {
        ex.printStackTrace();
    }
    CloudBlobClient blobClient = csa.createCloudBlobClient();
    return blobClient;
}

Answer:

According to your code, I think you want to get the properties and metadata of a blob container by building a url with SAS for a container. However, the SAS string generated by SharedAccessBlobPolicy is like sig=1G7tiQnLEtbjk2RSNuUSKH7gLNVZjqhuLQL%2Fci%2FXS50%3D&st=2017-01-30T16%3A00%3A00Z&se=2021-01-30T16%3A00%3A00Z&sv=2018-03-28&sp=racwdl&sr=b for blob (sr=b), not for container (sr=c, such as st=2019-01-31T08%3A38%3A46Z&se=2019-02-01T08%3A38%3A46Z&sp=rl&sv=2018-03-28&sr=c&sig=KnynNYBUtzNSYtBEcYakMrhAXPRIk60wztB3BFv5b%2Bs%3D copied from Azure Storage Explorer).

I tried to use CloudStorageAccount with SharedAccessAccountPolicyto generate an Account SAS for a blob via the code below, but it still not works.

Account SAS. The account SAS delegates access to resources in one or more of the storage services. All of the operations available via a service SAS are also available via an account SAS. Additionally, with the account SAS, you can delegate access to operations that apply to a given service, such as Get/Set Service Properties and Get Service Stats. You can also delegate access to read, write, and delete operations on blob containers, tables, queues, and file shares that are not permitted with a service SAS. See Constructing an Account SAS for in-depth information about constructing the account SAS token.

SharedAccessAccountPolicy accountPolicy = new SharedAccessAccountPolicy();
accountPolicy.setPermissions(EnumSet.allOf(SharedAccessAccountPermissions.class));
accountPolicy.setSharedAccessStartTime(Date.valueOf(LocalDate.now().minusYears(2)));
accountPolicy.setSharedAccessExpiryTime(Date.valueOf(LocalDate.now().plusYears(2)));

String sas = csa.generateSharedAccessSignature(accountPolicy);

I test the code below,

StorageCredentials credentials = new StorageCredentialsSharedAccessSignature(sas);
CloudBlobContainer sasContainer = new CloudBlobContainer(new URI(container2.getUri().toString()+"?"+sas), credentials);
sasContainer.downloadAttributes();

Then to get the exception.

Exception in thread "main" java.lang.IllegalArgumentException: Cannot provide credentials as part of the address and as constructor parameter. Either pass in the address or use a different constructor.

Or to test the code CloudBlobContainer sasContainer = new CloudBlobContainer(new URI(container2.getUri().toString()+"?"+sas)) to get the exception.

Exception in thread "main" com.microsoft.azure.storage.StorageException: Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.

It seems to be caused by the implementation of Azure Java Storage SDK v8.0.0 after I researched the SDK source codes. Maybe you can report it to Microsoft to ask for this issue.

I tried to generate a container url with SAS using Azure Java Storage SDK v10 by the code below, it works fine.

Maven dependency for v10:

<!-- https://mvnrepository.com/artifact/com.microsoft.azure/azure-storage-blob -->
<dependency>
    <groupId>com.microsoft.azure</groupId>
    <artifactId>azure-storage-blob</artifactId>
    <version>10.4.0</version>
</dependency>

Code for generating a container url with SAS:

String accountName = "<your account name>";
String accountKey = "<your account key>";
SharedKeyCredentials credentials = new SharedKeyCredentials(accountName, accountKey);
final ServiceURL serviceURL = new ServiceURL(new URL("http://" + accountName + ".blob.core.windows.net"), StorageURL.createPipeline(credentials, new PipelineOptions()));
String containerName = "<container name>";
ServiceSASSignatureValues values = new ServiceSASSignatureValues()
                .withProtocol(SASProtocol.HTTPS_ONLY) // Users MUST use HTTPS (not HTTP).
                .withExpiryTime(OffsetDateTime.now().plusDays(2)) // 2 days before expiration.
                .withContainerName(containerName)
                .withBlobName(blobName);
ContainerSASPermission permission = new ContainerSASPermission()
                .withRead(true)
                .withAdd(true)
                .withWrite(true);
values.withPermissions(permission.toString());
SASQueryParameters serviceParams = values.generateSASQueryParameters(credentials);
String sas = serviceParams.encode();

String containerUrlWithSAS = String.format(Locale.ROOT, "https://%s.blob.core.windows.net/%s%s",
                accountName, containerName, sas);
HttpPipeline pipeline = new HttpPipelineBuilder().build();
ContainerURL sasContainer = new ContainerURL(new URL(containerUrlWithSAS), pipeline);
sasContainer.getProperties();

Note: the function getProperties of ContainerURL in SDK v10 is similar with downloadAttributes of CloudBlobContainer in SDK v8, which also returns the container's metadata and system properties.

Question:

I have the following code to upload single blob to azure storage using azure-storage-blob 12.5.0.

Is there any way to pass a collection of byte arrays and do it in some kind of batch upload?

    public void store(final String blobPath, final String originalFileName, final byte[] bytes) {
        final BlobClient blobClient = containerClient.getBlobClient(blobPath);
        final BlockBlobClient blockBlobClient = blobClient.getBlockBlobClient();
        try (ByteArrayInputStream inputStream = new ByteArrayInputStream(bytes)) {
            blockBlobClient.upload(inputStream, bytes.length, true);
        } catch (BlobStorageException | IOException exc) {
            throw new StorageException(exc);
        }
    }

Answer:

Is there any way to pass a collection of byte arrays and do it in some kind of batch upload?

In V8 sdk, i found uploadFromByteArray method supports byte[] parameter.

CloudBlockBlob blob = container.getBlockBlobReference("helloV8.txt");

String str1 = "132";
String str2 = "asd";
ByteArrayOutputStream os = new ByteArrayOutputStream();
os.write(str1.getBytes());
os.write(str2.getBytes());
byte[] byteArray = os.toByteArray();
blob.uploadFromByteArray(byteArray, 0, byteArray.length);

Test:

No such method could be found in V12 sdk,only upload method you used in your question.In fact,in above uploadFromByteArray method inside,it is upload method as well.

If you are referring upload multiple blobs in the batch,i'm afraid it it not supported in the official sdk except using for loop.About bulk writing,you could refer to the Azure CLI and AzCopy scenarios mentioned in this document.

Question:

I'm writing an application using Spring Boot and Java that will be writing files to Azure Blob Storage. How can I use a Service Principal to authenticate? The details of the SP should ideally be read in via some properties or an external file.

I've been wading through the reams of documentation and examples, all of which don't seem to be quite what I'm looking for. Most examples that I've seen use the Storage Account Key which I don't want to do.

Some example code would be really appreciated. As I said, I'm struggling to find a decent example (both of how to use an SP but also generally how to write to Azure BLOB Storage in Java) as there seems to be so many different ways of accessing storage scattered around in the microsoft docs.


Answer:

You can use ADAL4J to acquire a token, and then use the token to write to blobs.

  1. Add role assignment to your principal.

  1. Get token.

    public static String getToken() throws Exception {
        String TENANT_ID = "your tenant id or name, e4c9*-*-*-*-*57fb";
        String AUTHORITY = "https://login.microsoftonline.com/" + TENANT_ID;
        String CLIENT_ID = "your application id, dc17*-*-*-*a5e7";
        String CLIENT_SECRET = "the secret, /pG*32";
        String RESOURCE = "https://storage.azure.com/";
        String ACCESS_TOKEN = null;
        ExecutorService service = Executors.newFixedThreadPool(1);
        AuthenticationContext context = null;
        try {
            context = new AuthenticationContext(AUTHORITY, false, service);
            ClientCredential credential = new ClientCredential(CLIENT_ID, CLIENT_SECRET);
            Future<AuthenticationResult> future = context.acquireToken(RESOURCE, credential, null);
            ACCESS_TOKEN = future.get().getAccessToken();
        } catch (InterruptedException e) {
            e.printStackTrace();
        } catch (ExecutionException e) {
            e.printStackTrace();
        } catch (MalformedURLException e) {
            e.printStackTrace();
        } finally {
            service.shutdown();
        }
        return ACCESS_TOKEN;
    }
    
  2. Access blob.

    public static void main(String[] args) throws Exception {
        String token = getToken();
        StorageCredentialsToken credentialsToken = new StorageCredentialsToken("storagetest789", token);
        CloudBlobClient blobClient = new CloudBlobClient(new URI("https://storagetest789.blob.core.windows.net/"), credentialsToken);
        CloudBlobContainer blobContainer = blobClient.getContainerReference("pub");
        CloudBlockBlob blockBlob = blobContainer.getBlockBlobReference("test.txt");
        blockBlob.uploadText("Test!");
    }
    

Hope it helps.

Question:

I need get Image from Azure blob storage container using Proxy and save the Image to BufferedImage.

             System.out.println("********Initiated******");

            //Set Proxy Host name and Port
            Proxy proxy = new Proxy(Proxy.Type.HTTP, new InetSocketAddress("xx-xx-xxxxx", 8080));
            OperationContext op = new OperationContext();
            op.setProxy(proxy);

            // Retrieve storage account from connection-string.
            CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConnectionString);

            // Create the blob client.
           CloudBlobClient blobClient = storageAccount.createCloudBlobClient();

           // Get a reference to a container.
           // The container name must be lower case
           CloudBlobContainer container = blobClient.getContainerReference("images");

            //call via this overload
            Iterable<ListBlobItem> blobs = container.listBlobs(null, false, EnumSet.noneOf(BlobListingDetails.class), new BlobRequestOptions(), op);

            URL urlOfImage = null; 
            //Listing contents of container
            for(ListBlobItem blob: blobs) { 
                /*Process the Image. Sample URL from Azure: **https://someWebsite.blob.core.windows.net/images/00001.png***/
                if(((CloudBlockBlob) blob).getName().toLowerCase().contains(".png")) {
                    urlOfImage = blob.getUri().toURL();
                    BufferedImage buffimage = ImageIO.read(urlOfImage);
                }
            }

            System.out.println("********Success*********");

By using the URI, i can open the image via browser(seperatly).

Question: I want to process the blob content directly or via URI. If i run my above code when i save Image to buffered Image, I get the following Error.

Exception in thread "main" javax.imageio.IIOException: Can't get input stream from URL!
at javax.imageio.ImageIO.read(Unknown Source)

Thanks in advance.


Answer:

Per my experience, your issue was caused by the url of blob without SAS token which can not be accessed directly.

Here is my sample code to generate a blob url with SAS token.

String connectionString = "<your storage connection string>"
String containerName = "<your container name>";
String blobName = "<your blob name>";
CloudStorageAccount account = CloudStorageAccount.parse(connectionString);
CloudBlobClient client = account.createCloudBlobClient();
CloudBlobContainer container = client.getContainerReference(containerName);
CloudBlockBlob blob = container.getBlockBlobReference(blobName);
SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy();
policy.setPermissions(EnumSet.allOf(SharedAccessBlobPermissions.class));
policy.setSharedAccessStartTime(Date.valueOf(LocalDate.now().minusYears(2)));
policy.setSharedAccessExpiryTime(Date.valueOf(LocalDate.now().plusYears(2)));
String sas = blob.generateSharedAccessSignature(policy, null);
String urlWithSas = String.format("%s?%s", blob.getUri(), sas);

Then, you can pass the urlWithSas value to the method ImageIO.read without proxy to get its BufferedImage object, as below.

URL urlOfImage = new URL(urlWithSas);
BufferedImage buffimage = ImageIO.read(urlOfImage );
System.out.println(buffimage.getHeight());

It works for me.

For using proxy, you just need to follow the JDK offical document Java Networking and Proxies to use System.setProperty method to enable networking with proxy for JVM at first.

System.setProperty("http.proxyHost", "<your proxy host>");
System.setProperty("http.proxyPort", "<your proxy port>");

Update:

The result of the code below is the same as the above.

HttpURLConnection conn = (HttpURLConnection) urlOfImage.openConnection();
conn.connect();
InputStream input = conn.getInputStream();
BufferedImage buffimage = ImageIO.read(input);

Question:

I use this code for upload files to azure blob storage, but when i try load directory with sub-directories i get error "FileNotFoundException encountered: C:\upload\bin" :(Access is denied), is any solution to load files and directories in source directory?

try {
        CloudStorageAccount account = CloudStorageAccount.parse(storageConnectionString);
        CloudBlobClient serviceClient = account.createCloudBlobClient();

        // Container name must be lower case.
        CloudBlobContainer container = serviceClient.getContainerReference(containerName);
        container.createIfNotExists();
        File source = new File(path);
        if (source.list().length > 0) {
            for (File file : source.listFiles()) {
                CloudBlockBlob blob = container.getBlockBlobReference(file.getName());
                if (blob.exists() == false) {
                    File sourceFile = new File(source + "\\" + file.getName());
                    blob.upload(new FileInputStream(sourceFile), sourceFile.length());
                    System.out.println("File " + source + "\\" + file.getName() + " Load to blob storage");
                } else System.out.println("File " + source + "\\" + file.getName() + " Already exist in storage");
            }
        } else System.out.println("In folder " + path + " are no files ");
    } catch (FileNotFoundException fileNotFoundException) {
        System.out.print("FileNotFoundException encountered: ");
        System.out.println(fileNotFoundException.getMessage());
        System.exit(-1);

    } catch (StorageException storageException) {
        System.out.print("StorageException encountered: ");
        System.out.println(storageException.getMessage());
        System.exit(-1);
    } catch (Exception e) {
        System.out.print("Exception encountered: ");
        System.out.println(e.getMessage());
        System.exit(-1);
    }

Answer:

As @ZhaoxingLu-Microsoft said, the file object generated by source.listFiles() is enough for gettting the absolute file path via file.getAbsolutePath(), so you can write your code as below.

if (blob.exists() == false) {
    blob.uploadFromFile(file.getAbsolutePath());
} else System.out.println("File " + file.getAbsolutePath() + " Already exist in storage");

I test your code in my environment, it also works. However, per my experience, your issue FileNotFoundException encountered: C:\upload\bin" :(Access is denied) was caused by lacking the permission of accessing files under C: or C:\upload\bin. So you need to run your code as administrator on your current Windows environment, as the figures below.

Fig 1. Run your code as administrator if using IntelliJ

Fig 2. Run your code as administrator if using Eclipse

Fig 3. Run your code as administrator via Command Prompt


Update: On Azure Blob Storage, the file and directory structure is depended on the blob name. So if you want to see the file structure like the figures below, you can use the code String blobName = file.getAbsolutePath().replace(path, ""); to get the blob name.

Fig 4. The file and directory structure built on my local machine

Fig 5. The same above on Azure Blob Storage via Azure Storage Explorer

Here is my complete code.

private static final String path = "D:\\upload\\";
private static final String storageConnectionString = "<your storage connection string>";
private static final String containerName = "<your container for uploading>";

private static CloudBlobClient serviceClient;

public static void upload(File file) throws InvalidKeyException, URISyntaxException, StorageException, IOException {
    // Container name must be lower case.
    CloudBlobContainer container = serviceClient.getContainerReference(containerName);
    container.createIfNotExists();
    String blobName = file.getAbsolutePath().replace(path, "");
    CloudBlockBlob blob = container.getBlockBlobReference(blobName);
    if (blob.exists() == false) {
        blob.uploadFromFile(file.getAbsolutePath());
    } else {
        System.out.println("File " + file.getAbsolutePath() + " Already exist in storage");
    }
}

public static void main(String[] args)
        throws URISyntaxException, StorageException, InvalidKeyException, IOException {
    CloudStorageAccount account = CloudStorageAccount.parse(storageConnectionString);
    serviceClient = account.createCloudBlobClient();
    File source = new File(path);
    for (File fileOrDir : source.listFiles()) {
        boolean isFile = fileOrDir.isFile();
        if(isFile) {
            upload(fileOrDir);
        } else {
            for(File file: fileOrDir.listFiles()) {
                upload(file);
            }
        }

    }
}

Question:

I have a task to make possibility to download simple .txt files from the application using Azure Blob Storage. The code is supposed to work. I didn't write it, but it looks OK to me and from what I'll show later in this post, it really connects to the Azure, and, what's more important, it really works only when I'm testing the app on localhost, but not on the publicly available site.

These are the steps I made:

  1. uploaded files to the storage (the underlined is one of them):
  2. added proper link to the button that should download the attachment via REST API
  3. of course, I've also added reference to the attachment in the database (its ID, name etc.)
  4. here's how it looks on frontend:
  5. And this is what I get:

I've seen somewhere that it might be caused by Azure CORS settings that don't allow the app to access the storage. Here's what I've done so far:

  1. went to portal.azure.com and changed CORS settings like this:
  2. found something about putting some code into the app under this Microsoft link, but it's not Java. I guess there are some analogical ways in Java: https://blogs.msdn.microsoft.com/windowsazurestorage/2014/02/03/windows-azure-storage-introducing-cors/ . Is it necessary after the CORS rules have been added in the Azure Portal?

Also, I've found information that it may be caused by the storage access permissions. The Public Access Level is set to Container:

Not sure if it gives anything, but these are the container's properties:

What else can be the problem with the BlobNotFound error I receive? Hope I've put enough information here, but if some more is needed say in comment and I'll provide it.

This is the code that's supposed to download the attachment of this method, contained in 3 classes:

Controller class part:

@GetMapping("/download/{id}")
    @ResponseStatus(HttpStatus.OK)
    public void downloadAttachment(@PathVariable long id, HttpServletResponse response) throws IOException {
       dataUploadRequestAttachmentService.downloadStaticAttachment(response, id);
    }

Controller service class part:

public void downloadStaticAttachment(HttpServletResponse response, long id) throws IOException {
        ArticleAttachment articleAttachment = this.findAttachment(id);
        String mimeType = URLConnection.guessContentTypeFromName(articleAttachment.getName());

        if (mimeType == null){
            mimeType = "application/octet-stream";
        }

        response.setContentType(mimeType);
        response.setHeader("Content-Disposition", String.format("attachment; filename=\"%s\"", articleAttachment.getName()));

        azureBlobStorageArticleAttachmentService.downloadArticleAttachment(
                articleAttachment.getName(),
                articleAttachment.getId(),
                response.getOutputStream()
        );
    }

And the AzureBlobStorageArticleAttachmentService class:

public void downloadArticleAttachment(String attachmentName, Long articleId, OutputStream outputStream) {
        try {
            CloudBlockBlob blob = container.getBlockBlobReference(String.format("%s_%s", articleId, attachmentName));
            blob.download(outputStream);
        } catch (URISyntaxException | StorageException e) {
            e.printStackTrace();
            log.error(String.format("Download article attachment %s error", attachmentName));
        }
    }

Answer:

According to your description, please debug to check if you get the correct blob name in the code: CloudBlockBlob blob = container.getBlockBlobReference(String.format("%s_%s", articleId, attachmentName));

Here is a demo about how to download blobs using Java SDK for your reference:

/// <summary>
/// download blob to memory
/// </summary>
/// <param name="containerName">blob container name</param>
/// <param name="blobName">blob Name</param>
public static ByteArrayOutputStream downloadBlobToMemory(String containerName, String blobName) {
    CloudStorageAccount account = null;
    CloudBlobContainer container = null;
    ByteArrayOutputStream byteArrayOutputStream = null;
    try {
        account = CloudStorageAccount.parse(ConnString);
        CloudBlobClient client = account.createCloudBlobClient();
        container = client.getContainerReference(containerName);
        container.createIfNotExists();
        CloudBlockBlob cloudBlockBlob = container.getBlockBlobReference(blobName);         

        byteArrayOutputStream=new ByteArrayOutputStream();
        cloudBlockBlob.download(byteArrayOutputStream);         

    }catch(Exception ex) {
        ex.printStackTrace();
    }

    return byteArrayOutputStream;
}


/// <summary>
/// download blob to local disk
/// </summary>
/// <param name="containerName">blob container name</param>
/// <param name="blobName">blob Name</param>
/// <param name="filePath"> for example: C:\\Test\test.txt</param>
public static void downloadBlobToDisk(String containerName, String blobName, String filePath) {
    CloudStorageAccount account = null;
    CloudBlobContainer container = null;
    try {
        account = CloudStorageAccount.parse(ConnString);
        CloudBlobClient client = account.createCloudBlobClient();
        container = client.getContainerReference(containerName);
        container.createIfNotExists();
        CloudBlockBlob cloudBlockBlob = container.getBlockBlobReference(blobName);
        FileOutputStream fileOutputStream=new FileOutputStream(filePath);
        cloudBlockBlob.download(fileOutputStream);
    }catch(Exception ex) {
        ex.printStackTrace();
    }
}

Question:

I have eclispe maven project based on Azure services. I want to run it on client's machine who is from another country, so have different time zone.

When I install eclipse on their machine and run all the services, those works fine. But when I added war to apache folder , after running it, it's giving error like :

Make sure the value of Authorization header is formed correctly including the signature.
at com.microsoft.azure.storage.StorageException.translateException(StorageException.java:)

I came to this link and I think I am having almost same type of error :

StorageException for azure blob with java

Please tell me how to solve this and how to make clock 'slow' as mentioned in the answer of provided link.

Here is the code : (See the code in else loop from first method. It's blob related code.)

@Override
    public JSONObject syncFiles(JSONObject jsonInput) throws InvalidKeyException, URISyntaxException {
        if (jsonInput.containsKey("accountName")) {
            CloudFileClient fileClient = null;
            String storageConnectionString = "DefaultEndpointsProtocol=https;AccountName="
                    + jsonInput.get("accountName") + ";" + "AccountKey=" + jsonInput.get("accountKey");
            System.out.println(storageConnectionString);
            CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConnectionString);
            JSONObject jsonOutput = new JSONObject();
            ArrayList fileList = new ArrayList<>();
            try {
                // fileClient =
                // FileClientProvider.getFileClientReference(jsonOutput);
                fileClient = storageAccount.createCloudFileClient();
                String directoryName = jsonInput.get("directoryStructure").toString();

                String[] directoryNameArray = directoryName.split("\\s*/\\s*");
                System.out.println(directoryNameArray.length);

                CloudFileShare share = fileClient.getShareReference(directoryNameArray[0].toLowerCase()
                        .replaceAll("[-+.^:,!@#$%&*()_~`]", "").replaceAll("\\s+", ""));
                if (share.createIfNotExists()) {
                    System.out.println("New share created named as " + directoryNameArray[0].toLowerCase()
                            .replaceAll("[-+.^:,!@#$%&*()_~`]", "").replaceAll("\\s+", ""));
                }
                CloudFileDirectory rootDir = share.getRootDirectoryReference();
                for (int i = 0; i < directoryNameArray.length; i++) {
                    String directoryToCreate = directoryNameArray[i];
                    CloudFileDirectory directory = rootDir.getDirectoryReference(directoryToCreate);

                    String directoryNameToListFiles = directory.getName();
                    if (i == directoryNameArray.length - 1) {
                        for (ListFileItem fileItem : directory.listFilesAndDirectories()) {
                            boolean isDirectory;
                            if (isDirectory = fileItem.getClass() == CloudFileDirectory.class) {
                                System.out.println("Directory Exists Here");
                            } else {
                                System.out.println("Name with files :" + fileItem.getUri().toString());
                                String downloadLocation = "/home/zcon/AzureDownloadedFiles";
                                String fileName[] = fileItem.getUri().toString().split("\\s*/\\s*");
                                for (int j = 0; j < fileName.length; j++) {
                                    if (j == fileName.length - 1) {
                                        String fileNameWithExtension = fileName[j];
                                        File f = new File(downloadLocation + "/" + fileNameWithExtension);
                                        String DownloadTo = f.toString();
                                        f.createNewFile();
                                        CloudFile cloudFile = directory
                                                .getFileReference(fileNameWithExtension.replaceAll("%20", " "));
                                        System.out.println("fileName===========" + fileNameWithExtension);
                                        String tokenKey = testFileSAS(share, cloudFile);
                                        cloudFile.downloadToFile(DownloadTo);
                                        fileList.add(fileItem.getUri().toString() + "?" + tokenKey);
                                        f.delete();
                                    }
                                }
                            }
                        }
                    }
                    rootDir = directory;
                }
                ArrayList fileNamesList = new ArrayList<>();
                for (int i = 0; i < fileList.size(); i++) {
                    String fileName[] = fileList.get(i).toString().split("\\s*/\\s*");
                    for (int j = 0; j < fileName.length; j++) {
                        if (j == fileName.length - 1) {
                            String fileNameReturn = fileName[j];
                            String[] fileNameReturnArray = fileNameReturn.split("\\.");
                            fileNamesList.add(fileNameReturnArray[0].replace("%20", " "));
                        }
                    }
                }
                jsonOutput.put("fileNamesList", fileNamesList);
                jsonOutput.put("fileList", fileList);
                jsonOutput.put("status", "successful");
            } catch (Exception e) {
                System.out.println("Exception is " + e.toString());
                jsonOutput.put("status", "unsuccessful");
                jsonOutput.put("exception", e.toString());
                e.printStackTrace();
            }
            return jsonOutput;
        } else {

            CloudBlobClient blobClient = null;
            String storageConnectionString = "DefaultEndpointsProtocol=https;AccountName="
                    + jsonInput.get("blobAccountName") + ";" + "AccountKey=" + jsonInput.get("blobAccountKey");
            System.out.println(storageConnectionString);
            CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConnectionString);
            JSONObject jsonOutput = new JSONObject();
            ArrayList fileList = new ArrayList<>();
            ArrayList fileNamesList = new ArrayList<>();
            ArrayList blobItemList = new ArrayList<>();

            try {
                blobClient = storageAccount.createCloudBlobClient();
                String directoryName = jsonInput.get("directoryStructure").toString();
                String[] directoryNameArray = directoryName.split("\\s*/\\s*");
                CloudBlobContainer container = blobClient.getContainerReference(directoryNameArray[0].toLowerCase()
                        .replaceAll("[-+.^:,!@#$%&*()_~`]", "").replaceAll("\\s+", ""));
                if (container.createIfNotExists()) {
                    System.out.println("New share created named as " + directoryNameArray[0].toLowerCase()
                            .replaceAll("[-+.^:,!@#$%&*()_~`]", "").replaceAll("\\s+", ""));
                }
                // CloudBlockBlob blob =
                // container.getBlockBlobReference(jsonInput.get("directoryStructure")+"/"+jsonInput.get("fileToCopy"));
                CloudBlobDirectory directoryOfFile = container
                        .getDirectoryReference(jsonInput.get("directoryStructure").toString());
                for (ListBlobItem blobItem : directoryOfFile.listBlobs()) {
                    // System.out.println(blobItem.getUri());
                    // fileList.add(blobItem.getUri());
                    blobItemList.add(blobItem);

                }

                for(int  q= 0; q<blobItemList.size(); q++){
                    if(blobItemList.get(q).getClass()==CloudBlobDirectory.class)
                    {
                        blobItemList.remove(q);
                    }
                }
                System.out.println(blobItemList);
                for (int l = 0; l < blobItemList.size(); l++) {
                    CloudBlob blob = (CloudBlob) blobItemList.get(l);
                    if (blob.getUri().toString().contains("Temp.txt")) {
                        System.out.println("Temp file was skipped");
                    } else {
                        String tokenKey = testBlobSaS(blob, container);
                        fileList.add(blob.getUri().toString() + "?" + tokenKey);
                    }
                }
                System.out.println("size of blobItemList is=============" + blobItemList.size());

                for (int k = 0; k < fileList.size(); k++) {
                    String fileItem = fileList.get(k).toString();
                    String fileName[] = fileItem.split("\\s*/\\s*");

                    for (int j = 0; j < fileName.length; j++) {
                        if (j == fileName.length - 1) {
                            String fileNameWithExtension = fileName[j];
                            String[] parts = fileNameWithExtension.split("\\?");
                            System.out.println("fileName===========" + fileNameWithExtension);
                            fileNamesList.add(parts[0].replace("%20", " "));
                        }
                    }
                }
                jsonOutput.put("fileList", fileList);
                jsonOutput.put("fileNamesList", fileNamesList);
                jsonOutput.put("status", "successful");
                System.out.println(fileList);
                return jsonOutput;
            } catch (Exception e) {
                System.out.println("Exception is " + e.toString());
                jsonOutput.put("status", "unsuccessful");
                jsonOutput.put("exception", e.toString());
                e.printStackTrace();
            }
            return jsonOutput;
        }

    }

Method to create BlobSAS :

@Test
    // @Category(SlowTests.class)
    public String testBlobSaS(CloudBlob blob, CloudBlobContainer container) throws InvalidKeyException,
            IllegalArgumentException, StorageException, URISyntaxException, InterruptedException {
        SharedAccessBlobPolicy sp = createSharedAccessBlobPolicy(
                EnumSet.of(SharedAccessBlobPermissions.READ, SharedAccessBlobPermissions.LIST), 100);
        BlobContainerPermissions perms = new BlobContainerPermissions();

        perms.getSharedAccessPolicies().put("readperm", sp);
        container.uploadPermissions(perms);
        // Thread.sleep(30000);
        String sas = blob.generateSharedAccessSignature(sp, null);

        CloudBlockBlob sasBlob = new CloudBlockBlob(
                new URI(blob.getUri().toString() + "?" + blob.generateSharedAccessSignature(null, "readperm")));
        sasBlob.download(new ByteArrayOutputStream());

        // do not give the client and check that the new blob's client has the
        // correct perms
        CloudBlob blobFromUri = new CloudBlockBlob(
                PathUtility.addToQuery(blob.getStorageUri(), blob.generateSharedAccessSignature(null, "readperm")));
        assertEquals(StorageCredentialsSharedAccessSignature.class.toString(),
                blobFromUri.getServiceClient().getCredentials().getClass().toString());

        // create credentials from sas
        StorageCredentials creds = new StorageCredentialsSharedAccessSignature(
                blob.generateSharedAccessSignature(null, "readperm"));
        CloudBlobClient bClient = new CloudBlobClient(sasBlob.getServiceClient().getStorageUri(), creds);

        CloudBlockBlob blobFromClient = bClient.getContainerReference(blob.getContainer().getName())
                .getBlockBlobReference(blob.getName());
        assertEquals(StorageCredentialsSharedAccessSignature.class.toString(),
                blobFromClient.getServiceClient().getCredentials().getClass().toString());
        assertEquals(bClient, blobFromClient.getServiceClient());
        return sas;

    }

Method to create shared access blob policy :

private final static SharedAccessBlobPolicy createSharedAccessBlobPolicy(EnumSet<SharedAccessBlobPermissions> sap,
            int expireTimeInSeconds) {

        Calendar cal = new GregorianCalendar(TimeZone.getTimeZone("UTC"));
        cal.setTime(new Date());
        cal.add(Calendar.YEAR, expireTimeInSeconds);
        SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy();
        policy.setPermissions(sap);
        policy.setSharedAccessExpiryTime(cal.getTime());
        return policy;

    }

What changes I should make here?


Answer:

I have faced similar problem before. What you need to do is , make your code comfortable for all the time zones like GMT,IST,EST etc. Because when you upload war in server of another country, your code must be 'clever' enough to understand time zone of that country !

So here is what you can do :

Step 1:

In third part of your code , try replaceing "UTC" by your client machine's timezone like GMT,EST etc.

If that works (and I am pretty much sure that it will work fine) , go for

Step 2:

First of all we will code something that gives you current time zone in String like "Indian Standard Time" or "European Standard Time" etc.

Then, we will pick only first letters from all words and make a String like "IST" or "EST".

At last, we will pass this String at place where you wrote "UTC" in third part of your code.

So, here is the code :

private final static SharedAccessBlobPolicy createSharedAccessBlobPolicy(EnumSet<SharedAccessBlobPermissions> sap,
            int expireTimeInSeconds) {
        Calendar now = Calendar.getInstance();
        TimeZone timeZone = now.getTimeZone();
        System.out.println("Current TimeZone is : " + timeZone.getDisplayName());
        String x = timeZone.getDisplayName();
        String[] myName = x.split(" ");
        String s = "";
        ArrayList zoneArray = new ArrayList<>();
        char zone = 0;
        for (int i = 0; i < myName.length; i++) {
            s = myName[i];
            System.out.print(s.charAt(0));
            zone = s.charAt(0);
            zoneArray.add(zone);
        }
        String timeZoneCurrent = s;
        String timeZoneDynamic = zoneArray.toString().replace(",", "").replace(" ", "").replace("[", "").replace("]",
                "");
        System.out.println("Value of S==========" + timeZoneDynamic);
        Calendar cal = new GregorianCalendar(TimeZone.getTimeZone(timeZoneDynamic));
        cal.setTime(new Date());
        cal.add(Calendar.YEAR, expireTimeInSeconds);
        SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy();
        policy.setPermissions(sap);
        policy.setSharedAccessExpiryTime(cal.getTime());
        return policy;
    } 

In this, timeZoneDynamic has values like "IST" , "GMT" etc. This logic must work. If have any error, post it in edit. Hope that will work.

Question:

Background

Java Azure functions 2, using blob storage with event grid subsription for blob create events which the function (see below) is bound to via the event trigger.

Problem

Not clear how to bind a blob (See @BlobInput Java annotation) Input Blob Binding from Azure functions which the documentation illudes to, but not sure if possible in the Java API, unlike its C# counterpart.

When the function is invocked nothing is bound to the content varaible using the @BlobInput annotation, so once the line is reached where the content variable is reached, it results in a null pointer.

The path = "{data.url}" based on documentation allows you to access event data passed to the function. The data passed from the event is all bound to the EventSchema event POJO also (see below for an example of an event).

The @StorageAccount("AzureWebJobsStorage") links to the properties stored and setup by default via the functions configuration, which is correct.

Tried

Deployed azure function:

@StorageAccount("AzureWebJobsStorage")
@FunctionName("myfunc")
public void run(@EventGridTrigger(name = "blobeventgrid") EventSchema event,
                @BlobInput(name = "zipfile",dataType = "binary",path = "{data.url}") byte[] content,
                final ExecutionContext context) {
    context.getLogger().info("Java Event Grid trigger function executed.");
    context.getLogger().info("Id: " + event.id);
    context.getLogger().info("Data: " + event.data);
    context.getLogger().info("zip file: " + content.length);
}

Example Event Grid Event

{
  "topic": "/subscriptions/<omitted>/resourceGroups/java-functions-group/providers/Microsoft.Storage/storageAccounts/<omitted storageaccount>",
  "subject": "/blobServices/default/containers/mycontainer/blobs/compressed.zip",
  "eventType": "Microsoft.Storage.BlobCreated",
  "eventTime": "2019-10-02T12:46:33.2915427Z",
  "id": "<omitted>",
  "data": {
  "api": "PutBlob",
  "clientRequestId": "<omitted>",
  "requestId": "<omitted>",
  "eTag": "<omitted>",
  "contentType": "application/zip",
  "contentLength": 32460,
  "blobType": "BlockBlob",
  "url": "https://<omitted storageaccount>.blob.core.windows.net/mycontainer/compressed.zip",
  "sequencer": "<omitted>",
  "storageDiagnostics": {
  "batchId": "<omitted>"
    }
  },
  "dataVersion": "",
  "metadataVersion": "1"
}

Log from running function locally (remote is same)

[10/05/2019 18:48:16] Executing HTTP request: {
[10/05/2019 18:48:16]   "requestId": "299a3870-98cf-41cf-b418-7cdb33c1f1c7",
[10/05/2019 18:48:16]   "method": "POST",
[10/05/2019 18:48:16]   "uri": "/runtime/webhooks/EventGrid"
[10/05/2019 18:48:16] }
[10/05/2019 18:48:17] Executing 'Functions.myFunc' (Reason='EventGrid trigger fired at 2019-10-05T19:48:17.4343990+01:00', Id=82a2f47b-34bc-492f-8b60-12601beb45ee)
[10/05/2019 18:48:18] Java Event Grid trigger function executed.
[10/05/2019 18:48:18] Event content 
[10/05/2019 18:48:18] Subject: /blobServices/default/containers/mycontainer/blobs/zip/compressed.zip
[10/05/2019 18:48:18] Time: Mon Sep 30 20:46:33 BST 2019
[10/05/2019 18:48:18] Id: 7de5edc4-c01e-0107-1bc7-77755f061e49
[10/05/2019 18:48:18] Data: {api=PutBlob, clientRequestId=007dd554-e3bb-11e9-80b4-dca90473b192, requestId=7de5edc4-c01e-0107-1bc7-77755f000000, eTag=0x8D745DEE5936EE3, contentType=application/zip, contentLength=32460.0, blobType=BlockBlob, url=https://<ommitted storage account>.blob.core.windows.net/mycontainer/zip/compressed.zip, sequencer=000000000000000000000000000007E200000000002ab872, storageDiagnostics={batchId=1c15a3b6-2006-0046-00c7-771b19000000}}
[10/05/2019 18:48:18] Executed 'Functions.myFunc' (Failed, Id=82a2f47b-34bc-492f-8b60-12601beb45ee)
[10/05/2019 18:48:18] System.Private.CoreLib: Exception while executing function: Functions.myFunc. System.Private.CoreLib: Result: Failure
[10/05/2019 18:48:18] Exception: NullPointerException: 

Falls over because of nothing is bound to content byte[]...

Alternative

Using the Azure Java SDK, but trying to stay with the semantics around Azure functions.


Answer:

Your function is almost correct. Based on my test, the value of {data.url} would be a http url like the following:

https://storagetest789.blob.core.windows.net/test/test.txt

And if you set the correct storage connection string, the binding will work and you will get the content.

Here is my verification:

1. Code
public class Function {

    @FunctionName("StroageEventGrid")
    @StorageAccount("AzureWebJobsStorage")
    public void run(@EventGridTrigger(name = "blobeventgrid") EventSchema event, 
                    @BlobInput(name = "blob",dataType = "binary",path = "{data.url}") byte[] content,
                    final ExecutionContext context) 
    {
        context.getLogger().info((String)event.data.get("url"));
        if(content != null)
            context.getLogger().info("Length: " + content.length);
        else
            context.getLogger().info("Content is null");
    }
}

public class EventSchema {

  public String topic;
  public String subject;
  public String eventType;
  public Date eventTime;
  public String id;
  public String dataVersion;
  public String metadataVersion;
  public Map<String, Object> data;

}
2. Check the AzureWebJobsStorage connectiong string in application settings

Make sure that it is the correct connection string of your target storage account.

3. Upload a new blob

You can see that I get the right result.

So, I suggest you check your connection string setting. By default, your local test settings will not update to the cloud, which may causes this issue.

Question:

I am developing an azure function using Java. I need to iterate all the files in the following folder

aDirectory/aSubdirectoryWithManyFiles/

There are many files in that path,:

aDirectory/aSubdirectoryWithManyFiles/file1
aDirectory/aSubdirectoryWithManyFiles/file2
aDirectory/aSubdirectoryWithManyFiles/file3
aDirectory/aSubdirectoryWithManyFiles/file4
aDirectory/aSubdirectoryWithManyFiles/file5

so I wrote the following code in order to get them all:

// myCloudBlobContainer is a CloudBlobContainer
// I expected to get all files thanks to the next row
Iterable<ListBlobItem> blobs = myCloudBlobContainer.listBlobs();
// The only blob found in the container is the directory itself
for (ListBlobItem blob : blobs) {
    //log the current blob URI
    if (blob instanceof CloudBlob) {  // this never happens
        CloudBlob cloudBlob = (CloudBlob) blob;
        //make nice things with every found file
    }
}

The only blob iterated in the for is the directory, noone of the expected files. so in logs i get only the following URI:

https://blablablabla.blob.core.windows.net/aDirectory/aSubdirectoryWithManyFiles/

What should I do in order to access every file?

And in case I would have more than one subdirectory, as in the following example?

aDirectory/aSubdirectoryWithManyFiles/files(1-5)
aDirectory/anotherSubdirectoryWithManyFiles/files(6-10)

Thanks in advance


Edit

In order to make methods testable, the project uses wrappers and interfaces instead of directly using directly a CloudBlobContainer; basically, the CloudBlobContainer is given by CloudBlobClient.getContainerReference("containername")

After the answer to this question, I changed teh code to the following so I used listBlobs with parameters myCloudBlobContainer.listBlobs("aDirectory", true) and I wrote the following code in order to get them all:

// myCloudBlobClient is a CloudBlobClient
CloudBlobContainer myCloudBlobContainer = myCloudBlobClient.getContainerReference("containername")
// I expected to get all files thanks to the next row
Iterable<ListBlobItem> blobs = myCloudBlobContainer.listBlobs("aDirectory", true); // HERE THE CHANGE
// No blob found this time
for (ListBlobItem blob : blobs) { // NEVER IN THE FOR
    //log the current blob URI
    if (blob instanceof CloudBlob) {
        CloudBlob cloudBlob = (CloudBlob) blob;
        //make nice things with every found file
    }
}

But this time, it doesn't go at all in the for...


Answer:

Try using the following override of listBlobs method:

listBlobs(String prefix, boolean useFlatBlobListing)

So your code would be:

Iterable<ListBlobItem> blobs = myCloudBlobContainer.listBlobs("aDirectory", true);

This will list all blobs inside "aDirectory" virtual folder in your blob container.

Question:

I want to download virtual directory from Azure blob storage. directory is present in container. I tried using ColudBlobDirectory API. but it doesn't have any method for downloading virtual directory.

for (ListBlobItem blobItem : container.listBlobs()) { 
    // If the item is a blob, a virtual directory. 
    if (blobItem instanceof CloudBlobDirectory) { 
        CloudBlobDirectory blob1 = (CloudBlobDirectory) blobItem; 
        System.out.println("\n Blob prefix:" + blob1.getPrefix()); 
    }
}

Answer:

As @talex said, you can download blob file from directory one by one like this:

public static String destFilePath = "/path/to/directory/";

public static void main(String[] args)
        throws InvalidKeyException, URISyntaxException, StorageException, IOException {

    CloudStorageAccount account = CloudStorageAccount.parse(storageConnectionString);
    CloudBlobClient serviceClient = account.createCloudBlobClient();
    CloudBlobContainer container = serviceClient.getContainerReference("mycontainer");

    for (ListBlobItem blobItem : container.listBlobs()) {

        // If the item is a blob, a virtual directory.
        if (blobItem instanceof CloudBlobDirectory) {

            CloudBlobDirectory blobDir = (CloudBlobDirectory) blobItem;
            downloadDirectory(blobDir);
        }
    }
}

public static void downloadDirectory(CloudBlobDirectory blobDir)
        throws IOException, StorageException, URISyntaxException {

    Files.createDirectories(Paths.get(destFilePath + blobDir.getPrefix()));

    for (ListBlobItem blobInDir : blobDir.listBlobs()) {

        if (blobInDir instanceof CloudBlockBlob) {
            CloudBlockBlob blob = (CloudBlockBlob) blobInDir;
            blob.downloadToFile(destFilePath + blob.getName());
        } else {
            downloadDirectory((CloudBlobDirectory) blobInDir);
        }
    }

}

Question:

I'm trying to upload data to the same Block Blob in parallel, from multiple java processes. The processes have no way of talking to each other, so they won't know when the last block has been uploaded.

I'd like to commit each block immediately after it has been uploaded by a process. I thought I could use commitBlockList to do this, but its behavior seems to wipe out uncommitted blocks. I tried synchronizing access to the blob via a lease, so that each process would acquire a lease on the blob and pull down the blocklist via downloadBlockList, and update the blocks each process has uploaded, but this still has unexpected behavior (blocks keep disappearing from the list).

Is there an operation in the java azure storage client that allows me to update the committed status of a SINGLE block in a Block Storage blob? Alternatively, is there some parameter I can pass to uploadBlock that would make it immediately set the block's status to COMMITED after its finished uploading?


Answer:

Ok, I figured out my misunderstanding, from the docs:

"Any block commitment overwrites the blob’s existing properties and metadata, and discards all uncommitted blocks."

I was only acquiring the lease when I wanted to update the block list, not realizing that when I committed the block list, I would be destroying any uncommitted blocks not included in the list. I didn't acquire the lease when writing my blocks, since that would destroy any parallelism I'd get from having multiple different processes uploading at the same time.

I'll just have to try something else.

Question:

The problem is I want to set azure server side encryption to file which is present in azure blob container but i do not found any way to set server side encryption.


Answer:

Quoting from documentation (emphasis mine):

Existing Data - SSE only encrypts newly created data after the encryption is enabled. If for example you create a new Resource Manager storage account but don't turn on encryption, and then you upload blobs or archived VHDs to that storage account and then turn on SSE, those blobs will not be encrypted unless they are rewritten or copied.

So only data written after you enable SSE gets encrypted. You will need to do as they say and rewrite/copy the files.

Question:

I am new to implement Azure Mobile Service. I have refer the demo of ToDoItem given by Azure.

In same manner i have make class User for my own app. Then I am inserting the data in to the MobileServiceTable but it gives me error like below:

{"message":"The operation failed with the following error: 'A null store-generated value was returned for a non-nullable member 'CreatedAt' of type 'CrazyLabApp.Models.User'.'."}

I have not created any field like this as it is not created in ToDoItem demo as well. I have seen that there are 4 fields that are by Default created by the MobileServiceTable. createdAt is one of the field of that.

I am wonder about whats wrong i am doing.

Check my below Userclass:

   public class User {
    @com.google.gson.annotations.SerializedName("id")
    private String ServiceUserId;

    @com.google.gson.annotations.SerializedName("email")
    private String Email;

    @com.google.gson.annotations.SerializedName("firstname")
    private String FirstName;

    @com.google.gson.annotations.SerializedName("lastname")
    private String LastName;

    @com.google.gson.annotations.SerializedName("profilepic")
    private String ProfilePic;

    @com.google.gson.annotations.SerializedName("introduction")
    private String Introduction;

    @com.google.gson.annotations.SerializedName("website")
    private String Website;

    @com.google.gson.annotations.SerializedName("title")
    private String Title;

    @com.google.gson.annotations.SerializedName("_createdAt")
    private Date CreatedAt;

    @com.google.gson.annotations.SerializedName("coverimage")
    private ArrayList<CoverImage> CoverImages;

    /*public Date getCreatedAt() {
        return CreatedAt;
    }

    public void setCreatedAt(Date createdAt) {
        CreatedAt = createdAt;
    }*/

    @com.google.gson.annotations.SerializedName("followers")
    private ArrayList<User> Followers;

    @com.google.gson.annotations.SerializedName("likes")
    private ArrayList<Likes> Likes;

    @com.google.gson.annotations.SerializedName("collections")
    private ArrayList<Collections> Collections;

    @com.google.gson.annotations.SerializedName("comments")
    private ArrayList<Comments> Comments;

    @com.google.gson.annotations.SerializedName("stories")
    private ArrayList<Story> Stories ;




    //-------------- Methods
    public ArrayList<Story> getStories() {
        return Stories;
    }

    public void setStories(ArrayList<Story> stories) {
        Stories = stories;
    }

    public ArrayList<com.promact.crazylab.model.Comments> getComments() {
        return Comments;
    }

    public void setComments(ArrayList<com.promact.crazylab.model.Comments> comments) {
        Comments = comments;
    }

    public ArrayList<com.promact.crazylab.model.Collections> getCollections() {
        return Collections;
    }

    public void setCollections(ArrayList<com.promact.crazylab.model.Collections> collections) {
        Collections = collections;
    }

    public ArrayList<com.promact.crazylab.model.Likes> getLikes() {
        return Likes;
    }

    public void setLikes(ArrayList<com.promact.crazylab.model.Likes> likes) {
        Likes = likes;
    }

    public ArrayList<User> getFollowers() {
        return Followers;
    }

    public void setFollowers(ArrayList<User> followers) {
        Followers = followers;
    }

    public ArrayList<CoverImage> getCoverImages() {
        return CoverImages;
    }

    public void setCoverImages(ArrayList<CoverImage> coverImages) {
        CoverImages = coverImages;
    }

    public String getTitle() {
        return Title;
    }

    public void setTitle(String title) {
        Title = title;
    }

    public String getWebsite() {
        return Website;
    }

    public void setWebsite(String website) {
        Website = website;
    }

    public String getIntroduction() {
        return Introduction;
    }

    public void setIntroduction(String introduction) {
        Introduction = introduction;
    }

    public String getLastName() {
        return LastName;
    }

    public void setLastName(String lastName) {
        LastName = lastName;
    }

    public String getProfilePic() {
        return ProfilePic;
    }

    public void setProfilePic(String profilePic) {
        ProfilePic = profilePic;
    }

    public String getEmail() {
        return Email;
    }

    public void setEmail(String email) {
        Email = email;
    }

    public String getFirstName() {
        return FirstName;
    }

    public void setFirstName(String firstName) {
        FirstName = firstName;
    }

    public String getServiceUserId() {
        return ServiceUserId;
    }

    public void setServiceUserId(String serviceUserId) {
        ServiceUserId = serviceUserId;
    }

    @Override
    public boolean equals(Object o) {
        return o instanceof User && ((User) o).ServiceUserId == ServiceUserId;
    }

}

Also check below code the way i am inserting it:

final User u = new User();
u.setFirstName(mName);
u.setEmail(mEmail);
u.setProfilePic(mUrl);

mUserTable = mClient.getTable(User.class);

             // Insert the new item
             new AsyncTask<Void, Void, Void>(){
             @Override
             protected Void doInBackground(Void... params) {
                    try {
                           final User entity = mUserTable.insert(u).get();

                     } catch (Exception e){
                        //createAndShowDialog(e, "Error");
                        System.out.println("Error: "+e.toString());

                     }
             return null;
             }
     }.execute();

Please help me in this.


Answer:

The "_createdat" column will be populated automatically by Azure Mobile Services so there is no need to include it in your model. Delete this property from the User class. Its presence is probably overwriting the auto-populated value with a null.

Question:

I am generating a url for a public azure blob by

    String connectStr = "connection string";
    BlobServiceClient blobServiceClient = new BlobServiceClientBuilder().connectionString(connectStr).buildClient();

    BlobContainerClient containerClient = blobServiceClient.getBlobContainerClient("container name");

    BlobClient blobClient=containerClient.getBlobClient("blob name");

    String newstring = blobClient.getBlobUrl();
    System.out.println(newstring);

but the problem is that this generate a url only for public blob how can I get the url of private blob.


Answer:

Check this Java Storage V12 change, there is a description about sas.

It was not discoverable how to do something as fundamental as create a SAS token because there was no generateSAS method, and figuring how to attach a SAS to a URL was yet another problem.

So for now if you want to use v12 sdk, have to say no way to implement it. If you accept other version sdk, you could refer to the below code, it uses a V8 sdk.

import com.microsoft.azure.storage.CloudStorageAccount;
import com.microsoft.azure.storage.StorageException;
import com.microsoft.azure.storage.blob.*;
import java.net.URISyntaxException;
import java.security.InvalidKeyException;
import java.util.*;

public class App 
{
    public static void main( String[] args ) throws URISyntaxException, InvalidKeyException, StorageException {

        String storageConnectionString ="connection string";

        CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConnectionString);
        CloudBlobClient blobClient = storageAccount.createCloudBlobClient();
        CloudBlobContainer container = blobClient.getContainerReference("test");

        CloudBlockBlob blob = container.getBlockBlobReference("test.txt");

        SharedAccessBlobPolicy sasPolicy = new SharedAccessBlobPolicy();

        // Create a UTC Gregorian calendar value.
        GregorianCalendar calendar = new GregorianCalendar(TimeZone.getTimeZone("UTC"));
        // Use the start time delta one hour as the end time for the shared
        // access signature.
        calendar.add(Calendar.HOUR, 10);
        sasPolicy.setSharedAccessExpiryTime(calendar.getTime());

        sasPolicy.setPermissions(EnumSet.of(SharedAccessBlobPermissions.READ, SharedAccessBlobPermissions.WRITE,
                SharedAccessBlobPermissions.LIST));
        String sas = blob.generateSharedAccessSignature(sasPolicy,null);

        String sasurl=blob.getUri()+"?"+sas;

        System.out.println(sasurl);


    }
}

My dependency:

<dependencies>
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>4.11</version>
      <scope>test</scope>
    </dependency>
    <dependency>
      <groupId>com.microsoft.azure</groupId>
      <artifactId>azure-storage</artifactId>
      <version>8.4.0</version>
    </dependency>

  </dependencies>

Question:

I am looking at the following code example at https://github.com/Azure/azure-functions-java-worker

public class MyClass {
    @FunctionName("copy")
    @StorageAccount("AzureWebJobsStorage")
    @BlobOutput(name = "$return", path = "samples-output-java/{name}")
    public static String copy(@BlobTrigger(name = "blob", path = "samples-input-java/{name}") String content) {
        return content;
    }
}

In @BlobOutput we are using {name} parameter, because it was provided to us in @BlobInput. How can I dynamically generate that name in my function?

I want my blob name to be files/E36567AB1B93F7D9798 where the E36567AB1B93F7D9798 part is a hash generated from blob content. I want to generate it inside the function and return the hash as output. Sort of like GitHub creates unique IDs for files.


Answer:

If you just want to define a unique and dynamic value for blob name, I recommend you to use {rand-guid}. Here's the document.

@BlobOutput(name = "$return", path = "samples-output-java/files/{rand-guid}").

You will get a blob named like 85546257-97f8-43ea-961e-a8bbe70e009d in the virtual directory files.

If you have to use hash value of your file content, as for now it's not supported to do that in bindings, you can use Azure Storage SDK inside the function to specify your blob name.

Here's the related issue on github for you to refer.

Question:

Below is my code to upload a file to the Azure Blob Store using the

com.microsoft.azure.storage

library

public class BlobUploader {
    private CloudBlobContainer blobContainer;
    private static Logger LOGGER = LoggerFactory.getLogger(BlobUploader.class);

    /**
     * Constructor of the BlobUploader
     * 
     * @param storageAccountName The storage account name where the files will be uploaded to.
     * @param storageAccountKey The storage account key of the storage account
     * @param containerName The container name where the files will be uploaded to.
     */
    public BlobUploader( String storageAccountName, String storageAccountKey, String containerName ) {

        String storageConnectionString = "DefaultEndpointsProtocol=http;AccountName=" + storageAccountName + ";AccountKey=" + storageAccountKey;

        CloudStorageAccount storageAccount;
        try {
            storageAccount = CloudStorageAccount.parse( storageConnectionString );
            CloudBlobClient blobClient = storageAccount.createCloudBlobClient();
            // Retrieve reference to a previously created container.
            this.blobContainer = blobClient.getContainerReference( containerName );
        } catch ( Exception e ) {
            LOGGER.error( "failed to construct blobUploader", e );
        }
    }

    public void upload( String filePath ) throws Exception {

        // Define the path to blob in the container
        String blobPath = "/uploads";
        File fileToBeUploaded = new File( filePath );
        String fileName = fileToBeUploaded.getName();

        String blobName = blobPath + fileName;

        // Create or overwrite the blob with contents from a local file.
        CloudBlockBlob blob = blobContainer.getBlockBlobReference( blobName );

        System.out.println( "start uploading file " + filePath + " to blob " + blobName );

        blob.upload( new FileInputStream( fileToBeUploaded ), fileToBeUploaded.length() );
        System.out.println( "upload succeeded." );
    }
}

I am looking for an API, where, given a file path to a file uploaded to the Azure Blob Store, it can return me the properties of that file, specifically, the date and time uploaded.

Is there an API in Java that supports this?


Answer:

I am looking for an API, where, given a file path to a file uploaded to the Azure Blob Store, it can return me the properties of that file, specifically, the date and time uploaded.

The method you're looking for is downloadAttributes() which returns an object of type BlobProperties will set blob's properties that are of type BlobProperties. It will contain information about the blob. The method you would want to use there is getLastModified().

However this will return the date/time when the blob was last updated. So if you create a blob and make no changes to it, this property can be used to find out when it was uploaded. However if you make any changes to the blob after it has been created (like setting properties/metadata etc.), then the value returned is the date/time when it was last changed.

If you're interested in finding about when a blob was created, you may want to store this information as custom metadata along with the blob.

You can get detailed information about the SDK here: http://azure.github.io/azure-storage-java/.

Question:

I've been trying to integrate azure blob storage with my android app, but I've been having trouble. I'm trying to upload a text file to the server, but I an IOException error and the following log:

getTopLevelResources: /data/app/com.tmacstudios.topmeme-1/base.apk / 1.0 running in com.tmacstudios.topmeme rsrc of package com.tmacstudios.topmeme
09-10 14:40:45.302 16945-16945/com.tmacstudios.topmeme I/InjectionManager: Inside getClassLibPath + mLibMap{0=, 1=}
09-10 14:40:45.302 16945-16945/com.tmacstudios.topmeme D/ResourcesManager: For user 0 new overlays fetched Null
09-10 14:40:45.312 16945-16945/com.tmacstudios.topmeme I/InjectionManager: Inside getClassLibPath caller 
09-10 14:40:45.322 16945-16945/com.tmacstudios.topmeme W/System: ClassLoader referenced unknown path: /data/app/com.tmacstudios.topmeme-1/lib/arm64
09-10 14:40:45.802 16945-16945/com.tmacstudios.topmeme W/System: ClassLoader referenced unknown path: /data/app/com.tmacstudios.topmeme-1/lib/arm64
09-10 14:40:45.812 16945-16945/com.tmacstudios.topmeme D/InjectionManager: InjectionManager
09-10 14:40:45.812 16945-16945/com.tmacstudios.topmeme D/InjectionManager: fillFeatureStoreMap com.tmacstudios.topmeme
09-10 14:40:45.812 16945-16945/com.tmacstudios.topmeme I/InjectionManager: Constructor com.tmacstudios.topmeme, Feature store :{}
09-10 14:40:45.812 16945-16945/com.tmacstudios.topmeme I/InjectionManager: featureStore :{}
09-10 14:40:46.062 16945-16945/com.tmacstudios.topmeme W/ResourcesManager: getTopLevelResources: /data/app/com.tmacstudios.topmeme-1/base.apk / 1.0 running in com.tmacstudios.topmeme rsrc of package com.tmacstudios.topmeme
09-10 14:40:46.062 16945-16945/com.tmacstudios.topmeme W/ResourcesManager: getTopLevelResources: /data/app/com.tmacstudios.topmeme-1/base.apk / 1.0 running in com.tmacstudios.topmeme rsrc of package com.tmacstudios.topmeme
09-10 14:40:46.162 16945-16945/com.tmacstudios.topmeme W/art: Before Android 4.1, method android.graphics.PorterDuffColorFilter android.support.graphics.drawable.VectorDrawableCompat.updateTintFilter(android.graphics.PorterDuffColorFilter, android.content.res.ColorStateList, android.graphics.PorterDuff$Mode) would have incorrectly overridden the package-private method in android.graphics.drawable.Drawable
09-10 14:40:46.322 16945-16945/com.tmacstudios.topmeme E/TopMeme: successful connection to server
09-10 14:40:46.322 16945-16945/com.tmacstudios.topmeme E/TopMeme: upload beginning for clueful_log.txt @ /storage/emulated/0/clueful_log.txt
09-10 14:40:46.332 16945-16945/com.tmacstudios.topmeme E/TopMeme: File size: 111394
09-10 14:40:46.332 16945-16945/com.tmacstudios.topmeme E/TopMeme: available: 111394
09-10 14:40:46.342 16945-16995/com.tmacstudios.topmeme I/System.out: (HTTPLog)-Static: isSBSettingEnabled false
09-10 14:40:46.342 16945-16995/com.tmacstudios.topmeme I/System.out: (HTTPLog)-Static: isSBSettingEnabled false
09-10 14:40:46.472 16945-16945/com.tmacstudios.topmeme I/System.out: (HTTPLog)-Static: isSBSettingEnabled false
09-10 14:40:46.472 16945-16945/com.tmacstudios.topmeme I/System.out: (HTTPLog)-Static: isSBSettingEnabled false
09-10 14:40:46.472 16945-16945/com.tmacstudios.topmeme W/System.err: java.io.IOException
09-10 14:40:46.472 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.microsoft.azure.storage.core.Utility.initIOException(Utility.java:592)
09-10 14:40:46.472 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.microsoft.azure.storage.blob.BlobOutputStream.close(BlobOutputStream.java:309)
09-10 14:40:46.472 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.microsoft.azure.storage.blob.CloudBlockBlob.upload(CloudBlockBlob.java:679)
09-10 14:40:46.472 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.microsoft.azure.storage.blob.CloudBlockBlob.upload(CloudBlockBlob.java:595)
09-10 14:40:46.472 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.tmacstudios.topmeme.MainActivity.uploadFile(MainActivity.java:163)
09-10 14:40:46.472 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.tmacstudios.topmeme.MainActivity.onCreate(MainActivity.java:84)
09-10 14:40:46.472 16945-16945/com.tmacstudios.topmeme W/System.err:     at android.app.Activity.performCreate(Activity.java:6876)
09-10 14:40:46.472 16945-16945/com.tmacstudios.topmeme W/System.err:     at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1135)
09-10 14:40:46.472 16945-16945/com.tmacstudios.topmeme W/System.err:     at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:3207)
09-10 14:40:46.472 16945-16945/com.tmacstudios.topmeme W/System.err:     at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:3350)
09-10 14:40:46.472 16945-16945/com.tmacstudios.topmeme W/System.err:     at android.app.ActivityThread.access$1100(ActivityThread.java:222)
09-10 14:40:46.472 16945-16945/com.tmacstudios.topmeme W/System.err:     at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1795)
09-10 14:40:46.472 16945-16945/com.tmacstudios.topmeme W/System.err:     at android.os.Handler.dispatchMessage(Handler.java:102)
09-10 14:40:46.472 16945-16945/com.tmacstudios.topmeme W/System.err:     at android.os.Looper.loop(Looper.java:158)
09-10 14:40:46.472 16945-16945/com.tmacstudios.topmeme W/System.err:     at android.app.ActivityThread.main(ActivityThread.java:7229)
09-10 14:40:46.472 16945-16945/com.tmacstudios.topmeme W/System.err:     at java.lang.reflect.Method.invoke(Native Method)
09-10 14:40:46.472 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1230)
09-10 14:40:46.472 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1120)
09-10 14:40:46.472 16945-16945/com.tmacstudios.topmeme W/System.err: Caused by: com.microsoft.azure.storage.StorageException: Network operations may not be performed on the main thread.
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.microsoft.azure.storage.core.ExecutionEngine.executeWithRetry(ExecutionEngine.java:188)
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.microsoft.azure.storage.blob.CloudBlockBlob.commitBlockList(CloudBlockBlob.java:324)
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.microsoft.azure.storage.blob.BlobOutputStream.commit(BlobOutputStream.java:339)
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.microsoft.azure.storage.blob.BlobOutputStream.close(BlobOutputStream.java:306)
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:    ... 16 more
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err: Caused by: android.os.NetworkOnMainThreadException
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:     at android.os.StrictMode$AndroidBlockGuardPolicy.onNetwork(StrictMode.java:1273)
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:     at libcore.io.BlockGuardOs.recvfrom(BlockGuardOs.java:249)
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:     at libcore.io.IoBridge.recvfrom(IoBridge.java:549)
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:     at java.net.PlainSocketImpl.read(PlainSocketImpl.java:481)
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:     at java.net.PlainSocketImpl.access$000(PlainSocketImpl.java:37)
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:     at java.net.PlainSocketImpl$PlainSocketInputStream.read(PlainSocketImpl.java:237)
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.android.okhttp.okio.Okio$2.read(Okio.java:140)
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.android.okhttp.okio.AsyncTimeout$2.read(AsyncTimeout.java:211)
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.android.okhttp.okio.RealBufferedSource.exhausted(RealBufferedSource.java:70)
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.android.okhttp.internal.http.HttpConnection.isReadable(HttpConnection.java:165)
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.android.okhttp.Connection.isReadable(Connection.java:1524)
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.android.okhttp.OkHttpClient$1.isReadable(OkHttpClient.java:91)
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.android.okhttp.internal.http.HttpEngine.createNextConnection(HttpEngine.java:475)
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.android.okhttp.internal.http.HttpEngine.nextConnection(HttpEngine.java:465)
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.android.okhttp.internal.http.HttpEngine.connect(HttpEngine.java:447)
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.android.okhttp.internal.http.HttpEngine.sendRequest(HttpEngine.java:353)
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.android.okhttp.internal.huc.HttpURLConnectionImpl.execute(HttpURLConnectionImpl.java:468)
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.android.okhttp.internal.huc.HttpURLConnectionImpl.connect(HttpURLConnectionImpl.java:118)
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.android.okhttp.internal.huc.HttpURLConnectionImpl.getOutputStream(HttpURLConnectionImpl.java:249)
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:     at com.microsoft.azure.storage.core.ExecutionEngine.executeWithRetry(ExecutionEngine.java:103)
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme W/System.err:    ... 19 more
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme E/TopMeme: upload failed: java.io.IOException
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme D/Activity: performCreate Call Injection manager
09-10 14:40:46.482 16945-16945/com.tmacstudios.topmeme I/InjectionManager: dispatchOnViewCreated > Target : com.tmacstudios.topmeme.MainActivity isFragment :false
09-10 14:40:46.502 16945-16945/com.tmacstudios.topmeme D/SecWifiDisplayUtil: Metadata value : SecSettings2
09-10 14:40:46.502 16945-16945/com.tmacstudios.topmeme D/ViewRootImpl: #1 mView = com.android.internal.policy.PhoneWindow$DecorView{8ada019 I.E...... R.....ID 0,0-0,0}
09-10 14:40:46.512 16945-17007/com.tmacstudios.topmeme D/OpenGLRenderer: Use EGL_SWAP_BEHAVIOR_PRESERVED: true
09-10 14:40:46.562 16945-16945/com.tmacstudios.topmeme V/ActivityThread: updateVisibility : ActivityRecord{50af8de token=android.os.BinderProxy@e66a3c9 {com.tmacstudios.topmeme/com.tmacstudios.topmeme.MainActivity}} show : true
09-10 14:40:46.652 16945-17007/com.tmacstudios.topmeme D/libEGL: loaded /vendor/lib64/egl/libGLES_mali.so
09-10 14:40:46.672 16945-17007/com.tmacstudios.topmeme D/libEGL: eglInitialize EGLDisplay = 0x7f8747f178
09-10 14:40:46.672 16945-17007/com.tmacstudios.topmeme I/OpenGLRenderer: Initialized EGL, version 1.4

                                                                         [ 09-10 14:40:46.672 16945:17007 D/         ]
                                                                         ro.exynos.dss isEnabled: 0
09-10 14:40:46.682 16945-17007/com.tmacstudios.topmeme D/mali_winsys: new_window_surface returns 0x3000,  [1440x2560]-format:1
09-10 14:40:46.692 16945-16945/com.tmacstudios.topmeme W/DisplayListCanvas: DisplayListCanvas is started on unbinded RenderNode (without mOwningView)
09-10 14:40:46.752 16945-16945/com.tmacstudios.topmeme I/InjectionManager: dispatchCreateOptionsMenu :com.tmacstudios.topmeme.MainActivity
09-10 14:40:46.752 16945-16945/com.tmacstudios.topmeme I/InjectionManager: dispatchPrepareOptionsMenu :com.tmacstudios.topmeme.MainActivity
09-10 14:40:46.752 16945-16945/com.tmacstudios.topmeme D/ViewRootImpl: MSG_RESIZED_REPORT: ci=Rect(0, 96 - 0, 0) vi=Rect(0, 96 - 0, 0) or=1
09-10 14:40:46.752 16945-16945/com.tmacstudios.topmeme I/Timeline: Timeline: Activity_idle id: android.os.BinderProxy@e66a3c9 time:331880753

Here is my function that's calling that. I can confirm that the file path exists:

void uploadFile(String filePath, String name){
        Log.e("TopMeme","upload beginning for "+name+" @ "+filePath);
        try
        {
            // Retrieve storage account from connection-string.
            CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConnectionString);

            // Create the blob client.
            CloudBlobClient blobClient = storageAccount.createCloudBlobClient();

            // Retrieve reference to a previously created container.
            CloudBlobContainer container = blobClient.getContainerReference("memecontainer");

            // Create or overwrite the "myimage.jpg" blob with contents from a local file.
            CloudBlockBlob blob = container.getBlockBlobReference(name);
            File source = new File(filePath);
            Log.e("TopMeme","File size: "+source.length());
            FileInputStream fileInputStream = new FileInputStream(source);
            Log.e("TopMeme","available: "+fileInputStream.available());
            blob.upload(fileInputStream, source.length());
            Log.e("TopMeme","upload function completed");
        }
        catch (Exception e)
        {
            // Output the stack trace.
            e.printStackTrace();
            Log.e("TopMeme","upload failed: "+e);
        }
    }

Answer:

So it turns out that in Azure you can't do anything networking wise on your main android thread, so you have to do it on another thread. Here is my new function:

void uploadFile(final String filePath,final String name){
        Log.e("TopMeme","upload beginning for "+name+" @ "+filePath);

        //cant perform network tasks on main thread
        AsyncTask<Void,Void,Void> task = new AsyncTask<Void,Void,Void>(){
            @Override
            protected Void doInBackground(Void... params) {
                try
                {
                    // Retrieve storage account from connection-string.
                    CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConnectionString);

                    // Create the blob client.
                    CloudBlobClient blobClient = storageAccount.createCloudBlobClient();

                    // Retrieve reference to a previously created container.
                    CloudBlobContainer container = blobClient.getContainerReference("memecontainer");

                    /*
                    //create blob if it doesn't exist - hopefully resolves bugs
                    container.createIfNotExists();

                    // Create a permissions object.
                    BlobContainerPermissions containerPermissions = new BlobContainerPermissions();

                    // Include public access in the permissions object.
                    containerPermissions.setPublicAccess(BlobContainerPublicAccessType.CONTAINER);

                    // Set the permissions on the container.
                    container.uploadPermissions(containerPermissions);
                    */

                    // Create or overwrite the "myimage.jpg" blob with contents from a local file.
                    CloudBlockBlob blob = container.getBlockBlobReference(name);
                    File source = new File(filePath);
                    Log.e("TopMeme","File size: "+source.length());
                    FileInputStream fileInputStream = new FileInputStream(source);
                    Log.e("TopMeme","available: "+fileInputStream.available());
                    blob.upload(fileInputStream, source.length());
                    Log.e("TopMeme","upload function completed");
                }
                catch (Exception e)
                {
                    // Output the stack trace.
                    e.printStackTrace();
                    Log.e("TopMeme","upload failed: "+e);
                }
                return null;
            }
        };

        task.execute();
    }

Question:

The user selects images from gallery and those selected images I am uploading on windows azure blob storage. But while uploading I am getting a null-pointer exception.

I couldn't find any solution on internet. The ArrayList<String> 'selected' consists of paths of the selected images. The paths are displayed in this format in the logcat: mnt/sdcard/Pictures/image1.jpg

@Override
public View onCreateView(LayoutInflater inflater, ViewGroup container,
        Bundle savedInstanceState) {

    selected = new ArrayList<String>();

    try {
        // Retrieve storage account from connection-string.
        storageAccount = CloudStorageAccount.parse(storageConnectionString);

        // Create the blob client.
        blobClient = storageAccount.createCloudBlobClient();

        // Get a reference to a container.
        // The container name must be lower case
        blobContainer = blobClient.getContainerReference("mycontainer");

        // Create the container if it does not exist.
        // Create a blob container using the createIfNotExist method that
        // checks whether a container exists with the same name. The method
        // creates the blob container only if a container with the same name
        // does not exist. Otherwise, no operation is performed. 
        blobContainer.createIfNotExists();

        // Create a permissions object.
        containerPermissions = new BlobContainerPermissions();

        // Include public access in the permissions object.
        containerPermissions.setPublicAccess(BlobContainerPublicAccessType.CONTAINER);

        // Set the permissions on the container.
        blobContainer.uploadPermissions(containerPermissions);

    } catch (InvalidKeyException e1) {
        e1.printStackTrace();
    } catch (Exception e1) {
        e1.printStackTrace();
    }             
}

public void onCreateOptionsMenu(Menu menu,MenuInflater inflater) {
        super.onCreateOptionsMenu(menu, inflater);

    // Inflate the menu items for use in the action bar

    inflater.inflate(R.menu.mymenu, menu);

    // Here we get the action view we defined
    myActionMenuItem = menu.findItem(R.id.my_action);
    View actionView = myActionMenuItem.getActionView();

    // We then get the button view that is part of the action view
    if(actionView != null) {
        myActionButton = (Button) actionView.findViewById(R.id.action_btn);
        myActionButton.setText(R.string.txt_submit);
        if(myActionButton != null) {
            // We set a listener that will be called when the return/enter key is pressed
            myActionButton.setOnClickListener(new OnClickListener() {                       

                @Override
                public void onClick(View v) {

                    myActionButton.setEnabled(false);

                    myActionButton.setText("Submitting..");                         

                    try {

                        for(int i = 0; i <selected.size();i++){
                            String filePath = selected.get(i).sdcardPath;

                            File source = new File(filePath);
                            String absoluteFilePath = source.getAbsolutePath();
                            Log.d("personal", absoluteFilePath);
                            CloudBlockBlob blob = blobContainer.getBlockBlobReference(source.getName());
                            Log.d("personal", source.getName());
                            //Log.d("personal", imageName.get(i));
                            blob.upload(new FileInputStream(absoluteFilePath), source.length());
                            //blob.uploadFromFile(filePath);
                            Log.d("personal", "Image Uploaded");
                        }

                    } catch (URISyntaxException e) {
                        e.printStackTrace();
                    } catch (Exception e) {
                        e.printStackTrace();
                    }

                });

            }
        }
    }

StackTrace:

10-09 15:50:27.168: W/System.err(1451): java.lang.NullPointerException
10-09 15:50:27.168: W/System.err(1451):     at libcore.net.http.HttpEngine.readResponse(HttpEngine.java:784)
10-09 15:50:27.168: W/System.err(1451):     at libcore.net.http.HttpURLConnectionImpl.getResponse(HttpURLConnectionImpl.java:274)
10-09 15:50:27.168: W/System.err(1451):     at libcore.net.http.HttpURLConnectionImpl.getResponseCode(HttpURLConnectionImpl.java:479)
10-09 15:50:27.178: W/System.err(1451):     at com.microsoft.azure.storage.StorageException.translateException(StorageException.java:145)
10-09 15:50:27.178: W/System.err(1451):     at com.microsoft.azure.storage.core.ExecutionEngine.executeWithRetry(ExecutionEngine.java:252)
10-09 15:50:27.178: W/System.err(1451):     at com.microsoft.azure.storage.blob.CloudBlockBlob.commitBlockList(CloudBlockBlob.java:242)
10-09 15:50:27.178: W/System.err(1451):     at com.microsoft.azure.storage.blob.BlobOutputStream.commit(BlobOutputStream.java:321)
10-09 15:50:27.178: W/System.err(1451):     at com.microsoft.azure.storage.blob.BlobOutputStream.close(BlobOutputStream.java:285)
10-09 15:50:27.178: W/System.err(1451):     at com.microsoft.azure.storage.blob.CloudBlockBlob.upload(CloudBlockBlob.java:582)
10-09 15:50:27.199: W/System.err(1451):     at com.microsoft.azure.storage.blob.CloudBlockBlob.upload(CloudBlockBlob.java:499)
10-09 15:50:27.199: W/System.err(1451):     at com.jbandroid.fragment.PersonalInfoFragment$2.onCompleted(PersonalInfoFragment.java:273)
10-09 15:50:27.199: W/System.err(1451):     at com.jbandroid.fragment.PersonalInfoFragment$2.onCompleted(PersonalInfoFragment.java:1)
10-09 15:50:27.199: W/System.err(1451):     at com.microsoft.windowsazure.mobileservices.MobileServiceTable$ParseResultOperationCallback.onCompleted(MobileServiceTable.java:103)
10-09 15:50:27.199: W/System.err(1451):     at com.microsoft.windowsazure.mobileservices.MobileServiceJsonTable$2.onCompleted(MobileServiceJsonTable.java:249)
10-09 15:50:27.199: W/System.err(1451):     at com.microsoft.windowsazure.mobileservices.MobileServiceJsonTable$4.onPostExecute(MobileServiceJsonTable.java:389)
10-09 15:50:27.199: W/System.err(1451):     at com.microsoft.windowsazure.mobileservices.MobileServiceJsonTable$4.onPostExecute(MobileServiceJsonTable.java:1)
10-09 15:50:27.199: W/System.err(1451):     at android.os.AsyncTask.finish(AsyncTask.java:602)
10-09 15:50:27.199: W/System.err(1451):     at android.os.AsyncTask.access$600(AsyncTask.java:156)
10-09 15:50:27.199: W/System.err(1451):     at android.os.AsyncTask$InternalHandler.handleMessage(AsyncTask.java:615)
10-09 15:50:27.199: W/System.err(1451):     at android.os.Handler.dispatchMessage(Handler.java:99)
10-09 15:50:27.208: W/System.err(1451):     at android.os.Looper.loop(Looper.java:137)
10-09 15:50:27.208: W/System.err(1451):     at android.app.ActivityThread.main(ActivityThread.java:4340)
10-09 15:50:27.208: W/System.err(1451):     at java.lang.reflect.Method.invokeNative(Native Method)
10-09 15:50:27.208: W/System.err(1451):     at java.lang.reflect.Method.invoke(Method.java:511)
10-09 15:50:27.208: W/System.err(1451):     at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:784)
10-09 15:50:27.208: W/System.err(1451):     at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:551)
10-09 15:50:27.208: W/System.err(1451):     at dalvik.system.NativeStart.main(Native Method)

Answer:

This may be because Android does not allow network connections on the main thread. The storage library currently provides the incorrect error message in this case - we're working on fixing this. Take a look at this other Stack Overflow post for more info.

If this is not the issue, if you could provide the version of the Android library you're working with and the version of Android you're running on that would be helpful in reproducing the problem.

Question:

How to change the blob access policy?

At the moment I am able to create some test blobs using the azure quick start. This works great but the blobs have a public access level of private by default in my case. I want to set public access level from Private (no anonymous access) to Blob (anonymous read access for blobs only).

I did some try & error and I found that setAccessPolicy could help me out. I want to implement this but I don't fully understand the .setPermissions("permissionString"). Do I need to change this to something like "anonymous read access for blobs only" or am I missing something?

My try & error code:

public static void main( String[] args ) throws IOException
    {
        BlobServiceClient blobServiceClient = new BlobServiceClientBuilder().connectionString(connectionString).buildClient();
        String containerName = "testblobs"+ UUID.randomUUID();

        BlobSignedIdentifier identifier = new BlobSignedIdentifier()
                .setId("name")
                .setAccessPolicy(new BlobAccessPolicy()
                .setStartsOn(OffsetDateTime.now())
                        .setExpiresOn(OffsetDateTime.now().plusDays(1))
                .setPermissions("permissionString")); //what should I put here?

        BlobContainerClient containerClient = blobServiceClient.createBlobContainer(containerName);

        try {
            containerClient.setAccessPolicy(PublicAccessType.CONTAINER, Collections.singletonList(identifier));
            System.out.println("Set Access Policy to 'Public read access for blobs only'.");
        } catch (UnsupportedOperationException err) {
            System.out.printf("Set Access Policy failed because: %s\n", err);
        }

        String localPath = "pathtofile";
        String fileName = "myfile.bpmn";
        File localFile = new File(localPath+fileName);

        BlobClient blobClient = containerClient.getBlobClient(fileName);

        System.out.println("\nUploading to Blob storage as blob:\n\t" + blobClient.getBlobUrl());

        blobClient.uploadFromFile(localPath + fileName);

        System.out.println("\nListing da blobs...");

        for (BlobItem blobItem : containerClient.listBlobs()) {
            System.out.println();
            System.out.println("\t" + blobItem.getName());
        }
    }

When I run this piece of code it will throw the following exception and I think it's because of the .setPermissions("permissionString") is not set correctly.

Exception stacktrace:

Exception in thread "main" com.azure.storage.blob.models.BlobStorageException: Status code 400, "InvalidXmlDocumentXML specified is not syntactically valid. RequestId:74403433-a01e-0086-17c1-1727cd000000 Time:2020-04-21T09:43:11.5943935Z00" at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) at com.azure.core.http.rest.RestProxy.instantiateUnexpectedException(RestProxy.java:357) at com.azure.core.http.rest.RestProxy.lambda$ensureExpectedStatus$3(RestProxy.java:398) at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:118) at reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1705) at reactor.core.publisher.MonoCacheTime$CoordinatorSubscriber.signalCached(MonoCacheTime.java:320) at reactor.core.publisher.MonoCacheTime$CoordinatorSubscriber.onNext(MonoCacheTime.java:337) at reactor.core.publisher.Operators$ScalarSubscription.request(Operators.java:2267) at reactor.core.publisher.MonoCacheTime$CoordinatorSubscriber.onSubscribe(MonoCacheTime.java:276) at reactor.core.publisher.FluxFlatMap.trySubscribeScalarMap(FluxFlatMap.java:191) at reactor.core.publisher.MonoFlatMap.subscribeOrReturn(MonoFlatMap.java:53) at reactor.core.publisher.InternalMonoOperator.subscribe(InternalMonoOperator.java:48) at reactor.core.publisher.MonoDefer.subscribe(MonoDefer.java:52) at reactor.core.publisher.MonoCacheTime.subscribeOrReturn(MonoCacheTime.java:132) at reactor.core.publisher.InternalMonoOperator.subscribe(InternalMonoOperator.java:48) at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:150) at reactor.core.publisher.FluxDoFinally$DoFinallySubscriber.onNext(FluxDoFinally.java:123) at reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:112) at reactor.core.publisher.FluxMap$MapConditionalSubscriber.onNext(FluxMap.java:213) at reactor.core.publisher.FluxDoFinally$DoFinallySubscriber.onNext(FluxDoFinally.java:123) at reactor.core.publisher.FluxHandleFuseable$HandleFuseableSubscriber.onNext(FluxHandleFuseable.java:178) at reactor.core.publisher.FluxContextStart$ContextStartSubscriber.onNext(FluxContextStart.java:103) at reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1705) at reactor.core.publisher.MonoCollectList$MonoCollectListSubscriber.onComplete(MonoCollectList.java:121) at reactor.core.publisher.FluxPeek$PeekSubscriber.onComplete(FluxPeek.java:252) at reactor.core.publisher.FluxMap$MapSubscriber.onComplete(FluxMap.java:136) at reactor.netty.channel.FluxReceive.terminateReceiver(FluxReceive.java:419) at reactor.netty.channel.FluxReceive.drainReceiver(FluxReceive.java:209) at reactor.netty.channel.FluxReceive.onInboundComplete(FluxReceive.java:367) at reactor.netty.channel.ChannelOperations.onInboundComplete(ChannelOperations.java:363) at reactor.netty.channel.ChannelOperations.terminate(ChannelOperations.java:412) at reactor.netty.http.client.HttpClientOperations.onInboundNext(HttpClientOperations.java:585) at reactor.netty.channel.ChannelOperationsHandler.channelRead(ChannelOperationsHandler.java:90) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:377) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:363) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:355) at io.netty.channel.CombinedChannelDuplexHandler$DelegatingChannelHandlerContext.fireChannelRead(CombinedChannelDuplexHandler.java:436) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:321) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:295) at io.netty.channel.CombinedChannelDuplexHandler.channelRead(CombinedChannelDuplexHandler.java:251) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:377) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:363) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:355) at io.netty.handler.ssl.SslHandler.unwrap(SslHandler.java:1470) at io.netty.handler.ssl.SslHandler.decodeNonJdkCompatible(SslHandler.java:1231) at io.netty.handler.ssl.SslHandler.decode(SslHandler.java:1268) at io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:498) at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:437) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:276) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:377) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:363) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:355) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:377) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:363) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493) at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989) at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) at java.base/java.lang.Thread.run(Thread.java:834) Suppressed: java.lang.Exception: #block terminated with an error at reactor.core.publisher.BlockingSingleSubscriber.blockingGet(BlockingSingleSubscriber.java:99) at reactor.core.publisher.Mono.block(Mono.java:1664) at com.azure.storage.common.implementation.StorageImplUtils.blockWithOptionalTimeout(StorageImplUtils.java:99) at com.azure.storage.blob.BlobContainerClient.setAccessPolicyWithResponse(BlobContainerClient.java:416) at com.azure.storage.blob.BlobContainerClient.setAccessPolicy(BlobContainerClient.java:386) at Main.main(Main.java:33)

Any help on explaining the .setPermissions("permissionString") to me with some code examples how to set public access for blobs would be great.


Answer:

Permissions that are applicable for a blob container are defined here.

Depending on the permissions you wish to include in your access policy, you can choose from one or more of the following permissions: Read (r), Add (a), Create (c), Write (w), Delete (d) and List (l).

Please note that ordering of these permissions is important. They must follow the following order: racwdl.

Also note that specifying permissions in shared access policy is optional so you can leave the permission string as empty as well.

Question:

Can someone help with this please? I am following the Java JDK samples, there are lots of examples on how to manage containers and blobs, however, nothing saying how to move from one storage container onto another.

Eg I have a blob on StorageOne/ContainerOne/BlobName need to be moved to Storage2/ContainerTwo/BlobName

I am looking at this site https://github.com/Azure/azure-sdk-for-java/blob/master/sdk/storage/azure-storage-blob/README.md hoever no luck.

Also I have managed to connect via ConnectionString and create, download blobs fine, however cant figure out how to move.

Any suggestion would be helpful. I have also tried to create an App Function in Azure to do it, but my powershell skills are not good.

Thank you


Answer:

If you want to copy a blob from one storage container to other storage container, you could use beginCopy method, firstly get the source blob url with getBlobUrl method then pass it.

If you want a sample you could refer to this github sample:BlobAsyncClientBaseJavaDocCodeSnippets.

And if you want to move one blob from source container to another container and it doesn't exist in the source container, for now no direct method to implement, you could copy the blob firstly, after copy activity then delete the source blob with delete method.

Actually from all these method link you could find they all provide the github sample just follow the project structure.

Update: if you want a sample code, you could refer to my below code, I have test it it could work.

        String connectStr = "storage account connection string";

        // Create a BlobServiceClient object which will be used to create a container client
        BlobServiceClient blobServiceClient = new BlobServiceClientBuilder().connectionString(connectStr).buildClient();

        BlobContainerClient containerClient = blobServiceClient.getBlobContainerClient("test");

        BlobContainerClient destcontainer=blobServiceClient.getBlobContainerClient("testcontainer");

        PagedIterable<BlobItem> blobs= containerClient.listBlobs();
        for (BlobItem blobItem : blobs) {

            System.out.println("This is the blob name: " + blobItem.getName());
            BlobClient blobClient=containerClient.getBlobClient(blobItem.getName());
BlobServiceSasSignatureValues sas = new BlobServiceSasSignatureValues(OffsetDateTime.now().plusHours(1),
                BlobContainerSasPermission.parse("r"));
        String sasToken = blobClient.generateSas(sas);

            BlobClient destblobclient=destcontainer.getBlobClient(blobItem.getName());
            destblobclient.beginCopy(blobClient.getBlobUrl()+ "?" + sasToken,null);

        }

Update:

        String connectStr = "source storage account connection string";

        String destconnectStr="destination storage account connection string";



        // Create a BlobServiceClient object which will be used to create a container client
        BlobServiceClient blobServiceClient = new BlobServiceClientBuilder().connectionString(connectStr).buildClient();

        BlobServiceClient destblobServiceClient = new BlobServiceClientBuilder().connectionString(destconnectStr).buildClient();

        BlobContainerClient containerClient = blobServiceClient.getBlobContainerClient("test");

        BlobContainerClient destcontainer=destblobServiceClient.getBlobContainerClient("destcontainer");

        PagedIterable<BlobItem> blobs= containerClient.listBlobs();
        for (BlobItem blobItem : blobs) {

            System.out.println("This is the blob name: " + blobItem.getName());
            BlobClient blobClient=containerClient.getBlobClient(blobItem.getName());
            BlobClient destblobclient=destcontainer.getBlobClient(blobItem.getName());
            destblobclient.beginCopy(blobClient.getBlobUrl(),null);

        }

Question:

I have uploaded the files on azure blob container but how can I get the url of a that uploaded file using java. I have connectionString of the azure portal.

Can anyone help me?


Answer:

Here I use the Azure Blob storage v12, you could refer to the below code. Further more information about this SDK you could check this source code:Azure Storage Blob client library for Java.

        String connectStr = "storage account connection";

        // Create a BlobServiceClient object which will be used to create a container client
        BlobServiceClient blobServiceClient = new BlobServiceClientBuilder().connectionString(connectStr).buildClient();

        BlobContainerClient containerClient = blobServiceClient.getBlobContainerClient("container name");

        BlobClient blobClient=containerClient.getBlobClient("blob name");

        System.out.println(blobClient.getBlobUrl());

Question:

I am not able to run my azure function locally written in java which should work based on BlobTrigger.

I am facing the following error:

 A host error has occurred
 Microsoft.WindowsAzure.Storage: No connection could be made because the target machine actively refused it. System.Net.Http: No connection could be made because the target machine actively refused it. System.Private.CoreLib: No connection could be made because the target machine actively refused it.

Here is my code:

public class Function {

    @FunctionName("BlobTrigger")
    @StorageAccount("reseaudiag")
    public void blobTrigger(
            @BlobTrigger(name = "content", path = "filer/{fileName}", dataType = "binary",
            connection = "AzureWebJobsDashboard"
                    ) byte[] content,
            @BindingName("fileName") String fileName,
            final ExecutionContext context
            ) {
        context.getLogger().info("Java Blob trigger function processed a blob.\n Name: " + fileName + "\n Size: " + content.length + " Bytes");

    }
}

Based on initial run only, I can start implementing the logic but I am blocked to run the basic step itself.

Here is my local.settings.json

{
  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=XXX;AccountKey=confidential;EndpointSuffix=core.windows.net",
    "AzureWebJobsDashboard": "UseDevelopmentStorage=true",
    "DataConnectionString": "UseDevelopmentStorage=true",
    "ContainerName": "filer"
  },
  "ConnectionStrings": {
    "PlantaoEntities": {
      "ConnectionString": "DefaultEndpointsProtocol=https;AccountName=xxx;AccountKey=confidential;EndpointSuffix=core.windows.net",
      "ProviderName": "System.Data.EntityClient"
    }
  }
}

Thank you.


Answer:

Your code is almost correct. You need to specify a correct connection in your blob trigger.

Here is my successful sample:

package com.function;

import com.microsoft.azure.functions.annotation.*;
import com.microsoft.azure.functions.*;

public class Function {

    @FunctionName("BlobTrigger")
    public void run(
            @BlobTrigger(name = "trigger", path = "test/{fileName}", dataType = "binary", connection = "AzureWebJobsStorage") byte[] content,
            @BindingName("fileName") String fileName,
            final ExecutionContext context) {
                context.getLogger().info("Blob: " + fileName + " -> Length: " + content.length);
    }
}

I use the "AzureWebJobsStorage" connection in my code, so I need to set a connection string in the local.settings.json:

{
  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=storagetest789;AccountKey=*******w==;EndpointSuffix=core.windows.net",
    "FUNCTIONS_WORKER_RUNTIME": "java"
  }
}

Then, run the function locally and upload a file to the storage, I will get outputs as following:


Note:

  1. When you publish your app to Azure Function App, the settings in your local setting file will not be updated to the cloud. You need to manually update them.

  2. Please ensure that you have made your storage account accessible. If you enable firewall for your storage account, you need to add your client IP for local test, and allow trusted Microsoft Services to access your storage.


Update
  1. If you develop with VS Code, you can upload local settings from the command palette: F1 -> "Upload Local Settings"

  1. You can also set application settings from Azure Portal

And then you can modify settings here:

Question:

I am using Java SDK for connection to Azure Blob Storage:

@Bean
@SneakyThrows
public CloudBlobContainer sourceContainer(CloudStorageAccount cloudStorageAccount) {
    return cloudStorageAccount
            .createCloudBlobClient()
            .getContainerReference(sourceContainerName);
}

During the download process, I am taking listBobs and the necessary CloudBlockBlob.

It exists in the list of blobs. Then I try to download it:

blob.downloadToFile(path);
blob.delete();

And it fails with error:

Method threw 'com.microsoft.azure.storage.StorageException' exception.
The specified blob does not exist.

The interesting fact is that when I rename blob to remove the french accent letters it works as expected. But I can't resolve it from server side. I can't copy to blob with a filename without french accent letters since every oberation on CloudBlockBlob fails with 404 HTTP code


Answer:

I test with azure-storage 5.0.0 and it could download the file with associé.txt name. Maybe you could try with my code or provide more information to let me have a test.

    final String storageConnectionString ="connectionstring";

    CloudStorageAccount account = CloudStorageAccount.parse(storageConnectionString);

    CloudBlobClient serviceClient = account.createCloudBlobClient();

    CloudBlobContainer container = serviceClient.getContainerReference("test");
    container.createIfNotExists();

    File file = new File("E:\\Test");
    for(ListBlobItem item : container.listBlobs()){
        CloudBlockBlob cloudBlob = (CloudBlockBlob) item;
        File f = new File(file.getAbsolutePath() + "\\" +cloudBlob.getName() );
        cloudBlob.downloadToFile(f.toString());
        System.out.println(cloudBlob.getName()+" success download");
    }

Question:

This problem I am facing in title is very similar to this question previously raised here (Azure storage: Uploaded files with size zero bytes), but it was for .NET and the context for my Java scenario is that I am uploading small-size CSV files on a daily basis (about less than 5 Kb per file). In addition the API code uses the latest version of Azure API that I am using in contrast against the 2010 used by the other question.

I couldn't figure out where have I missed out, but the other alternative is to do it in File Storage, but of course the blob approach was recommended by a few of my peers.

So far, I have mostly based my code on uploading a file as a block of blob on the sample that was shown in the Azure Samples git [page] (https://github.com/Azure-Samples/storage-blob-java-getting-started/blob/master/src/BlobBasics.java). I have already done the container setup and file renaming steps, which isn't a problem, but after uploading, the size of the file at the blob storage container on my Azure domain shows 0 bytes.

I've tried alternating in converting the file into FileInputStream and upload it as a stream but it still produces the same manner.

fileName=event.getFilename(); //fileName is e.g eod1234.csv
String tempdir = System.getProperty("java.io.tmpdir");
file= new File(tempdir+File.separator+fileName); //
try {
    PipedOutputStream pos = new PipedOutputStream();
    stream= new PipedInputStream(pos);
    buffer = new byte[stream.available()];
    stream.read(buffer);
    FileInputStream fils = new FileInputStream(file);
    int content = 0;
    while((content = fils.read()) != -1){
        System.out.println((char)content);
    }
    //Outputstream was written as a test previously but didn't work
    OutputStream outStream = new FileOutputStream(file);
    outStream.write(buffer);
    outStream.close();

    // container name is "testing1"            
    CloudBlockBlob blob = container.getBlockBlobReference(fileName);
    if(fileName.length() > 0){
       blob.upload(fils,file.length()); //this is testing with fileInputStream
       blob.uploadFromFile(fileName); //preferred, just upload from file
    }
}            

There are no error messages shown, just we know that the file touches the blob storage and shows a size 0 bytes. It's a one-way process by only uploading CSV-format files. At the blob container, it should be showing those uploaded files a size of 1-5 KBs each.


Answer:

Instead of blob.uploadFromFile(fileName); you should use blob.uploadFromFile(file.getAbsolutePath()); because uploadFromFile method requires absolute path. And you don't need the blob.upload(fils,file.length());.

Refer to Microsoft Docs: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-java#upload-blobs-to-the-container

Question:

Getting the below error while making a call to Create Container.

Response Code : 403 Response Message : Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.

String stringToSign = "PUT\n\n\n\n\n\n\n\n\n\n\n\nx-ms-date:" + date + "\nx-ms-version:" + "2018-03-28\nx-ms-lease-action:acquire\nx-ms-lease-duration:1\nx-ms-proposed-lease-id:1f812371-a41d-49e6-b123-f4b542e851c5\n" + "/" + storageAccount + "/"+ "container-lease-test"+"\ncomp:lease";

Java code snippet

HttpURLConnection connection = (HttpURLConnection)new URL(url).openConnection();
connection.setRequestMethod(vMethod);
connection.addRequestProperty("Authorization", authHeader);
connection.addRequestProperty("x-ms-date", date);
connection.addRequestProperty("x-ms-version", "2018-03-28");
connection.setDoOutput(true);
connection.setFixedLengthStreamingMode(0);

//Create Lease
connection.addRequestProperty("x-ms-lease-action", "acquire");
connection.addRequestProperty("x-ms-lease-duration","1");
connection.addRequestProperty("x-ms-proposed-lease-id","1f812371-a41d-49e6-b123-f4b542e851c5");

Answer:

We need to sort the x-ms-* headers lexicographically by header name, in ascending order. And also you missed restype at the end.

String stringToSign = "PUT\n\n\n\n\n\n\n\n\n\n\n\nx-ms-date:" + date + "\nx-ms-lease-action:acquire\nx-ms-lease-duration:15\nx-ms-proposed-lease-id:1f812371-a41d-49e6-b123-f4b542e851c5\nx-ms-version:2018-03-28\n/" + storageAccount + "/container-lease-test\ncomp:lease\nrestype:container";

Besides, x-ms-lease-duration should be 15~60 or -1(infinite).

I recommend you to follow docs and use Fiddler to catch traffic, you can see expected stringtosign if you get 403 error. Then you can enjoy quick debug.

Question:

I have an http triggered azure function (written in Java) in which I want to access Blob storage. The code compiles under maven, but when I run it locally and send a post from CURL, the runtime crashes due to a ClassNotFound exception caused by missing com.microsoft.azure.storage.CloudStorageAccount. azure-storage (version 6.0.0) is listed as a dependency in the POM file. Where should the related .jar files be so that they are seen by the function?

Any insights regarding Java azure functions would be appreciated.


Answer:

According to your needs,I suggest you following this official tutorial to create,run and deploy your java azure function.

Function Class:

package com.fabrikam.functions;

import com.microsoft.azure.serverless.functions.annotation.*;
import com.microsoft.azure.serverless.functions.ExecutionContext;

import com.microsoft.azure.storage.*;
import com.microsoft.azure.storage.blob.*;

/**
 * Hello function with HTTP Trigger.
 */
public class Function {

    // Configure the connection-string with your values
    public static final String storageConnectionString =
            "DefaultEndpointsProtocol=http;" +
                    "AccountName=***;" +
                    "AccountKey=***";

    @FunctionName("hello")
    public String hello(@HttpTrigger(name = "req", methods = {"get", "post"}, authLevel = AuthorizationLevel.ANONYMOUS) String req,
                        ExecutionContext context) {

        try {
            // Retrieve storage account from connection-string.
            CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConnectionString);

            // Create the blob client.
            CloudBlobClient blobClient = storageAccount.createCloudBlobClient();

            // Get a reference to a container.
            // The container name must be lower case
            CloudBlobContainer container = blobClient.getContainerReference(req);

            // Create the container if it does not exist.
            container.createIfNotExists();

            return String.format("Hello, I get container name : %s!", container.getName());

        } catch (Exception e) {
            // Output the stack trace.
            e.printStackTrace();
            return "Access Error!";
        }
    }
}

Pom.xml:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>com.fabrikam.functions</groupId>
    <artifactId>fabrikam-functions</artifactId>
    <version>1.0-SNAPSHOT</version>
    <packaging>jar</packaging>

    <name>Azure Java Functions</name>

    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <maven.compiler.source>1.8</maven.compiler.source>
        <maven.compiler.target>1.8</maven.compiler.target>
        <functionAppName>fabrikam-functions-20171017112209094</functionAppName>
    </properties>

    <dependencies>
        <dependency>
            <groupId>com.microsoft.azure</groupId>
            <artifactId>azure-functions-java-core</artifactId>
            <version>1.0.0-beta-1</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/com.microsoft.azure/azure-storage -->
        <dependency>
            <groupId>com.microsoft.azure</groupId>
            <artifactId>azure-storage</artifactId>
            <version>6.0.0</version>
        </dependency>

        <!-- Test -->
        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.12</version>
            <scope>test</scope>
        </dependency>
    </dependencies>

    <build>
        <pluginManagement>
            <plugins>
                <plugin>
                    <artifactId>maven-resources-plugin</artifactId>
                    <version>3.0.2</version>
                </plugin>
                <plugin>
                    <groupId>com.microsoft.azure</groupId>
                    <artifactId>azure-functions-maven-plugin</artifactId>
                    <version>0.1.4</version>
                </plugin>
            </plugins>
        </pluginManagement>

        <plugins>
            <plugin>
                <groupId>com.microsoft.azure</groupId>
                <artifactId>azure-functions-maven-plugin</artifactId>
                <configuration>
                    <resourceGroup>java-functions-group</resourceGroup>
                    <appName>${functionAppName}</appName>
                    <region>westus2</region>
                    <appSettings>
                        <property>
                            <name>FUNCTIONS_EXTENSION_VERSION</name>
                            <value>beta</value>
                        </property>
                    </appSettings>
                </configuration>
                <executions>
                    <execution>
                        <id>package-functions</id>
                        <goals>
                            <goal>package</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <artifactId>maven-resources-plugin</artifactId>
                <executions>
                    <execution>
                        <id>copy-resources</id>
                        <phase>package</phase>
                        <goals>
                            <goal>copy-resources</goal>
                        </goals>
                        <configuration>
                            <overwrite>true</overwrite>
                            <outputDirectory>${project.build.directory}/azure-functions/${functionAppName}
                            </outputDirectory>
                            <resources>
                                <resource>
                                    <directory>${project.basedir}</directory>
                                    <includes>
                                        <include>host.json</include>
                                        <include>local.settings.json</include>
                                    </includes>
                                </resource>
                            </resources>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
        </plugins>

    </build>

</project>

Then use command mvn clean package to pack your maven project into a jar package.

use command mvn azure-functions:run to run your azure function locally.


Update Answer:

I ran my azure function and reproduce the same exception as you said.

java.lang.NoClassDefFoundError: com/microsoft/azure/storage/CloudStorageAccount

Exception:
Stack: java.lang.reflect.InvocationTargetException
[10/25/2017 2:48:44 AM]         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[10/25/2017 2:48:44 AM]         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[10/25/2017 2:48:44 AM]         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)[10/25/2017 2:48:44 AM]         at java.lang.reflect.Method.invoke(Method.java:498)
[10/25/2017 2:48:44 AM]         at com.microsoft.azure.webjobs.script.broker.JavaMethodInvokeInfo.invoke(JavaMethodInvokeInfo.java:19)
[10/25/2017 2:48:44 AM]         at com.microsoft.azure.webjobs.script.broker.JavaMethodExecutor.execute(JavaMethodExecutor.java:34)
[10/25/2017 2:48:44 AM]         at com.microsoft.azure.webjobs.script.broker.JavaFunctionBroker.invokeMethod(JavaFunctionBroker.java:40)
[10/25/2017 2:48:44 AM]         at com.microsoft.azure.webjobs.script.handler.InvocationRequestHandler.execute(InvocationRequestHandler.java:33)
[10/25/2017 2:48:44 AM]         at com.microsoft.azure.webjobs.script.handler.InvocationRequestHandler.execute(InvocationRequestHandler.java:10)
[10/25/2017 2:48:44 AM]         at com.microsoft.azure.webjobs.script.handler.MessageHandler.handle(MessageHandler.java:41)
[10/25/2017 2:48:44 AM]         at com.microsoft.azure.webjobs.script.JavaWorkerClient$StreamingMessagePeer.lambda$onNext$0(JavaWorkerClient.java:84)
[10/25/2017 2:48:44 AM]         at java.util.concurrent.ForkJoinTask$AdaptedRunnableAction.exec(ForkJoinTask.java:1386)
[10/25/2017 2:48:44 AM]         at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
[10/25/2017 2:48:44 AM]         at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
[10/25/2017 2:48:44 AM]         at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
[10/25/2017 2:48:44 AM]         at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)
[10/25/2017 2:48:44 AM] Caused by: java.lang.NoClassDefFoundError: com/microsoft/azure/storage/CloudStorageAccount
[10/25/2017 2:48:44 AM]         at com.fabrikam.functions.Function.hello(Function.java:26)
[10/25/2017 2:48:44 AM]         ... 16 more
[10/25/2017 2:48:44 AM] Caused by: java.lang.ClassNotFoundException: com.microsoft.azure.storage.CloudStorageAccount
[10/25/2017 2:48:44 AM]         at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
[10/25/2017 2:48:44 AM]         at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
[10/25/2017 2:48:44 AM]         at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
[10/25/2017 2:48:44 AM]         ... 17 more
[10/25/2017 2:48:44 AM] .
[10/25/2017 2:48:44 AM]   Function had errors. See Azure WebJobs SDK dashboard for details. Instance ID is '3450abda-99a0-4d75-add2-a7bc48a0cb51'
[10/25/2017 2:48:44 AM] System.Private.CoreLib: Exception while executing function: Functions.hello. System.Private.CoreLib: Result:

After some research, I found out that it was because the jar packaged without dependent jar packages.

So , I added the snippet of configuration as below into my pom.xml

<plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                    <archive>
                        <manifest>
                            <mainClass>Your main class path</mainClass>
                        </manifest>
                    </archive>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>

Then please use command mvn-clean-package and you will see two jar files generated.

One is that it does not contain dependent jar packages, and the second one contains dependent jar packages.

Move the fabrikam-functions-1.0-SNAPSHOT-jar-with-dependencies jar into the path:${project.basedir}/target/azure-functions/${function-app-name}/

For me ,it looks like E:\TestAzureFunction\fabrikam-functions\target\azure-functions\fabrikam-functions-20171017112209094.

Don't forget rename the jar to fabrikam-functions-1.0-SNAPSHOT.

Finally, I run the azure function successfully and get the output result via the url: http://localhost:7071/api/mongo.

In addition, you could refer to this github doc for more configuration details about azure function maven plugin.

Hope it helps you.

Question:

So the existing code base where I work uses a regular Java File("a/directory/path") object for a massive amount of logic. Now my team wants me to use a file stored in the Azure Blob instead. I can get the file from the blob using the CloudBlobItem() java api. But this object is different than a regular java File() object. And I would have to change a bunch of stuff in the logic. Is there any blob item which can be casted to a regular File() object?


Answer:

Short answer: No.

You're comparing two completely different things. Azure blobs are not files. You'd need to stream them down to where your code is running. Maybe to a file stream. Maybe write to disk. And then work with the file. You cannot just use an Azure blob like any other file I/O.

Note: If you're using Azure File Storage (which is an SMB share), then you do treat everything in that file store like you'd treat local storage. But it sounds like you're just using normal block blobs for your storage.

Question:

We have application wherein many (~1000) consumers try to fetch files from blob storage. There is no concurrent access on blob files, but they share single storage account. I see files available on the blob storage, but we are constantly seeing below exception

Caused by: com.microsoft.azure.storage.StorageException: The specified blob does not exist.
at com.microsoft.azure.storage.StorageException.translateFromHttpStatus(StorageException.java:207)[3:org.ops4j.pax.logging.pax-logging-service:1.6.9]
at com.microsoft.azure.storage.StorageException.translateException(StorageException.java:172)[3:org.ops4j.pax.logging.pax-logging-service:1.6.9]
at com.microsoft.azure.storage.core.StorageRequest.materializeException(StorageRequest.java:306)[3:org.ops4j.pax.logging.pax-logging-service:1.6.9]
at com.microsoft.azure.storage.core.ExecutionEngine.executeWithRetry(ExecutionEngine.java:177)[3:org.ops4j.pax.logging.pax-logging-service:1.6.9]
at com.microsoft.azure.storage.blob.CloudBlob.downloadAttributes(CloudBlob.java:1268)[3:org.ops4j.pax.logging.pax-logging-service:1.6.9]
at com.microsoft.azure.storage.blob.CloudBlob.downloadAttributes(CloudBlob.java:1235)[3:org.ops4j.pax.logging.pax-logging-service:1.6.9]

We are using

Azure-storage-api 1.1.0

Is this a known bug or limitation? What are the scenarios in which we will get this exception?

We download blobs using following code

String storageConnectionString = "DefaultEndpointsProtocol=http;AccountName="+ storageAccount + ";AccountKey=" + primaryAccessKey;
CloudStorageAccount account = CloudStorageAccount.parse(storageConnectionString);
CloudBlobClient blobClient = account.createCloudBlobClient();
CloudBlobContainer container = blobClient.getContainerReference(containerName.toLowerCase());
CloudBlockBlob blockBlob = container.getBlockBlobReference(fileName);
blockBlob.downloadAttributes();
//http://stackoverflow.com/questions/1071858/java-creating-byte-array-whose-size-is-represented-by-a-long
int size = (int)blockBlob.getProperties().getLength();
out = new byte[size];
blockBlob.downloadToByteArray(out, 0);

Answer:

What is constantly? Is it always, or is it when more than X consumers are trying to fetch the blob?

On the Scalability Targets for Azure Storage you can learn more about the targeted scalability parameters. One of which is target throughput for single blob:

Target throughput for single blob Up to 60 MB per second, or up to 500 requests per second

With your 1000 consumers, there is no doubt you hit that limit when querying the same blob. Question is - do you really need to get the from the blob so intense, can you cache somewhere (intermediate facede) or can you use CDN (it also works with SAS's )

If the 1000 consumers are hitting 1000 different blobs, there are are limitations, like:

Total Request Rate (assuming 1KB object size) per storage account Up to 20,000 IOPS, entities per second, or messages per second

Which, for the 1000 consumers makes 20 requests per second - based on the number of blocks in your files, it may well also be that limit.

In any way, you shall revise your application and discover which limit you hit.

Question:

I took the Library source code directly from https://github.com/azure/azure-storage-java for android

Added the code to my existing library project. Everything seems fine expect that it gives errors in files like OperationContext.java asking me to import org.slf4j.Logger

From the documentation it is clear that the library is optional. Why then should I use the library, is there anyway to build this library without the use of SLF4J library?

Do i have to modify the source code to build it ?


Answer:

If you're looking for the Azure Storage Android Client Library, it is located at https://github.com/Azure/azure-storage-android and not the location you linked.

Adding the Android source code to your existing library project should be successful without any modifications.

Question:

I was able to create a Container in Storage Account and upload a blob to it through the Client Side Code.

I was able to make the blob available for Public access as well , such that when I hit the following query from my browser, I am able to see the image which I uploaded.

https://MYACCOUNT.blob.core.windows.net/MYCONTAINER/MYBLOB

I now have a requirement to use the rest service to retrieve the contents of the blob. I wrote down the following java code.

package main;

import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.net.HttpURLConnection;
import java.net.URL;
import java.security.MessageDigest;
import java.security.NoSuchAlgorithmException;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.TimeZone;

public class GetBlob {


public static void main(String[] args) {





    String url="https://MYACCOUNT.blob.core.windows.net/MYCONTAINER/MYBLOB";


    try {
        System.out.println("RUNNIGN");
    HttpURLConnection connection = (HttpURLConnection) new URL(url).openConnection();


 connection.setRequestProperty("Authorization", createQuery());
 connection.setRequestProperty("x-ms-version", "2009-09-19");

        InputStream response = connection.getInputStream();
        System.out.println("SUCCESSS");
        String line;
        BufferedReader reader = new BufferedReader(new InputStreamReader(response));
        while ((line = reader.readLine()) != null) {
            System.out.println(line);
        }   



    } catch (IOException e) {
        e.printStackTrace();
    }

}
public static String createQuery()
{
    String dateFormat="EEE, dd MMM yyyy hh:mm:ss zzz";
    SimpleDateFormat dateFormatGmt = new SimpleDateFormat(dateFormat);
    dateFormatGmt.setTimeZone(TimeZone.getTimeZone("UTC"));
    String date=dateFormatGmt.format(new Date());


    String Signature="GET\n\n\n\n\n\n\n\n\n\n\n\n" +
            "x-ms-date:" +date+
            "\nx-ms-version:2009-09-19" ;

            // I do not know CANOCALIZED RESOURCE 
            //WHAT ARE THEY??
//          +"\n/myaccount/myaccount/mycontainer\ncomp:metadata\nrestype:container\ntimeout:20";

    String SharedKey="SharedKey";
    String AccountName="MYACCOUNT";

    String encryptedSignature=(encrypt(Signature));

    String auth=""+SharedKey+" "+AccountName+":"+encryptedSignature;

        return auth;

}





public static String encrypt(String clearTextPassword)   {  
    try {
      MessageDigest md = MessageDigest.getInstance("SHA-256");
      md.update(clearTextPassword.getBytes());
      return new sun.misc.BASE64Encoder().encode(md.digest());
    } catch (NoSuchAlgorithmException e) {
    }
    return "";
  }



}

However , I get the following error when I run this main class...

RUNNIGN
java.io.IOException: Server returned HTTP response code: 403 for URL:     https://klabs.blob.core.windows.net/delete/Blob_1
    at sun.net.www.protocol.http.HttpURLConnection.getInputStream(Unknown Source)
at sun.net.www.protocol.https.HttpsURLConnectionImpl.getInputStream(Unknown Source)
at main.MainClass.main(MainClass.java:61)

Question1: Why this error, did I miss any header/parameter?

Question2: Do I need to add headers in the first place, because I am able to hit the request from the browser without any issues.

Question3: Can it be an SSL issue? What is the concept of certificates, and how and where to add them? Do I really need them? Will I need them later, when I do bigger operations on my blob storage(I want to manage a thousand blobs)?

Will be thankful for any reference as well, within Azure and otherwise that could help me understand better. :D



AFTER A FEW DAYS



Below is my new code for PutBlob I azure. I believe I have fully resolved all header and parameter issues and my request is perfect. However I am still getting the same 403. I do not know what the issue is. Azure is proving to be pretty difficult.

A thing to note is that the containers name is delete, and I want to create a blob inside it, say newBlob. I tried to initialize the urlPath in the code below with both "delete" and "delete/newBlob". Does not work..

package main;

import java.io.DataInputStream;
import java.io.DataOutputStream;
import java.io.File;
import java.io.IOException;
import java.io.UnsupportedEncodingException;
import java.net.HttpURLConnection;
import java.net.URISyntaxException;
import java.net.URL;
import java.security.InvalidKeyException;
import java.security.NoSuchAlgorithmException;
import java.text.SimpleDateFormat;
import java.util.Calendar;
import java.util.TimeZone;

import javax.crypto.Mac;
import javax.crypto.spec.SecretKeySpec;

import com.sun.org.apache.xml.internal.security.exceptions.Base64DecodingException;
import com.sun.org.apache.xml.internal.security.utils.Base64;

public class Internet {

    static String key="password";
    static String account="klabs";
private static Base64 base64 ; 
private static String createAuthorizationHeader(String canonicalizedString) throws InvalidKeyException, Base64DecodingException, NoSuchAlgorithmException, IllegalStateException, UnsupportedEncodingException     {  
          Mac mac = Mac.getInstance("HmacSHA256");  
          mac.init(new SecretKeySpec(base64.decode(key), "HmacSHA256"));  
          String authKey = new String(base64.encode(mac.doFinal(canonicalizedString.getBytes("UTF-8"))));  
          String authStr = "SharedKey " + account + ":" + authKey;  
          return authStr;  
     } 

public static void main(String[] args) {

System.out.println("INTERNET");
String key="password";
String account="klabs";
long blobLength="Dipanshu Verma wrote this".getBytes().length;
File f = new File("C:\\Users\\Dipanshu\\Desktop\\abc.txt");
String requestMethod = "PUT";  
String urlPath = "delete";
String storageServiceVersion = "2009-09-19";


SimpleDateFormat fmt = new SimpleDateFormat("EEE, dd MMM yyyy HH:mm:sss");  
fmt.setTimeZone(TimeZone.getTimeZone("UTC"));  
String date = fmt.format(Calendar.getInstance().getTime()) + " UTC";
String blobType = "BlockBlob"; 
String canonicalizedHeaders = "x-ms-blob-type:"+blobType+"\nx-ms-date:"+date+"\nx-ms-version:"+storageServiceVersion;  
String canonicalizedResource = "/"+account+"/"+urlPath;  

String stringToSign = requestMethod+"\n\n\n"+blobLength+"\n\n\n\n\n\n\n\n\n"+canonicalizedHeaders+"\n"+canonicalizedResource;

try {
    String authorizationHeader = createAuthorizationHeader(stringToSign);
    URL myUrl = new URL("https://klabs.blob.core.windows.net/" + urlPath);

    HttpURLConnection connection=(HttpURLConnection)myUrl.openConnection();
    connection.setRequestProperty("x-ms-blob-type", blobType);
    connection.setRequestProperty("Content-Length", String.valueOf(blobLength));
    connection.setRequestProperty("x-ms-date", date);
    connection.setRequestProperty("x-ms-version", storageServiceVersion);
    connection.setRequestProperty("Authorization", authorizationHeader);
    connection.setDoOutput(true);
    connection.setRequestMethod("POST");
    System.out.println(String.valueOf(blobLength));
    System.out.println(date);
    System.out.println(storageServiceVersion);
    System.out.println(stringToSign);
    System.out.println(authorizationHeader);
    System.out.println(connection.getDoOutput());


    DataOutputStream  outStream = new DataOutputStream(connection.getOutputStream());


       // Send request
       outStream.writeBytes("Dipanshu Verma wrote this");
       outStream.flush();
       outStream.close();
       DataInputStream inStream = new DataInputStream(connection.getInputStream());
       System.out.println("BULLA");

       String buffer;
       while((buffer = inStream.readLine()) != null) {
           System.out.println(buffer);
       }

       // Close I/O streams
       inStream.close();
       outStream.close();





} catch (InvalidKeyException | Base64DecodingException | NoSuchAlgorithmException | IllegalStateException | UnsupportedEncodingException e) {
    // TODO Auto-generated catch block
    e.printStackTrace();
} catch (IOException e) {
    // TODO Auto-generated catch block
    e.printStackTrace();
}
}

}

I know only a proper code reviewer might be able to help me, please do it if you can. Thanks


Answer:

Question1: Why this error, did I miss any header/parameter?

Most likely you're getting this error is because of incorrect signature. Please refer to MSDN documentation for creating correct signature: http://msdn.microsoft.com/en-us/library/azure/dd179428.aspx. Unless your signature is correct you'll not be able to perform operations using REST API.

Question2: Do I need to add headers in the first place, because I am able to hit the request from the browser without any issues.

In your current scenario, because you can access the blob directly (which in turn means the container in which the blob exist has Public or Blob ACL) you don't really need to use REST API. You can simply make a HTTP request using Java and read the response stream which will have blob contents. You would need to go down this route if the container ACL is Private because in this case your requests need to be authenticated and the code above creates an authenticated request.

Question3: Can it be an SSL issue? What is the concept of certificates, and how and where to add them? Do I really need them? Will I need them later, when I do bigger operations on my blob storage(I want to manage a thousand blobs)?

No, it is not an SSL issue. Its an issue with incorrect signature.

Question:

I have a use case, where I need to download the file from Azure blob location to an IoT Device which is registered with IoT Hub.

In this case, I will be sending an instruction to IoT Device (through IoT Hub) to download the file from Azure blob location, to specific destination on the machine (where IoT Device app is running).

These instructions will be sent through IoT Hub using a backend (customised) application.

Just wanted to know whether Azure IoT Hub currently support this facility which I can use directly into the IoT Device app (by using Azure IoT API).

Any reference will be helpful.

Thanks,

Avinash Deshmukh


Answer:

Yes, sure. As I known, you can generate an url of an Azure Blob file with its SAS token to send it as a cloud-to-device message from IoTHub, then to receive it from IoT device and to download file via the file url with SAS token directly on the IoT device.

As references, you can refer to these offical documents to try to realize it.

  1. To generate a blob url with SAS token, please refer to Create an account SAS with .NET. If you are using Java, I think it's very simple for you to write the Java one with Azure Storage SDK for Java.
  2. To send a cloud-to-device message from IoTHub, please refer to Send cloud-to-device messages with IoT Hub (Java).
  3. To receive the c2d message on a device, the REST API for all languages is Device - Receive Device Bound Notification. Or you can refer to the part of sample code SendReceive.java by searching the key word D2C to know how to retrieve messages from IoT Hub.
  4. If you have to get the blob url with SAS token from IoT Hub on devices, I think it's not hard for you to download a file from a public link.

Hope it helps.

Question:

So I have created a Azure blob trigger, and it is working fine as soon as I put some file or create some directory on the blob the trigger happens.

Question: Now I can not figure out how can I get the same file content which causes the blob trigger.

I can get the files using Azure storage library, but I am going to upload lots of files on the blob and want to do some processing on the file which has just written on the blob.

Thanks in advance


Answer:

It looks pretty straightforward from the example documentation - https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob#trigger---java-example

@FunctionName("blobprocessor")
public void run(
    @BlobTrigger(name = "file",
                 dataType = "binary",
                 path = "myblob/{name}",
                 connection = "MyStorageAccountAppSetting") byte[] content,
    @BindingName("name") String filename,
    final ExecutionContext context
) {
    context.getLogger().info("Name: " + filename + " Size: " + content.length + " bytes");
}

The content gets passed in as a byte array.

Question:

I have the Azure blob sample code which I'm trying to modify. However, the uploadFile function only works when it is in the while loop in a switch case. If I take it out of the loop, it creates the container but fails to upload the file.

I've tried taking it out and calling the function from different places in the code, but none of them work.

uploadFile function:

 static void uploadFile(BlockBlobURL blob, File sourceFile) throws IOException {

        AsynchronousFileChannel fileChannel = AsynchronousFileChannel.open(sourceFile.toPath());

        // Uploading a file to the blobURL using the high-level methods available in TransferManager class
        // Alternatively call the PutBlob/PutBlock low-level methods from BlockBlobURL type
        TransferManager.uploadFileToBlockBlob(fileChannel, blob, 8*1024*1024, null, null)
                .subscribe(response-> {
                    System.out.println("Completed upload request.");
                    TimeUnit.SECONDS.sleep(5);
                    System.out.println(response.response().statusCode());
                });


    }

Relevant part of main

            // Listening for commands from the console
            //THIS IS THE PART THAT ONLY MAKES THE CONTAINER
            /*
            System.out.println("Uploading the sample file into the container: " + containerURL );
            uploadFile(blobURL, sampleFile);
            System.out.println("File Uploaded");
            */
            //TRYING TO CALL FUNCTION FROM OUTSIDE WHILE, BUT IT ONLY WORKS HERE
            System.out.println("Enter a command");
            System.out.println("(P)utBlob | (L)istBlobs | (G)etBlob");
            BufferedReader reader = new BufferedReader(new InputStreamReader(System.in));
            while (true) {
                System.out.println("# Enter a command : ");
                String input = reader.readLine();
                switch(input){
                    case "P":
                        System.out.println("Uploading the sample file into the container: " + containerURL );
                        uploadFile(blobURL, sampleFile);
                        break;

The uploadFile called outside of the while loop creates the container but doesn't actually upload a file to the blob, while uploadFile from in the while loop and switch case does


Answer:

This won't upload the blob because the upload process has not finished and the main process has finished. So I just add a Thread.sleep() to the main process. And reminder don't set the value too short or it still fails. In my test I set it to 2000 millisecond.

public static void main(String[] args) throws java.lang.Exception{


        ContainerURL containerURL;

        // Creating a sample file to use in the sample
        File sampleFile = null;

        try {
            sampleFile = File.createTempFile("downloadedFile", ".txt");

            // Retrieve the credentials and initialize SharedKeyCredentials
            String accountName = "xxxxxx";
            String accountKey = "xxxxxxx";

            // Create a ServiceURL to call the Blob service. We will also use this to construct the ContainerURL
            SharedKeyCredentials creds = new SharedKeyCredentials(accountName, accountKey);
            // We are using a default pipeline here, you can learn more about it at https://github.com/Azure/azure-storage-java/wiki/Azure-Storage-Java-V10-Overview
            final ServiceURL serviceURL = new ServiceURL(new URL("https://" + accountName + ".blob.core.windows.net"), StorageURL.createPipeline(creds, new PipelineOptions()));

            // Let's create a container using a blocking call to Azure Storage
            // If container exists, we'll catch and continue
            containerURL = serviceURL.createContainerURL("quickstart");

            try {
                ContainerCreateResponse response = containerURL.create(null, null, null).blockingGet();
                System.out.println("Container Create Response was " + response.statusCode());
            } catch (RestException e){
                if (e instanceof RestException && ((RestException)e).response().statusCode() != 409) {
                    throw e;
                } else {
                    System.out.println("quickstart container already exists, resuming...");
                }
            }

            // Create a BlockBlobURL to run operations on Blobs
            final BlockBlobURL blobURL = containerURL.createBlockBlobURL("SampleBlob.txt");

            System.out.println("Uploading the sample file into the container: " + containerURL );
            AsynchronousFileChannel fileChannel = AsynchronousFileChannel.open(sampleFile.toPath());


            // Uploading a file to the blobURL using the high-level methods available in TransferManager class
            // Alternatively call the PutBlob/PutBlock low-level methods from BlockBlobURL type
            TransferManager.uploadFileToBlockBlob(fileChannel, blobURL, 8*1024*1024, null, null)
                    .subscribe(response-> {
                        System.out.println("Completed upload request.");
                        System.out.println(response.response().statusCode());
                    });

            Thread.sleep(2000);

        } catch (InvalidKeyException e) {
            System.out.println("Invalid Storage account name/key provided");
        } catch (MalformedURLException e) {
            System.out.println("Invalid URI provided");
        } catch (RestException e){
            System.out.println("Service error returned: " + e.response().statusCode() );
        } catch (IOException e) {
            e.printStackTrace();
            System.exit(-1);
        }
    }

Question:

I'm trying to delete some blobs in an Azure Storage container using the Java Azure Storage Library 4.0.0, as explained here. Seems like this should be an easy thing to do, so I assume I'm doing something wrong, as the code below doesn't delete anything. There are 4 blobs in the container.

String connectionString = String.format(
        "DefaultEndpointsProtocol=https;" +
        "AccountName=%s;" +
        "AccountKey=%s", accountName, accountKey);
CloudStorageAccount account =
        CloudStorageAccount.parse(connectionString);
CloudBlobClient client = account.createCloudBlobClient();
CloudBlobContainer container =
        client.getContainerReference("myContainer");

// This loop iterates 4 times, as expected
for (ListBlobItem item : container.listBlobs("prefix/", true)) {
    CloudBlockBlob blob = container.
            getBlockBlobReference(item.getUri().toString());
    if (blob.deleteIfExists()) {
        // never hits
    }
}

No exceptions are thrown, but the blobs remain. When I call delete() instead of deleteIfExists(), I get a StorageException: "the specified blob does not exist."


Answer:

If you take a look at the API docs for getBlockBlobReference you'll see it takes the name (hence a string, and not a URI) of the blob. So, what you're doing here is trying to delete blobs whose name is the full URI of your blob. These of course don't exist.

What you want to do is simply check the type of the item and cast it to a blob. You can then do what ever operations you want.

      if (item instanceof CloudBlob) {
            blob = (CloudBlob) item;
      }

Question:

While the following link details the way you can compute the storage size using C#, I don't see similar methods in Java. Appreciate if someone can post a sample code for Java please. Azure Storage container size


Answer:

Here is my sample code. For more details, please refer to the javadocs of Azure Storage SDK for Java.

String accountName = "<your-storage-account-name>";
String accountKey = "<your-storage-account-key>";
String storageConnectionString = "DefaultEndpointsProtocol=https;AccountName=%s;AccountKey=%s";
String connectionString = String.format(storageConnectionString, accountName, accountKey);
CloudStorageAccount account = CloudStorageAccount.parse(connectionString);
CloudBlobClient client = account.createCloudBlobClient();
String containerName = "mycontainer";
CloudBlobContainer container = client.getContainerReference(containerName);
long size = 0L;
Iterable<ListBlobItem> blobItems = container.listBlobs();
for (ListBlobItem blobItem : blobItems) {
    if (blobItem instanceof CloudBlob) {
        CloudBlob blob = (CloudBlob) blobItem;
        size += blob.getProperties().getLength();
    }
}

If you need to count size for a container include snapshot, please using the code below to get the blob list.

// If count blob size for a container include snapshots
String prefix = null;
boolean useFlatBlobListing = true;
EnumSet<BlobListingDetails> listingDetails = EnumSet.of(BlobListingDetails.SNAPSHOTS);
BlobRequestOptions options = null;
OperationContext opContext = null;
Iterable<ListBlobItem> blobItems = container.listBlobs(prefix, useFlatBlobListing, listingDetails, options, opContext);

If just count size for snapshots in a container, please using the code below to check a blob whether is a snapshot.

if (blob.isSnapshot()) {
    size += blob.getProperties().getLength();
}

Question:

I am trying to decode files after downloading via Azure Storage SDK for Java.

Here is my code:

try
        {
            CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConnectionString);
            CloudBlobClient blobClient = storageAccount.createCloudBlobClient();
            CloudBlobContainer container = blobClient.getContainerReference("mycontainer");
            CloudBlobDirectory blobDirectory = container.getDirectoryReference("shi");
            for (ListBlobItem blobItem : blobDirectory.listBlobs()) {
                if (blobItem instanceof CloudBlob) {
                    CloudBlob blob = (CloudBlob) blobItem;
                    blob.download(new FileOutputStream("/Users/shi/Downloads/" + blob.getName()));
                }
            }
        }

This code downloads all the BLOB files from mycontainer but there are encoded. How can I download them but in decoded form.


Answer:

I tried to reproduce your issue.

My sample code :

CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConnectionString);
CloudBlobClient blobClient = storageAccount.createCloudBlobClient();
CloudBlobContainer container = blobClient.getContainerReference("jay");
   for (ListBlobItem blobItem : container.listBlobs()) {
        if (blobItem instanceof CloudBlob) {
            CloudBlob blob = (CloudBlob) blobItem;
            blob.download(new FileOutputStream("E://AzureFile/" + blob.getName()));

Download successfully :

You can see check if the blob Content Type is text/plain in Storage Explorer first. Blob specific Content Type is not displayed on the portal.

In addition , it might because of the encoding you use when you parse blob content is inconsistent with the encoding your colleague use when the blob is uploaded.

This issue is encountered when downloading text, but the binary does not.

I suggest you get the encoding when the file was uploaded and set the character set to parse the blob content by using the following sample snippet of code:

if (blobItem instanceof CloudBlob) {
      CloudBlob blob = (CloudBlob) blobItem;
      InputStream input =  blob.openInputStream();
      InputStreamReader inr = new InputStreamReader(input, "UTF-8");
      String utf8str = org.apache.commons.io.IOUtils.toString(inr);
      System.out.println(utf8str);
}

Hope it helps you.

Question:

I try to upload image to Azure Blob storage from android. I can do it from Java by this way

 CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConnectionString);

        // Create the blob client.
        CloudBlobClient blobClient = storageAccount.createCloudBlobClient();

        CloudBlobContainer container = blobClient.getContainerReference("mycontainer");

        final String filePath = "C:\\Users\\icon.jpg";

        CloudBlockBlob blob = container.getBlockBlobReference("1.jpg");
        File source = new File(filePath);
        blob.upload(new FileInputStream(source), source.length());

But if I change filepath to "content://media/external/images/media/12" this, in android I have FileNotFoundException. How can I upload images from android?


Answer:

final String filePath = "C:\Users\icon.jpg"

I pretty much doubt this points to existing file on your device.

EDIT

But if I change filepath to content://media/external/images/media/12

This is NOT file path. Using content:// Uri, requires using a ContentResolver and methods like openInputStream() and openOutputStream().