Sunday, 8 September 2013

Unable to renew Windows Azure SAS key in StorageException handler

Unable to renew Windows Azure SAS key in StorageException handler

pI'm able to capture the AuthenticationFailed error code (in my catch
block) from the StorageException when the SAS key expires. I then make
another call (within the catch block) to a SAS generation method to create
a new SAS key with a later expiration. When I have the new SAS key, I use
it to make another CloudBlockBlob object to write the block that failed to
the azure blob. However, I keep getting the same 403 forbidden
authentication error I would get when the SAS key has expired. Why is this
happening when I'm creating a new CloudBlockBlob object with a different
SAS key (with a later expiration)? It has nothing to do with Clock skew
since I don't even specify an StartTime in the SAS key generation. Any
ideas? Is there a better practice for handling SAS renewal? Would
appreciate the help! Below is my code. NOTE: I'm using javascript to chunk
the upload, so each call to the WriteBlock method is a brand new state:/p
precode/// Sends the file chunk to the Azure Blob /// lt;param name =
BlockIdgt;The id for the current blocklt;/param namegt; public void
WriteBlockToBlob(string BlockId) { try { Blob = new CloudBlockBlob(new
Uri(BlobSasUri)); Blob.PutBlock( BlockId, File.InputStream, null, null,
uploadBlobOptions, null );
/***********************************************************************************************/
/* REST API Approach */ //string queryString = (new
Uri(abm.BlobContainerSasUri)).Query; //abm.BlobContainer =
abm.BlobContainerSasUri.Substring(0, abm.BlobContainerSasUri.Length -
queryString.Length); //string requestUri =
string.Format(System.Globalization.CultureInfo.InvariantCulture,
{0}/{1}{2}amp;comp=blockamp;blockid={3}, // abm.BlobContainer,
abm.FileName, queryString,
Convert.ToBase64String(Encoding.UTF8.GetBytes(abm.BlockId)));
//HttpWebRequest request = (HttpWebRequest)WebRequest.Create(requestUri);
//request.Method = PUT; //request.ContentLength =
abm.File.InputStream.Length; //using (Stream requestStream =
request.GetRequestStream()) //{ // inputStream.CopyTo(requestStream); //}
//using (HttpWebResponse resp = (HttpWebResponse)request.GetResponse())
//{ //}
/******************************************************************************************************/
} catch (StorageException stex) { if
(stex.RequestInformation.ExtendedErrorInformation.ErrorCode.Equals(AuthenticationFailed))
{ AzureStorageTester ast = new AzureStorageTester(); BlobSasUri =
ast.getStorageLibrarySas(File.FileName); // Retry writing block to blob
Blob = new CloudBlockBlob(new Uri(BlobSasUri)); try { Blob.PutBlock(
BlockId, File.InputStream, null, null, uploadBlobOptions, null ); } catch
(StorageException ex) { var test =
stex.RequestInformation.ExtendedErrorInformation.ErrorCode; throw ex; } }
//string errMsg =
ComposeAndLogStorageException(ExceptionFlag.UploadLargeFileException,
stex); //BigFileExceptionFlag = on; //throw new
ApplicationException(errMsg); } catch (SystemException se) { string errMsg
= ComposeAndLogSystemException(ExceptionFlag.UploadLargeFileException,
se); BigFileExceptionFlag = on; throw new ApplicationException(errMsg); }
catch (Exception ex) { string errMsg =
ComposeAndLogException(ExceptionFlag.UploadLargeFileException, ex);
BigFileExceptionFlag = on; throw new ApplicationException(errMsg); }
finally { File.InputStream.Close(); } } /code/pre

No comments:

Post a Comment