Passing object messages in Azure Queue Storage

azure storage queue listener
azure queue example c#
azure queue getmessage
azure function queue trigger
azure storage queue polling
dequeue count azure queue
azure function queue trigger batch
cloudqueue c#

I'm trying to find a way to pass objects to the Azure Queue. I couldn't find a way to do this.

As I've seen I can pass string or byte array, which is not very comfortable for passing objects.

Is there anyway to pass custom objects to the Queue?

Thanks!

You can use the following classes as example:

 [Serializable]
    public abstract class BaseMessage
    {
        public byte[] ToBinary()
        {
            BinaryFormatter bf = new BinaryFormatter();
            byte[] output = null;
            using (MemoryStream ms = new MemoryStream())
            {
                ms.Position = 0;
                bf.Serialize(ms, this);
                output = ms.GetBuffer();
            }
            return output;
        }

        public static T FromMessage<T>(CloudQueueMessage m)
        {
            byte[] buffer = m.AsBytes;
            T returnValue = default(T);
            using (MemoryStream ms = new MemoryStream(buffer))
            {
                ms.Position = 0;
                BinaryFormatter bf = new BinaryFormatter();
                returnValue = (T)bf.Deserialize(ms);
            }
            return returnValue;
        }
    }

Then a StdQueue (a Queue that is strongly typed):

   public class StdQueue<T> where T : BaseMessage, new()
    {
        protected CloudQueue queue;

        public StdQueue(CloudQueue queue)
        {
            this.queue = queue;
        }

        public void AddMessage(T message)
        {
            CloudQueueMessage msg =
            new CloudQueueMessage(message.ToBinary());
            queue.AddMessage(msg);
        }

        public void DeleteMessage(CloudQueueMessage msg)
        {
            queue.DeleteMessage(msg);
        }

        public CloudQueueMessage GetMessage()
        {
            return queue.GetMessage(TimeSpan.FromSeconds(120));
        }
    }

Then, all you have to do is to inherit the BaseMessage:

[Serializable]
public class ParseTaskMessage : BaseMessage
{
    public Guid TaskId { get; set; }

    public string BlobReferenceString { get; set; }

    public DateTime TimeRequested { get; set; }
}

And make a queue that works with that message:

CloudStorageAccount acc;
            if (!CloudStorageAccount.TryParse(connectionString, out acc))
            {
                throw new ArgumentOutOfRangeException("connectionString", "Invalid connection string was introduced!");
            }
            CloudQueueClient clnt = acc.CreateCloudQueueClient();
            CloudQueue queue = clnt.GetQueueReference(processQueue);
            queue.CreateIfNotExist();
            this._queue = new StdQueue<ParseTaskMessage>(queue);

Hope this helps!

How to pass object messages in Azure Queue Storage?, I'm trying to find a way to pass objects to the Azure Queue. Is there anyway to pass custom objects to the Queue? Is there anyway to pass custom objects to the Queue? I'm trying to find a way to pass objects to the Azure Queue. 41957/how-to-pass-object-messages-in-azure-queue-storage

Extension method that uses Newtonsoft.Json and async

    public static async Task AddMessageAsJsonAsync<T>(this CloudQueue cloudQueue, T objectToAdd)
    {
        var messageAsJson = JsonConvert.SerializeObject(objectToAdd);
        var cloudQueueMessage = new CloudQueueMessage(messageAsJson);
        await cloudQueue.AddMessageAsync(cloudQueueMessage);
    }

Get started with Azure Queue storage using .NET, What is the message size limit in a Azure storage queue? Azure Queue storage output bindings for Azure Functions. 02/18/2020; 9 minutes to read; In this article. Azure Functions can create new Azure Queue storage messages by setting up an output binding. For information on setup and configuration details, see the overview. Example

I like this generalization approach but I don't like having to put Serialize attribute on all the classes I might want to put in a message and derived them from a base (I might already have a base class too) so I used...

using System;
using System.Text;
using Microsoft.WindowsAzure.Storage.Queue;
using Newtonsoft.Json;

namespace Example.Queue
{
    public static class CloudQueueMessageExtensions
    {
        public static CloudQueueMessage Serialize(Object o)
        {
            var stringBuilder = new StringBuilder();
            stringBuilder.Append(o.GetType().FullName);
            stringBuilder.Append(':');
            stringBuilder.Append(JsonConvert.SerializeObject(o));
            return new CloudQueueMessage(stringBuilder.ToString());
        }

        public static T Deserialize<T>(this CloudQueueMessage m)
        {
            int indexOf = m.AsString.IndexOf(':');

            if (indexOf <= 0)
                throw new Exception(string.Format("Cannot deserialize into object of type {0}", 
                    typeof (T).FullName));

            string typeName = m.AsString.Substring(0, indexOf);
            string json = m.AsString.Substring(indexOf + 1);

            if (typeName != typeof (T).FullName)
            {
                throw new Exception(string.Format("Cannot deserialize object of type {0} into one of type {1}", 
                    typeName,
                    typeof (T).FullName));
            }

            return JsonConvert.DeserializeObject<T>(json);
        }
    }
}

e.g.

var myobject = new MyObject();
_queue.AddMessage( CloudQueueMessageExtensions.Serialize(myobject));

var myobject = _queue.GetMessage().Deserialize<MyObject>();

Azure Queue body is too large and exceeds the maximum permissible , (48 KB when using Base64 encoding) based on the latest Azure Storage Service Limits documentation as below. It is non-configurable and at the moment Azure support also will not increase the size upon request. Poison messages. When a queue trigger function fails, Azure Functions retries the function up to five times for a given queue message, including the first try. If all five attempts fail, the functions runtime adds a message to a queue named <originalqueuename>-poison. You can write a function to process messages from the poison queue by logging them or sending a notification that manual attention is needed.

Introduction to Azure Queues, What are the most common uses of Azure storage queue? Now that you have a storage queue, you can test the function by adding a message to the queue. Test the function. Back in the Azure portal, browse to your function, expand the Logs at the bottom of the page, and make sure that log streaming isn't paused. In Storage Explorer, expand your storage account, Queues, and myqueue-items, then click Add

That is not right way to do it. queues are not ment for storing object. you need to put object in blob or table (serialized). I believe queue messgae body has 64kb size limit with sdk1.5 and 8kb wih lower versions. Messgae body is ment to transfer crutial data for workera that pick it up only.

Deep Dive into Azure Storage Queue vs Azure Service Bus Queue, to create a backlog of work to process asynchronously. Azure Queue storage is a service for storing large numbers of messages that can be accessed from anywhere in the world via authenticated calls using HTTP or HTTPS. A single queue message can be up to 64 KB in size, and a queue can contain millions of messages, up to the total capacity limit of a storage account.

Tutorial - Work with Azure storage queues, What is the difference between Azure storage queue and service bus queue? Azure Queue Storage is a service for storing large numbers of messages. You access messages from anywhere in the world via authenticated calls using HTTP or HTTPS. A queue message can be up to 64 KB in size. A queue may contain millions of messages, up to the total capacity limit of a storage account.

Azure Queue storage trigger and bindings for Azure Functions , has two parts, a header and a body so the total size of the message is limited to 256 KB. Use Azure Queue Storage to build flexible applications and separate functions for better durability across large workloads. When you design applications for scale, application components can be decoupled, so that they can scale independently. Queue storage gives you asynchronous message queueing for communication between application components

Add messages to an Azure Storage queue using Functions , Azure Queue storage implements cloud-based queues to enable Each queue maintains a list of messages that can be added by a sender Update Main to create a CloudQueue object, which is later passed into the send  The unit of transfer on Azure Queue storage is the CloudQueueMessage. Cloud messages carry the payload of the message (i.e. your object or entity graph) in a serialized string (e.g. xml or json) or serialized binary representation (byte[]). You have serialization options such as: Json, e.g.

Comments
  • Seems it's the solution most go with :) Thanks!
  • I know :) I use it in production ;)
  • Pretty neat solution. But I'd say this violates Single Responsibility Principle: Serialisable POCO objects now take dependency on Azure library. I would not have messages inherit from BaseMessage, but rather have ToBinary() and FromMessage<T>() be private inside of the StdQueue<T> class. Objects should not really be responsible for serialisation/deserialisation of themselves.
  • Hey astaykov, how would you store generic types with this? What if you had ParseTaskMessage<U>, if you had to deserialize it, where would you get the type from?
  • new versions of Microsoft.WindowsAzure.Storage.Queue does change the ctor new CloudQueueMessage(byte[] content) to a static method CloudQueueMessage.CreateCloudQueueMessageFromByteArray(byte[] content)
  • I like this approach, it is compact :) If using code above, reminder you may still need a reference to the original CloudQueueMessage in order to Delete it from the queue after reading.
  • you actually only need MessageID and PopReciept properties of the original message.
  • And why should I put my small enough object in Blob or Table, when the queue message can transfer it? You want me to overhead my storage stransactions with 50% (from 2 storage transactions - 1 to read message, 1 to delete it; to 3 - one to read message, 1 to delete message + 1 additional to read table entity or blob) ?? And blobs IMHO are for storing files, not serialized objects. Plus you also have the 64k limit for a Table entity, where the byte[] property may be just up to 64k - msdn.microsoft.com/en-us/library/windowsazure/dd179338.aspx!
  • Well actually storage transactions will be increased at least twise (by 100%), because I have to first write what I want into blob/table, then send the message, then read the message, then read from blob/table, then delete the message. And if I can avoid this - I do.
  • Sorry for a late response - but, I'd say that blobs are not jsut for storing files but for storing "long term -nonsearchable- data". Storing serialized (big) objects in blobs is a common practice on Azure. Meta data (and more) goes to tables. Queues have somewhat dual nature I do agree with that, but are certenly not meant as a storage. If you know your object will not exceed certain size limit and these object are not required in general to be accessible at any time - sure put them in the queue. But one needs to be aware that queue message can be visible or not, dequed for ever, etc.
  • there is no way to be exact when the queue message will be available to get the data you need. That is why I'm saying that storing data in queue should be only data that is bound to that message operation only and nothing else. But, as you have pointed out, in some cases it would probably be safe and better (transaction wise) to store data in queue message itself.
  • Queues are made for passing data and instructions and what is being shown here is an excellent way of doing that. I didn't see anywhere that they were considering this for long term storage. I use a version of this using small instruction objects serialized to JSON to let processing systems know that there are new data files ready for processing. By using objects on both ends I maintain type control on a system that could fail with corrupt data. It also allows me to process the next set of files as soon as the system has some free capacity.