A few years ago, I wrote about storing Sitecore binaries in an external blob storage service instead of having them in the database. You can read more about it here and the code is available on GitHub. It has several benefits and it works great! I’ve used this implementation in production on large scale solutions for many years.
In Sitecore 9.3, Sitecore introduced its own Azure Blob Storage module, that uses the same principles. Sitecore also slightly changed how databases are configured, so my old module works up to 9.2 as it is right now.
Since Sitecore supports for their own module, it makes sense to use that one instead of running a custom one. However, in true Sitecore spirit, the module was released without testing, so beware of the findings below before using it:
No more caching
When Sitecore processes an image, such as resizing etc., it stores a cached copy of the resulting image in the
/App_Data/MediaCache folder. Thereby, an image doesn’t have to be resized every time it is requested.
However, as soon as the Sitecore Azure Blob Storage module is enabled, Sitecore stops using the Media Cache. Instead, it downloads the original file from the Azure Blob storage tier and then resize it on every request. This have a huge negative performance impact.
At the time of writing this, I’m not aware of any solution to workaround this issue. I can’t believe the module was released without this being found in testing.
Lower upload size limit
The Azure Blob Storage module comes with a strange problem where it sometimes fails when uploading large binaries. We have tested a couple of files and it seems like when a source image is about 5MB or larger, the upload fails. Sitecore creates the new item, but fails to attach the binary.
Looking in the log files, the following error is shown:
ERROR Could not save posted file: [uploaded-filename.ext]
Message: Parameter is not valid.
at System.Drawing.Image.FromStream(Stream stream, Boolean useEmbeddedColorManagement, Boolean validateImageData)
at Sitecore.Resources.Media.ImageMedia.UpdateImageMetaData(MediaStream mediaStream)
at Sitecore.Resources.Media.MediaCreator.AttachStreamToMediaItem(Stream stream, String itemPath, String fileName, MediaCreatorOptions options)
at Sitecore.Resources.Media.MediaCreator.CreateFromStream(Stream stream, String filePath, MediaCreatorOptions options)
at Sitecore.Resources.Media.MediaUploader.UploadToDatabase(List`1 list)
at Sitecore.Pipelines.Upload.Save.Process(UploadArgs args)
When attaching such large binary from code, it works just fine. Performing the same operation with the module disabled also works just fine.
Note that there is max size configured max size on how large file can be uploaded to the application, but that limitation is configured way larger.
At the time of writing this, I’m not aware of any solution to workaround this issue.
Don’t comment your blob config
The solution I have been working with, needed a fourth database to use as a preview publishing target, beside the common
web databases. So, I just added the fourth database with a regular config patch, essentially as a copy of the web database.
But a few things didn’t work as expected. I soon realized that the blob configuration didn’t load properly for my new database. But I couldn’t figure out why. I extracted the config sections and did line by line comparison with the web database, and I couldn’t find anything wrong. It wasn’t until I attached a debugger and stepped through the Sitecore, I found that the configured blob provider was null. Strange…
It turned out that
Sitecore.Data.DefaultDatabase.GetBlobProviders() method loops over the config nodes’
ChildNodes of type
XmlNode and just assumes they are all XML elements. Well, a comment in XML is also a
XmlNode, so Sitecore essentially tried to read my comments in the configuration as a database configuration. That obviously failed. Below is the configuration structure that caused parsing to fail:
<BlobStorage hint="raw:AddBlobStorage"> <providers default="azure"> <provider name="classic" type="...ClassicSqlBlobProvider, ..."> <param desc="databaseName">$(id)</param> </provider> <!-- Adding a comment here causes provider loading to fail and the default provider becomes null. It works as expected by just removing this line. --> <provider name="azure" type="...AzureStorageBlobProvider, ..."> <lots of parameters... /> </provider> /providers> </BlobStorage>
At the time of writing this, I’m not aware of any other solution, than just removing such comments to workaround this issue.