Try our conversational search powered by Generative AI!

Custom Blob Storage Provider

Vote:
 

Our current CMS is version 11.20.7.0

Does anyone have an example of a custom blob storage provider?  We're looking at putting blobs in Cloudflare R2 buckets and I can't seem to find an example of overriding BlobProvider.  

My specific issue is, how do I get access to the FileStream of the uploaded file so that I can store it in the R2 bucket.  When overriding CreateBlob I only see an id Uri.  I must be understanding this incorrectly.

public override Blob CreateBlob(Uri id, string extension)

I have been to this doc but it doesn't seem to give any details on overriding BlobProvider
https://docs.developers.optimizely.com/content-management-system/v11.0.0-cms/docs/blob-storage-and-providers

#320013
Apr 04, 2024 17:52
Vote:
 

Hi Todd,

You can take a look on SqlBlobProvider project, on tag 1.5.2 is the last version for CMS 11.
https://github.com/BVNetwork/SqlBlobProvider/tree/v1.5.2

#320014
Apr 04, 2024 18:15
Vote:
 

You will need to override the Blob class, which has OpenRead, OpenWrite, and some other methods and properties. This is where you need to return streams.

The BlobProvider also needs to be overridden. Here you will interact with the blob container to create, delete, and get blobs (returning your own blob class).

#320050
Apr 04, 2024 23:00
Vote:
 

Thank you both very much, this is very helpful and seems pretty straight forward.  Working through it now.

#320059
Apr 05, 2024 13:36
Stefan Holm Olsen - Apr 05, 2024 14:56
Happy to hear that.
Stefan Holm Olsen - Apr 05, 2024 14:57
Would be interesting to hear your experiences with Cloudflare R2 afterwards. If you are allowed to.
Vote:
 

So I've finally got back to working through this, pretty straight forward however I have a fundimental question.  I'm still struggling to see where I get a reference to the byte array of the file that has just been uploaded from the client.  Our storage of these files will ultimately be in S3 buckets, but upon upload the file must come to the server in some temp directory.  From there I'm expecting to have the file stream to upload to S3.  So in SqlBlobProvider i see the following for CreateBlob()... which on my end, the Save() implementation is swapped out for S3 bucket upsert.  I'm failing to see where SqlBlobProvider actually gets the file stream of the blob and stores it in sql. 

public override Blob CreateBlob(Uri id, string extension)
{
    var sqlBlobModel = new SqlBlobModel
    {
        BlobId = Blob.NewBlobIdentifier(id, extension)
    };
    SqlBlobModelRepository.Save(sqlBlobModel);
    return GetBlob(sqlBlobModel.BlobId);
}
public static void Save(SqlBlobModel blob)
{ 
    SqlBlobStore.Save(blob, blob.Id);
}
#320332
Apr 11, 2024 18:51
Vote:
 

I'm definitely missing something here, but I'd expect something like the following

public override Blob CreateBlob(Uri id, string extension)
{
    var blobId = Blob.NewBlobIdentifier(id, extension);
    var blob = GetBlob(blobId);
    var customBlobModel = new CustomBlobModel
    {
        BlobId = blobId,
        Blob = blob.OpenWrite()?.ReadAllBytes()
    }; 

    CustomBlobModelRepository.Save(customBlobModel);
    return blob;
}
public void Save(CustomBlobModel blob)
{
    if (blob == null)
        return;

    var fileStream = new MemoryStream();
    fileStream.Write(blob.Blob, 0, blob.Blob.Length);

    var putRequest = new PutObjectRequest
    {
        Key = DetermineS3Key(blob.BlobId.Segments),
        BucketName = _s3BucketName,
        InputStream = fileStream,
        DisablePayloadSigning = true
    };

    _s3Client.PutObject(putRequest);
}
#320333
Apr 11, 2024 19:36
Vote:
 

It looks like the initial stream in OpenWrite was the folder, that's why the byte array length was 0.  I was able to get read/write working.  

Now my next issue is figuring out why Delete is not firing.  After moving an asset to trash and then removing from trash the following code does not seem to fire

public override void Delete(Uri id)

However, I am able to bind to the DeletingContent event like such.  Will this suffice?

var events = ServiceLocator.Current.GetInstance<IContentEvents>();
events.DeletingContent += DeleteSqlBlobProviderFiles;
#320524
Apr 15, 2024 14:03
Vote:
 

Hi Todd,

Actually, Delete method in Blob Provider will be fired when this job "Remove Abandoned BLOBs" runs. So you only need to run this job manually or schedule to run this job to clean binary data for deleted media contents

#320570
Apr 16, 2024 4:23
Vote:
 

Thank you Binh, I would not have found that.  Once an asset is deleted from the trash bin, the Delete() override method in my custom blob provider fires when running the Remove Abandoned BLOBs job.

#320734
Apr 19, 2024 17:35
* You are NOT allowed to include any hyperlinks in the post because your account hasn't associated to your company. User profile should be updated.