Try our conversational search powered by Generative AI!

Jonas Bergqvist
Jan 31, 2011
  5517
(1 votes)

New features in Dynamic Data Store (DDS)

Working on the “LinqToDDS” project is the most fun I’ve ever had at work. Therefore, it’s a pleasure to announce new LINQ support in the R2 wave.

Earlier we only supported the deferred execution methods “ToList()”, “ToArray”, “ToDictionary”, “ToLookup”, and “Count”. In the R2 wave, we have added the support for “First”, “FirstAndDefault”, “Single”, “SingleOrDefault”, “Last”, and “LastOrDefault”. We have also added support for multiple group by, “contains” on .Net collections, and predicate support on “count”.

Single object support

When developers used the LINQ support to query DDS for only one object, they had to make a query like “.ToList().First()”. This isn’t only ugly, it’s also inefficient. This query receives all identities matching the query, and puts them into memory. Then it takes the first item from memory, and query the database for the inline properties for that item (if the object is in the cache, it will be received from the cache).

Now, we’ve added support for “First”, “FirstOrDefault”, “Single”, “SingleOrDefault”, “Last”, and “LastOrDefault”. All of those both with and without predicates.

Fist and Last

When making a query directly against “First” (same with “Last”), the database will only receive the inline properties for the first/last object matching the query, and therefore we have everything needed to recreate the object for you.

Single

When making a query against “Single” we will do the same as for “First” and “Last”, but we will try to receive the two first objects matching the query instead of one. If we receive two objects, we throw an exception, because this is what you should used “Single” for.

Defaults

When using “FirstOrDefault”, SingleOrDefault”, or “LastOrDefault”, we will return default(T) if the query didn’t receive any result from the database. For “First”, “Single”, and “Last” an exception will be thrown if no result was received.

Multiple group by support

Earlier we only had support for grouping on one property. This could be a problem if you for example want to sum number of visits on a page for a specified group. Than, page and group should be grouped to get sum of visits for the rows where page and group are the same.

Now, this is possible. A query could look like:

var groupedQuery = GetStore().Items<MyClass>()
.GroupBy(myClass => new { Page = myClass.PageId, Group = myClass.AccessGroup })
.Select(grouped => new { Page = grouped.Key.Page, Group = grouped.Key.Group, Hits = grouped.Sum(s => s.Visits) });

Contains on .Net Collection

If you have a .Net list and wants to check if a property in the DDS has the same value as any of the items in the list, we now have support for that.

var myList = new List<string>();
myList.Add("Jonas");
myList.Add("Erik");
myList.Add("Per");
var query = GetStore().Items<MyClass>()
.Where(p => myList.Contains(p.FirstName));

Count with predicate

In the earlier versions, it wasn’t possible to use predicates in the “Count” deferrer. This is now fixed.

var query = GetStore().Items<MyClass>().Count(p => p.LastName == "Bergqvist");
Jan 31, 2011

Comments

Magnus Rahl
Magnus Rahl Jan 31, 2011 09:18 AM

Super! Keep up the good work!

Jan 31, 2011 09:50 AM

That's good news, thanks you :)

Anders Hattestad
Anders Hattestad Jan 31, 2011 09:57 AM

Do you still do 100 database calls when I do LoadAll().ToList() and its 100 rows in the store?

Jan 31, 2011 10:10 AM

Anders => That's still the behavior, yes.

As you probably know, when you are using the LINQ functionality, the cache mechanism will prevent this, if you have loaded the object with the specific identity before.

What you can do now, to always receive the everything directly, is to use the "select" method in your query. We will than only receive the properties you want, and you could than create your object using those properties. (...Select(p => new { p.Id, p.Name, p.Other }).ToList();)

smithsson68@gmail.com
smithsson68@gmail.com Jan 31, 2011 10:28 AM

Anders =>

Give me a good reason why you would want to do something like this in a production website.

For those of you who don't know what Anders is getting at it works like this:
- LINQ, Find and LoadAll calls to the DDS only actually load the Id's of the matching items.
- The IEnumerable object returned lazy loads each full object into memory on demand

This is a trade off between low memory footprint and speed. As items get cached you then benefit from both speed and a manageable memory footprint (the ASP.NET cache is used).

Of course we could "pre-load" an arbitaty number of objects into memory instead of just the Id's but what is the correct level, 10, 100, 1000.

Anybody else want to give us some feedback on this?

Anders Hattestad
Anders Hattestad Jan 31, 2011 10:51 AM

:)
I quess if you for instance have a store with 100-200 rows, and you need to load them all to do stuff with them, the LoadAll method sounds like it does what it should... Load All....

You could make a method that is named ActullyLoadAll that load all the elements from the store... one overload could be nr of elements to load...

If I have a list with words in the DDS and want to check a text agains those words, 100 database calls sound wrong....

This is the same "problem" we have with EPiServers PageData behaivore. There it is also one database call for each page.
This sounds in my head wrong... Why do 100 (or actully 101) loops to the database to retrive 100 objects.

smithsson68@gmail.com
smithsson68@gmail.com Jan 31, 2011 11:25 AM

Anders =>

In that case I would design my object as a container and collection:

class Words
{
public List Items {get;set;}
}

then when you load your singleton Words object you will get all it's constituent items in 2 database calls (one for the Words object and one for the Items collection)

Anders Hattestad
Anders Hattestad Jan 31, 2011 11:35 AM

I quess we can agree on disagree on this one :)

patrik.akselsson@valtech.se
patrik.akselsson@valtech.se Jan 31, 2011 03:39 PM

NHibernate has had batching for quite some time and it seems to be a great performance boost (http://knol.google.com/k/fabio-maulo/nhibernate-chapter-16-improving/1nr4enxv3dpeq/19#16(2E)1(2E)5(2E)(C2)(A0)Using_batch_fetching).

As for the use case you are talking about i think it makes sense to load all objects in one select, or possibly two (one for the ids and one IN-clause for all objects not in cache). When you iterate on LoadAll you expect to use all items in the collection. If you wanted a subset you would use Skip, Take, First or any of the other methods that limit the scope.

Please login to comment.
Latest blogs
Optimizely and the never-ending story of the missing globe!

I've worked with Optimizely CMS for 14 years, and there are two things I'm obsessed with: Link validation and the globe that keeps disappearing on...

Tomas Hensrud Gulla | Apr 18, 2024 | Syndicated blog

Visitor Groups Usage Report For Optimizely CMS 12

This add-on offers detailed information on how visitor groups are used and how effective they are within Optimizely CMS. Editors can monitor and...

Adnan Zameer | Apr 18, 2024 | Syndicated blog

Azure AI Language – Abstractive Summarisation in Optimizely CMS

In this article, I show how the abstraction summarisation feature provided by the Azure AI Language platform, can be used within Optimizely CMS to...

Anil Patel | Apr 18, 2024 | Syndicated blog

Fix your Search & Navigation (Find) indexing job, please

Once upon a time, a colleague asked me to look into a customer database with weird spikes in database log usage. (You might start to wonder why I a...

Quan Mai | Apr 17, 2024 | Syndicated blog