Performance Improvement Tips for ForEach loop in C#?

I need to optimize the below foreach loop. The foreach loop is taken more time to get the unique items.

  1. Instead can the FilterItems be converted into a list collection. If so how to do it. Then i will take unique items easily from it.

  2. The problem arises when i have 5,00,000 items in FilterItems.

Please suggest some ways to optimize the below code:

int i = 0; 
List<object> order = new List<object>();
List<object> unique = new List<object>();
// FilterItems IS A COLLECTION OF RECORDS. CAN THIS BE CONVERTED TO A LIST COLLECTION DIRECTLY, SO THAT I CAN TAKE THE UNIQUE ITEMS FROM IT.
foreach (Record rec in FilterItems)
{
  string text = rec.GetValue(“Column Name”);
  int position = order.BinarySearch(text);
  if (position < 0)
  {
    order.Insert(-position - 1, text);
    unique.Add(text);
  }
  i++;
}
Jon Skeet
people
quotationmark

It's unclear what you mean by "converting FilterItems into a list" when we don't know anything about it, but you could definitely consider sorting after you've got all the items, rather than as you go:

var strings = FilterItems.Select(record => record.GetValue("Column Name"))
                         .Distinct()
                         .OrderBy(x => x)
                         .ToList();

The use of Distinct() here will avoid sorting lots of equal items - it looks like you only want distinct items anyway.

If you want unique to be in the original order but order to be the same items, just sorted, you could use:

var unique = FilterItems.Select(record => record.GetValue("Column Name"))
                        .Distinct()
                        .ToList();
var order = unique.OrderBy(x => x).ToList();

Now Distinct() isn't guaranteed to preserve order - but it does so in the current implementation, and that's the most natural implementation, too.

people

See more on this question at Stackoverflow