Showing posts with label C#. Show all posts
Showing posts with label C#. Show all posts

Friday, June 3, 2011

Entiy Framework Add Function Import - Get Column Information returns nothing..

Submit this story to DotNetKicks

I needed to update a sql stored procedure in our entity data model. In the Model Browser I clicked "Update Model from Database" and then double clicked on the SP under functions import to open the "Edit function import" wizard so I could make the complex return type from the SP. To my surprice the import wizard told me that "The selected stored procedure returns no columns"? I knew for a fact that the SP returned several rows.

After a little Google searching I found a solution in this forum post from "Brian”: Microsoft forum

Quote: "

it seems Entity Framework will try to get the columns by executing your Stored Procedure, passing NULL for every argument. You can see this by putting a trace on your SQL Server.

So - first make sure your S_P will return something under these circumstances. Note it may have been smarter for Entity Framework to execute the Stored Proc with Default Values for arguments, as opposed to NULLS. Never mind - nothing we can do about that!

However - before trying to run the Stored Procedure, ER does this

SET FMTONLY ON

This will break your stored procedure in various circumstances, in particular, if it uses a temp table.

So, add to the start of your Stored Procedure:

SET FMTONLY OFF;

This worked for me - hope it works for you too.

brian

"
One other work around or maybe a better solution is to not use temp tables at all, if you don’t need too index those tables that is.

If you use table variables, the SET FMTONLY ON; option will not break in your stored procedure. Performance wise, except indexing option on temp tables, they seem quite the same.

Ref this article:


Thursday, June 11, 2009

How to get the values of an enum?

Submit this story to DotNetKicks

Sometimes you need to list the values of your enums. Typically, I have to represent them in dropdown-boxes for a web page. Here's my solution:

public static ListItem[] GetEnumValues(Type t)

{

List<ListItem> lc = new List<ListItem>();

lc.Add(new ListItem() { Text = "Choose an item", Value = "-1" });

if (t.UnderlyingSystemType.BaseType == typeof(System.Enum))

{

foreach (int value in Enum.GetValues(t))

{

lc.Add(new ListItem() { Text = Enum.GetName(t, value), Value = value.ToString() });

}

}

return lc.ToArray();

}


Then, to use it in a control, use the Items.AddRange() function:

_dropdown.Items.AddRange(SomeClass.GetEnumValues(typeof(myEnumType)));

Plain and easy, but as far as I know, not available directly from .NET.

Friday, May 29, 2009

Performance testing of Dictionary, List and HashSet

Submit this story to DotNetKicks

Update June 4, 2009
The original post favored Dictionary as the fastest data structure both for add and contains. After some debate and some re-runs of my test code, I found the result to be wrong. HashSet is faster. Probably the results were affected by the workload on my computer. I ran one List test, then a Dictionary test and finally a HashSet test - I should have run them multiple times and at the same workload (i.e. no programs or processes running). Anyway, on to the article.

Still working with high-volume realtime datafeeds, I'm struggling to understand where the bottleneck in my code is. It's not with the database, it's not with the network - it's somewhere in the code. Now that doesn't help a lot.

So I decided to have a look at the different data structures that could be usable for my needs. Right now I'm using a List to keep track of my orders, and each time I get a new order, I check that list for the given ID.

So I'm starting to wonder, maybe there is a performance issue with the List data structure. So I made a small test with the following code:

static void TestList()
{

var watch = new Stopwatch();

var theList = new List<int>();

watch.Start();

//Fill it
for (int i = 0; i <>

{

theList.Add(i);

}

watch.Stop();

Console.WriteLine("Avg add time for List: {0}", (watch.Elapsed.TotalMilliseconds / noOfIterations));

watch.Reset();

watch.Start();

//Test containsKey

for (int j = 0; j <>

theList.Contains(j);

}

watch.Stop();

Console.WriteLine("Avg Contains lookup time for List: {0}", (watch.Elapsed.TotalMilliseconds / noOfIterations));

}

I created similar test code for Dictionary and HashSet. In all of the tests I used a loop where the given key always existed in the data structure. I used Add instead of Insert, because of tests I've seen online show that this is much faster. For example, this one.

First run, I used 1,000 for the noOfIterations variable and ran it three times.

The horizontal scale here is in milliseconds, so things are reasonably fast. A value of 0.2 gives you a possible 200 adds per second. As you probably can see without checking the numbers, dictionary is faster. List is slower for lookup, but HashSet suprises a little with the slow add function. So what happens when we go to 100,000 items?

OK, List is a lot slower for lookup still. Add seems to compare ok though. Let's see how it compares if we remove the lookup-time from the chart: This is wrong, see "The aftermath"

Now that's a result! Turns out List is your winner if fast adding is all you care about. If you want to look up your values later though, dictionary is your winner. Hope this will save some time for other developers in the same situation. Also feel free to comment if you find different results, or a problem with my test!

By popular request: The rest of the code!

static void TestDictionary()

{

var watch = new Stopwatch();

//Create Dictionary

var theDict = new Dictionary<int, int>();

watch.Start();

//Fill it

for (int i = 0; i <noofiterations;i++)

{

theDict.Add(i, i);

}

watch.Stop();

Console.WriteLine("Avg add time for Dictionary: {0}", (watch.Elapsed.TotalMilliseconds/noOfIterations));

watch.Reset();

//Test containsKey

watch.Start();

for (int j = 0; j <noofiterations;j++)

{

theDict.ContainsKey(j);

}

watch.Stop();

Console.WriteLine("Avg Contains lookup time for Dictionary: {0}",

(watch.Elapsed.TotalMilliseconds/noOfIterations));

}

static void TestHashSet()

{

var watch = new Stopwatch();

//Create List

var hashSet = new HashSet<int>();

//Fill it

watch.Start();

for (int i = 0; i <>

{

hashSet.Add(i);

}

watch.Stop();

Console.WriteLine("Avg add time for HashSet: {0}", (watch.Elapsed.TotalMilliseconds / noOfIterations));

watch.Reset();

//Test contains

watch.Start();

for (int j = 0; j <>

{

hashSet.Contains(j);

}

watch.Stop();

Console.WriteLine("Avg Contains lookup time for HashSet: {0}", (watch.Elapsed.TotalMilliseconds / noOfIterations));

}

The aftermath
Because of the controversy involved in HashSet being slower than Dictionary, despite having only one value and the Dictionary two, I re-ran the test on my computer, and on a colleagues. Instead of running the tests one by one, I ran three times HashTest and three times Dictionary, and picked the averages. The result you can see below, HashSet is faster than Dictionary both for Add and Contains methods.


Tuesday, February 12, 2008

Illegal Cross Thread Operation?

Submit this story to DotNetKicks

I see many forumposts solving this problem by simply adding the CheckForIllegalCrossThreadOperations=false;

While this is a totally legit workaround, it is highly thread-unsafe, and should only be used for debugging purposes. To be able to update for instance form-components from another thread, you will need to use delegates and invoke. Sounds scary? Not really.

A delegate is simply a "blueprint" that describes how a particular function or method should look. In C++ world, delegates are similar to Function pointers, but since we don't work with pointers in C# this is the closest we get. Still, function pointer is not a correct definition, as the delegate won't point to a function unless you make it do so. Here's an example:

The function


private void UpdateStatusText(string statusText)
{
statusLabel.Text = statusText;
}


Would have the following delegate:


public delegate void UpdateStatusTextHandler(string statusText);


Both the delegate name and the parameter-name(s) are totally random, it is the return-type and parameter-types that are important.

Let's say we have an eventhandler thread_OnStatusChanged(string newStatus) that needs to update the statusLabel.Text property. If the thread is different from the one the statusLabel exists in, you will get the Illegal Cross Thread operation error if you simply try to update it.

What we first want to do is an if-check to see if it comes from a different thread.
Inside the thread_OnStatusChanged eventhandler we do the following:



if (statusLabel.InvokeRequired)
{
statusLabel.Invoke(new UpdateStatusTextHandler(UpdateStatusText), newStatus);
}



The Invoke function takes the arguments Delegate Method, as well as the optional args[] to enable you to pass the required arguments to your method.

So by doing the new UpdateStatusTextHandler(UpdateStatusText) you are simply creating a pointer to the method UpdateStatusText(string statusText)



Hopefully this will make delegates and invoking a little more understandable.

Good luck with your threads!

Tuesday, October 16, 2007

Using using

Submit this story to DotNetKicks

When writing code that makes use of alot of unmanaged memory resources, all objects should be disposed of before exiting methods and such. The keyword using is often forgotten, and also sometimes seen as messy. example:

using (Bitmap bitmap1 = new Bitmap(100, 100))
{
Console.WriteLine("Width:{0}, Height:{1}", bitmap1.Width, bitmap1.Height);
}

should basically be the same as

Bitmap bitmap1 = new Bitmap(100, 100)
Console.WriteLine("Width:{0}, Height:{1}", bitmap1.Width, bitmap1.Height);
bitmap1.Dispose();

shouldn't it?
No, not really... The using-clause does a little more than this. Actually, my using-example is equivalent to the following block of code.

Bitmap bitmap1 = new Bitmap(100, 100);
try
{
Console.WriteLine("Width:{0}, Height:{1}", bitmap1.Width, Bitmap1.Height);
}
finally
{
if (bitmap1 != null)
{
bitmap1.Dispose();
}
}
Console.ReadLine();


Imagine writing nested using-clauses, and try writing the expanded code for that :)
Using using really cleans up your code, and your memory. So use it!


Edit: Oh, as Stian commented. Using should be used for all objects that implements the IDisposable interface. But remember - if you plan on implementing this interface on your own classes, you need to make sure it fully implements all the necessary methods for cleaning up. Otherwise it will all be to waste :)