Error: Twitter did not respond. Please wait a few minutes and refresh this page.
- 11,740 hits
cultivate passion for everything else that goes on around programming
Unless you’re writing a small academic application that no one will use that targets SQL Server, make sure to use SQL Server Profiler to avoid running into performance problems.
In 99% of all performance problems, the real cause is badly written code. Specially when using abstraction layers like Linq2SQL, Entity Framework and even pure ADO.NET, the developer rarely knows what actual SQL queries are generated. And most of the time the developer doesn’t really know when the SQL queries are executed.
The most useful method for performance optimization is thus understanding of what queries are generated. So here are some tips:
Additional tip: if multiple applications are hitting the same SQL Server, set the application name in your connection string so that you can filter the queries in SQL Server Profiler by the application name, in web.config:
or built through code:
During an optimization session for a ASP.NET application based on EntityFramework (and Linq to Entities) I found out a simple optimization method based on rewriting the Linq to Entities queries for many-to-many objects.
In a simple scenario, we have CareTypes, Establishments and a many-to-many table between them: EstablishmentCareTypes. If we want to fetch all CareTypes for a given Establishment, we would write the following TSQL query:
Using SQL Server Management Studio you can see the estimated execution plan, which is rather simple:
In Linq to Entities, the query was written like this:
This query does the job, but has the following generated TSQL:
This has a rather complicated execution plan:
Obviously not what would be expected after knowing the execution plan of our manually written TSQL!
After some Internet research, I wrote the Linq to Entities query like this:
This returns the exact same objects, but has the execution plan we would expect:
The generated TSQL is also much simpler (and shorter):
The SQL Server Profiler also shows a big improvement over the previous query:
The difference between the two queries, related to the effort done by SQL Server to process them, is astonishing. If the 2 tables referred were very large, having a lot of users accessing the ASP.NET application would bring the SQL Server eventually to its limits.
By also noticing that some queries executed twice for the same request and by introducing caching, the performance gain is considerable.
Important to this type of improvement described earlier is that it’s risk free. It is very easy to rewrite these queries, without having to think about introducing possible bugs.
No idea why, but today I was unable to give permissions to a domain user on a specific projects. Normally, all it would take is to add the user to the [Project]\Contributors group from within Visual Studio. I did that, but the user could not see the sources in Source Control Explorer. He received this error:
Either source control has not been configured for this team project or you do not have permission to access it. Would you like to create the source control folder, $/ProjectName ?
After some more experiments, I run the following command line to check his permissions for the project:
He did not show up, like the rest of the users. So I tried to give him permissions through the command line:
After this he showed up using the first command line. Then I had to add him again to the [Project]\Contributors group and everything was fine after that.
Only problem I have right now is that I don’t know what caused all these troubles.
Using Live Writer for blog posting, WordPress as the actual blog host and Tweet It plugin for Live Writer.
Are there better options for doing blogs and tweets at the same time for WordPress?
Edit: actually the plugin’s name is Twitter Update
Yesterday I tried to migrate from Toshiba Satellite L300 to L500. I made some plans, did some research and noticed it will take a lot of time to reinstall and to migrate the settings from all Applications I have installed (in Add/Remove Programs I have 350 items!). So I searched for a tool to automate this process and found PCmover. Installed it on both notebooks, both notebooks runnging on Windows7, started migrating everything through network and left it running, while continuing work normally on the old notebook. After 12 hours it finished migrating everything (50GB, compressed to about 20GB) and I was surprised of how much it managed to do on its own. There were problems with the settings of some programs (like Total Commander Ultima Prima, Firefox and Visual Studio), but I was very happy with the results. Also, everything I did after starting the migration process was not migrated (like chat history and settings I changed during the migration took place).
I also found out a good tip: make sure that on the new computer there are no installed programs. PCmover did not manage (or intentionally didn’t) upgrade the settings for the already installed apps.
Finally I erased everything from the new notebook and sticked to the old one, because it has a taller screen, a nicer keyboard (without the num pad) and because… well, I’m to used to the old one and the performance improvement I would have gained was not worth the trouble.
But nonetheless, PCmover is something I would recommend!
Just migrated to WordPress from Live Spaces. It’s interesting that Microsoft admits that others are better, e.g. WordPress offers a better blogging experience. This also shows (as so many other sites) that you can be even better than Microsoft in your specific area.