Sunday, November 29, 2009

Embed Silverlight in your blog using Windows Live Writer

 

This is amazing. You can embed a Silverlight app in your blog using Windows Live Writer and this plug-in. For example, here is the Silverlight showcase application from the home page of http://silverlight.net.

Get Microsoft Silverlight

Tuesday, October 13, 2009

Death to Visual Source Safe! ... TFS 2010 Basic

The biggest announcement about TFS 2010:

"Price – We’re not quite ready to announce the pricing and licensing for 2010 yet but I can tell you that it will be at least as easy and cost effective to get as SourceSafe has been.  Stay tuned for more info on this." great post.

That means that instead of $20,000 for licensing (current cost of TFS for a small development team), I am speculating that it might be included with license of Visual Studio/MSDN Universal License.

Visual Source safe was the most popular source control system in the world in the 90s. Now it is the most hated source control system in the world (for good reasons.) I hate it when VSS asks me: "Would you like to run the analyze utility to make sure your files are not corrupt?" But the one thing that VSS had/has is it is easy to set up and use. TFS 2010 Basic aims to make setup easy (no SharePoint dependency).

Screenshot of  TFS Basic Configuration Wizard (easy easy):

image

TFS is awesome. I fell in love with it while using it for free at codeplex building SharePoint SUSHI.

Tuesday, October 06, 2009

JQuery popularity

 

Some interesting statistics about the popularity of JQuery from the JQuery conference.

- Microsoft.com uses Jquery!

- 21% of all sites on the Internet use JQuery!

 

clip_image004

clip_image010

This research is done by Google: graph

clip_image008

clip_image002

clip_image006

Saturday, September 26, 2009

DataGeneral: Lightweight ADO.NET wrapper replacing Enterprise Library Data Access Application Block

 

The DataGeneral class provides access to the most often used features of ADO.NET in a simple-to-use class boosting developer productivity.

DataGeneral is lightweight for maximum performance. It employs Microsoft best practices. DataGeneral provides the best features of Enterprise Libraries Data Application block. In my experience, I like the Enterprise libraries Application Blocks to learn what features of the .NET framework are available , and best practice coding techniques (hence the name patterns and practices). But I don't like the weight of the enterprise libraries, and the random errors I get when versions are out of sync across environments. That is why I wrote DataGeneral which is just a single wrapper class. With DataGeneral you will have less lines of code to debug. DataGeneral is specific to Microsoft Sql Server.

For example, one of the features of the Data Access Block is stated as: "By changing the settings in the configuration file, developers can use their applications with different database configurations without recompiling their code." This makes me laugh: What application can have its most major component ripped out and replaced without recompiling the code? I would state that this feature is useless as are many of the other features in the Data Access block. However Enterprise library has many good features like saving you lines of code when building a SqlCommands with SqlParameters. That's why I wrote DataGeneral: All the good parts without all the useless parts.

DataGeneral has been used in several enterprise applications currently in production. On each project it has been further refined and perfected.

 

Example 1) DataGeneral Source Code:

using System;
using System.Collections.Generic;
using System.Data;
using System.Data.SqlClient;

namespace DataAccess
{
public class DataGeneral
{

private static SqlConnection createConnection()
{
string sqlCn = System.Configuration.ConfigurationManager.ConnectionStrings["ConnectionString"].ConnectionString;
SqlConnection cn = new SqlConnection(sqlCn);
cn.Open();
return cn;
}

/// <summary>
/// Wraps the SqlCommand.ExecuteReader() method.
/// </summary>
/// <param name="procedureNameOrSql">Stored procedure name or a Sql statement.</param>
/// <param name="parameters">List of SqlParameter. Set to null if no parameters.</param>
/// <param name="isStoredProcedure">True if the procedureNameOrSql is a stored procedure, false if it is a SQL statement.</param>
/// <returns>Returns a SqlDataReader which MUST be wrapped in a using statement so that its SqlConnecion is closed as soon as the SqlDataReader is disposed.</returns>
public static SqlDataReader ExecuteReader(string procedureNameOrSql, List<SqlParameter> parameters, bool isStoredProcedure)
{
//IMPORTANT: make sure you wrap the returned SqlDataReader in a using statement so that it is closed. (You do not need to close the SqlConnection object.)
SqlConnection cn = createConnection();

SqlCommand cmd = new SqlCommand(procedureNameOrSql, cn);

if (isStoredProcedure)
cmd.CommandType = CommandType.StoredProcedure;
if (parameters != null)
cmd.Parameters.AddRange(parameters.ToArray());
return cmd.ExecuteReader(CommandBehavior.CloseConnection);
}

/// <summary>
/// Wraps the SqlCommand.ExecuteScalar() method.
/// </summary>
/// <param name="procedureNameOrSql">Stored procedure name or a Sql statement.</param>
/// <param name="parameters">List of SqlParameter. Set to null if no parameters.</param>
/// <param name="isStoredProcedure">True if the procedureNameOrSql is a stored procedure, false if it is a SQL statement.</param>
/// <returns>Returns the first value of the first row of the Sql Statement.</returns>
public static object ExecuteScalar(string procedureNameOrSql, List<SqlParameter> parameters, bool isStoredProcedure)
{
object scalarValue;
using (SqlConnection cn = createConnection())
{
SqlCommand cmd = new SqlCommand(procedureNameOrSql, cn);
if (isStoredProcedure)
cmd.CommandType = CommandType.StoredProcedure;
if (parameters != null)
cmd.Parameters.AddRange(parameters.ToArray());
scalarValue = cmd.ExecuteScalar();
}
return scalarValue;
}

/// <summary>
/// Wraps the SqlCommand.ExecuteNonQuery() method.
/// </summary>
/// <param name="procedureNameOrSql">Stored procedure name or a Sql statement.</param>
/// <param name="parameters">List of SqlParameter. Set to null if no parameters.</param>
/// <param name="isStoredProcedure">True if the procedureNameOrSql is a stored procedure, false if it is a SQL statement</param>
/// <returns>Returns the number of rows affected by ExecuteNonQuery().</returns>
public static int ExecuteNonQuery(string procedureNameOrSql, List<SqlParameter> parameters, bool isStoredProcedure)
{
int rowsAffected;
using (SqlConnection cn = createConnection())
{
SqlCommand cmd = new SqlCommand(procedureNameOrSql, cn);
if (parameters != null)
cmd.Parameters.AddRange(parameters.ToArray());
if (isStoredProcedure)
cmd.CommandType = CommandType.StoredProcedure;

rowsAffected = cmd.ExecuteNonQuery();
}
return rowsAffected;
}

public class ParamBuilder
{
private readonly List<SqlParameter> _parameters = new List<SqlParameter>();
public List<SqlParameter> Parameters
{
get
{
return _parameters;
}
}

public void AddParam(SqlDbType sqlDbType, string paramName, object paramVal)
{
SqlParameter p = new SqlParameter(paramName, sqlDbType);
p.Value = paramVal ?? DBNull.Value;
_parameters.Add(p);
}

public SqlParameter AddOutputParam(SqlDbType sqlDbType, string paramName)
{
SqlParameter p = new SqlParameter(paramName, sqlDbType);
p.Direction = ParameterDirection.Output;
_parameters.Add(p);
return p;
}
}


}
}


Example 2) Sample Usage of ExecuteReader()



Note you should wrap the ExecuteReader() in a using statement.



public static Address GetAddressByID(int addressID)
{
Address address = null;

DataGeneral.ParamBuilder paramBuilder = new DataGeneral.ParamBuilder();
paramBuilder.AddParam(SqlDbType.Int, "@AddressID", addressID);

using (SqlDataReader sqlDataReader = DataGeneral.ExecuteReader("Location_Address_GetByID", paramBuilder.Parameters, true))
{
if (sqlDataReader.Read())
{
address = PopulateAddressFromReader(sqlDataReader);
}
}

return address;
}

private static Address PopulateAddressFromReader(SqlDataReader sqlDataReader)
{
Address address = new Address();

address.AddressID = (int)sqlDataReader["AddressID"];
address.AddressLine1 = sqlDataReader["AddressLine1"] as string;
address.AddressLine2 = sqlDataReader["AddressLine2"] as string;
address.AddressLine3 = sqlDataReader["AddressLine3"] as string;
address.City = sqlDataReader["City"] as string;
address.Zipcode = sqlDataReader["ZipCode"] as string;
address.CountryRefID = (int)sqlDataReader["CountryRefID"];
address.CityLatitude = sqlDataReader["CityLatitude"] as float?;
address.CityLongitude = sqlDataReader["CityLongitude"] as float?;
address.ZipcodeLatitude = sqlDataReader["ZipcodeLatitude"] as float?;
address.ZipcodeLongitude = sqlDataReader["ZipcodeLongitude"] as float?;

return address;
}


Example 3) Sample usage of ExecuteNonQuery()



Note, that both inline sql and stored procedures are supported. This example also demonstrates output parameters. For example, if you are inserting in to a table with an identity column and want to return the newly created identity value.



This example also demonstrates wrapping a call with TransactionScope() which is another great feature of the .NET framework.



public static void AddAddress(Address address)
{
DataGeneral.ParamBuilder paramBuilder = new DataGeneral.ParamBuilder();
SqlParameter paramNewAddressId = paramBuilder.AddOutputParam(SqlDbType.Int, "@NewAddressID");
paramBuilder.AddParam(SqlDbType.VarChar, "@AddressLine1", address.AddressLine1);
paramBuilder.AddParam(SqlDbType.VarChar, "@AddressLine2", address.AddressLine2);
paramBuilder.AddParam(SqlDbType.VarChar, "@AddressLine3", address.AddressLine3);
paramBuilder.AddParam(SqlDbType.VarChar, "@City", address.City);
paramBuilder.AddParam(SqlDbType.VarChar, "@StateProvinceRefID", address.StateProvinceRefID);
paramBuilder.AddParam(SqlDbType.VarChar, "@ZipCode", address.Zipcode);
paramBuilder.AddParam(SqlDbType.VarChar, "@CountryRefID", address.CountryRefID);
paramBuilder.AddParam(SqlDbType.Float, "@CityLatitude", address.CityLatitude);
paramBuilder.AddParam(SqlDbType.Float, "@CityLongitude", address.CityLongitude);
paramBuilder.AddParam(SqlDbType.Float, "@ZipcodeLatitude", address.ZipcodeLatitude);
paramBuilder.AddParam(SqlDbType.Float, "@ZipcodeLongitude", address.ZipcodeLongitude);

const string sql =
@" INSERT Address
( AddressLine1, AddressLine2, AddressLine3, City, StateProvinceRefID, ZipCode, CountryRefID,
CityLatitude, CityLongitude, ZipcodeLatitude, ZipcodeLongitude, AddressPhoneID,AddressFaxPhoneID,
CreatedDate, CreatedBy, UpdatedDate, UpdatedBy)

VALUES
(@AddressLine1,@AddressLine2,@AddressLine3,@City,@StateProvinceRefID,@ZipCode,@CountryRefID,
@CityLatitude,@CityLongitude,@ZipcodeLatitude,@ZipcodeLongitude)

SET @NewAddressID=SCOPE_IDENTITY();";

using (TransactionScope transactionScope = new TransactionScope())
{
DataGeneral.ExecuteNonQuery(sql, paramBuilder.Parameters, false);
transactionScope.Complete();
}
address.AddressID = (int) paramNewAddressId.Value;
}

Wednesday, August 05, 2009

Sticky Notes in Windows 7

It’s the little things that are big. I found a new feature in Windows 7 that I really like: Sticky Notes. They look and behave a lot like real sticky notes. I used to use real sticky notes like crazy. At the beginning of each day I would write all my tasks down on those notes helping me, as Stephen Covey would say: keep first things first. Lately I've been using OneNote and/or notepad. OneNote is a great app but like any good thing, too much of it is bad. I'll still keep using OneNote every day, but for task lists and quick notes, Windows 7 Sticky Notes is definitely my new friend. 

image

 

The leap to Windows 7

A month ago I took a risk and installed Windows 7 as my primary OS. Installing a beta version of an OS on a laptop which is critically important for my job is not something I usually do but fortunately it has paid off. My favorite Windows 7 features:

  • The performance is good (even though my laptop hardware is seriously lacking for a developer).
  • Reboot time is fantastic.
  • Stability has been good. (I got a video driver failure yesterday than didn't crash my machine, instead the screen flickered and then Windows 7 gave me a message that the video driver restarted.. Hold the applause please..)

clip_image002

  • The Windows Key+P for switching from one to two monitors.

image

  • Accessible volume and network connections in the bottom right hand corner. I open and close VPN constantly so having this quickly accessible will delay my carpal tunnel.

image

image

 

  • Dragging windows from one monitor to another without having to double-click to resize and then re-maximize. We all have two monitors now so this saves a lot of click-click-click once again delaying my carpal tunnel.

image

Thursday, July 30, 2009

Catapult Kids Day

I work for a consulting company which is big on providing a great place to work. Catapult Systems was ranked #2 best place to work in all of Texas last year and also received the same impressive rank in 2006. Examples: Last weekend we had a Catapult kids day when all the employees got to bring their kids into work. My 4 year old son had a blast. Granted, now his impression is that I play mini-golf, watch movies, and shoot Nerf guns all day.

clip_image001

My son is the one in front making the funny face. I so proud of my little goofball :).

Wednesday, July 08, 2009

DIVs not Tables

My brother-in-law is an HTML extraordinaire and has been telling me for years that DIVs are just better than tables. I never believed him. I though this was just inherited open source community hatred towards Microsoft’s server controls that spit out tables everywhere. But I have been convinced! There is actually a simple demonstrable, measurable benefit to using DIVs over tables. A common scenario is displaying two columns of information with the label in the left and the value in the right column. See the code below, the DIV option is definitely cleaner and results in 7 tags while the table version results in 11. The <label> tag is also more descriptive than a <td> tag.

Granted, in order to make your DIVs line up correctly, you have to use styles, while tables do this by default.

Div version: (7 tags required)

<div>
<div>
<label>scrollTop:</label>
<input type="text" />
</div>
<div>
<label>scrollLeft:</label>
<input type="text" />
</div>
<div id="targetContainer">
<label>Apply to:</label>
<input type="radio" /> window
<input type="radio" /> document
<input type="radio" /> test subject
</div>
</div>


table version: (11 tags required)


<table>
<tbody>
<tr>
<td>
scrollTop:
</td>
<td>
<input type="text" />
</td>
</tr>
<tr>
<td>
scrollLeft:
</td>
<td>
<input type="text" />
</td>
</tr>
<tr>
<td>
Apply to:
</td>
<td>
<input type="radio" />
window
<input type="radio" />
document
<input type="radio" />
test subject
</td>
</tr>
</tbody>
</table>



 

Saturday, July 04, 2009

Enable JQuery intellisense in Visual Studio 2008

Steps to Enable jQuery Intellisense in VS 2008

Step 1: Install VS 2008 SP1
Link: http://msdn.microsoft.com/en-us/vstudio/cc533448.aspx
Step 2: Install VS 2008 Patch KB958502 to Support "-vsdoc.js" Intellisense Files
Link: http://code.msdn.microsoft.com/KB958502/Release/ProjectReleases.aspx?ReleaseId=1736
Step 3: Download the jQuery-vsdoc.js file
Link: http://docs.jquery.com/Downloading_jQuery#Download_jQuery

Full story with details can be found here:
Scott Guthrie announcement

Sunday, June 28, 2009

Codeplex changes it’s policy based on my suggestion

This is pretty cool. This week Codeplex changed it’s Start a Project page and added a section which I had suggested. I guess they are listening to me. :) You can read my suggestion I made back in Dec 2008 in the discussion forums to . You can also watch a Channel 9 video describing the feature starting at 5min. The suggestion was simply to allow donation links on Codeplex project sites.

My Codeplex project SharePoint SUSHI has been on Codeplex since November 2007 and was one of the early projects posted on Codeplex. It has been downloaded 17,000 times. I love the comments on the latest version, they are awesome:

Probably the most useful free SharePoint tool out there! Saved me hours of work many many times. Some really innovative features (e.g. copy list view). Great stuff! Please keep it up. Thanks for sharing the results of your hard work!. Greg
by
Greg_O on Mar 9 at 7:22 AM

Some great functions. Definitely going to use that tool often in the future.
by
Dublette on Mar 2 at 2:19 AM

God bless you, Joseph. You may just get that Wikipedia page after all. :)
by
panoone on Feb 4 at 8:24 PM

 

It is good to see all those hours spent building SUSHI having some positive benefit for the community. I know that I have benefited enormously from blog posts and free utilities so it is good to give a little something back.

Thursday, June 25, 2009

Float vs. Decimal data types in Sql Server

 

This is an excellent article describing when to use float and decimal. Float stores an approximate value and decimal stores an exact value.

In summary, exact values like money should use decimal, and approximate values like scientific measurements should use float.

Here is an interesting example that shows that both float and decimal are capable of losing precision. When adding a number that is not an integer and then subtracting that same number  float results in losing precision while decimal does not:

DECLARE @Float1 float, @Float2 float, @Float3 float, @Float4 float;
SET @Float1 = 54;
SET @Float2 = 3.1;
SET @Float3 = 0 + @Float1 + @Float2;
SELECT @Float3 - @Float1 - @Float2 AS "Should be 0";

Should be 0
----------------------
1.13797860024079E-15

 

When multiplying a non integer and dividing by that same number, decimals lose precision while floats do not.

DECLARE @Fixed1 decimal(8,4), @Fixed2 decimal(8,4), @Fixed3 decimal(8,4);
SET @Fixed1 = 54;
SET @Fixed2 = 0.03;
SET @Fixed3 = 1 * @Fixed1 / @Fixed2;
SELECT @Fixed3 / @Fixed1 * @Fixed2 AS "Should be 1";

Should be 1
---------------------------------------
0.99999999999999900

Tuesday, June 16, 2009

Trancender vs. Measureup – Tips for Microsoft Certification Tests

 

  • I've been going certification crazy this year as I've passed 5 certification tests so far. Along the way I've picked up a few tips that might be of interest to someone looking to study for and take a Microsoft Certification exam.
  • Trancender is a win forms app installed locally, so don't need to be connected to internet.
  • Measure up is a web app.
  • Both Measureup and Trancender are seriously out-of-date applications. It is embarrassing that they test your .NET skills and yet the skills of the programmers who wrote these tools is seriously lacking/out-of date.
  • No cut and paste available in Trancender. This makes it hard to try out source code contained in the practice questions and descriptions are more generic than Measureup. For each question Measureup tells you why the right answer is right and the wrong answer is wrong, which helps you understand the way the question asker is thinking and what they are trying to teach or the point they are trying to assess.
  • Both Measureup and Trancender tests are a little harder than the real tests. On a measure up test all the answers usually at least are not nonsensical, they represent valid .NET names, classes etc. On the actual test, you can often eliminate several wrong answers just by being familiar with correct syntax.
  • Measureup allows you to take a short test, and see the answer after each question. This is my favorite way to study for a test: in small chunks. I take a short 10 question test whenever I get a chance to study and dig into any questions I didn't get right. With Trancender you have to take a 30 question test, and can't see the answers until the end.
  • Note that In my experience there are not very many interactive format questions. These are questions that have drag drop functionality. All the questions are multiple choice in which you choose 1 answer. A few questions have two or three right answers.
  • Overall I like Measureup better. It is easier to study on-the-fly, and can be used from any machine without going through an install program most likely written in 1994 using VB6. :)

MCP Testing Tips:

  • I didn't use any practice tests that are actually just copies of the real tests posted on the Internet (brain dumps), that is just plain cheating and Microsoft has clearly stated this. Measureup and Trancender tests however are sanctioned by Microsoft and are linked from their certification website. Measureup and Trancender use similar questions without using the actual questions from the real test. They are good preparation materials because they are a fast way to give you a feel for which concepts will be tested and the types of questions that will be asked. They also get you used to the testing format. I think that in addition to testing your .NET skills, the MCP exams test your pattern recognition and reading compression skills and so practice tests help you sharpen these skills as well. The other thing I like about these practice tests is that there is just a huge amount of features to a given technology (WCF, ADO.NET, Win Forms, etc) and a practice test is a good way to get a feel for what the most important features of that technology are. You may not know every detail about that feature after the practice test but at least down the road when you have a business need for that feature you know where to start looking. I would borrow a phrase from a favorite childhood cartoon GI-JOE.

    If GI-JOE were a .NET programmer he would have said: "Knowing where to look in the .NET Framework is half the battle.” MCP Certification tests are a great way to quickly learn where to look.

Friday, June 05, 2009

Stored Procedures versus Ad-hoc SQL

Have you ever worked on a system where someone decreed that all database calls must be Stored Procedures, and ad-hoc SQL is strictly forbidden? I have and it leads to incredible development pain.

 

Let me first say that I have written many awesome 100+ line stored procedures. Stored procedures are definitely a good thing when writing complex queries. TSQL and Stored Procedures are fantastic. However, I think that the optimal decision is to use a mix of both ad-hoc sql and stored procedures rather than just stored procedures.

 

Reasons that are given for the above decree and why they are no longer true:

  • Security:
    • SQL injection: Resolved by using parameterized SQL which eliminate SQL injection possibility.
    • Granular security: If the app pool account is dbowner, then there is no additional security from applying security to each stored procedure (execute only privileges).

 

  • Performance:  (see performance testing section below for test results)
    • Execution plan Caching: In SQL Server 2005 and later, SQL server caches the execution plan based on tokenized versions of queries so performance of parameterized ad-hoc SQL is close to stored procedures.(see John Lam's comment below)

 

  • Maintenance
    • Changing Stored procedures doesn’t require recompiling code: With agile development and continuous integration, code is easy to change and deploy. Advantage of ad-hoc SQL: fewer lines of code overall because you don’t have to declare stored procedure signature. Fewer lines of code means risk of errors goes down, and maintenance cost goes down. Easier to refactor code with inline SQL because able to use "find and replace" in Visual Studio when renaming a column or changing a data type. Also, using a mix of stored procedures and ad-hoc SQL keeps the database cleaner. When using all stored procedures you quickly have 100s to keep track of and inevitably stored procedures get lost and a significant effort must be devoted to maintaining and auditing them all.
    • Faster coding, don't have to switch between sql server and Visual Studio. (Jeff Atwood also makes this point).
    • Better documentation: When TSQL code is in data layer, able to look at it in one place. (Jeff Atwood also makes this point)
    • Debugging faster, easier. Cumbersome to set breakpoints in stored procedures.

 

  • Transactions
    • It used to be possible to do transactions only in stored procedures and not in .NET. But now with the System.Transactions namespace it is easy. This is really great stuff. If you haven't used it, it is so awesome and clean.

 

Comments:

I am not anti-stored procedure. I have used them quite a bit and they are incredibly powerful. TSQL is just gorgeous when it comes to performance. When you have complex queries, they become good candidates for stored procedures because you can leverage temporary tables, common table expressions, etc.

Note ORMs like Entity Framework (the future of data access) uses parameterized sql, so if developers are willing to use those frameworks they are already using parameterized sql. For some reason developers are willing to use those frameworks but not inline SQL, even though they talk to the database in the same way.

 

 

What others are saying:

Jeff Atwood puts in eloquently in his blog post: who needs stored procedures anyway?.

 

And who can argue with the great John Lam? He says:

As a guy who dabbles in low-level bit twiddling stuff from time-to-time, the performance claims are quite interesting to me. The new (as of SQL Server 7.0) cached execution plan optimization in SQL Server looks to me a lot like JIT compilation. If this is, in fact, the case it seems to me that the only overhead that would be associated with dynamic SQL would be:

  1. The amount of bandwidth + time it takes to transmit the dynamic SQL text to the database.
  2. The amount of time it takes to calculate the hash of the dynamic SQL text to look up the cached execution plan.

I can imagine quite a few scenarios where the above overhead would disappear into the noise of the network roundtrip. What upsets me are the folks who spout forth anecdotal arguments that claim stored procedures have "much better" performance than dynamic SQL.

 

Conclusion

I cringe when I see an architectural decision which tries to make it easy for the server which is a $4000 resource over two years at the cost of making it harder for the developer which is a $400,000 resource (over two years).

A question from personal experience: I recently saw a 10 million dollar software project go over-budget and lose a ton of money. Why did the project fail? Using ad-hoc sql may not have saved the project but making similar productivity enhancing decisions like allowing ad-hoc SQL when it makes sense and is not a security or performance compromise, might just have tipped the project from red to black.

 

Performance Testing

I always have to test performance claims for myself. So I wrote a small console application to put the execution caching plan to the test. A method using a stored procedure was called 1000 times. A method using the same sql as paramaterized, ad-hoc SQL was called 1000 times. The times were about equal, actually they where slightly faster for the ad-hoc sql :).

Results (milliseconds):

clip_image001

clip_image002

clip_image003

static void Main(string[] args)

{

//--warm up

SelectContactStoredProc();

SelectContactParamaterizedSql();

//--

Stopwatch sw = new Stopwatch();

sw.Start();

for (int i = 1; i <= 1000; i++)

SelectContactStoredProc();

sw.Stop();

Console.WriteLine("Elapsed StoredProc:" + sw.ElapsedMilliseconds);

//--

sw.Reset();

sw.Start();

for (int i = 1; i <= 1000; i++)

SelectContactParamaterizedSql();

sw.Stop();

Console.WriteLine("Elapsed ParmSQL:" + sw.ElapsedMilliseconds);

Console.ReadLine();

}

public static void SelectContactParamaterizedSql()

{

DataGeneral.ParamBuilder pb = new DataGeneral.ParamBuilder();

pb.AddParam(SqlDbType.Int, "@contactGroupId", 11);

string sql = @"select ContactID,ContactGroupID,ContactTypeRefCode,FirstName,LastName,Prefix,Suffix,Email,Phone,PhoneExt,AltPhone,AltPhoneExt,MobilePhone,FaxPhone

from Contact where contactGroupId = @contactGroupId";

object ret = DataGeneral.GetScalar(sql, pb.Parameters, false);

}

public static void SelectContactStoredProc()

{

DataGeneral.ParamBuilder pb = new DataGeneral.ParamBuilder();

pb.AddParam(SqlDbType.Int, "@contactGroupId", 11);

object ret = DataGeneral.GetScalar("General_Contact_GetByContactGroupID", pb.Parameters, true);

}

Wednesday, April 08, 2009

Rename a SharePoint Server Machine Name

I often need to rename a machine running SharePoint. I was afraid this would be terribly difficult because there are several places in SharePoint that the machine name is stored, but it was actually pretty easy.

Here is a good article which worked for me: rename SharePoint server

  1. Rename the SharePoint server using the STSADM command. (-o renameserver)
  2. Add alternate access mapping for central admin to the new name
  3. rename the machine to match
  4. reboot
  5. If you get a “Some or all identity references could not be translated. when navigating to Central Admin”, run this command: stsadm.exe -o updatefarmcredentials -userlogin <login> –password <pass> . detailed instructions.
  6. Post-configuration: alternate access mapping changes, User ID changes, etc.

Friday, March 27, 2009

The fluckiger.org guestbook is back, new and improved!

After a 5 year outage, the fluckiger.org guestbook is back!Please feel free to leave your thoughts, hellos, ideas, comments, poems, rants etc.

Check this out, I just used internet archive to retrieve messages left on the old fluckiger guestbook. The old guestbook was up between 2002 and 2004. It is fun to see the comments, back then having a guestbook was a really cool impressive thing!

I know a guestbook is an old fashioned thing. Old fashioned in Internet years which means anything over 5 years old. In 2002 I built a guestbook using asp.net 1.1 and an Microsoft Access database. My original guestbook at fluckiger.org was pretty popular. This was before facebook and blog comments which make having your own guestbook seem kind of pointless.

But this guestbook gave me a chance to flex my Silverlight muscles and is just the mustard seed. There is a lot of potential in Silverlight to bring back richness to human-computer interaction. We've been suffering through painful primitive UI and slow response times for quite a while now. Silverlight has the potential to bend the computer to the needs of people rather than what we've been doing over the past decade which is bending people to use the HTML in the browser just because that was the easiest and most secure way to deploy applications.

My guestbook is using some pretty cool technology. It is written in Silverlight 2.0 which is delivered from IIS 7 on a virtual Windows Server 2008 instance which is hosted in a 64 bit Windows Server 08 Hypervisor. The physical machine is a box I got off of Craig’s List for only $1500. It has 8 cores, 22 gigs of memory, and a 300 GB Velociraptor hard drive. This gives me plenty of room to run multiple virtuals. I am currently running: a project server farm, an Ubuntu Linux box, a Windows 7 box, and my SUSHI development environment which also hosts this guestbook. Sweet!

Friday, March 20, 2009

Snapshots in Windows Server 2008 Hypervisor

I recently installed Windows Server 2008 with Hypervisor. I was a little confused by the Snapshot options. But after reading this post, and a little trial and error, I figured it out…

snapshots

Apply… results in losing your changes since the snapshot. This applies the state stored in the snapshot to the current state of the machine.

Delete Snapshot… results in keeping your changes since the snapshot. This causes the AVHD (differencing disk) files to be merged into the parent VHD or AVHD files. Note that if the virtual machine is running the merge will not happen until the guest machine is shut down.

Revert does the same thing as Apply for the most recent snapshot.

 

Additional Notes from my experience: When you create multiple snapshots you’ll see that Hypervisor creates a hierarchy of snapshot nodes. This hierarchy is important. The green arrow is also important. I made the mistake at first of thinking that the green arrow and the snapshot were the same thing, but the green arrow represents the current state and all changes that have happened since its parent snapshot. Once I understood this, it was much more clear to me why apply resulted in losing my current changes and delete resulted in keeping my current changes.

Thursday, March 19, 2009

Johnny Lee on Channel9

Scott Hanselman interviews Johnny Lee.

Excellent topics of this video: Why can’t a computer’s see? Why does the academic world reward writing papers instead of producing useful technology? (Johnny Lee inspired millions of people with his revolutionary use of the Wiimote and yet got no academic credit from Carnegie Mellon which he was attending at the time).

-Joseph

Sunday, March 15, 2009

SUSHI has 5 star rating on Codeplex

Writing free software for the community often come with little reward. But it is great to get positive feedback. I can’t help but smile at the feedback that SUSHI has gotten:

 

Probably the most useful free SharePoint tool out there! Saved me hours of work many many times. Some really innovative features (e.g. copy list view). Great stuff! Please keep it up. Thanks for sharing the results of your hard work!. Greg

by Greg_O on Mar 9 at 7:22 AM

 

Some great functions. Definitely going to use that tool often in the future.

by Dublette on Mar 2 at 2:19 AM

 

God bless you, Joseph. You may just get that Wikipedia page after all. :)
by panoone on Feb 4 at 8:24 PM

 

 

The last comment is the funniest of all. I hope he is right! SUSHI has had 13,000 downloads today and is in the top 25 downloads in the category of SharePoint over the last 7 days.

Tuesday, March 10, 2009

Super fast drive

 

This is hilarious, and awesome at the same time.

Drive that gets 2 GIGbits per second transfer rates. I bought a 300 gig raptor for $250 at Fry’s electronics and I love it. It is very fast, but my office is now a sauna and it is only Spring. I noticed that a 64 gig Solid State Disk cost $199. Can’t wait to get one of those. The future of computing, starting in 2009? SSD, SSD, SSD. Performance and low power consumption.

Tuesday, March 03, 2009

Tech Tip: Set up Remote Desktop on 443 to Get Through a Restrictive Firewall

 

I often connect to the Internet through a guest wireless account. These guest wireless networks are typically very restrictive and block all ports except for 80 (http) and 443 (https) .  This is a big pain when I need to remote desktop into a remote machine, or perform other tasks which also might be restricted by the firewall like accessing my online email, searching the Internet for helpful blog posts, etc.

Here is an awesome tip to get to the resources you need: Set up a remote machine and change the Remote desktop port from the default of 3389 to 443. To do this just change a registry key and then reboot your machine. See this Microsoft KB article for details. You will also need to configure your router to forward inbound traffic for 443 to your machine. Instructional videos on port forwarding with a Linksys router.

Another benefit of this approach is that the traffic between you and the website is not exposed to the guest network which is usually a low security, public network and vulnerable to packet sniffing. The guest network only sees RDP packets being passed between your laptop and your remote machine.

Sunday, March 01, 2009

SharePoint and Project Server Download Links

 

Common download links for MOSS and Project Server 2007:

  • Patches
    • Infrastructure
      • (The infrastructure is now included in the Dec CU, no need to install it separately)
    • Help: Version numbers lookup link
    • Help: Deploy Project Server 2007 Tips link
    • Help: Deploy WSS 3.0 Tips link
    • Help: Master blog post on SharePoint service packs and updates. link

I always lose these links, so I’ve decided to maintain a blog post with these common links.

Saturday, February 21, 2009

RAM Disks do not speed up Visual Studio

 

The limiting factor for Visual Studio is disk IO. I got a tip to speed up Visual Studio from Channel 9 by creating a RAM disk which sounded like a great idea. However, when I ran a thorough set of tests, I found that the performance difference between the Ram disk and the hard disk were not appreciably different. This was a big surprise since RAM is 240,000 times faster than disk (see my previous blog post). But the reason is because Visual Studio and Vista do a lot of caching. So compile times for the same project in RAM disk and on hard disk were pretty similar. I also tested the time it took to search the entire solution for a word, and times to open a solution. There was no discernable difference!

 

If you still want to try it out and create your own RAM disk, you can download a simple RAMDISK.EXE utility to create a RAM disk in just a few minutes. What is a RAM Disk ?  Ramdisk is a virtual drive created in RAM.

 

Performance Analysis

Creating files was an average of 1.5 times faster on the RAM disk. reading files was pretty much the same between the two disks. Copying a file was actually slightly slower on the RAM disk. All three visual studio tests were pretty much the same between the two.

 

What is the downside of a RAM disk?

Running drivers always give the chance of a blue-screen, and I did have one blue screen when I was first setting up the RAMDISK. (My guess is that I chose a non-default RAMDISK size at first. I typed in 200 instead of selecting 256 MB.) So save your stuff before you run the setup.

If your battery falls out or runs out of juice you will loose everything on your RAMDISK, but this rarely happens to my laptop. I always sleep my laptop instead of shutdown, so my data was always there when I woke my laptop up.

 

 

Performance Tests

So how much faster is a RAM disk than a standard 5,400 laptop hard disk? I used a simple utility DiskBench to measure disk performance.  The RAM Disk was 1.5 times faster creating files. But very surprisingly, for all other tests, the RAM disk and the hard disk were pretty much the same.

 

Conclusion

Sorry, a RAM disk isn’t going to help you out much. It doesn’t beat the natural caching of Visual Studio and Vista. Stick to the hard disk. Also, the pain of having to constantly copy a VS solution from the hard disk to the RAM disk means that it is definitely more pain than it is worth. Perhaps moving your temporary IE files folder would be a good application since it creates a lot of files.

 

Raw performance data.

Note R is the Ramdisk and C is the hard disk. During my test I tried various file sizes ranging from 50 MB to 180 MB. I performed file creation, file reading, and file writing tests. I ran one test against my C drive and then one test against my R drive (ram disk). I shut down all other programs that could potentially compete for IO.

Create File R: C:    
create file 60 MB 272 425   (milliseconds)
create file 60 MB 273 412    
create file 60 MB 206 419    
average: 250.3 418.7 1.7 times faster
create file 180 MB 978 1345    
create file 180 MB 870 1283    
create file 180 MB 945 1344    
average: 931.0 1324.0 1.4 times faster
  R: C:    
create file 192 MB 1034 1484    
create file 192 MB 1002 1427    
create file 192 MB 904 1334    
average: 980.0 1415.0 1.4 times faster
create overall average:     1.5 times faster
         
Read File R: C:    
read file 201 MB 255 272   (milliseconds)
read file 201 MB 269 249    
read file 201 MB 268 419    
read overall average: 264.0 313.3 1.2 times faster
         
Copy File R: C:    
copy 50 MB to same dir 144 143   (milliseconds)
copy 50 MB to same dir 150 119    
copy 50 MB to same dir 169 135    
copy 50 MB to same dir 118 121    
copy 50 MB to same dir 129 125    
copy 50 MB to same dir 117 121    
copy 50 MB to same dir 138 121    
copy overall average: 138 126.4 0.92 times faster (slower)
         
Visual Studio Test R: C:    
open solution with 100 files 5 3   (seconds)
  5 6    
average: 5 4.5   (pretty much the same)
         
Find word in VS R: C:    
  4 4   (seconds)
  5 5    
average: 4.5 4.5   (pretty much the same)
         
Run VS project R: C:    
  5 6    
  7 5    
average: 6 5.5   (pretty much the same)

Thursday, February 12, 2009

Highpoint Games

 

My friend has just started his own board game company, and has released his first game “Greedy Greedy”. www.highpointgames.com

Check it out, this is a game that we had a lot of fun playing many times while in Florida.

Friday, January 30, 2009

SUSHI Version 3.4 Released

 

SUSHI Version 3.4 includes the following improvements:

  • Important Improvement: Security Report: The logic for looking up Active Directory Groups that a User is a member of has been improved so that users with read-only privileges to Active Directory can successfully query group membership.
  • Delete old documents: This feature is now asynchronous and includes a cancel so that the action can be canceled after it has started but before it is finished. This is useful when archiving a large number of documents.
  • Bulk List Creation: Tips for bulk deleting lists and renaming list URLs added.

Check for Memory Leaks in your SharePoint API Code

 

Microsoft has just released a memory leak checker for SharePoint code. Not sure if all your SPWeb and SPSite objects have been properly disposed? Use this tool to discover which lines of your code are not properly disposing of those objects.

 

Each SPWeb and SPSite object takes 1-2 megs of unmanaged memory. So no properly disposing of them causes major memory leaks. However, (this is important) if you dispose of a SPSite object that you get from SPContext.Current, you will crash your SharePoint site. So having a tool that you can run to definitively determine if your production code is good, is a big help.

 

-Joseph

Saturday, January 24, 2009

SharePoint SUSHI Version 3.3 Released

I am pleased to announce the release of SUSHI, version 3.3.  About SharePoint SUSHI.

 

sushi_homepage4.png

 

New Features, Improvements and Bug Fixes

  • Delete old documents: Copy documents older than a specified date to an archive folder and then delete them from SharePoint.
  • Security Report: User can type and also select from the list. As the user types, the name of the SharePoint User is found.
  • Awesome new screenshot on the Codeplex SUSHI home page. :)

 

I would like to thank the community for all the great feedback. I am working to incorporate your ideas as quickly as possible. Knowing which features you are using is helpful to know which features to prioritize.

 

Thank you to all who have donated to SUSHI. This is always a very big help. You are welcome to make a small donation to SUSHI.

 

-Joseph Fluckiger

Scripted SharePoint Install

Doing a scripted install of MOSS can save a lot of time, and can also ensure that your installs are consistent.

Below is a sample scripted install of MOSS. (I have changed sensitive information like passwords, emails and URLS.)

I have found that it works best to execute these commands one at a time, rather than as a single script. If you are unlike me and never make mistakes, then feel free to run them all at once. Another added benefit is that this provides very nice documentation of your install for future posterity. And for someone who understands windows scripts can teach quite effectively what it takes to install MOSS, surely much more effectively than the beastly 30 page install instructions on MSDN.

One of my favorite benefits of a scripted install is that you can choose a name for the Central admin content database instead of getting stuck with the ugly content database name the includes a Guid. Some of the setproperty statements at the end are optional, but they give you an idea of how you can customize this script to make sure the same policies are applied across environments.

Credit for these scripts go to Ben Curry.

------------------------------

@echo off
REM //////////////////////////////////////////////////
REM // script farm -- creating dbs and setting sites /
REM //  sharepoint farm.                             /
REM //////////////////////////////////////////////////

REM //////////////////////////////////////////////////
REM // applications
REM //////////////////////////////////////////////////
set s="C:\Program Files\Common Files\Microsoft Shared\web server extensions\12\BIN\stsadm.exe"
set ps="c:\program files\common files\microsoft shared\web server extensions\12\bin\psconfig.exe"

REM //////////////////////////////////////////////////
REM // your enterprise SQL server
REM //////////////////////////////////////////////////
set sql=TOPDEV

REM //////////////////////////////////////////////////
REM // account vars
REM //////////////////////////////////////////////////
set mossfarm=dev\mossfarm
set mosscrawler=dev\mosscrawler
set sspapid=dev\mosswebapp
set myapid=dev\mosswebapp
set sspservice=dev\mossfarm
set portalapid=dev\mosswebapp
set mossservice=dev\mossfarm

REM //////////////////////////////////////////////////
REM // password
REM //   NOTE:if using a % sign in password you must
REM //        escape it with a % sign 'iuyOP%%$#@!11'
REM //        is interpreted as 'iuyOP%$#@!11'
REM //       
REM //////////////////////////////////////////////////
set p=MyPasswordIsSomething

REM //////////////////////////////////////////////////
REM // additional settings
REM //////////////////////////////////////////////////
set sspportalurl=http://ssp.catapultdemo.com:81
set mysiteurl=http://my.catapultdemo.com:81
set portalurl=http://catapultdemo.com:81
set owneremail="joseph at email.com"
set ownername="Joseph Fluckiger"
set ownerlogin="dev\josephf"

REM //////////////////////////////////////////////////
REM // start work here
REM //////////////////////////////////////////////////

GOTO CURRENTSPOT
:CURRENTSPOT

Echo ===============================
Echo == Creating Farm
Echo ===============================
:: Creating Farm via populating the ConfigDB. Set SQL Servername, configDB name, Central Admin ContentDB, and Farm Account.
@pause
%ps% -cmd configdb -create -server %sql% -database SharePoint_Config_TOPDEV -user %mossfarm% -password %p% -admincontentdatabase SharePoint_Content_CentralAdmin
Echo ======================
Echo Provision Central Admin
Echo ======================
::  Provision Central Admin Application on this server. Uses configDB and ContentDB above. Set port number to suit your requirements.
@pause
%ps%  -cmd adminvs -provision -port 5000 -windowsauthprovider onlyusentlm
Echo ======================
Echo Install all Services
Echo ======================
::  Install all services on machine
@pause
%ps% -cmd services install
Echo ======================
Echo Securing File System and Registry Keys
Echo ======================
:: Set Security on File System and Registry Keys
@pause
%ps% -cmd secureresources
Echo ======================
Echo Starting MOSS Search
Echo ======================
::  Start SharePoint Server Search Service.Verify database and services names. Change role to Index, Query, or IndexQuery, depending on your farm topology.
@pause
%s% -o osearch -action start -role Indexquery -farmcontactemail %owneremail% -farmperformancelevel PartlyReduced -farmserviceaccount %mossservice% -farmservicepassword %p%
Echo ======================
Echo Starting WSS Search
Echo ======================
::  Start WSS Search. Verify database and service names.
@pause
%s% -o spsearch -action start -farmserviceaccount %mossservice% -farmservicepassword %p% -farmcontentaccessaccount %mosscrawler% -farmcontentaccesspassword %p% -databaseserver %sql% -databasename SharePoint_WSS_Search
Echo ======================
Echo Installing all Features
Echo ======================
::  Install all features on machine
@pause
%ps% -cmd installfeatures
Echo ======================
Echo Creating My Sites Web
Echo ======================
::  Create My Site Web application. Verify database name and administrator's names.
@pause
%s% -o extendvs -url %mysiteurl% -ownerlogin "%mossfarm%" -owneremail %owneremail% -exclusivelyusentlm -ownername "mossAdmin" -databaseserver %sql% -databasename SharePoint_Content_MySite -sitetemplate spsmsitehost -description "My Site Host" -sethostheader -apidname MySiteAppPool -apidtype configurableid -apidlogin %myapid% -apidpwd %p%
iisreset
Echo ======================
Echo Enabling Self Service Site Management for %mysiteurl%
Echo ======================
::  Enable Self Service Site Management (Creation) on %mysiteurl%
@pause
%s% -o enablessc -url %mysiteurl%
Echo ======================
Echo Creating SSP Web
Echo ======================
::  Create SSP Web application. Verify database and apid names. (APID = Application Pool Identity)
@pause
%s% -o extendvs -url %sspportalurl% -exclusivelyusentlm -databaseserver %sql% -databasename SharePoint_Content_SSP -donotcreatesite -description "SSP Admin Host" -sethostheader -apidname "SSP1" -apidtype configurableid -apidlogin %sspapid% -apidpwd %p%
::  We must reset IIS before building the SSP. If you are local on the box, you can check all services are created before creating SSP.
iisreset
Echo ======================
Echo Creating SSP
Echo ======================
::  Create SSP. Verify all names and URLs.
@pause
%s% -o createssp -title "SSP1" -url %sspportalurl% -mysiteurl %mysiteurl% -ssplogin %sspservice% -indexserver topdev -indexlocation "C:\Program Files\Microsoft Office Servers\12.0\Indexes" -ssppassword %p% -sspdatabaseserver %sql% -sspdatabasename SharePoint_SSP1_Config -searchdatabaseserver %sql% -searchdatabasename SharePoint_SSP1_Search -ssl no
Echo ======================
Echo Creating Portal
Echo ======================
::  Creating Portal.
@pause
%s% -o extendvs -url %portalurl% -ownerlogin %ownerlogin% -owneremail %owneremail% -ownername %ownername% -exclusivelyusentlm -databaseserver %sql% -databasename SharePoint_Content_catapultdemo -sitetemplate STS#1 -description "Catapult Demo Portal" -sethostheader -apidname "MossWebAppPool" -apidtype configurableid -apidlogin %portalapid% -apidpwd %p%
REM //////////////////////////////////////////////////
REM // POST BUILD
REM //////////////////////////////////////////////////
Echo ========================
Echo Modifying Logging level
Echo and Outbound Smtp
Echo ================calling setlogs.cmd====see premium content for xlsx source file=======
REM  or just set the logging levels in this script
@pause
%s% -o setlogginglevel -category general -tracelevel unexpected -windowslogginglevel error
%s% -o email -outsmtpserver topdev -fromaddress %owneremail% -replytoaddress %owneremail% -codepage 65001

Echo ========================
Echo Setting Application Settings
Echo Setting Recycle Bin Settings
Echo ========================
@pause
%s% -o setproperty -pn recycle-bin-enabled -pv yes -url %portalurl%
%s% -o setproperty -pn recycle-bin-enabled -pv yes -url %mysiteurl%
%s% -o setproperty -pn recycle-bin-retention-period -pv 180 -url %portalurl%
%s% -o setproperty -pn recycle-bin-retention-period -pv 180 -url %mysiteurl%
%s% -o setproperty -pn second-stage-recycle-bin-quota -pv 20 -url %portalurl%
%s% -o setproperty -pn second-stage-recycle-bin-quota -pv 20 -url %mysiteurl%

Echo ========================
Echo Setting Maximum Upload
Echo File Size
Echo ========================
@pause
%s% -o setproperty -pn max-file-post-size -pv 200 -url %portalurl%
%s% -o setproperty -pn max-file-post-size -pv 200 -url %mysiteurl%