Posted by: Dennis | August 15, 2010

Avoid initialization of proxy object during compare

Lets assume you have defined an equals operator on your entity class, e.g. something like resharpers default implementation:

public override bool Equals(object obj)
{
 if (ReferenceEquals(null, obj)) return false;
 if (ReferenceEquals(this, obj)) return true;
 if (obj.GetType() != typeof (Category)) return false;
 return ((Category)obj).Id == Id;
}

Now, the problem occurs when you want to use this for an NHibernate proxy object. First of all the GetType will fail, because it is a sub class, not the actual class.

Secondly, if one of the two objects is indeed a proxy object we may never have loaded it from the database. And if we never loaded it from the database, then the call to .Id will cause NHibernate to load the object. And that was NOT what we wanted.

The solution is to make a very complicated equals method that take the proxies into account:

public override bool Equals(object obj)
{
 if (ReferenceEquals(null, obj)) return false;
 if (ReferenceEquals(this, obj)) return true;

Stays the same

 INHibernateProxy thisproxy = this as INHibernateProxy;
 int thisId;
 Type thisType = GetType();
 if (ReferenceEquals(null, thisproxy))
 {
   thisId = Id;
 }
 else
 {
   thisType = thisType.BaseType;
   if (thisproxy.HibernateLazyInitializer.IsUninitialized)
     thisId = (int) thisproxy.HibernateLazyInitializer.Identifier;
   else
     thisId = Id;
 }

This is where it becomes complicated. We cast this to INHibernateProxy, and if we find that it is a proxy object that is not initialized, then we use the Lazy loaders notion of an Id to compare to.

So the full Equals method ends up as:

public override bool Equals(object obj)
{
 if (ReferenceEquals(null, obj)) return false;
 if (ReferenceEquals(this, obj)) return true;
// ReSharper disable SuspiciousTypeConversion.Global
 INHibernateProxy thisproxy = this as INHibernateProxy;
 int thisId;
 Type thisType = GetType();
 if (ReferenceEquals(null, thisproxy))
 {
   thisId = Id;
 }
 else
 {
   thisType = thisType.BaseType;
   if (thisproxy.HibernateLazyInitializer.IsUninitialized)
     thisId = (int) thisproxy.HibernateLazyInitializer.Identifier;
   else
     thisId = Id;
 }
 INHibernateProxy otherproxy = obj as INHibernateProxy;
 int otherId;
 Type otherType = obj.GetType();
 if (ReferenceEquals(null, otherproxy))
 {
   otherId = ((EntityBase) obj).Id;
 }
 else
 {
   otherType = otherType.BaseType;
   if (otherproxy.HibernateLazyInitializer.IsUninitialized)
     otherId = (int)otherproxy.HibernateLazyInitializer.Identifier;
   else
     otherId = ((EntityBase)obj).Id;
 }
 if (otherType != thisType) return false;
// ReSharper restore SuspiciousTypeConversion.Global
 return otherId == thisId;
}

It would have been even better if the proxy was slightly more intelligent and didn’t init itself when fetching the value that is the key.

Posted by: Dennis | August 15, 2010

Avoid mocking for Request.IsAjaxRequest

A small nauseating thing about the Request.IsAjaxRequest is that there is no real nice way to test methods that use it.

The normal approach to test methods that use it is to mock the controllercontext

public void MakeRequestAjax(ControllerBase controller)
{
      var controllerContext = MockRepository.GenerateMock<ControllerContext>();
      controllerContext.Expect(c => c.HttpContext.Request["X-Requested-With"]).Return("XMLHttpRequest");
      controller.ControllerContext = controllerContext;
}

But Seriously, thats like shooting birds with a cannon.

Like shooting birds with a cannon

The much simpler approach is add a small wrapper to your controller  (or your controller baseclass)

private bool? _isAjaxRequest;
internal bool IsAjaxRequest
{
  get { return _isAjaxRequest ?? Request.IsAjaxRequest(); }
  set { _isAjaxRequest = value; }
}

Then in your test code you explicitly set it to true or false. If you don’t set it, then it will use the standard method of looking up the result.

I had a problem with Silverlight and dynamically loading xap files… It simply didn’t work with the “Application library cache” feature, which meant that either you had to put ALL of your dependencies in the “shell”.

This is ofcourse very much against the whole point of the “Application Library Cache”, we end up with a behemoth of an assembly. The point is that we want the start using the application instantly, thus download to be as small as possible.

Let me illustrate the point with an example:

We have a shell xap (myshell.xap), which just loads the start page and login page. And when it starts up it is supposed to load the second xap (hopefully before the user ever knows that it needed to)

And then we have another xap (myapp.xap) which is a full feature RIA application using MEF.

And then we have a third xap (myreport.xap) which is also a full feature RIA application, MEF, AND it also needs a repoting third party.

You can now choose your poison in how to set this up using the standard methods. Either we can make myshell.xap include the ria, mef, and the reporting lib. This means the size of your initial download explodes (probably 1MB instead of 10kb).

The way that people then usually solve this is that they make a 3rd XAP, which then gets loaded before the myapp and myreport xaps. This 3rd xap then includes all of the assemblies as above. The problem with this, is that we we needed to use myapp only, we still have to pay for downloading the reporting modules.

Microsoft thought it would be great in Silverlight 3 to include a feature called “Application Library Cache” which is supposed to solve this problem. Which it does… As long as you only have 1 xap in your application (but possibly several applications).

So if we use the “Application Library cache” feature in the above scenario we end up with 3 XAP files and a zip file for each of the dependencies. The point was then when I loaded myapp.xap, it should automatically load the dependencies of the XAP file and also fetch those. The problem is just that Microsoft implemented it in the loader, which means this functionality is not available inside Silverlight.

So I set out to solve that problem :) If you don’t care about all the details, then just skip to the bottom and copy the code…

First, go and read these two: How to use Library Cache in the first place and How to dynamically load a xap file.

A good read? You might have noticed that in the first place, there as a new component in the AppManifest.xml called ExtensionPart and that the second didn’t do anything about that.

So the first thing to do was actually put this code into the reading of the XML:

And with those two things in place, the rest of the code is rather trivial.

                List newdependencies = new List();
                reader.ReadToFollowing("ExtensionPart");
                do
                {
                    var uri = new Uri(reader.GetAttribute("Source"), UriKind.RelativeOrAbsolute);
                    lock (_dependencies)
                        if (_dependencies.Add(uri))
                            newdependencies.Add(uri);
                } while (reader.ReadToNextSibling("ExtensionPart"));

First of all we keep track of the dependencies we have already loaded from other xap files, there is no need to download things we already know is loaded. Unfortunately I cannot see if that particular assembly is loaded, simply because that information is not in the xml, only the link to its zip file. The second big problem is that once we actually load these zip files, there is no standard way of actually loading them in Silverlight. We can only load files from a zip file, where we ALREADY know the name. But these zip files were not packacked by us, and thus we cannot know what files are in the ZIP.

To get around this, we implement the absolutely minimal parsing of ZIP files, to read out the filenames. See GetFileName in the code.

/*
Copyright (c) 2009, Dennis Haney 
All rights reserved.

Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
    * Redistributions of source code must retain the above copyright
      notice, this list of conditions and the following disclaimer.
    * Redistributions in binary form must reproduce the above copyright
      notice, this list of conditions and the following disclaimer in the
      documentation and/or other materials provided with the distribution.
    * Neither the name of the  nor the
      names of its contributors may be used to endorse or promote products
      derived from this software without specific prior written permission.

THIS SOFTWARE IS PROVIDED BY Dennis Haney ''AS IS'' AND ANY
EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL Dennis Haney BE LIABLE FOR ANY
DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
 */
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.IO;
using System.Linq;
using System.Net;
using System.Reflection;
using System.Text;
using System.Threading;
using System.Windows;
using System.Windows.Resources;
using System.Xml;

namespace Utilities
{
    /// 
    /// A downloader of XAP files, that is capable of also 
    /// fetching the "Reduce XAP size by using application library cache" dependencies.
    /// 
    public class XapLoader
    {
        private readonly List _webClients = new List();
        private readonly List> _webClientsProgress = new List>();
        private readonly List> _assemblies = new List>();
        private int _downloadcount;
        private Exception _firstError;
        private volatile bool _wasCancelled;
        private readonly HashSet _dependencies = new HashSet();

        /// 
        /// Occurs when the download of all files and dependencies are completed
        /// 
        public event Action> DownloadCompleted;

        /// 
        /// Occurs when the amount downloaded changes. 
        /// A little quirky since it cannot know the full size until we actually have downloaded everything.
        /// 
        public event Action DownloadProgressChanged;

        /// 
        /// Initiate downloads of the given xap files, and all of their dependencies.
        /// Can be called multiple times, but might get multiple callbacks if downloads completed between the calls.
        /// 
        public void StartDownloads(params Uri[] uris)
        {
            StartDownloads(uris.AsEnumerable());
        }


        /// 
        /// Initiate downloads of the given xap files, and all of their dependencies
        /// Can be called multiple times, but might get multiple callbacks if downloads completed between the calls.
        /// 
        public void StartDownloads(IEnumerable uris)
        {
            _wasCancelled = false;
            foreach (var uri in uris)
            {
                WebClient wc = new WebClient();
                wc.OpenReadCompleted += FetchCompleted;
                wc.DownloadProgressChanged += ProgressChanged;
                Interlocked.Increment(ref _downloadcount);
                lock (_webClients)
                    _webClients.Add(wc);
                wc.OpenReadAsync(uri);
            }
        }

        private void ProgressChanged(object sender, System.Net.DownloadProgressChangedEventArgs e)
        {
            Action evt = DownloadProgressChanged;
            if (evt == null)
                return;
            long bytesReceived = 0;
            long totalBytesToReceive = 0;
            lock (_webClientsProgress)
            {
                while (_webClientsProgress.Count < _webClients.Count)
                    _webClientsProgress.Add(new KeyValuePair(0, 0));
                int idx = _webClients.IndexOf((WebClient) sender);
                _webClientsProgress[idx] = new KeyValuePair(e.BytesReceived, e.TotalBytesToReceive);
                foreach (var pair in _webClientsProgress)
                {
                    bytesReceived += pair.Key;
                    totalBytesToReceive += pair.Value;
                }
            }
            evt(this, new DownloadProgressChangedEventArgs(bytesReceived, totalBytesToReceive));
            
        }

        public class DownloadProgressChangedEventArgs : ProgressChangedEventArgs
        {
            public DownloadProgressChangedEventArgs(long bytesReceived, long totalBytesToReceive)
                : base((int) (bytesReceived / totalBytesToReceive), null)
            {
                BytesReceived = bytesReceived;
                TotalBytesToReceive = totalBytesToReceive;
            }

            public long BytesReceived { get; private set; }
            public long TotalBytesToReceive { get; private set; }
        }        /// 
        /// Called when one of the downloads are done
        /// 
        private void FetchCompleted(object sender, OpenReadCompletedEventArgs e)
        {
            if (_wasCancelled) return;
            if (e.Error != null || e.Cancelled)
            {
                _firstError = e.Error;
                CancelAsync();
                return;
            }
            if (!_wasCancelled)
                LoadPackagedAssemblies(e.Result);

            int left = Interlocked.Decrement(ref _downloadcount);
            if (left > 0)
                return;
            //Since we added these as we got them, the dependencies are actually last, 
            //so reverse the order, so that depencies are loaded first
            _assemblies.Reverse(); 
            OnDownloadComplete(_wasCancelled);
        }

        private void OnDownloadComplete(bool wasCancelled)
        {
            IEnumerable assms = null;
            if (!wasCancelled)
                assms = from kvp in _assemblies select kvp.Key.Load(kvp.Value);
            _assemblies.Clear();
            _webClients.Clear();
            _webClientsProgress.Clear();
            Action> evt = DownloadCompleted;
            if (evt != null)
                evt(new AsyncCompletedEventArgs(_firstError, wasCancelled, null), assms);
        }

        /// 
        /// Cancel all pending downloads. Not threadsafe with calls to StartDownloads
        /// 
        public void CancelAsync()
        {
            _wasCancelled = true;
            lock (_webClients)
            {
                foreach (var wc in _webClients.Where(wc => wc.IsBusy))
                    wc.CancelAsync();
            }
            OnDownloadComplete(true);
        }

        /// 
        /// Load all dlls from zip files and xap files, for the xap files also initiate the download of any non-included dependencies
        /// 
        private void LoadPackagedAssemblies(Stream packageStream)
        {
            StreamResourceInfo packageStreamInfo = new StreamResourceInfo(packageStream, null);
            StreamResourceInfo manifestStreamInfo = Application.GetResourceStream(packageStreamInfo, new Uri("AppManifest.xaml", UriKind.Relative));
            if (manifestStreamInfo == null) //Zip file with DLLs only
            {
                foreach (var filename in GetFileNames(packageStream))
                    Add(packageStreamInfo, filename);
                return;
            }

            using (XmlReader reader = XmlReader.Create(manifestStreamInfo.Stream))
            {
                reader.ReadToFollowing("AssemblyPart");
                do
                {
                    string source = reader.GetAttribute("Source");
                    Add(packageStreamInfo, source);
                } while (reader.ReadToNextSibling("AssemblyPart"));

                //Unfortunately the way MS did this, they didn't bother writing what assemblies are actually in those links,
                //so we are forced to fetch the files even if they turn out to already be loaded
                List newdependencies = new List();
                reader.ReadToFollowing("ExtensionPart");
                do
                {
                    var uri = new Uri(reader.GetAttribute("Source"), UriKind.RelativeOrAbsolute);
                    lock (_dependencies)
                        if (_dependencies.Add(uri))
                            newdependencies.Add(uri);
                } while (reader.ReadToNextSibling("ExtensionPart"));

                if (!_wasCancelled)
                    StartDownloads(newdependencies);
            }
        }

        private void Add(StreamResourceInfo packageStreamInfo, string source)
        {
            Stream stream = Application.GetResourceStream(packageStreamInfo, new Uri(source, UriKind.Relative)).Stream;
            var assemblyPart = new AssemblyPart { Source = source };
            lock (_assemblies) //We dont load them here, so that we can load them in the right order at the end
                _assemblies.Add(new KeyValuePair(assemblyPart, stream));
        }

        /// 
        /// This really ougth to be in the silverlight library
        /// 
        private static IEnumerable GetFileNames(Stream stream)
        {
            stream.Seek(0, SeekOrigin.Begin); //rewind
            var ret = new List();
            var archiveStream = new BinaryReader(stream);
            while (true)
            {
                string file = GetFileName(archiveStream);
                if (file == null) break;
                ret.Add(file);
            }
            stream.Seek(0, SeekOrigin.Begin); //rewind
            return ret;
        }

        private static string GetFileName(BinaryReader reader)
        {
            // http://www.pkware.com/documents/casestudies/APPNOTE.TXT
            var headerSignature = reader.ReadInt32();  // local file header signature     4 bytes  (0x04034b50)
            if (headerSignature != 0x04034b50)
                return null; // Not a zip file
            reader.ReadInt16();                        // version needed to extract       2 bytes
            reader.ReadInt16();                        // general purpose bit flag        2 bytes
            reader.ReadInt16();                        // compression method              2 bytes
            reader.ReadInt16();                        // last mod file time              2 bytes
            reader.ReadInt16();                        // last mod file date              2 bytes
            reader.ReadInt32();                        // crc-32                          4 bytes 
            int compressedsize = reader.ReadInt32();   // compressed size                 4 bytes
            reader.ReadInt32();                        // uncompressed size               4 bytes
            short filenamelength = reader.ReadInt16();   // file name length                2 bytes
            short extrafieldlength = reader.ReadInt16(); // extra field length              2 bytes
            byte[] fn = reader.ReadBytes(filenamelength); // file name                    (variable size)
            string filename = Encoding.UTF8.GetString(fn, 0, filenamelength);
            //And then make sure to skip the actual data, so that we can loop over it
            reader.BaseStream.Seek(compressedsize + extrafieldlength, SeekOrigin.Current);

            return filename;
        }

    }
}
Posted by: Dennis | May 18, 2009

How to do parallel work with PageMethods

Let take an trivial example. Here we make 4 asynchronous calls to the server to some function call DoWork.

function pageLoad() {

   PageMethods.DoWork(OnSucceeded, OnFailed);

   PageMethods.DoWork(OnSucceeded, OnFailed);

   PageMethods.DoWork(OnSucceeded, OnFailed);

   PageMethods.DoWork(OnSucceeded, OnFailed);

}

The DoWork method returns the time it started and the time it finished:

[WebMethod]

public static string DoWork()

{

  DateTime dt = DateTime.Now;

  Thread.Sleep(2000);

  return "Success @ " + dt.ToString(DateTimeFormatInfo.InvariantInfo) + " - " +

    DateTime.Now.ToString(DateTimeFormatInfo.InvariantInfo);

}

If you create a brand new project in Visual Studio and run the above, you will get something similar to:

Success @ 05/18/2009 20:04:52 – 05/18/2009 20:04:54
Success @ 05/18/2009 20:04:52 – 05/18/2009 20:04:54
Success @ 05/18/2009 20:04:52 – 05/18/2009 20:04:54
Success @ 05/18/2009 20:04:52 – 05/18/2009 20:04:54

And then 10 minutes later when you try to do the exact same thing on your production environment and test it you get:

Success @ 05/18/2009 20:18:09 – 05/18/2009 20:18:11
Success @ 05/18/2009 20:18:11 – 05/18/2009 20:18:13
Success @ 05/18/2009 20:18:13 – 05/18/2009 20:18:15
Success @ 05/18/2009 20:18:15 – 05/18/2009 20:18:17

This was probably not what you were expecting. The reason is rather obscure. If you simply have a Global.asax then ASP.NET will take an exclusive lock on your session object on every call to the server for a given Session.

This mean that in practice, your users can only make one call to the server at a time.

So how to fix?

On might think that disabling the session access to the WebMethod is enough, like this:

[WebMethod(false)] <------- HERE

public static string DoWork()

But that would be incorrect. In fact that does absolutely nothing.

Instead you have to mark your whole PAGE with no session, using the setting in the aspx page:

EnableSessionState=”False”

That will again return dates that are the same for each call. Well almost at least. If we increase the number of calls to the server to e.g. 8, what we get back is:

Success @ 05/18/2009 20:24:47 – 05/18/2009 20:24:49
Success @ 05/18/2009 20:24:47 – 05/18/2009 20:24:49
Success @ 05/18/2009 20:24:47 – 05/18/2009 20:24:49
Success @ 05/18/2009 20:24:47 – 05/18/2009 20:24:49
Success @ 05/18/2009 20:24:48 – 05/18/2009 20:24:50
Success @ 05/18/2009 20:24:48 – 05/18/2009 20:24:50
Success @ 05/18/2009 20:24:49 – 05/18/2009 20:24:51
Success @ 05/18/2009 20:24:49 – 05/18/2009 20:24:51

The reason is that now the browser is restricted by opening too many connections at the same time. So even though it will look like it is calling all 8 methods on the client side (the javascript call returns, FireBug will show nice graphs saying they are all called at the same time, etc.) then in reality it waits until a connection is available.

But what do you do when you actually need to use Session somewhere in your page or you have a nice 4+ CPU machine you actually want to utilize? Well, that gets a little tricky.

First we have to change the client side. Instead of making a single webmethod call, we need to first make a call to a “Begin” method and then a call to an “End” method to get the result.

function DoSomeWork() {

   PageMethods.BeginDoWork(OnSucceededBegin, OnFailed);

}

function OnSucceededBegin(result) {

   PageMethods.EndDoWork(result, OnSucceeded, OnFailed);

}

On the server side it gets even more tricky.

   1: [WebMethod]

   2: public static string BeginDoWork()

   3: {

   4:    string g = Guid.NewGuid().ToString();

   5:    Func<string> f = () => DoWork();

   6:    IAsyncResult call = f.BeginInvoke(null, f);

   7:    lock (OnGoingWork)

   8:       OnGoingWork[g] = call;

   9:    return g;

  10: }

  11: private static readonly Dictionary<string, IAsyncResult> OnGoingWork = new Dictionary<string, IAsyncResult>();

In line 4 we create a token. In line 5 we make a lambda function which calls our original method. In line 6 we start the execution of it on another thread, notice the 2nd parameter is the lambda function. Line 7 to 9 stores the IAsyncResult in a static variable using the token as identifier.

   1: [WebMethod]

   2: public static string EndDoWork(string guid)

   3: {

   4:    IAsyncResult call;

   5:    lock (OnGoingWork)

   6:    {

   7:       call = OnGoingWork[guid];

   8:       OnGoingWork.Remove(guid);

   9:    }

  10:    Func<string> f = (Func<string>) call.AsyncState;

  11:    call.AsyncWaitHandle.WaitOne();

  12:    return f.EndInvoke(call);

  13: }

In the “End” function we then need to get the result of the asynchronous call. Line 5 to 9 fetches the IAsyncResult out of the static variable and cleans up the. Line  10 gets the lambda function out of the IAsyncResult with the small hack from line 6 in the “Begin” function. Line 11 waits for the call to actually finish, and then in line 12 we get the result.

And voila:

Success @ 05/18/2009 20:28:48 – 05/18/2009 20:28:50
Success @ 05/18/2009 20:28:48 – 05/18/2009 20:28:50
Success @ 05/18/2009 20:28:48 – 05/18/2009 20:28:50
Success @ 05/18/2009 20:28:48 – 05/18/2009 20:28:50
Success @ 05/18/2009 20:28:48 – 05/18/2009 20:28:50
Success @ 05/18/2009 20:28:48 – 05/18/2009 20:28:50
Success @ 05/18/2009 20:28:48 – 05/18/2009 20:28:50
Success @ 05/18/2009 20:28:49 – 05/18/2009 20:28:51

Posted by: Dennis | April 22, 2009

Aspnet_Compiler compilation speed (Part 2)

In a previous post I discussed some of the things we could do to speed up the aspnet_compiler. Today I shall talk about some of the more exotic things to mess up.

Fixed Names

FixedNames is the setting in the gui called “Create a separate assembly for each page and control output”. When you do this, then each individual file is compiled on its own, and ASP.NET is super slow at doing this.

Unfortunately it is sometimes necessary. In some source you may get errors at runtime that a certain dll could not be found. I haven’t discovered the exact circumstances that causes it, but I suspect it has something to do with circular references between pages and controls.

You can check if you will run into this if you compile in the default merge too 1 assembly and in the fixed name mode with merge enabled anyway (You can only do this in the wdproj file directly) and the result are identical.

Avoid Copy

Based on feedback from multiple users we have enhanced WDP so that it does not wipe out the precompiled web from previous WDP run until the current WDP does not build successfully… This is specifically useful if you have created IIS Virtual directory pointing to the output location…  Now even if your current WDP build does not succeed your web site in IIS will still continue to function… At the same time this also implies that it will be important to take a note of the WDP output in the VS output window before grabbing the deployed output for any further processing as your output folder might have the output from the previous build…

This is from the VS 2008 version of Web Deployment Project (WDP). If you noticed a severe drop in performance between 2005 and 2008, this is probably the reason.

What it actually does is make a temporary directory, where it copies all of the files to, then completes the compile as normal, deletes the old deployment dir and then copies the tempdir over to the deployment dir and then deletes the tempdir… Now, why don’t they actually just MOVE the tempdir to the deployment dir and save the double copy? Perhaps something to do with permissions on the deployment folder.

No matter the reason, it seems like a good idea to fix it. The easiest is obviously to disable the temporary copy, but if you actually need the original to be usable even if the build fails, then obviously you cannot do that.

The other thing you can do is use that ramdrive you already set up for the tempdir. In order to do this, you need to edit the msbuild source of WDP. Specifically lines 89-91:

<CreateProperty Value=".\TempBuildDir\">

  <Output TaskParameter="Value" PropertyName="TempBuildDir" />      

</CreateProperty>

Just change the value to point to somewhere on your ramdrive.

Posted by: Dennis | April 16, 2009

Aspnet_Compiler compilation speed (part 1)

The Aspnet_compiler is a really useful util to precompile websites, so that the startup time in production environments are minimized and that none of the actual aspx pages needs to be part of the deployment.

However, it is also a real piece of crap from a performance perspective, reflecting the truely horrible performance that Asp.Net also has upon start-up. Which is no surprise since they use the exact same way of compiling sites.

Therefore we can also use various techniques for speeding up asp.net to speed up the aspnet_compiler.

First of all, you should not be running the aspnet_compiler as part of your normal build. That is just a waste of time. Instead let your auto builder do it after you commit your code (Because you are running a source repository and you do have a build server, right? Otherwise, implement that as the very first thing).

Second thing is to setup defrag on your drives on your build server. They become extremely fragmented if you have lots of people committing code. Obviously. if you have a SSD you don’t need to do this part (at least until there is a SSD compatible defragger).

Next depends on how much memory you have in your build server. If you have “enough", then make a ram drive for the following, otherwise simply just use a different drive from where your source code is located.

Third is to move the temporary location that asp.net uses to a faster location. You do this by editing either the web.config in %systemroot%\Microsoft.NET\Framework\v2.0.50727\CONFIG if your build server is dedicated to building, or in the individual web.config files (just remember not to commit them). The former is highly recommended.

What you want to edit is the tempDirectory of the compilation tag. Point it to a folder on the drive you selected above. Remember that this folder must have similar permissions as the %systemroot%\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files folder, which is the default.

I have a few more up my sleeves, until next time.

Posted by: Dennis | April 2, 2009

What to do when the known world goes broke?

USA is on the brink of bankruptcy, and it seems most European banks have also been lying about their books using AIG.

The result is the outlook to a greater depression than the one that was seen in the 1930s. The question is then… What will the world look like when the true depression hits? And how will it look like afterwards?

Posted by: Dennis | March 31, 2009

A web shop (Part 1)

image I really tried. I did. I started up the Entity framework and wasted hours upon hours on trying to make it do even the simplest little thing.

The thing I was trying to do was create 3 super simple tables: A product, a Product_Language containing translations of any text associated with the product and finally a language table to hold stuff like the ISO code and stuff.

Problem 1: The brain dead ESMX editor does not actually allow you to create tables. Uh no, that has to happen somewhere else.

Problem 2: That somewhere else inside of Visual Studio is the Data Connections Manager. I am not sure I would recommend this to even my worst enemy. In order to make a product table with 1 (yes 1) column, you have to click on the following:

  • Menu bar –> View –> Server Explorer
  • Navigate your way into Data Connections –> <your connection> –> Tables
  • Right click Tables and click “New table”
  • Write Id <tab>
  • Now I don’t know exactly how many firing neurons the guy who made the default Datatype had. But I am sure a lot of people hate him for choosing “nchar(10)” as the default. Perhaps it was choosing, because noone in their right mind would ever choose this for any real columns, thus forcing the user to think. Hint Microsoft: When I name something “Id” at the end of the word, the datatype I want is probably int. Anyway, type “int” <tab>
  • Press space to deselect “allow nulls”. Again we have the guy with lack of firing neurons choose the default. Joel has a thing about people not figuring out how recursion and pointers work (he is right btw.), I on the other hand have noticed that peoples eyes go similarly blank when you begin to talk about nullability. People just don’t understand nullability. If you don’t know what it is, don’t use it. About the only place where you really NEED it, is when you have foreign keys you want to have optional. Otherwise, leave it to people that understand nullability.
  • Now use your mouse to expand the “Column properties”, so that you can actually find the thing you are looking for. Assuming of course you are reading this, so that you know you are looking for something down here.
  • Expand the little plus next to the “Identity Specification”, then double click the “No” in the “(Is Identity)” to change it into a “Yes” (because double clicking the exact same “No” it in the “Identity Specification” will do exactly nothing)

Congratulations, you have now spent 10 minutes on something that could be written in 10 seconds: “Create table product (id int primary key not null identity)”

Problem 3: Every time I closed the ESMX editor to see if the weird errors it would spew, then it could not be reopened again. Restarting VS was the only option.

Problem 4: For some weird reason it would not accept my multi column key on the product language table. This was solved by simply deleting the ESMX and starting over again…

Problem 5: I do not know how many hundreds of mouse clicks I needed to go through that editor to figure out just the tiniest things.

It actually works now:

image

But I will not try the ESMX editor again, for a very very long time. So next time… Creating the same thing with NHibernate.

Posted by: Dennis | March 31, 2009

Pixel Qi screens coming to an eBook reader near you

It seems that Pixel Qi has finally put their screens into production and someone has put them in an eBook reader.

Pixel Qi are those people that made the screen for the OLPC, the super cute little green laptop for children in developing countries (oddly enough USA is not registered as a developing country)

Looking forward to being able to put my reading on an eBook, instead of wasting paper and money printing it.

Posted by: Dennis | March 29, 2009

Forgive me, for I have sinned

I forgot Earth hour. There I admitted it… Was installing VS and forgot all about it.

That reminded me of a cafe on the way to work which was advertizing with “We will turn off the lights for one hour, and instead burn candles”. This illustrates just why humans are unfit for inhabiting earth. Except me, my friends, my family, everyone involved in products I use and produce, and of course their family and friends and everyone involved in products they use and produce etc.

Let us just examine exactly what it means to replace 1 hour of electrical lights with 1 candle. 1 hour of light with an old energy hungry 60W light bulb would use a massive 0.06KWh. This site rates that we pollute appx. 0.37kg CO2 per KWh. So we would have polluted 18.5 grams of CO2. Moneywise, it probably cost around 0.5 cents with current VN prices.

A candle however is harder to estimate. Let us consider just what it takes to even be allowed to burn it in the first place. First we need to bring it to the cafe from where ever it was bought. Someone had put it on shelfs there, presumably using some machinery and some resources to actually process the sale of the item. To even get it to supermarket, someone else had to drive it there from the packaging central, which again had it transported from somewhere. At some point at the end of all this transportation, we will find a factory.

At this factory, they used resources to package the candles, they used resources to form the wax that was just melted in a big melting pot. And I don’t even know where the wax or packaging materials came from. We do know that moneywise the candle is about 1 USD.

So this cafe replaced a highly optimized delivery system of electricity to an electrical light, with an item that requires hundreds of people and thousands of kilometers of physical transportation. Not to mention it cost the cafe 200 times more.

I do like candles, but they should definitely not be used as a REPLACEMENT of electrical light for the purpose of saving the environment.

Older Posts »

Categories

Follow

Get every new post delivered to your Inbox.