Attention: We are retiring the ASP.NET Community Blogs. Learn more >

Fear and Loathing

Gonzo blogging from the Annie Leibovitz of the software development world.

  • Conditional Mandatory Fields and Inter-form Filtering

    No, I'm not having that big neurotic breakdown you've all been expecting (although from the title one has to wonder). I'm on a quest. A quest for knowledge. A quest for an answer. Before I head off to dig into some Microsofties gray matter, I thought I would throw these questions out there for anyone to pick at.

    SharePoint forms are pretty simple things. You define columns in a list and the ListFormWebPart will spit out a data entry form for all the fields, complete with the contract you created for each field (mandatory, html enabled, drop down list vs. checkboxes, etc.). This is great but it does have its limitations, namely two things you can do fairly easily with regular ASP.NET forms.

    The first is the idea of a conditional mandatory field (or mandatory conditional if you prefer). Suppose you want the user to enter a Company name but only if they check a checkbox on the form first. This is handled by a simple event handler (CheckChanged on the checkbox) in the Web Form. However there's no interface in SharePoint to hook into the ListFormWebPart part this way (well, none that I know of). So you either define the field as mandatory or not. The alternative is that you build yourself an ASP.NET form in a custom webpart (or use SmartPart with a User Control) and do the coding yourself and then write the results out to your SharePoint list but now you're in development land rather than configuration land and it's expensive.

    The second is filtering interdependant lists. There have been a few posts I've seen on this. Probably the best one I've seen is this one by Patrick Tisseghem in this blog. Suppose you have one list (Country) that needs to filter another (Province/State) which needs to filter yet another (City). Typical scenario. You don't want to show the list of every city to someone to pick from. You might also have a business condition where you don't want to present lists of information to some users based on some condition that's outside the scope of your SharePoint environment (say a corporate directory of who reports to whom). Anyways, again it's a fairly simple thing to do with regular ASP.NET by hooking into a change event on the list and rebinding the data to the lookup. Patrick's blog is a great tip however it does demand that you know the Guid for the list and runtime things like that to make it work. If you're trying to bake a solution into something like a SCHEMA.XML for a list you usually end up running into dead ends (see my frustration with not being able to define Lookup fields in the list definition for more on this).

    Guids are a big part of SharePoint. Every site, every web, and every list has a guid to identify it. The trouble with guids is that they're only defined at runtime, meaning you either need to suck it out using CAML (which I think I've seen done by some posts by Ian Morrish before) do drop an albeit small but custom-written-none-the-less web part onto a page to get what you need. And to tie in with the "don't touch the system files in SharePoint or you'll be unsupported" thing that has reared it's head recently, you won't be able to do this very well with stock lists or admin pages.

    Looking at the ListFormWebPart (and the other classes in the Microsoft.SharePoint.WebPartPages namespace) it may be possible to override the GetData and the RenderWebPart methods to do something funky before it hits the page. Anyways, if anyone has some interesting ideas to solving these it would be good to hear about them. It may only be that custom solutions (Web Parts) is what you need to do here but that's sometimes overkill. I'll post whatever results I find with discussions with others as we come up with them. Cheers!

  • This wasn't the excuse note you were looking for

    Dear Employer,

    Please excuse Bil Simser from work on Thursday, May 19. He is not feeling well. Bil is at home in bed for the entire day, nursing what appears to be a serious hamster attack. Bil’s illness is in no way, shape or form related to the premiere of the final installment of the greatest story ever, which, coincidentally, premieres on the same date.

    While I cannot confirm nor deny that Bil has called my company, Geek Squad, asking to be set up with wireless access “in case of a space opera-related sick day” know that if you do receive an e-mail from your prized employee today, it is most likely because He was wise enough to plan ahead in the event of illness.

    But as I mentioned before, Bil is at home, safely in bed, but reachable (in dire emergencies) by e-mail or cell.

    One more thing. Beginning at 12:00PM MST, Bil Simser will be unreachable for about two hours, thirteen minutes and eleven seconds. He will be feeling really bad at this time.

    This wasn’t the excuse note you were looking for,
    Robert Stephens
    Geek Squad — Chief Inspector

  • Someone changed my ONET.XML!

    Okay, so there's some buzz going around based on a recent Knowledge Base article (KB# 898631) from Microsoft on supported and unsupported scenarios with custom site definitions. Some people are upset at the scenarios and say that the reason why we're using sitedefs (vs .stp files) is that we can apply the changes against existing sites to "automagically" update a site. Heather Solomon (new to my SharePoint blogroll) has a great reference blog on what directories contain what files and where do they end up when a site/area gets created. Nice stuff and very handy.

    Anyways, here's some scenarios that I ran into a few times with some guidelines on how far you can push the sitedef envelope (even if it is unsupported).

    Modifying the default setups. This is completely unsupported and while I do agree (after all, I wouldn't want someone to create a new portal and have things missing that should be there) however it does create an age old problem we've had with SharePoint. You can't create a new portal with custom areas. Sure you can create new areas by cloning existing ones, but since you can't modify the default setups all portals will always start exactly the same for everyone. The real crux is for organizations that want to customize the My Site so all new My Sites will get something more "corporate" with the standard stuff. After all, I can create a new default team site by copying the STS directory and messing with all the options (say I don't want people to be able to create discussion areas, no problem). However I can't touch My Site because it's part of the base system and thus unsupported.

    There's another kicker to modifying default setups. Like I said with the Portal, these are default Setups. I don't recall seeing any hardcoded values anywhere that depends on these things being there but if I was supporting a product I sure wouldn't want someone calling me if they messed with my control pages. That's like screwing with a .NET assembly and removing bits and pieces of it and recompiling it back to hope it still works. The only thing is that you don't need to decompile the schema files that make up a SharePoint site because they're all there in front of you, with just a flick of Notepad away from becoming your worst enemy. So okay, I'll buy not modifying the default setups but for the love of all that is holy, can we at least modify My Site and have a choice of which portal definition we use when creating a new portal?

    The other non-supported scenario is one of updating a sitedef once sites have been created. I'm on the fence with this because of doing a total fubar on an existing site (which I've done many times, it's not pretty). I can understand the need to put a stake in the ground and say this is unsupported. Like I said I've messed up sites by doing a redeploy over top of existing ones and boy did it hurt. There were points where I *had* to open the thing up in FrontPage just to delete it. Modifing sites in-place (via an updated sitedef) is a bit of a sort spot because as Serge van den Oever put it in this blog, the reason why we use sitedefs is so we can have more flexibility with creation of a site (sans programming) and update it easily. 

    Here's the scenario I ran into with doing this. Create a list and define a field to be a DateTime field (don't get me started on Lookup fields again). Now sometime later you decide to change the DateTime field to a Text field. Boom. You try to revisit a list that uses that field and edit it and you'll be in a world of hurt. I don't have the exact DON'T DO THIS list of what fields can and cannot be transformed from and to (anyone game to putting one together?) but basically changing types can be bad. They sometimes can work. For example going from a numeric field (as long as you don't specify decimals, etc.) to text is sometimes okay. Going from Text to say Choice sometimes works. Again, there are some scenarios that work and some that just plain put you in reboot server land, you are screwed, do not pass Go. This is due to the fact that behind the scenes it's not a 1:1 mapping of column you create to column in database and all that metadata in all those tables sometimes just gets really, really confused.

    Adding new fields (or adding anything). This isn't so much of a problem. Adding new fields to SCHEMA.XML generally always works with no ill effects (and I have yet to create a situation where it did). After all you are just adding more meta data about the lists so when a list renders it's add/edit form it's just another field to display. The only drawback here is if you make a field mandatory you can end up in a situation where your lists metadata isn't valid because any new fields will just be blank.

    Of course, any of these scenarios, good, bad, or otherwise, are offiicially unsupported according to the KB article so buyer beware. What's odd is they support modifying sites via FrontPage. Now while I can only mess up one site at a time with FP, I can really mess it up with a simple removal of some DHTML or Web Part Zone tags (read: make site unrenderable to the point where you can't fix it). So that doesn't make sense to me.

    P.S. I also stumbled across a nasty gotcha recently around missing files and the execution of ONET.XML. In ONET.XML you can specify files to copy to your new site when it gets created (again, referring back to the body of this blog, this section of ONET.XML only gets executed once on site creation hence why you can't update it and expect your sites to automatically be populated with new files that were not there before). Anyways, in the Modules section of ONET you can specify files to copy and where to copy them to. This is great but you'll get a nasty message as ONET is being executed when it hits a file it can't find. Yes, it's a file not found message and the new SharePoint Blue Screen of Death which has practically no information at all. I came across this but we were creating new sites with dozens of custom lists, dozens more of lists being generated and even more dozens of files being copied. Which file was it that was not found? The FileIOException class that will get thrown if you're doing this programatically contains a property called, what do you know, FileName. So why can't the IIS log, the SharePoint log, or even the SharePoint page itself tell me what file it can't find? I had to hunt and peck through my entire system until I found the culprit. In any case, I closed the ticket with Microsoft and asked them to kindly add more information error messages especially when they know the context of the message. It's akin to you buying something and your bank or the retailer saying "Not enough funds". So how much more do I need? I'm just looking for a little break here.

  • WikiSharePoint!

    Came into the office this weekend (yeah, I need a life) and found a blog posting from last Thursday by Mart Muller. He's put together a set of Web Parts that provide a Wikipedia like experience but hosted on your SharePoint site.

    First there's the Search Results Web Part. This displays any hits on the term you search for using the Wiki Search Web Part. It has links to the content of the search results and will also automatically hyperlink phrases that match other Wiki entries. There's also a Tree View of the Wiki entries so you can navigate through preferred and variant terms. The whole thing is hosted in one custom list (a thesaurus) with logic behind it to parse out the terms and serve up the information all linked together with like terms. Adding an entry is easy and just like filling out any other SharePoint list.

    I installed this quickly and easily in our dev environment to give it a whirl. A couple of minutes later I created a site to host the web parts (you can also create an area on a portal and it works the same) and added a few terms. Simple to implement and workable. Even though Mart says this is beta, it seems to be pretty solid and usable and can only get better. Combine this with Jim Duncan's cBlog Templates and you've got yourself a pretty powerful set of services all hosted using SharePoint.

    There's been requests before about creating KBs or other type tools with SharePoint and we've all told people it's doable but just needs a little work to bring things together. I can see this as a great start at a knowledge base in your organization. Just create an area on your Portal called Knowledge Base, drop the Web Parts on, and let everyone start adding terms. The great thing is that there are lots of ways to cross link the information and terms once they're entered will automatically hyperlink to other terms in their bodies so it saves you having to manually creating links. So check out the Web Parts here and let Mart know what you think.

  • The pain of creating lookup fields

    The SharePoint UI is great for end users. With very little training they can go off and create new custom lists and through the magic of a Lookup field they can create lookups into other lists and life is grand. For the developer though life is a rotten bag of apples when it comes to Lookup fields.

    There are two ways to create new fields in SharePoint sites. You can define them through Xml or create them programatically. With the Xml definitions, it's a matter of copying the CUSTLIST definition (which is just a simple empty list with a single field, Title) to your new definition and add fields. Here's the definition for a new text field:

    <Fields>
       <Field Name="MyField" DisplayName="My Special Field" Type="Text" />
    </Fields>

    Simple and easy. The <Field> tag is defined in SCHEMA.XML for your custom list and supports all the tags that are in the SDK documentation. Well, almost. The Lookup type is there and so if you wanted to define it you would think you do something like this:

    <Fields>
       <Field Name="MyLookupField" DisplayName="My Special Field" Type="Lookup" />
    </Fields>

    The documentation says that the List and ShowField attributes can be used with this type. The List attribute just says it's the internal name of the list (which we would assume that it would be the list we want to lookup values from). The ShowField attribute says it's the field name to display and be used to override the default (Title) and display another field from an external list. There's also another attribute called FieldRef which is the name of another field to which the field refers to, such as for a Lookup field. All in all, it's very confusing but you would think you could do this:

    <Fields>
       <Field Name="MyLookupField" DisplayName="My Special Field" Type="Lookup" List="MyLookupList" />
    </Fields>

    And if you don't want the lookup to use the Title Field in MyLookupList then you can use:

    <Fields>
       <Field Name="MyLookupField" DisplayName="My Special Field" Type="Lookup" List="MyLookupList" FieldRef="LookupFieldName" />
    </Fields>

    So let's put this to test and have some real data. Let's create two custom lists called Employee and Department. Each entry in the Employee list has a Lookup field that points to the Name field in the Department list. Here's the Lookup field definition in our Employee list:

    <Fields>
       <Field Name="Department" DisplayName="Department Name" Type="Lookup" List="Department" FieldRef="Name" />
    </Fields>

    However if you create your lists you'll notice two things. First, if you add an item to your Employee list (the one with the Lookup field in it) you'll see there's no choices available for Department (assuming you added values to that list first). Second, if you try to modify the Lookup field through the UI you get this nasty message:

    Guid should contain 32 digits with 4 dashes (xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx).

    So what gives? Simple. The List attribute, while it says it's supposed to be Text is but it's not the name of the list. It's the Guid (in the form listed above). The problem is of course that Guids are unique and only known after they're generated. There's nothing in an Xml file (no matter how great the Xml file might be) that can dynamically retrieve the Guid. So Lookup fields, IMHO, can't be used in SCHEMA.XML because they have to be the Guid of the list and that's not known until the list is created first (feel free to jump in and correct me if I'm wrong).

    Okay if we can't use SCHEMA.XML to do this, we can write code. Yes, beautiful glorius code. If you have a Lookup field and you retrieve the raw text from it, it looks like this:

    42;#Information Services

    The 42 refers to (besides the answer to life, the universe, and everything) the ID of the item in whatever list you're looking up. When you retrieve the lookup value you get "42;#Information Services" which you're going to have to transform with a simple little RegEx call if you want to show it to a user.

    So now you're thinking if I can retrieve it and get "42;#Information Services" I should set it the same way right? Nope. What you need to do is set the Lookup field with the ID of the value it's looking up in the other list. Internally when you set that, SharePoint will do a join and retrieve the textual representation of the lookup information and save it for you.

    Okay, some code to explain all this. This assumes that the site is created with both an Employee and Department list. This snippet will:

    • Add a new Lookup field called Department to our Employee List
    • Fill in some imaginary Department Names
    • Fill in some imaginary Employess that report to various Departments

    private void CreateLookup()

    {

        SPSite site = new SPSite("http://localhost/sites/employee");

        SPWeb web = site.OpenWeb();

     

        // Get the Department List from the web for lookups

        SPList departmentList = web.Lists["Department"];

     

        // Get the Employee List from the web

        SPList employeeList = web.Lists["Employees"];

     

        // Add a new lookup field to the Employee list called Departement

        // that will use the Department list for it's values

        employeeList.Fields.AddLookup("Department", departmentList.ID,  false);

     

        // Create 2 new departments in the Department list for lookups

        AddDepartment(departmentList, "Information Services");

        AddDepartment(departmentList, "Finance");

     

        // Now create 5 employees with lookups into each Department

        AddEmployee(employeeList, "Mickey Mouse", departmentList, "Information Services");

        AddEmployee(employeeList, "Goofy", departmentList, "Finance");

        AddEmployee(employeeList, "Donald Duck", departmentList, "Information Services");

        AddEmployee(employeeList, "Daisy Duck", departmentList, "Information Services");

        AddEmployee(employeeList, "Minnie Mouse", departmentList, "Information Services");

     

        // Cleanup and dispose of the web and site

        web.Dispose();

        site.Dispose();

    }

     

    private void AddDepartment(SPList list, string name)

    {

        SPListItem newDepartmentItem = list.Items.Add();

        newDepartmentItem["Title"] = name;

        newDepartmentItem.Update();

    }

     

    private void AddEmployee(SPList list, string name, SPList deptList, string deptName)

    {

        SPListItem newEmployeeItem = list.Items.Add();

        newEmployeeItem["Title"] = name;

        newEmployeeItem["Department"] = FindDepartmentByName(deptList, deptName);

        newEmployeeItem.Update();

    }

     

    private int FindDepartmentByName(SPList list, string name)

    {

        int itemId = 0;

        SPQuery query = new SPQuery();

        query.Query = "<Where><Eq><FieldRef Name='Title'/><Value Type='Text'>" + name + "</Value></Eq></Where>";

        SPListItemCollection items = list.GetItems(query);                       

        if(items.Count == 1)

            itemId = items[0].ID;

        return itemId;

    }

    The trick here is that you need to retrieve the ID of the item in the Lookup list based on name, then use that ID and set it in the other list. This is done by a simple call to the GetItems method on the list we're looking for. There are other ways to do this so for example if you have a small list you can load it up into a Hashtable and use the name as the key and the ID as the value. Whatever works for you as the call to the query can be expensive so you wouldn't want to do this for everything but if you just need it for a report or some data loading it's not too bad. Now when you look at your Employee record you'll see it's got a Hyperlink to the Department Name field in the Department list. 15 seconds work for a user in the UI, a couple of hours for you in Visual Studio. Enjoy.

  • Get the GAT!

    If you start a lot of projects you might follow a structure or pattern to how your solution is put together. Mike Roberts has a great blog here on how ThoughtWorks generally sets up their project development tree. They also put together a simple tool to do this (Tree Surgeon). We follow a similar structure, but it's usually all done manually (usually by the lead developer or Solution Architect) at the start of the project. There are some variations on where things go, etc. and some things are driven by the demand of the application. For example if you're not accessing a database directly (like in a SharePoint application where the Database is the SharePoint Object Model) then you don't need a Database project with SQL/Oracle/etc. connections and scripts. Stuff like that makes it sometimes cumbersome to set things up and it usually takes a few hours to get everything just right and it will vary from person to person and group to group on preferences of how things are organized.

    If you follow the Patterns & Practices group over at Microsoft, you'll know they've been working on a lot of cool things. They consider themselves in the guidance business and not creating cool tools (this is the same group responsible for Enterprise Library, reference architectures and patterns and other neat stuff). Now imagine if there were a pattern you could follow based on policies and architecture your organization wants/needs/desires to adhere to. Imagine if you could just fill out a single Wizard page and have that entire solution generated for you in minutes. And that every project at your organization followed the same pattern no matter what the specifics of the project were so if you moved from project to project (or worked on multiple projects at the same time) you would immediately know where everything was and where it was supposed to go. The GAT will help you get there (and more).

    Tom Hollander has announced that the Guidance Automation Toolkit (GAT) is now available for download. This download is for Visual Studio 2005 Beta 2 and you'll also need the Guidance Automation Extensions installed (which is available on the same site). Think of the GAT as the old Visual Studio Enteprise Templates on steroids (and beyond). Anyone who built VS 2003 templates knows some of the pain and suffering involved in it. The GAT makes this easy(er) but it's more than just templates as you can implement policies and patterns and the whole thing fits into the bigger Software Factories approach to things (although we're not sure how yet, I can guess a few things that could be done in this space). Anyways, if you're into it, Get the GAT!

  • CDI Technology Briefing in Calgary

    I'm off this morning for the better part of the day to hook up with Eli Robilard and CDI on their Technology Briefing Tour which hits Calgary today. Eli got in last night and we'll be chatting up on SharePointy type stuff. If you're registered with the class hope to see you there at the Westin Hotel today. Cheers!

    PS Wow, did I blabber on Monday about WOW and 64bit stuff. Okay, blackmail material if you want to ever embarass me at the next presentation I give.

  • Drinking the 64bit Kool-Aid and knowing how it's made

    I had a painful but interesting experience the last couple of weeks. To set the stage, I currently run 6 working systems at home all networked and all with various purposes (Linux firewall, Programming box, Graphics rendering workstation, File Server, Games machine, etc.) My programming box (XP Pro) was feeling a little under the weather and there was a sale on at Memory Express for a 64bit Athlon CPU. With the 64bit version of XP released I thought it was a good enough reason to upgrade and try out life in the fast lane. Warning techie-geek speak ahead.

    So off I went and purchased an Athlon 3000+ with a Gigabyte GA-K8NS Ultra-939 Pro board. I grabbed an image of XP 64bit off of MSDN and burnt a copy and proceeded to format a new drive for the OS. The install went surprisingly well and all I had to do was install some beta drivers for the sound card. Everything else worked perfectly (including the dual LAN and SATA controllers on the mobo) and the system was flying. My intent was to try out the whole 64bit development thingy so after a lengthy install of Visual Studio 2003 (and 2005) I went ahead and tried some builds of larger systems I had, targetted specifically at x64. Again, everything went well although debugging is hosed in 64bit. It seems the perf tools don't work on x64. The monitor was unable to start the kernal mode driver (VSPerfDrv.sys) and during sampling it would create an error saying "Profiling WOW64 processes is not supported by this version of the profiling tools". I was also hoping to get Virtual PC (32bit) running on XP 64 to see how it performed and if I could do so some silly things like run a 64bit OS (like Windows 2003 Server) in a 64bit hosted VM using the 32bit application (say that 3 times fast).

    However the biggest problem with x64 is the drivers. All drivers have to be 64bit (read: have to be, not should be) and while this is true for most of the big companies (ATI, etc.) others only have beta versions out or no support at all (the on-board sound driver was beta). There's a CD that comes with the Gigabyte with some goodies on it (flash BIOS, CPU tuner, etc.) but it wouldn't run at all with a 64bit OS. Why would they package this with a 64bit motherboard? Surely they know that someone would run it on a 64bit OS so wouldn't they have tested it? Basically it came down to having to use a very limited set of hardware until the drivers are available and while the performance was there it really was hardly noticable. I think if I was running Windows 2003 Server 64bit with SQL Server (a notoriaous CPU hog) that probably would have shown better results but for a deskop OS I just don't see the value.

    In any case, I wasn't going to continue with the x64 version so flipped back to using my trusty 32bit Pro edition. Windows has this wonderful feature where after enough significant change in the hardware occurs, it wants you to re-activate itself. A change of motherboard did that. Another problem is that with my MSDN subscription, I only have 5 activations available. Like Bill Gates and his wonderful "640K ought to be enough for anybody" statement, I'm sure the MSDN Licencing guys have the same attitude. Nobody will ever need more than 5 activations on an OS. Probably true, when you're running 1 machine. Run 5 and change out your hardware every month because you have nothing better to do with your life and all bets are off. So now I was SOL with no more activations on XP installs left. A few calls, some begging and whining, some digging and I finally resorted to drop back to XP Home since I hadn't used any licenses for it in my mini-Norad setup. Of course there was no way I'm going to develop on an XP Home Edition box so more shuffling, imaging, backups, ghosting, lots of coffee and I finally have a setup that works. Took about 2 weeks in total but we're back and running all 32bit OSes again.

    Now to get to the "how it's made part". Rory Blyth had an interesting blog about whether or not viewing the Windows source would actually help anyone? Being a MVP we have (after giving up our first born and the location of the secret Kennedy CIA files) the ability to partipate in the Shared Source Initiative from Microsoft which would give us access to the codebase. However I have a hard time thinking it actually would be beneficial to anyone (except *maybe* someone building embedded systems). One of the big arguments you get from the open source community is that Linux source code is available and Windows isn't and if a company was in trouble by using Linux they could just crack open the code to see what was going on and fix it. Okay, sorry but I really can't buy that. I've been in the Linux codebase and while it's somewhat organized it's far from someone just cracking it open to fix something. It's like anyone who might know something about cars being asked to rebuild an intake manifold (including all the extra gotcha's like getting the timing of the system right, etc.). Yes, I really don't know much about cars but I know more about operating systems and code and I'm sure not going to start messing with kernels (Linux, Windows, or otherwise) to see how something works. It's not like tweaking an algorithm in a business application to see what the number appears on the screen next. I think it's one of those myths. "Oh if only I had the source code I could be so much more productive" or "I wish I knew what was really going on".

    Sure, I'd be the first guy to say I'd be curious to see the source code behind SharePoint and with a copy of Reflector you can see how some things are done (just blur your eyes on the obfuscated parts) but it's really just curiousity over anything. Take a peek at the Windows CE source code that Microsoft did release and tell me that you're a better person and it would have saved you all kinds of headaches "if you only knew". If anything I would see it as potentially a learning thing (and given some of the code I have seen it might be a learning platform of how not to write code) but other than curiousity or "how did they do it because it works so why not reuse it" attitude I can't see a need for it. Give me a component that works and has sufficient documentation anyday over ripping into the code to see how things work. I'd rather be a consumer of a serviceable part than knowing how the guy that wires the low-level stuff together did it.

    Intellectual property? Any corporation that produces a software product has invested oodles of cash into building it but do you think for one minute if a new "revolutionary" feature showed up in say Tiger (say a Networking Wizard, surprising similar to what Windows does) that people would'nt start looking and lawyering up to see the code that produced this miracle. There was a statement that Microsoft would lose it's competetive edge if it released it's source code. I don't know about you, but I can't imagine that happening with release of source code? Microsofts "competitive edge" is by sheer quantity of installed systems. You get that with 90% of the Desktop market and just because source code is out doesn't mean others would start building better or more competetive systems. It doesn't take a genius to figure out how to build a feature just by looking at how someone else did it. That's how the masters have been creating works of art for thousands of years.

    Yes there's been many times I've been asked at work about how something works and the answer was "I don't know, I guess that's how Microsoft did it". Had we been able to see what was under the hood would we have been able to know what's going on? Not likely. It takes some serious skill just to know what's really working in your own application, let alone having hundreds of thousands of lines of source code that you didn't create to wade through. I have yet to see a Linux developer (again, unless they're building embedded systems) do end-to-end debugging into the heart of the lion because something wasn't working quite right. I doubt any Windows application developer would do the same and I don't buy the argument that the world would be a better place if the source code was available. I just don't see enough data points to convince me but your mileage may vary.

    Okay, enough of my soapbox. Back to SharePoint development this week.

    UPDATE: After waking up I realize that having access to the Windows source has absolutely nothing to do with XP 64bit Edition. I'm not sure why I thought there was any connection there at all. Chalk this up to "insane man blogging" syndrome.

  • VS2005 Class Designer and Exporting Images

    One of my favorite features with the Beta 2 update to Visual Studio 2005 was the ability to export your Class Diagrams as images. Yes, it's simple but you won't believe how useful it's been lately sucking code in from a 2003 project, automagically creating the Class Diagrams (thank you Microsoft!) and exporting it to communicate to the development team.

    Dmitriy Vasyura of the Visual Studio Class Designer Team has an excellent post on the ClassDesigner's WebLog that goes into great depths on preparing the diagram, making use of comment shapes to annotate the diagram, documentation scenarios, and of course exporting the images. Check out the post here for more info.

  • What is an Application in SOA?

    A colleague of mine recently turned me onto Clemens Vasters blog (Thanks Rob!). Clemens latest post was entitled "SOA" doesn't really exist, does it? It led to some interesting thought but I'm not convinced that any architecture really exists in the true sense of the word when it comes to "SOA". I did agree with most of what he had to say in his blog and that most people use "SOA" as a label for the engineering components of "SO" which really isn't what Service Oriented Architecture is all about.

    Another perspective to overlay on this discussion is the notion of an "application" in SOA. In SOA you have a federation of co-operating services that are orchestrated to achieve a well defined goal or objective. This is a completely different architectural model from what we have traditionally considered an "application" to be.
     
    Traditional application models grew from from the notions of task automation that originated in the late 60s and the 70's. They were narrow and deep in their scope, and typically bounded by an organization unit's accountabilities. This led to a vertically organized monolith of function all the way from the UI down to the database calls. This has turned out to be somewhat limiting from the current perspectives of how businesses want to operate; companies are generally larger, more diverse, and re-organize more frequently than was common at the time these models were developed.
     
    When applying SOA it makes more sense to bundle services by affinity of capability rather than by "set of tasks" or "org unit accountabilities". In some ways this is similar to the data concept of organizing data by "subject area", rather than application use. For example, if you think about the original implementation of SAP, it was focused on the needs of the Finance Department with "transaction extensions" for other organization units. Some failings of this model were that SAP was an "intrusive burden" to org units other than Finance, and it couldn't support decentralized business models (just any of the decentralized companies that tried to implement it!). Today SAP is re-focusing and exposing it's core services to be consumed by other applications.
     
    But what if the architects for SAP had applied SOA from the get-go? The actual architecture would have been defined as several aggregations of financial capabilities (like accounts, money movements, and so on) with appropriate service contracts. It would have also included a collection of capability that would be needed by the finance functions of a typical organization, again with appropriate service contracts. It would have have included some UI capabilities (Web Parts if they were really on the ball) that could consume these services (and indeed other services from different providers) and have the "appearance" of a traditional "application". However, as architects we would recognize that what the user thinks is the "application" is really just a thin facade to the orchestrations of functions they need. An important additional capability is to easily orchestrate other services that are not part of SAP (for example, a service with AMEX to get your credit card statements that can augment expense reports).
     
    So the idea of SOA is to design in this set of patterns and paradigms from the initiation of the concept. Thoughts? Feedback? Pizza?