Fear and Loathing
Gonzo blogging from the Annie Leibovitz of the software development world.
-
Life, the Universe, and Everything
I’m about a week late on posting this but we finished taping last nights episode of Plumbers @ Work. Here are the shownotes and links from the last show. Both Dan and myself were absent from that one so it was John and James all the way.
Next show in a couple of weeks should have all 4 plumbers back and the show we taped last night should be online soon (I keep saying “taping” but it’s really recording isn’t it. I’m so dated eh?)
- Introduction
- News Bytes: Renaming of Office "12" to Office 2007
- News Bytes: Release Date for Team Foundation Server (TFS)
- News Bytes: WSCF 0.6
- Developer Destination: HanselMinutes
- Discussion about Reflector
- Discussion about SysInternals
- Developer Destination: .NET Kicks
- Developer Destination: DNR TV
- Discussion about Screencasting
- Calgary .NET User Group
- Site Navigation in ASP.NET 2.0
- WebParts in ASP.NET 2.0
- Upcoming Speakers for Calgary .NET User Group
- Discussion about AJAX and ASP.NET "Atlas"
- Test Driven Development (TDD) in AJAX
- Dan Sellers' WebCast Series - Security on the Brain
- Canadian Developers Blog
- Discussion about WinFX
- Overview of Windows Communication Foundation (WCF)
- Overview of Windows Presentation Foundation (WPF)
- Overview of Extensible Application Markup Language (XAML)
- Overview of Windows Workflow Foundation (WF)
- Discussion about Workflows and Activities
- Windows WorkFlow Foundation (WF) versus BizTalk Server (BTS)
- Overview of the Windows Shell (AKA, "Monad")
- Don Box's Weblog Post on SOAP versus REST in WCF
- Overview of SOAP and REST
- Multi-Core CPUs and the Future with Concurrency
Running time: 56:34
Here’s the link to the episode page (with links to all the references and the recording itself). We’ve also upgraded the Plumbers @ Work site to Community Server 2.0 and it looks great.
-
Calling all non-admin SharePoint developers... uh, help?
I try to be a good citizen, I really try. I tried to take the plunge today to create a non-admin user account to do SharePoint development. Yes, I’m the evil MVP that for the last few years has been running as, gasp, Administrator. I found that when I do a run-as command and launch Internet Exploder, some sites don’t render propertly. They’re fine if I log on as the user but not if I do a run-as.
Anyways, I figured it was about time I stopped using the God account and tried the least-user privledge approach on my Windows 2003 Server (I’m sure my place on the respecto-meter will go up with Dan on this). So I created a new regular Joe User account and started tweaking things to make sure I wasn’t able to be uber-admin guy accidently.
All is fine and well and I can build Web Parts, deploy, etc. but I’m screwed for debugging. I’ve added the account to the following groups: Debugger Users, IIS_WPG, and VS Developers. This is the standard thing everyone tells you to do (IIS_WPG is for SharePoint, otherwise the other two will work for regular VS development). When I launch the debugger in VS2003, I get the infamous dialog:
What’s a girl to do. I’ve tried every suggestion I’ve seen out there but the dialog is pretty un-informative. Any tools out there that I can use to see what right the system is trying to do so I can set it up? Note that the dialog says “Access is denied” but I have no idea what access it’s referring to. Kinda sucks so I’m temporarily flipping back to my Administrator account until I can get this to work.
Like I said, everything works fine except for debugging (which if I had complete coverage on my tests I probably wouldn’t need to debug, but that’s another blog).
Note: I’ve tried the following pages and suggestions so please don’t tell me to visit these as they don’t work in my case (for SharePoint development, even Angus’ post)
- Unable to Start Debugging on the Web Server
- Cannot Debug ASP.NET Web Application
- Adding http://localhost to your Trusted Sites
- Re-register oleaut32.dll
- Using the Visual Studio .NET 2003 Debugger with ASP.NET Applications
- How to configure your development environment to develop with least priviledge
Note: My favorite tool, SharePoint Explorer, seems to be hard coded to only run for a local admin. Any ideas on this issue as well? There are some solutions out there (MakeMeAdmin is one) but it seems SharePoint Explorer really needs the user to be the local admin and not just an elevated user. Pity.
-
Finding NUnit Examples
It's sometimes hard to learn a new talent when you have nothing to go on. Software development is like that. You learn the mechanics and get the lingo down, master the tools, yet you're still looking for that elusive way to get going. Each person learns differently and those of us that learn by example are sometimes left out to pasture when you're talking about learning things like unit testing.
Unit testing is an art as much as it is a science, so for those that are looking for good examples, we often turn up the same AddBalanceToBankAccount test that keeps coming up. There are great examples out there, but they are examples written for the purpose of providing an example rather than a real test in the context of a real system. So where are the real projects with real tests?
I've put a list together of projects that have unit tests available to them. As these are all open source and you're welcome to just look at them for the learning value of seeing what a set of unit tests in a system look like. What makes a good test and how does it relate to problems people thought needed testing. There are some very good examples here (and some projects that have oodles of tests, like Mono and Microsofts Enterprise Libraries).
You can find the sample page on the NUnit.com Wiki page here. While it’s a page on the NUnit.com site (which is .NET based) it does contain Java and other language references but I figured it would be a good thing for completeness (XPlanner has a great set of tests to look at). After all, reading Java and C# is like the difference between color and colour.
Enjoy!
-
Going to TechEd? Vote for me, vote for me, vote for me
Are you going to TechEd 2006 this June? What better chance to spend some time kicking back and talking about the next version of SharePoint from a developers perspective. Want to mellow out with me and a room of screaming geeks for an hour or so? Want to be the ruler of your own nation?
I’ve submitted a Birds-of-a-Feather session called Getting Ready for the next generation of SharePoint that I would love to see happen. Birds-of-a-Feather sessions are a one-hour open, moderated discussion on any topic of great interest to TechEd conference attendees. Birds of a Feather sessions are not presentations or panel discussions. There are no speakers and no slides. A microphone and whiteboard will be available, but there will be no projection equipment (BYOBA[1]). Here’s the blurb for my submission:
The next version of SharePoint has a new name, a new face, and a plethora of new features that are just itching to be scratched by the development community. Come share your questions, concerns, and ideas with others heading towards the same path. It's casual and we'll have an informal developer rap about the potential to unlock some of the great new features of the system. Discuss with others, passionate about the new technology, and how you can ahead of the curve for developing your own solutions, big, small, or otherwise with the next version of Microsoft Office SharePoint Server 2007.
So if you’re heading to TechEd and want to see the session happen, please vote for it. If you’re not going to TechEd, then you might as well vote for me as I can blog about it after the fact and all the cool stuff you missed out on because your company was too cheap to send you to Boston for a conference. Oh hell, just vote for it. It’ll be better for everyone in the end.
[1] BYOBA: Bring your own balloon animals
-
Ask the Doofus
Yup, today you can drop by the TechNet BUILD '06 show at Paramount Chinook here in Calgary. I’m there most of the day hanging out, answering questions, signing autographs, sneaking into movies. You know, geek stuff.
There’s an Ask The Experts booth (although there’s only one of me so you can hardly call it “Experts”) so drop by and say hi. We’ll make balloon animals and connect USB ports to them.
-
Know the difference between SPWeb.Users and SPWeb.SiteUsers
Getting lists of users is easy, when you know how. There are some subtle differences that might trip you up when you’re looking to retrieve a list of users from a SharePoint Site or Portal Area. The SPWeb class has two properties, Users and SiteUsers. Here’s the documentation on each:
SPWeb.Users Gets the collection of user objects belonging to the Web site. SPWeb.SiteUsers Gets the collection of all users belonging to the site collection. There’s just a *slight* difference in wording here. After all, SPWeb.Users gives me the user objects of the web site right? Not really.
As this was something that was bugging me with a problem I had, I decided to do a small spike to prove (or disprove) what was going on. When you have a design problem or a technical complexity that has to be addressed (before staring your work to reduce the risk) you create a spike, a small side project apart from your current work. The spike solutions are developed to solve or explore critical problems. Ideally those programs will be thrown away once every developer gets a clear idea about the problem.
So we’ll create the SSWP[1] with the following RenderWebPart method defined:
63 protected override void RenderWebPart(HtmlTextWriter output)
64 {
65 SPSite site = SPControl.GetContextSite(Context);
66 SPWeb web = site.OpenWeb();
67 output.Write(string.Format("<strong>Information</strong><br>User Count={0}, Current User ID={1}</p>",
68 web.Users.Count, web.CurrentUser.ID));
69
70 StringBuilder sb = new StringBuilder();
71 sb.Append("<table border=1>");
72 sb.Append("<tr><td colspan=3><strong>SPWeb.Users</strong></td></tr>");
73 sb.Append("<tr><td>ID</td><td>LoginName</td><td>Name</td></tr>");
74 foreach (SPUser user in web.Users)
75 {
76 sb.AppendFormat("<tr><td>{0}</td><td>{1}</td><td>{2}</td></tr>",
77 user.ID,
78 user.LoginName,
79 user.Name);
80 }
81 sb.Append("</table>");
82 output.Write(sb.ToString());
83
84 sb = new StringBuilder();
85 sb.Append("<table border=1>");
86 sb.Append("<tr><td colspan=3><strong>SPWeb.SiteUsers</strong></td></tr>");
87 sb.Append("<tr><td>ID</td><td>LoginName</td><td>Name</td></tr>");
88 foreach (SPUser user in web.SiteUsers)
89 {
90 sb.AppendFormat("<tr><td>{0}</td><td>{1}</td><td>{2}</td></tr>",
91 user.ID,
92 user.LoginName,
93 user.Name);
94 }
95 sb.Append("</table>");
96 output.Write(sb.ToString());
97
98 try
99 {
100 SPUser currentUser = web.SiteUsers.GetByID(web.CurrentUser.ID);
101 output.Write(string.Format("</p>Current user is {0} (ID={1})",
102 currentUser.LoginName, currentUser.ID));
103 }
104 catch(SPException e)
105 {
106 output.Write(string.Format("</p>Error getting user ({0})", e.Message));
107 }
108 }
So what’s this thing doing? Not much but basically:
- Get a reference to the SPSite object using the GetContextSite method of SPControl
- Open the SPWeb object using the OpenWeb method of the SPSite
- Write out some information about the number of users and current user id
- Create a small table with the id, login name, and display name of each user in the SPWeb.Users collection
- Do the same thing but with the SPWeb.SiteUsers collection
- Retrieve the SPUser object from the Web using the GetByID method and display information about that user
It’s the last part (wrapped in a try/catch block) that you should pay attention to.
Here’s the output of the Web Part logging in as LOCALMACHINE\Administrator when displayed in a SharePoint Portal Server Area:
And here’s the same Web Part when displayed in a Windows SharePoint Server site:
When you create an area or site, several users are automatically added behind the scenes. In the case of an area on the portal, you’ll see that SiteUsers contains more names than what’s in the WSS site. It adds in a “BUILTIN\Users” domain group (ID 4) and adds the “NT AUTHORITY\Authenticated Users” at the end (ID 6) in an area. In a site, “NT AUTHORITY\Authenticated Users” takes up the #4 slot, and the “BUILTIN\Users” is nowhere to be found.
Also when you create a site, it adds the creator (in this case “Administrator”) to the list of users (go into Site Settings | Manage Users to see the list). In area, the creator of that area shows up in the SPWeb.Users list, but it’s not present on the “Manage Security” page.
Here’s a rundown on the user ID slots that are filled in automatically:
ID Description 1 User who created the site or area 2 Not sure on this one? Sorry. 3 Application Pool Id, crawler account. 4 BUILTIN\Users on Portal, NT AUTHORITY\Authenticated Users on WSS 5 Additional individual users you add to the site, or any new users who happen along. 6 NT AUTHORITY\Authenticated Users on Portal only, otherwise the next user you add or visits the area. Note: these values are from most setups I’ve done, the way you configure your portal (turning on anonymous for example) might change these values somewhat so use them as a guideline, not a set of stone tablets that someone hands down to you after taking a siesta in the mountains.
You might say, “But Bil, I really only want the current user so SPWeb.CurrentUser should be good enough right?”. Maybe. What if you want to retrieve a user by ID (that might be stored somewhere else, but corresponds to the same ID # in the site). Or you want to display all the users in the site. This will become important when I blog later this week about Impersonation (everyone’s favorite topic).
So here’s the same Web Part, but being rendered by a regular Joe user (spsuser) who has read access to the site. He’s been added manually to the WSS site, but he’s a member of the Readers group at the Portal and doesn’t show up in the SPWeb.Users list. First, the Portal Area rendering:
Now the WSS site rendering:
Remember when I said to pay attention to that try/catch block? If you had used a bit of code like SPWeb.Users.GetByID (instead of using the SiteUsers property) you would have got an exception thrown while looking for a user with an ID of 5. As you can see in the SPWeb.Users list, it doesn’t exist because this will only display domain groups and indvidual users who have been manually added to the site (and Administrator got added automatically when the site got created).
Another thing that’s interesting is that if you did add “spsuser” to the site manually he would show up in the SPWeb.Users list above. However if you removed him, he would vanish from SPWeb.Users but he would still show up in the SPWeb.SiteUsers list. His ID # is reserved so the next user you added manually would have an ID of #6 (in the case of the site) and if you re-added “spsuser” at a later date, he would re-appear in both lists with an ID of #5.
So in a nutshell… if you’re going after a list of all users or want to retrieve a user from the list but he’s part of a group, then use the SiteUsers property instead of Users. Otherwise, he might not be there.
Okay, this post might have been a little confusing as we’re flipping around between Users and SiteUsers and all that but hopefully the code and images describe it for you. Let me know if you’re totally confused, otherwise… enjoy!
[1] Super-Simple-Web-Part: Just a web part where we hard code the values in the RenderWebPart method. Nothing fancy here. No Domain Objects. No DTOs. No Layers. Just write out some code.
-
Size does matter
Invirus sent me a note about a product they have to opimize virtual machine images. Finally got around to trying out the product and it’s pretty slick. It basically rips through your VMs and shrink them down without affecting the OS inside of them. VM Optimizer 2.0 will work with files from both VMWare and Virtual PC/Virtual Server vhd files (although in my test I tried shrinking an identical file of both types and for whatever reason, the Virtual PC file showed better compression).
What was useful for me was that I had a large (7GB) VM that I wanted to move to another external drive that was FAT16 formatted (most USB drives are so they’re compatible with Macs and Windoze). The external drive in this format doesn’t support individual files larger than 4GB so this was a problem. Not anymore. I ran the optimizer and it shrunk it down to 3.3GB and was able to transfer it. The reduced size also decreased the load time of the VM by a bit, so there’s some savings there if you’re starting and stopping these things all the time.
There’s a 30 day trial available which is a full version so you can try out all the features and see if it works for you. Check it out as it’s a cool product and doesn’t seem to have any adverse effects on VMs, other than making them smaller. And remember, size matters.
-
Goodbye ReSharper, hello Refactor! Pro
I was a little split on tools tonight. Like most of us, we spend most of our waking day inside an IDE writing and designing so we want the best use of our tools that we use. I’m always looking for ways to better my codebase and refactoring is a technique that I employ all the time. Little tweaks make things easier to read and generally make my code more maintainable. However it’s always hard to tell what the right refactoring is and when it should be applied.
I use ReSharper from JetBrains, but lately they’ve been a little lax with updates. Their VS2005 product isn’t final yet, many people (including myself) have tried it but then uninstalled it when your stable IDE becomes unstable. So I thought about looking at other options. Scott Hanselman is big fan of CodeRush and the DevExpress products and they have a refactoring tool called Refactor! Pro (and hey, if Scott likes it then it must be good right?).
Refactor! Pro has a nice set of features that take it above what ReSharper offers. It’s the same price point ($99USD) so anything it offers above ReSharper is just icing on the cake. Here’s a few that I like which gave me the swing vote for the product.
There’s a nice little animation that is displayed when you hover over a return statement that shows where it returns to (makes for visibility of the code you’re about to skip that much more potent).
Here’s one that I’m mixed on using, Inline Temp (or mixed on, using Inline Temp if you prefer). The idea is that instead of doing this:
101 int cultureId = Thread.CurrentThread.CurrentUICulture.LCID;
102 if (string.Empty != LayoutsDir)
103 {
104 if (string.Empty != StyleSheet)
105 {
106 // Both LayoutsDir and StyleSheet specified
107 styleLink = string.Format(
108 "<link rel=\"stylesheet\" type=\"text/css\" href=\"/_layouts/{0}/styles/{1}/{2}\">",
109 cultureId.ToString(), LayoutsDir, StyleSheet);
110 }
111 else
112 {
113 // LayoutsDir but no stylesheet (kind of useless though)
114 styleLink = string.Format(
115 "<link rel=\"stylesheet\" type=\"text/css\" href=\"/_layouts/{0}/styles/{1}\">",
116 cultureId.ToString(), LayoutsDir);
117 }
118 }
You remove the introduction of the cultureId variable and do this:
101 if (string.Empty != LayoutsDir)
102 {
103 if (string.Empty != StyleSheet)
104 {
105 // Both LayoutsDir and StyleSheet specified
106 styleLink = string.Format(
107 "<link rel=\"stylesheet\" type=\"text/css\" href=\"/_layouts/{0}/styles/{1}/{2}\">",
108 Thread.CurrentThread.CurrentUICulture.LCID.ToString(), LayoutsDir, StyleSheet);
109 }
110 else
111 {
112 // LayoutsDir but no stylesheet (kind of useless though)
113 styleLink = string.Format(
114 "<link rel=\"stylesheet\" type=\"text/css\" href=\"/_layouts/{0}/styles/{1}\">",
115 Thread.CurrentThread.CurrentUICulture.LCID.ToString(), LayoutsDir);
116 }
117 }
Personally I think creating a variable once and using it (as long as the variable has a good name) works better and is it just me or isn’t calling a method 3 times rather than referencing a variable is more expensive? Okay, we’re talking about milliseconds here, but sometimes every line of code counts.
When you select a line there are different options. Here it’s asking me if I want to do an Extract Method refactoring, using the HtmlWriter parameter and a local variable as parameters to the new method. The arrows show where it’s getting all the info from (and there are additional popups that didn’t get captured like the tooltip explaining the refactoring).
When you hover over strings, you can invoke the Introduce Constant refactoring, replacing that string with the constant value. I always find this a pain as I just naturally type in quoted strings rather than stop, go create a variable, then come back to the point I left. It interrupts the natural flow of things. Some would argue that going back and doing it after the fact takes more time, but again if it’s automated like here then I think it’s faster in the long run.
Finally as you hover over the three little dots Refactor! Pro puts under your variables, methods, and such it creates the pulldown menu (sometimes there are multiple refactorings you can do on a variable) a small tooltip explaining the refactoring. Nice if you don’t know exactly what it will do but also helps for newbies that have never used a certain refactoring before.
A really powerful thing is that you can create your own refactorings and apply them. While the base product supports all the known refactorings out there, this makes the product shelf-life that much longer as new ideas are discovered or even if you do something on a regular basis, you could build your own here. Yes, it’s like a glorified macro tool that is already built into Visual Studio but it has some smarts and looks like it could be powerful (especially if there’s a way to get new refactorings from the community at large).
I think the key thing is that it supports both VS2003 and VS2005 with the same product. This is key for me as I have dozens of tools that I buy, I have to be a little picky over what I purchase. With ReSharper I have to purchase two separate SKUs and pay for it. Refactor! Pro seems to offer both of my IDEs the same tool for the same price. Kudos to them as I like this in any product (and probably because I’m miffed that I now have to buy an XBox 360 version of my Burnout Revenge because my XBox version of the same game isn’t compatible with the 360).
There are other advantages and disadvantages. I don’t think Refactor! Pro highlights errors like ReSharper does, but it’s hard to tell (especially when you have both products installed at the same time). Refactor! Pro doesn’t show all refactorings with ones that don’t apply greyed out so the menu is smart and only shows you what’s relevant so I’m not sure the full set of refactorings that it supports, but I’m sure they’re all there (in my testing, it was “Good Enough”). The only big thing which swings me to looking at buying CodeRush as well as Refactor! Pro is the code templates. ReSharper has them built into their product, so you get the refactorings but you can also type in “tcf”, press TAB and get a try/catch/finally block. Refactor! Pro is just about refactoring and doesn’t seem to have this capabilty built-in but if you go the extra step you can buy CodeRush and get that feature (along with the other 10,000 things CodeRush does). Still, it was good enough for me to switch but YMMV.
Also as a note, refactoring is not about tools it’s about understanding when to refactor and apply the right refactorings. Tools can help you and make things quicker for you, but you need to know why you’re doing it. Like a skilled craftsman with a chisel, he can get the job done twice as fast with a plane but he needs to know how to do it first in order to be called a craftsman. Developers are the same and I try to teach people manual refactorings first so a) they know the pain and suffering you go through to refactor manually (it can be tedious sometimes) and b) they know why they’re refactoring and it’s not just a click of a tool to move code around. Only then do I talk to them about tools and how they can get the job done quicker. So learn why you refactor first and the raw mechanics of it, then look at automation to help out.
P.S. why must people put Pro or Advanced or something after their product names? There is no Refactor! Standard product, just Refactor! Pro. Is it a selling gimmick? Maybe I should start creating products with Pro in the name (but not offer a standard version). Must be me but then I’m also still looking for a copy of Microsoft Windows Amatueur edition. I know it’s out there.
There, I think I’ve said the word refactor so much that this blog post will hit the top of Google in a day or two now.
-
Creating Multilingual Web Parts
Ahalan. Parev. Goddag. Saluton. Bonjour. Guten Tag. Aloha. Hola. Shalom. Hello.
We live in a multi-lingual world. Everyone speaks a different language (unless you’re a slacker like me who barely comprehends the English one) and when we build an uber-cool Web Part (say a new Discussion Forum) that we want the world to use, it needs to support whatever language is out there (yes, including Esperanto).
A lot of SharePoint documentation (including the SDK) talks about implementing the LoadResource method and use the ResourcesAttribute to mark your properties or enum values in order to localize your Web Part properties. This is great if you want to create a property for every string you’re going to display, but what if you don’t want to do that? In some Web Parts, that’s a heck of a lot of Properties. Maurice Prather posted a simple HOWTO about using this method here, but it doesn’t really cover multiple languages for say strings that you would display on your Web Part (and don’t want to create properties for everything).
While it's certainly possible to develop a multi-lingual application with the tools provided with ASP.Net, there are a number of limitation which make the task less than a happy-happy-joy-joy experience for anyone. Some of the key problems are:
- Resource files are embedded into assemblies
- Resource files can't return strongly-typed objects
- Web controls aren't easily hooked up with resource files
While this may seem trivial, the above three issues can be quite serious - with the first being the worst. For example, since resource files are embedded into assemblies, its very difficult to ship a product which provides the client with the flexibility to change simple things like text on the screen. Do you really want to recompile your entire assembly when the translation department wants to change some text, stop the IIS process and copy the .dll into the bin folder (grant you might be able to get away with copying the file and not stopping IIS or the AppPool, but still… hey, walk with me on this).
A Simple Approach
This is a fairly basic approach but accomplishes a few things. First, resources are now externalized from your Web Part assembly(ies) and you can update them anytime you need without recompiling (say for spelling mistakes or you feel a strong desire wash over you to change “Exit” to “Leave”). Second, with some small changes we do here we can support any language automatigically (and fall back to say a default one if we can’t find the one we want).
One note is that you can accomplish multi-lingual Web Parts using a resx file, resgen, blah, blah, blah, blah so this article is an alternative to that. Personally I prefer this, but YMMV.
Create a Web Part
Yes, you know how to do this. Create a new Web Part Library Project using the Visual Studio templates. Let’s give it a name like ItsRainingMen.
Okay, fine. Let’s not so feel free to call it whatever you want. You’ll probably call it something boring like WebPartLibrary1 right? Whatever. Breathe. Move on.
Fields You Need
First we’re going to create a few fields in our Web Part that will help us manage the language and resources. We can always retrieve the current language setting (1033 for English) from our Web Part through the Language Property, but we’re going to put the language value into a Property of our own and let the user override and change it (hint for developers, this is a good thing as you don’t have to do something crazy like change your server settings in order to test other languages, just change a property at run-time and refresh your page). We’ll also create an XmlDocument to hold the contents of the resource file (which is going to be Xml as it’s easiest to implement) from where we’ll retrieve the language strings.
22 private const string defaultLanguage = "1033";
23 private string language = defaultLanguage;
24 private XmlDocument resourceFile;
So create two new fields, one XmlDocument and the other string for the language (with a default of “1033”). If you want to get fancy, this could be a drop down list of all the languages SharePoint supports and you can let the user pick it in a custom tool part. Again, more work than what I want to do here so we’ll make the users enter this manually. Why use 1033? We’ll get to that later.
OnInit
We’ll override the OnInit method in the Web Part in order to load our resource file. This may sound crazy, that we’re loading a resource file every time the Web Part is loaded but it’s a small file and takes less than a tenth of a second to load so get over it. If you really have your knickers in an uproar over it, you can always take a different approach and load the file once, toss it into a Cache and create a cache dependency on the file so if the file ever gets updated, the contents will get reloaded. I’ll leave that as an exercise for the reader.
38 protected override void OnInit(EventArgs e)
39 {
40 try
41 {
42 SPWeb web = SPControl.GetContextWeb(this.Context);
43 DirectoryInfo directoryInfo = new DirectoryInfo(this.Page.Server.MapPath("/wpresources/ItsRainingMen"));
44 FileInfo[] languageFileInfoArray = directoryInfo.GetFiles("*.lng");
45 for (int n = 0; n < languageFileInfoArray.Length; n++)
46 {
47 FileInfo fileInfo = languageFileInfoArray[n];
48 if (fileInfo.Name == (web.Language.ToString() + ".lng"))
49 {
50 this.language = web.Language.ToString();
51 }
52 }
53 if (this.language == "")
54 {
55 this.language = "1033";
56 }
57 this.resourceFile = new XmlDocument();
58 XmlTextReader reader = new XmlTextReader(this.Page.Server.MapPath("/wpresources/ItsRainingMen/" + this.language + ".lng"));
59 this.resourceFile.Load(reader);
60 reader.Close();
61 }
62 catch (Exception exception)
63 {
64 // Do something meaningful with the exception here
65 }
66 }
Our OnInit does a few things here. It creates a DirectoryInfo object for the resource directory for our Web Part and enumerates all the *.LNG files there (you can use any name you want here, but just in case you might be using something common like *.XML I didn’t want to include those files). The SPWeb.Language value for English is 1033 so naming your .LNG file as 1033.LNG means we can use this format for any language and just add new ones as we create them. For a complete list of the language codes search in the SharePoint SDK for LCID. 1033. See. Not just a hat rack.
The LNG files are just simple XML files that contain the strings you want to translate. You can make your language files as complex as you like, but really it just needs a way to a) store a token to retrieve later [using an Xml attribute] and b) store the value to be tranlated. Keeping your file simple will make it easier to edit later (if you’re really adventurous you could build a graphical editor to do translations).
First, heres the sample language file (1033.LNG) that represents the English strings.
1 <?xml version="1.0" encoding="utf-8" ?>
2 <!-- xml resource file for English translations -->
3 <strings>
4 <string id="HelloMessage">Hello</string>
5 </strings>
And here’s the same file, copied and renamed to 1036.LNG which contains the French translations.
1 <?xml version="1.0" encoding="utf-8" ?>
2 <!-- xml resource file for French translations -->
3 <strings>
4 <string id="HelloMessage">Bonjour</string>
5 </strings>
Then OnInit reads in the appropriate XML document using the XmlTextReader class into our private XmlDocument member variable called resourceFile. Now any time we need to access any string, it’ll be available in an XmlDocument so let’s load a string using XPath.
LoadResource
This is the method we’re going to override in order to a) determine what language we should load and b) use XPath to load the string from the language file so whenever we reference a string we’ll have the correct one.
73 public override string LoadResource(string id)
74 {
75 string translatedResource;
76
77 try
78 {
79 translatedResource = this.resourceFile.DocumentElement.SelectSingleNode("/strings/string[@id='" + id + "']").InnerText;
80 }
81 catch
82 {
83 translatedResource = string.Format("Error reading lanaguage ID={0}", id);
84 }
85
86 return translatedResource;
87 }
Again, simple stuff here and you’ll want to handle the exceptions accordingly.
RenderWebPart
Finally, as the simplest example we’ll just print “Hello” to the Web Part. Since we’re using translated strings, we want to display “Hello” in the appropriate language, based on the language from the SPWeb.Language setting or the value we set in the Web Part.
93 protected override void RenderWebPart(HtmlTextWriter output)
94 {
95 output.Write(SPEncode.HtmlEncode(LoadResource("HelloMessage")));
96 }
Cool huh? Anytime you want to reference a string that should be translated, use your string token (in our example it’s “HelloMessage”) and the LoadResource method. It might be a good idea to do something like split up your messages using a token that makes sense like:
- MainForm.FirstNameLabel
- MainForm.FirstNameErrorMessage
- SecondaryForm.GenericErrorMessage
Or whatever naming convention works for you. Make it readable and make it something that makes sense in your code as well as your language file.
Now that you have the framework built into your system, you can just hand out the Xml file with all the tokens and the values needed for translation and have people do your work for you (that’s the part I like). The LNG file can just be copied into the wpresources directory for deployment and a switch of a property in the Web Part will result in your entire Web Part/Application/etc. translated to whatever language you want (I personally really want the Swedish Chef language file for my Forums Web Part done first).
Now you can a) support multiple languages in your Web Parts b) add a new language just by creating a file and c) update any changes or perform translations on strings whenever you want without taking down your portal, restarting your AppPool or some other silly thing. Not a lot of code to implement it either.
P.S. I have to thank Steen Molberg and his BlogParts Web Part for the idea about the langauge files.
-
Colligo making the SharePoint rounds
Colligo Networks, makers of Colligo for SharePoint has released their beta 2 version of the product. If you’re not familiar with it, Colligo for SharePoint is a rich offline client that emulates the functions of SharePoint team spaces. The client enables mobile users to download, create, organize, view, edit and save content on their laptop much like they can on a SharePoint server.
A lot of new features and fixes have gone into Beta 2, including the following highlights:
- Support for views with grouping
- Support for events lists and recurring appointments
- Support for issues lists
- Can now reuse Windows login credentials to authenticate with site
- Can now modify credentials and URL for a site through the Workspace > Manage Workspaces menu item.
- Synchronization progress bar now reflects the actual progress of the sync
- Switched to .NET Framework Version 2.0
Some Key improvements still to come include:
- Ability to sync multiple sites at once (rather than syncing each site individually)
- Improved error handling and reporting
They have a blog online now at http://www.offlinesharepoint.com and you can sign up (free) for the beta here.
Enjoy!