Monday, November 29, 2010

SharePoint 2010 - Replace missing Publishing Site Columns and Content Types

While working on a new content type that inherited from Article Page, I somehow lost a lot of Site Columns. When I tried to view the Article Page content type, it had no columns listed!

This ended up being a simple fix. Run the following command to reinstall the Site Columns and Content Types that are used by Publishing:

stsadm -o activatefeature -name PublishingResources -url http://MySharePointSite -force

Once that operation completes, the missing columns should now be back. Don't forget to fix the feature that caused the problem!

Wednesday, November 24, 2010

SharePoint 2010 - Content Deployment Woes Part 2: Custom Features

If you are reading this post, please read through my previous post on this topic to get some background on tricks to solving the "Could not find Feature" issue with built-in SharePoint features.

If you are still reading, then you have probably built and deployed custom features to your farm, such as a web part. You are then running into problems during Content Deployment jobs between farms that have different SharePoint versions on them, such as deploying from Enterprise to Standard.

The solution that worked for the built-in features was appropriate because the features that were causing the problems didn't need to be installed on your destination farm. But with custom features, that is not the case. You built them precisely because you need to be able to run them on the destination farm.

I encountered this same problem and it took me a little while to figure out why it was unhappy. Here are the steps I'd suggest you follow:

  • Check the 14-hive on the destination Web Front End to verify your feature is there.
  • Verify that your feature is installed in the destination site. For instance, check the Site Features page for features scoped to the Site level
  • Try enabling the feature and making sure it doesn't thrown any errors

Maybe you've done all those things and the feature looks good, but it's still not working? At this point, you should be confused. I know I was. I decided to check the settings on my package in Visual Studio. The Deployment Server Type was set to WebFrontEnd, which was correct. I then checked the 14-hive on the destination App Server, which is where Content Deployment jobs are run. No feature folder there.

Now I know what you're thinking - isn't that what we want? Solutions that are targeted as WebFrontEnd are marked that way for a reason - they don't need to be on the App Server because it's not serving up those web parts. Well, when it comes to Content Deployment, apparently they DO need to be there. I assume this is because the Content Deployment job is going through Central Administration, which is living on your App Server.

The solution I came up with was to enable the Microsoft SharePoint Foundation Web Application service on my App Server, effectively making it a Web Front End as well. This ensures that when I add solutions to my farm that the features get deployed to my App Server as well, effectively squashing the Content Deployment job errors I was receiving.

My App Server is already segmented off from the Web Front Ends by a firewall to keep it from being accessible from the internet. However, to ensure that my App Server is never used as a Web Front End, I am also making sure it is never listed in the load balancer and I am blocking traffic to port 80 on that box. This means there should be no real impact to the machine and my Content Deployment jobs can now run successfully.

SharePoint 2010 - Content Deployment Woes

Lately I've been working a lot with Publishing sites, which means I've been using Content Deployment jobs to move content between my farms. Unfortunately, I've learned the hard way that this part of SharePoint is rather particular about farm setups.

I have an Enterprise development farm that I use for consulting work. I built some pages on it and then wanted to deploy them to my customer's test farm, which runs a Standard license. This broke on me every single time. Specifically, I kept getting errors that features didn't exist on my destination farm.

One message I got was "Could not find feature IPFSSiteFeatures". Here are the screenshots from my deployment report:








These are all the features that existed on my Enterprise development farm but not on my Standard test farm:
  • IPFSSiteFeatures
  • WACustomReports
  • PPSWorkspaceCtype
  • PPSMonDatasourceCtype
  • PPSWebParts
  • PPSSiteCollectionMaster
To get my content deployment working again, I enabled them temporarily on the destination farm, performed the deployment, and then disabled them again. You might also be able to disable/uninstall those features on your Enterprise farm before doing the content deployment job and achieve success, but I didn't test that option.  
Note: Make sure you disable these features on your destination farm after deployment, or you may be in violation of your SharePoint license.

If you made the mistake of installing Excel Services and PerformancePoint Services on your Enterprise environment, your problems are more difficult. If you run Get-SPFeature, you will see more items in the list that will cause your content deployment to fail, such as BizAppsListTemplates.

You might be tempted to disable them on your source farm. Unfortunately, when I tried, I learned you cannot disable them with PowerShell because they are still in use. However, those features aren't activated on any of my sites. Because of my failure to disable them on the source farm, I did not test enabling them on the destination farm because I wasn't sure I'd be able to get rid of them and I didn't want to have to rebuild the entire farm again and lose my content.

I believe these features might be activated inside the service application itself, but I have not found a way to confirm this theory. I tried deleting the Service Application, but the features still couldn't be removed. If you figure this out, let me know.

Summary

In short, here are the things to keep in mind if you want to do a Content Deployment job between two farms:
  • Source and Destination farms should be on the same version of SharePoint
  • You must deploy solutions that exist on your source farm to your destination farm prior to running the job.
  • Features that are enabled on your source farm will automatically get enabled on your destination farm, but if you encounter an error with a feature, verify that the feature is installed on the destination
There are some additional quirks if you receive the "Could not find feature X" for custom features that you have developed. I'll cover those in a followup post.

    Wednesday, October 6, 2010

    SharePoint 2010 - Dynamic JavaScript problems with Inline Editing

    SharePoint 2010 allows inline editing of your HTML on Publishing pages, which is a nice boost to productivity. However, there are some quirks with this feature that can really trip you up if you aren't ready for them.

    Here's a scenario I ran into this week. I have a simple Publishing Page with a Content Editor Web Part on it. Inside this web part, there is an HTML element that I target with some jQuery to display tooltips. The JavaScript here is really quite simple and just dynamically adds a relative positioned div to the page above the target element.

    The code for this is tested and works fine outside of SharePoint. But when you edit a page, you'll find an interesting side-effect. Specially, when editing the page, the JavaScript that adds the dynamic elements to the DOM still runs. When the page is saved, even if you didn't even edit that particular Content Editor Web Part, the updated DOM elements get saved - including the tooltip changes!

    I have tested this with a lot of scenarios, but it's pretty consistent across any SharePoint components that provide inline editing.

    So, how do you allow JavaScript to dynamically update content on publishing pages? This is really going to depend on the nature of your content. Here are a few different scenarios I've employed in my site:
    • Extract the dynamic content and make it a web part.

      This works well if your content doesn't change that often and has few properties. A great example here would be having a web part that uses swfobject.js to load a flash file rather than placing the script call inside a Content Editor
    • Skip the JavaScript calls that update your page when in edit or design mode.


      This works really well for scenarios like my tooltip. You can do this either in JavaScript or C#, depending on how your page works.
    Here is an example of checking your page mode with JavaScript, using jQuery:
    if ($('#MSOSPWebPartManager_DisplayModeName').val() == 'Design' || $('#MSOSPWebPartManager_DisplayModeName').val() == 'Edit') {
            alert('Don't do anything');
        } else {
            alert('Safe to run JavaScript');
        }
    

    Saturday, July 31, 2010

    SP2010: Unable to publish workbook to Excel Services

    While trying to publish a workbook to SharePoint 2010 to use with Excel Services, I ran into some problems. Specifically, it appeared that Excel wasn't able to access my Document library.

    To publish my workbook, I followed instructions you might find anywhere on the web, using the Office 2010 Backstage to save to SharePoint. However, none of my libraries showed up by default. Not a problem! I can just type the URL into the address bar, right? Apparently Excel says that it "can't open this location using this program".

    Generally problems like this are permissions related. Well, I was logged into the machine as my farm administrator, who has full rights to the Documents library. I was also able to click on files from SharePoint and have them open Excel automatically.

    I decided to take a new approach. I opened the demo Excel workbook that came with the Business Intelligence Center template. I then made a small update and saved it back to SharePoint. Now when I opened Backstage to save to SharePoint, the Documents library showed up in the Recent Locations section.


    I was hoping by doing this, I could trick Excel into finding the location. So I then tried to save my new workbook by just clicking on that recent location. However, Excel was still not able to do it, and told me it was unable to open my site:


    After looking around on the web, I found a forum post that seemed promising, as I'm using a Windows Server 2008 R2 machine. I opened Server Manager and installed the Desktop Experience, which required me to install the Ink and Handwriting Services as a prerequisite. I was then prompted to reboot upon completion. After rebooting, I attempted to save my workbook to SharePoint again and it worked.

    SP2010: Excel Services Error - Unable to Process the Request

    I've been playing a lot with Excel Services for the last week and while it is nice, it is also temperamental. Most of this can be chalked up to inexperience on my part as I discover the closest to least privileges you can get for a SharePoint 2010 Excel Services Service Application, in light of the ever popular bug, "The workbook cannot be opened". However, I'm also convinced it's a little more particular than the previous version.

    By default when I installed Excel Services using a PowerShell script, it had an entry for a Trusted File Location at "http://". After playing with the Business Intelligence Center template for a bit, I created my own workbook that had a PivotTable and a PivotChart that talked to my SSAS installation. Naturally I decided to update my Trusted File Location settings. So I deleted the default entry and created a more specific one that pointed directly to the Documents library that came with the template. Then I checked on the Excel example that came with the template and found that Excel Services was no longer working because it was "Unable to process the request".


    Event Viewer showed nonstop critical errors and ULS had some gems in there pinning the blame on the Secure Store Service: "Request for security token failed with exception". I tried refreshing the key figuring that my WFE and App server were out of sync, but that didn't fix the problem. Lacking a better idea, I did an IIS reset on the WFE - and it started working again.

    I'm not sure why it was unhappy, but at least it was only a 10 min troubleshooting span and a simple fix!

    Monday, July 26, 2010

    SharePoint 2010 - Error Removing Managed Account

    Recently, while trying to experiment with Excel Services in SharePoint 2010, I decided to remove the service and do a reinstall with a PowerShell script. Since I like my scripts to create managed accounts as well, I removed the account that was running my service. Apparently, SharePoint didn't like this and I received an error stating that my SPManagedAccount could not be deleted because other objects depend on it when I loaded the Managed Accounts page in Central Administration. Well, not being able to load the page sort of limits my options for correcting the problem, doesn't it, Microsoft?


    I double checked and the service application had been removed from SharePoint, the Excel Calculation Service had been stopped on the App Server, and the ApplicationPool had even been removed from IIS. Looking up the CorrelationID in the logs also didn't tell me very much.

    I decided to pop open PowerShell and see what I could pull off. Turns out it was a rather simple fix. First, I looked up my Managed Account. I then tried to remove it and found out the dependency was an application pool. After removing the dependency I was about to remove the Managed Account! The PowerShell Commands to do this were:
    • Get-SPManagedAccount
    • Get-SPServiceApplicationPool
    • Remove-SPServiceApplicationPool
    • Remove-SPManagedAccount
    Here is a screenshot of the PowerShell at work:

    Thursday, June 17, 2010

    SP2010 Dispose Patterns and Broken WebParts

    Today I was building a web part for SharePoint 2010 and ran across an interesting problem. The web part was working fine for authenticated users, but was failing for anonymous users. My first thought was that Lockdown might have been the problem, but it turned out after looking through SharePoint log files that I was running into a dispose problem:

    Trying to use an SPWeb object that has been closed or disposed and is no longer valid.

    It's an interesting error message, considering that I was disposing of my object, but only after I'd finished with it. I believe the problem wasn't that my code was using a disposed SPWeb object, but rather that I had closed an SPWeb that SharePoint was relying upon.

    Here was the original code, which, according to Best Practices, was closing objects that don't need to be closed:

    using (SPSite site = SPContext.Current.Site)
    {
        using (SPWeb web = site.RootWeb)
        {
            SPList myList = web.Lists["myList"];
            //code here to find a list item
        }
    }

    And the fix was as simple as changing it to this:

    SPSite site = SPContext.Current.Site;
    SPWeb web = site.RootWeb;
    SPList myList = web.Lists["myList"];
    

    I'm not so sure this would have broken a SharePont 2007 site, but it's a good reminder to make sure you are following Best Practices for disposing objects!

    Thursday, June 3, 2010

    SharePoint 2010 Custom Error Messages for Public Facing Deployments

    It's a fairly common requirement when building a public facing SharePoint site to make sure all pages share branding elements. This includes the SharePoint error pages. While working on a publishing site, you've probably encountered an article that touches on one of the error pages, but for some reason I haven't seen one that tries to cover all of them. So hopefully this will do the job!

    Error Pages

    If you are just looking to control the basic error pages, then look no further than the SPWebApplication.SPCustomPage enumeration, which provides a list of commonly updated pages. You use the SPWebApplication.UpdateMappedPage() method to set which pages SharePoint should serve up for each of the values within the enumeration. For a good example of this, see this post.

    HTTP 404
    But what about other HTTP status codes? Well the easiest would be the 404. SharePoint already has support for a custom 404 page, as long as it is pure HTML. The default 404 is located in 14\TEMPLATE\LAYOUTS\1033\sps404.html. The quickest way to override this is to use a feature to deploy your own HTML page into the same folder. A feature receiver can then update the SPWebApplication.FileNotFound property with a relative path to your file. There are just a couple points to keep in mind when you do this:
    1. Your custom page should be greater than 512 bytes. This is because the default feature in Internet Explorer to show "friendly" error messages will sometimes ignore pages smaller than this. You can read more about this problem at Microsoft Support.
    2. Your custom page, if encoded as UTF-8, should not have a BOM. Even if you save your file as UTF-8 without BOM, editing it later in Visual Studio will add the BOM back, so be careful. You can read more about that problem here.

    If you'd like to have a custom 404 that performs server side code, such as to include web parts, you'll need to get a little more creative. There are two common ways to handle this. The first would be to have a static html page that performs a client side redirect with either a META tag or javascript. If you'd like to see that approach in action, here is an example.

    A more robust solution would be to use an HttpModule. With the HttpModule, you have two more possibilities for how you serve your page. You can use a Response.Redirect, which is what the linked article suggests. The only downside to this is that the URL changes, which may or may not be desired. Alternatively, to keep the URL, you would need to use Response.Write.

    HTTP 401
    Now that you have custom error pages for the SPCustomPage enumered pages as well as a 404 page. But what about a 401? The simplest way you might think to do this would be to edit the web.config CustomErrors element. In any normal ASP.NET application, that would have been sufficient. Unfortunately it doesn't work in SharePoint.

    Your next instinct might be to try IIS, as it provides the ability to set custom error pages. If you edit the property for the 401.2 status code and point it to a custom HTML page on your site, it may or may not work. In testing, it turns out that having Anonymous Access enabled in SharePoint (which then sets it in IIS) prevents a custom 401.2 page from being used. However, if Anonymous Access is disabled, such as with an Intranet site, then the custom page will show just fine. Since we are talking about a public facing deployment you are probably using the Publishing template, so you're going to need a different approach.

    The trick for a custom 401 message when you have Anonymous Access enabled is using a custom http module, much like you could have done for the 404.

    Here is an example:

    public class CustomPageMappingsHttpModule : IHttpModule
    {
     private const string Custom401File = "/_layouts/1033/Custom401_CSS.htm";
    
     private HttpApplication _application;
    
     public void Dispose()
     {
     }
    
     public void Init(HttpApplication context)
     {
      this._application = context;
      this._application.PreSendRequestHeaders += new EventHandler(_application_PreSendRequestHeaders);
     }
    
     protected void _application_PreSendRequestHeaders(object sender, EventArgs e)
     {
      HttpResponse response = this._application.Response;
    
      if (response.ContentType.Equals("text/html", StringComparison.CurrentCultureIgnoreCase))
      {
       string message = string.Empty;
    
       switch (response.StatusCode)
       {
        case 401:
         message = File.ReadAllText(this._application.Server.MapPath(Custom401File));
         this._application.Response.Clear();
         this._application.Response.Write(message);
         break;
       }
      }
     }
    }
    

    Once you hook that up in your web.config, go ahead and try to load http://YourSite/_layouts/settings.aspx. You should get the same authentication prompt you'd expect for an anonymous user, but if you hit cancel you'll now see your custom 401 message. Note that the way you register an HttpModule in 2010 is slightly different. If you add the module to the httpModules element in web.config, it may not work. In testing, I was only able to get my module to work by adding it to the modules element with the other SharePoint HttpModules.

    As a final level of customization, you could consider having no prompt at all on your public site when users request restricted content. The way to do this is to extend the site and have a private site as well. Since we are just talking about two different web sites in IIS (with corresponding settings and web.config files), you can control their authentication and error settings independently while still serving up the same content. The public site could then have Anonymous Access enabled and Windows Authentication disabled. This removes the login prompt when a 401 occurs on the public site. The private site would have Anonymous Access disabled but Windows Authentication enabled, allowing users to log in and manage content as needed.

    Quirks
    Since you are probably using the Publishing template, it's worth mentioning the Lockdown feature. This feature stops users from being able to see your Form pages. It also locks down lists and libraries, such as the Style Library. So make sure if you use CSS in your custom error pages that you know whether you have lockdown enabled and plan locations for your assets appropriately.

    Wednesday, May 26, 2010

    SharePoint 2010 Content Deployment Jobs Missing Deployment Options Section

    I've been working with Content Deployment jobs lately, and noticed that in 2010, the section on the Create Jobs page where you can specify whether to perform an incremental or full content deployment was missing. Just to make sure I wasn't crazy, I looked up the old screen from 2007, which looks like this:


    As you can see, there is a section there for setting the types of content that are deployed. The first option, "Deploy only new, changed, or deleted content" is the incremental deployment. The second option, "Deploy all content, including content that has been deployed before", is the full deployment.

    Now compare that screen to the 2010 equivalent:

    For some reason, the Deployment Options section is missing. This is not gonna work. If you recall from 2007, doing a full deployment after the initial deployment can cause problems. See this article for more information about why.


    So, how do we create an incremental job? The answer, courtesy of Becky Bertram, MVP:
    You can't do it through Central Administration, but you can specify whether you want to do a full or incremental deployment using PowerShell. Go to your SharePoint PowerShell prompt and type get-help new-spcontentdeploymentjob -detailed and then take a look at the IncrementalEnabled parameter.

    Here is a simple script to create an incremental content deployment job, assuming you have already defined a content deployment path of "Authoring to Production":

    New-SPContentDeploymentJob -Name "Authoring to Production - Incremental" 
    -SPContentDeploymentPath "Authoring to Production" -IncrementalEnabled:$true
    

    Looking in Central Administration, you'll see your created job, but you still can't see if it's incremental or full. To do that, run Get-SPContentDeploymentJob "Authoring to Production - Incremental". You should notice the following line in your output:
    • ExportMethodType             : ExportChanges
    That sounds awfully like an incremental doesn't it? Let's try setting the  IncrementalEnabled parameter to false and creating another job:
    New-SPContentDeploymentJob -Name "Authoring to Production - Full" 
    -SPContentDeploymentPath "Authoring to Production" -IncrementalEnabled:$false 
    
    Now run Get-SPContentDeploymentJob "Authoring to Production - Full" and you should see a slightly different export method type:
    • ExportMethodType             : ExportAll
    So, this means that your Content Deployment job is going to be incremental by default. The only way to get a full job is to create it via PowerShell. Given the limitations and caveats with full jobs, this seems like a good change.

    Thursday, May 6, 2010

    SP 2010 - Web Parts and Embedded Resources Not Working

    I ran into a problem recently trying to use Embedded Resources with SharePoint 2010 web parts. I was using code I had directly used in non SharePoint ASP.Net server controls, which if you aren't familiar with, is pretty simple. I'll briefly describe how you'd use an embedded resource, then describe my problem and how I fixed it.

    First, you need to just add your resource, which in my case was a stylesheet, to your project. Then, using the properties pane, set the Build Action to Embedded Resource. Once you've done that, you'll need to edit the AssemblyInfo.cs file within your project and enable the embedded resource.

    [assembly: WebResource("MyAssembly.styles.MyStyleSheet.css", "text/css")]
    

    Now you have a resource that will be embedded in your DLL. The path to that resource is specified in the WebResourceAttribute above. I usually compile and open my DLL in reflector at this point to confirm that the path I've picked is correct. Once it is, you can include your stylesheet like this:

    string styleSheetUrl = Page.ClientScript.GetWebResourceUrl(this.GetType(), "MyAssembly.styles.MyStyleSheet.css");
    LiteralControl styleSheetLink = new LiteralControl(string.Format("LINK SYNTAX HERE - BLOGGER WONT LET ME EMBED THE HTML", styleSheetUrl));
                    Page.Header.Controls.Add(styleSheetLink);
    

    In theory, that should work, right? It works in server controls, and it's worked for me in older web parts for MOSS 2007. But, when I built my 2010 web part, it wasn't working.

    I viewed source on my page and could see that the embedded resource was added to the markup. But when I tried to load that URL myself, I got an error, implying my resource wasn't there at all! I checked and rechecked my path inside the DLL. Everything was right.

    But then I noticed something different about 2010 web parts. They use a LoadControl mechanism. I was accessing my embedded resources, using the above code, directly inside the webpart.ascx.cs, not the webpart.cs file. On a hunch, I moved the code into the webpart.cs file. Success!

    This got me curious, so I looked at the source for the generated HTML and noticed that the generated URL for the embedded resource had changed. Apparently, the path is based somewhat on that first parameter to GetWebResourceUrl(), which is a type. Tinkering a little more, I learned that I could actually leverage the embedded resource from the webpart.ascx.cs by making a minor adjustment.

    string styleSheetUrl = Page.ClientScript.GetWebResourceUrl(this.Parent.GetType(), "MyAssembly.styles.MyStyleSheet.css");
    

    Notice that I'm not passing the type of the parent, which for our webpart.ascx.cs, is the wrapper class that calls LoadControl.

    Tuesday, May 4, 2010

    SP 2010 - LINQ versus CAML Joins and the Nuances of Projected Fields

    When working with relational lists in SharePoint 2010, you have the option to use LINQ to SharePoint or CAML to join those lists to pull data out. While LINQ is easier to use and will leverage CAML under the covers, it is not always capable of performing queries that CAML can directly.

    For instance. Let's say you have a list, called Parent. This list has a single valued lookup column, PrimaryChild, that refers to the Children list. LINQ can very easily perform a query with a where clause based on PrimaryChild:

    var parents = from p in context.Parents
           where p.PrimaryChild.Title == "My First Child"
           select p;
    

    However, if you were to create a second lookup column, OtherChildren, that allowed multiple values, LINQ would run into difficulties because the lookup is now represented as an EntitySet. With a single valued lookup field, you would instead have just a strongly typed object, with direct access to the fields within that list item.

    var parents = from p in context.Parents
           where p.OtherChildren.Any(c => c.Title == "My Other Child")
           select p;
    

    Running this query will throw an exception that semi-efficient queries are not allowed. If you recall, this is because LINQ leverages Two-Stage Queries. So, you can continue with LINQ and perform the query in two stages, or you can change the query to use CAML.

    Here is the same query in CAML, but this time, the query will not cause an exception.

    using (SPSite site = new SPSite(SPContext.Current.Site))
    {
     SPWeb web = site.RootWeb;
     if (web != null)
     {
      SPList list = web.Lists["Parents"];
      if (list != null)
      {
       SPQuery query = new SPQuery();
       StringBuilder sbQuery = new StringBuilder();
       sbQuery.Append("");
       sbQuery.Append("");
       sbQuery.Append("My Other Child");
       sbQuery.Append("");
       query.Query = sbQuery.ToString();
    
       StringBuilder sbJoins = new StringBuilder();
       sbJoins.Append("");
       sbJoins.Append("");
       sbJoins.Append("");
       sbJoins.Append("");
       sbJoins.Append(" ");
       sbJoins.Append(" ");
       query.Joins = sbJoins.ToString();
    
       StringBuilder sbProj = new StringBuilder();
       sbProj.Append("");
       query.ProjectedFields = sbProj.ToString();
    
       StringBuilder sbView = new StringBuilder();
       sbView.Append("");
       sbView.Append("");
       query.ViewFields = sbView.ToString();
    
       if (!string.IsNullOrEmpty(query.Query))
       {
        SPListItemCollection matches = list.GetItems(query);
        foreach (SPListItem match in matches)
        {
         Console.WriteLine(match["Title"]);
    
         string rawNickname = (string)match["OtherChildrenNickname"];
         if (!string.IsNullOrEmpty(rawNickname))
         {
          SPFieldLookupValue nickname = new SPFieldLookupValue(rawNickname);
          Console.WriteLine(nickname.LookupValue);
         }
        }
       }
      }
     }
    }
    

    You'll notice that in the example above I'm performing a CAML join and also leveraging Projected Fields. Here are some guidelines/rules to keep in mind:
    • The CAML has several attributes that ask for the list name or list alias. This is NOT the actual name of the list. Rather, it is the internal name of the lookup field within your list. So in our example, the list was named Children, and the field was OtherChildren. We used OtherChildren to build the join and projected fields.
    • Projected Fields used in your query do not have to match up to the Projected Fields you have specified in the Child list. Those are a UI convenience and not used by your CAML.
    • If you want to display a value from the Child list, you need to make sure you have a Projected Field in your CAML. Only those fields which are projected are eligible to be View fields.
    • All Projected Fields will become SPFieldLookupValue objects (or perhaps SPFieldLookupValueCollection, though I haven't yet had one of my lists do this). Those lookups will always contain the ID of the child list item, but the value of the selected field within that child item.

    Thursday, April 29, 2010

    SP 2010 - Phonetic Search Only for People Scope?

    I have a custom scope I've created that I am querying against with the FullTextSqlQuery object. Recently I was asked if we could allow a phonetic search. With all the hype around this new feature in SharePoint 2010, I thought it would be very easy.

    I looked, and sure enough, there is an EnablePhonetic property right on FullTextSqlQuery. I set it to true, ran my search and got back...nothing. I figured maybe it didn't like my query, which had a LIKES keyword. That wasn't it. I tried looking on the web and it seems all the mentions of phonetic search seem to be closer to press releases than coding snippets.

    Finally I found this nugget on MSDN:
    For FAST Search Server 2010 for SharePoint, this property is only applicable for People Search.
    Well, I'm not using FAST, but I am using a custom search scope. It would appear that might be the limiting factor here. While I am searching for people, they are stored in a custom list, since they are not members of my SharePoint site. Therefore I was using a custom scope to get to them.

    It would seem phonetic search is not available yet to customize in this way.

    Wednesday, April 28, 2010

    SP 2010 - Configure and Use a TaxonomyWebTaggingControl

    If you are using the Managed Metadata Service and developing a custom web part, you may be interested in using that nice term picker that you see when you create a new list item that contains a Managed Metadata Column.

    First, let's cover the bare minimum code you'll need to get the term picker to show up in your web part. First you'll want to add a reference to Microsoft.SharePoint.Taxonomy to your project. Then you'll need a Register directive in your ascx, like this:

    <%@ Register Tagprefix="Taxonomy" Namespace="Microsoft.SharePoint.Taxonomy" Assembly="Microsoft.SharePoint.Taxonomy, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %> 
    

    Now you can add the picker to your web part just like any other control.

    
    

    You'll notice that I didn't set any properties on the TaxonomyWebTaggingControl. Well there are three properties you'll need to set, but we'll do it programatically. Those properties are:
    • SSPList - This property parses your string and builds a List to set the SspId property
    • GroupId - A Guid
    • TermSetList - This property parses your string and builds a List to set the TermSetId property

    Now, you can set these properties declaratively if you wish, but it won't be a very portable web part, as the Guids aren't part of the import CSV format for the Managed Metadata Service, nor do any of the Taxonomy Create methods allow setting of Ids. Because of this, it's best to have your web part set these properties on the TaxonomyWebTaggingControl programatically. If you just want to see how you would go about setting these properties declaratively, scroll to the bottom of this post, as I've included it for reference.

    In order to make your web part portable, you're going to need to query the Managed Metadata Service and get these values yourself. Moreover, you're going to have to do it on every load of your web part since these properties are not stored anywhere between page loads.

    Note: If you only set these properties on page load, this will work as long as there are no scenarios where you need to redraw the control again, such as having a custom server validator. In that instance, your picker would work before the postback, but the look-ahead feature would be broken after the postback.

    In order to make loading our properties a little easier, here is an extension method for TaxonomyWebTaggingControl that allows you to quickly set the properties. The path part of this was inspired by PHolpar. Also, remember that extension methods must be defined in static classes.

    public static bool Configure(this TaxonomyWebTaggingControl twtc, string termSetPath)
    {
     bool configured = false;
    
     TaxonomySession session = new TaxonomySession(SPContext.Current.Site);
    
     string[] parts = termSetPath.Split(new char[] { '/' }, StringSplitOptions.RemoveEmptyEntries);
     if (parts.Length > 0)
     {
      TermStore termStore = GetTermStore(session, parts[0]);
      if (termStore != null && parts.Length > 1)
      {
       twtc.SSPList = termStore.Id.ToString();
    
       Group group = GetGroup(termStore, parts[1]);
       if (group != null && parts.Length > 2)
       {
        twtc.GroupId = group.Id;
    
        TermSet termSet = GetTermSet(group, parts[2]);
        if (termSet != null)
        {
         twtc.TermSetList = termSet.Id.ToString();
         configured = true;
        }
       }
      }
     }
    
     return configured;
    }
    

    In order to load MyTermSet, which lives within MyGroup, which resides within MyTermStore, you will use this extension method similar to the following:

    protected void Page_Load(object sender, EventArgs e)
    {
     bool configured = twtcSpecialty.Configure("MyTermStore/MyGroup/MyTermSet");
     if (!configured)
     {
      throw new ApplicationException("Unable to find target TermSet");
     }
    }
    

    The result looks just like the picker from other parts of SharePoint, with functioning look-ahead.


    As promised, here is the markup if you want to configure your TaxonomyWebTaggingControl declaratively.

    
        0d8382cf-2d63-4421-a37a-b9386d0be5c8
        7adeced1-b73e-47dc-b695-82bb28e2f48f
    
    

    Tuesday, April 27, 2010

    SP 2010 - Randomize the order of your search results from FullTextSqlQuery

    I had a need recently to perform a query with FullTextSqlQuery but to limit the results and then display them in a random order. The FullTextSqlQuery class does support a RowLimit property, though this would not work in my case because I didn't want to get the exact same answers everytime.

    Remembering that LINQ to SharePoint often requires Two-Stage Queries, I decided to employ a similar approach here and pull back the data I could then perform more culling server side.

    DataTable searchResults = PerformSearch(query, queryRowLimit);
    if (searchResults != null && searchResults.Rows.Count > 0)
    {
     int limit = 10; //todo: this should be a webpart property
     var results = (from row in searchResults.AsEnumerable()
           orderby Guid.NewGuid()
           select row).Take(limit);
    }
    

    The PerformSearch method is just a standard setup for using FullTextSqlQuery to return a DataTable. Note that I limit my resultset as much as I can with the FullTextSqlQuery.QueryText property to try to avoid hitting a throttling exception.

    With the limited results returned, I then want to randomize the order of the records. LINQ allows us to do this the same way we would in SQL, and I just order by a random Guid. Once the results are randomized, we just grab the number of records we need with the Take() extension method.

    Friday, April 23, 2010

    SP 2010 Managed Metadata TermSet.CreateTerm Throws Error

    I've been working on importing a set of Terms into my Managed Metadata Term Store from a 3rd party database. However, I ran into a snag. When I execute the following code, I get an error, "There is already a term with the same default label and parent term."

    public static void CreateTermIfNotExists(TermSet termSet, string termName)
    {
     if (termSet != null && !string.IsNullOrEmpty(termName))
     {
      Term term = null;
    
      //This throws an exception if the Term doesn't exist
      try
      {
       term = termSet.Terms[termName];
      }
      catch { }
    
      if (term == null)
      {
       Term t = termSet.CreateTerm(termName, 1033);
       termSet.TermStore.CommitAll();
      }
     }
    }
    

    I attached my debugger and found out that even though I was checking for my term, the code would say it wasn't there, even though it was! The problem, was that the specific term causing me problems had an ampersand. The term name I supplied was "Foo & Bar", but the value put into the TermStore actually contained unicode version of the ampersand.

    Looking through the documentation, I found this relevant comment:

    The name value will be normailized to trim consecutive spaces into one and replace the & character with the wide character version of the character (\uFF06). The leading and trailing spaces will be trimmed. It must be non-empty and cannot exceed 255 characters, and cannot contain any of the following characters ; "<>|&tab.

    That was indeed the behavior I was seeing. Thinking I had just found a limitation of the TermSet.Terms collection, I changed my code to this:

    public static void CreateTermIfNotExists(TermSet termSet, string termName)
    {
     if (termSet != null && !string.IsNullOrEmpty(termName))
     {
      Term term = null;
    
      TermCollection tc = termSet.GetTerms(termName, 1033, true, StringMatchOption.ExactMatch, 1, false);
      if (tc != null && tc.Count > 0)
      {
       term = tc[0];
      }
    
      if (term == null)
      {
       Term t = termSet.CreateTerm(termName, 1033);
       termSet.TermStore.CommitAll();
      }
     }
    }
    

    I ran again, and this time instead of blowing up on just that one case, my code went crazy trying to insert a bunch of different terms that were working fine before and throwing way more exceptions. A little research on this method led me to this post, which suggests the TermSet.GetTerms method does not work in Beta 2, which seems to be what I just discovered as well.

    I decided to explore the reference to normalizing term names from the MSDN link. My final pass at the code became:

    public static void CreateTermIfNotExists(TermSet termSet, string termName)
    {
        if (termSet != null && !string.IsNullOrEmpty(termName))
        {
            Term term = null;
    
            try
            { 
                string normalizedTermName = TermSet.NormalizeName(termName);
                term = termSet.Terms[normalizedTermName];
            }
            catch { }
    
            if (term == null)
            {
                Term t = termSet.CreateTerm(termName, 1033);
                termSet.TermStore.CommitAll();
            }
        }
    } 
    

    Finally, success! Moral of the story - when you query your TermSet for a specific Term you need to normalize your name first, because that is how it will be stored internally.

    Monday, April 19, 2010

    SP 2010 - Using LINQ to SharePoint to Find List Items with Specific Managed Metadata Terms

    Now that I can effectively use LINQ to access my Managed Metadata Columns, I'd like to only pull back those columns that contain values I need. For single valued Managed Metadata Columns, this is very straightforward:

    var examples = from d in context.Examples
          where d.Specialty is TaxonomyFieldValue && ((TaxonomyFieldValue)d.Specialty).Label == "Value One"
                   select d;
    

    For multi valued Managed Metadata Columns, my first attempt was a bust. I tried the following expression, but received a compiler error, "An expression tree may not contain an anonymous method expression".

    var examples = from d in context.Examples
                   where ((TaxonomyFieldValueCollection)d.Specialty).Exists(delegate(TaxonomyFieldValue tfv){
                 return tfv.Label == "Value One";
                   }) != null
          select d;
    

    The hint here from the compiler is that I can do this as long as I don't use an anonymous method. So I created a method to test for the term I wanted and changed my expression to call this method.

    private static bool ContainsMetadataTerm(object o, string termLabel)
    {
     bool exists = false;
    
     if (o is TaxonomyFieldValueCollection)
     {
      TaxonomyFieldValueCollection tfvc = (TaxonomyFieldValueCollection)o;
      exists = tfvc.Exists(delegate(TaxonomyFieldValue tfv)
      {
       return tfv.Label == termLabel;
      });
     }
    
     return exists;
    }
    
    var examples = from d in context.Examples
          where d.Specialty is TaxonomyFieldValueCollection && ContainsMetadataTerm(d.Specialty, "Value One")
          select d;
    

    Now I can query specifically for list items that use specific terms. Keep in mind that this filtering does not happen until after the list items have been pulled down, so you may want to do some additional filtering to narrow the results so you don't run afoul of the new Throttling feature of SharePoint 2010.

    Here was the CAML generated by the last LINQ query, which shows that the additional filtering for Term was done after the records were pulled.

    
      
        
          
            
            0x0100
          
        
      
      
        
        
        
        
        
        
        
        
        2147483647
    
    

    SP 2010 - Managed Metadata Columns ARE Supported in LINQ to SharePoint

    I apparently spoke too soon! You can get to Managed Metadata Columns in LINQ to SharePoint with SPMetal, just not directly off the command line. You need to supply a parameters option.

    First you'll want to create your parameters file. I've found two different ways to get the Managed Metadata Column to show up. The first attempt I used this XML in my parameter file.

    
      
        
          
        
      
    
    
    

    I saved this file to metaloptions.xml and then ran the following command.

    SPMetal.exe /web:http://mysite /code:SPMySite.cs /namespace:SPMySite /parameters:metaloptions.xml

    The output of that command will include a warning, but it seems to have no impact on LINQ working. I think it's purely informational and not relevant to what we are working with.

    Warning: All content types for list Form Templates were excluded.

    Now when I query with LINQ I see the hidden fields that power my Managed Metadata Column, which look familiar.

    On my typed list item object, I now had three new fields:
    • Specialty_0 - String
    • TaxonomyCatchAllColumnCatchAllData - IList
    • TaxonomyCatchAllColumnId - IList

    Those fields hold values that look like this, respectively. Unfortunately, this isn't terribly useful.
    • Value One|e203149a-6852-46fb-9d8e-9c21d350068d
    • z4KDDWMtIUSjerk4bQvlyA==|0c7eej633Ee2lYK7KOL0jw==|mhQD4lJo+0adjpwh01AGjQ==
    • 4

    I tried another pass on my parameters file. This time I used the following XML.

    
    
      
        
          
        
      
    
    

    On my typed list item object, I now have the one extra Column that I specified in my parameters file. LINQ will create a property for this field that is just an object. However, if you access this property, you can cast it to a TaxonomyFieldValueCollection or TaxonomyFieldValue, as appropriate. This provides exactly the information we wanted:

    using (SPMySiteDataContext context = new SPMySiteDataContext("http://mysite"))
    {
     var examples = from d in context.examples
          select d;
    
     foreach (var example in examples)
     {
      if (example.Specialty is TaxonomyFieldValueCollection)
      {
       foreach (TaxonomyFieldValue tfv in (TaxonomyFieldValueCollection)example.Specialty)
       {
        Console.WriteLine(tfv.Label);
       }
      }
      else if (example.Specialty is TaxonomyFieldValue)
      {
       TaxonomyFieldValue tfv = (TaxonomyFieldValue)example.Specialty;
       Console.WriteLine(tfv.Label);
      }
     }
    }
    

    SP 2010 - Managed Metadata Columns Not Supported in LINQ to SharePoint?

    Today I am playing with LINQ to SharePoint. I created a simple list and added a Managed Metadata Column to my list. I then used SPMetal to generate a DataContext class. My Managed Metadata Column is nowhere to be found. I looked through options for SPMetal to see if maybe it just needed a switch to capture those Managed Metadata Columns, but I don't see one.

    So it appears that Managed Metadata Columns have no support in LINQ to SharePoint. They are also not eligible to be Projected Fields. It would seem Microsoft didn't really flesh out all the ways the new Managed Metadata Service might be used.

    Update: I figured out how to do this.

    Thursday, April 15, 2010

    SP 2010 - Creating ListItems with Managed Metadata Columns

    Today I needed to create some list items programatically for a list that contained Managed Metadata columns. Since Managed Metadata columns are specialized SPLookups under the covers, it's a little more difficult to set your field value than just assigning the text value of your Term.

    Here is a method that will set the value for you, and an example of how to use it. Note that in my example the Managed Metadata column is multi-value, even though the example method is only set up to add a single Term. If you have a single-value Managed Metadata column, you can leave out the TaxonomyFieldValueCollection.

    class Program
    {
     static void Main(string[] args)
     {
      using (SPSite site = new SPSite("http://mysite"))
      {
       SPList pl = site.RootWeb.Lists["MyList"];
       if (pl != null)
       {
        TaxonomySession session = new TaxonomySession(site);
        TermStore termStore = session.TermStores["Managed Metadata Service"];
        Group group = termStore.Groups["MyGroup"];
        TermSet termSet = group.TermSets["MyTermSet"];
    
        SPListItem li = pl.AddItem();
        li[SPBuiltInFieldId.Title] = "my new item";
    
        Term term = termSet.Terms["MyTerm"];
        UpdateListItemTermColumn(li, "MyMetadataField", term);
    
        li.Update();
       }
      }
     }
    
     static void UpdateListItemTermColumn(SPListItem li, string fieldName, Term term)
     {
      SPField termField = li.Fields[fieldName];
    
      TaxonomyFieldValueCollection tfvc = new TaxonomyFieldValueCollection(termField);
      tfvc.Add( new TaxonomyFieldValue(termField));
      li[fieldName] = tfvc;
    
      TaxonomyField taxonomyField = termField as TaxonomyField;
      if (taxonomyField != null)
      {
       taxonomyField.SetFieldValue(li, term);
      }
     }
    }
    

    SP 2010 - Empty Logs Versus Least Permissions Install

    Earlier this week I was having problems with my SharePoint logs being empty. At the time, the only fix I could discover was giving my AppPool identity Administrator rights on the machine. Obviously the reason the account lacked such permissions was because I had given it no rights when I created it - the same way I would have when I install MOSS 2007.

    Well giving Administrator rights to your AppPool identities in SharePoint 2010 is also a bad thing, as shown by the new Health Monitoring:


    However, at this time, I can't seem to find what permission set is needed for ULS to be accessible by my AppPool account. Right now my choices are to upset Health Monitoring or to have logs. Having both is apparently a luxury.

    Wednesday, April 14, 2010

    SharePoint 2010 - Can't Open Crawled Properties

    I have been working on building some custom Search components, which will leverage my own Managed Properties. However, early on in the process I hit a snag.

    I created some sample content and then running a full index of my farm. SharePoint was able to discover some new Crawled Properties during this process, which I was hoping to turn into Managed Properies. However, when clicking on any of the new Crawled Properties, I get an error "Unable to cast object of type 'System.DBNull' to type 'System.String'":



    Using my newly working SharePoint logs, I found the following error message:

    SchemaDatabase.GetSamples:Error occurred when reading [SampleUrl] System.InvalidCastException: Unable to cast object of type 'System.DBNull' to type 'System.String'.     at Microsoft.Office.Server.Search.Administration.SchemaDatabase.GetSamples(CrawledProperty crawledProperty, Int32 sampleCount)

    I then opened up Reflector to find what was going on in that method. It turns out to be a very simple method that just calls a stored procedure. I fired up SQL Server Profiler and tracked down this call, which ultimately was breaking the page:

    exec dbo.proc_MSS_GetCrawledPropertySamplesByPropertyID @CrawledPropertyId=334,@SampleCount=5

    So, as it turns out, this SProc does handle NULLs, just not as robustly as we might want! Here is the SProc, which I found in the Search_Service_Application_PropertyStoreDB_{GUID} database:

    CREATE PROCEDURE dbo.proc_MSS_GetCrawledPropertySamplesByPropertyID
    @CrawledPropertyId  int,        
    @SampleCount        int
    AS
            set RowCount @SampleCount
         SELECT
               ( DP.strVal + ISNULL(cast(DP.strVal2 AS nvarchar(2000)), '') ) as 'SampleURL'
         FROM 
                dbo.MSSDocProps as DP         
            INNER JOIN 
                dbo.MSSCrawledPropSamples as CPS
                on CPS.DocId = DP.DocId
         WHERE
             CPS.CrawledPropertyId = @CrawledPropertyId
                AND DP.Pid          = 7
            ORDER BY DP.strVal
            set RowCount 0
    

    Normally I wouldn't dream of altering a SharePoint SProc, but seeing as we are still in beta and I'm approaching a deadline, I decided to make a slight adjustment:

    alter PROCEDURE dbo.proc_MSS_GetCrawledPropertySamplesByPropertyID
    @CrawledPropertyId  int,        
    @SampleCount        int
    AS
            set RowCount @SampleCount
         SELECT
               ( ISNULL(DP.strVal,'') + ISNULL(cast(DP.strVal2 AS nvarchar(2000)), '') ) as 'SampleURL'
         FROM 
                dbo.MSSDocProps as DP         
            INNER JOIN 
                dbo.MSSCrawledPropSamples as CPS
                on CPS.DocId = DP.DocId
         WHERE
             CPS.CrawledPropertyId = @CrawledPropertyId
                AND DP.Pid          = 7
            ORDER BY DP.strVal
            set RowCount 0 

    Now, when clicking on my Crawled Property, I get a proper page:

    Sharepoint 2010 ULS Problems - Logs are Empty!

    I've been noticing since my install that SharePoint 2010 seems to not be logging much. It was logging some, but most of the logs were very small. Contrast this with SharePoint 2007 where you could easily generated several hundred kilobytes of logging within a few minute span and you can tell something is wrong.

    I tried a lot of things to get to the root of my problem. The question I was trying to answer was why sometimes I'd get logs, and other times not. And why were none of my Correlation IDs found? I tried a lot of different things, including:
    • Disabling the Windows Firewall
    • Stopping and restarting the Windows SharePoint Services Tracing V4 service.
    • Running commands from stsadm or Powershell to see if they could log.
    While doing this, I started reading the log entries I was receiving and noticed something. Each of the entries listed the process that was adding the entry. I saw entries from the following processes:
    • STSADM.EXE
    • PowerShell.exe
    • wsstracing.exe
    • vssphost4.exe
    • psconfigui.exe
    Surprisingly, I didn't see a single entry for w3wp.exe. I decided to check permissions for the LOGS folder. The user who runs my AppPool, sp_Farm looks like he has almost every permission there is, and should be able to do almost anything to the log files. It didn't make sense.

    In frustration I put all my AppPool accounts in the Administrators group and rebooted the machine. Suddenly my logs are going wild.

    I'm not entirely sure why my AppPool accounts couldn't write to the logs when they are already in WSS_ADMIN_WPG, but there is clearly some missing permission somewhere. I'll have to break out procmon when I have some free time to figure out what the permission was missing.

    Tuesday, April 13, 2010

    How Not to Run SharePoint 2010 Successfully

    I'm still waiting on some new RAM to arrive in the mail, so currently my development machine for SharePoint 2010 is a Core2Duo 2Ghz laptop running a VM with 2Gigs of RAM. Needless to say, I've gotten used to looking at this:

    SP 2010 - Adding Metadata Columns to Content Type Programatically

    I have a feature that provisions a few Site Columns and a Content Type that will use those Site Columns. This was easy enough to achieve with some XML packaged into a Feature. However, my Content Type also needed to have a Managed Metadata column, which doesn't work with XML.

    The reasons you can't do a Managed Metadata Site Column through XML are the same reasons Lookup columns don't work - the GUIDs that refer to the underlying objects change between environments, so you can't hardcode them into the XML. If you wanted to write the XML by hand to do this and already knew the GUIDs, it would of course work, but that's more effort than it's worth for such a limited solution.

    My first thought when trying this was that I knew when I had exported my Content Type that I created through the GUI I found out that my Managed Metadata Site Column as provisioned ended up being several different SPFields under the covers. So in my first pass, I tried to create all of these fields. That ended up breaking my site because I was unable to remove all the SPFields/SPContentTypes after the fact!

    Well, it turns out the answer isn't so hard after you try all the wrong ways first. I ended up having my Feature that provisions the base Site Columns and Content Type via XML. I then created a Feature Receiver that creates the Managed Metadata Site Column and then adds it to the Content Type. Here is the code:

    public enum Operation
    {
     Add,
     Remove
    }
    
    [Guid("367cb65f-cd86-440a-8f39-1bfa2a9ab1f6")]
    public class MyContentTypeEventReceiver : SPFeatureReceiver
    {
     /*
      * On feature activation, we are going to provision a site column that points to the Managed MetaData Service used by this site.
      * We have to do this because managed metadata columns work like lookups do under the covers and are keyed to the store they were
      * created with.
      * 
      * After creating the site column, we will then update our content type to include the new site column.
      */
     public override void FeatureActivated(SPFeatureReceiverProperties properties)
     {
      ProvisionMetadataSiteColumn("Managed Metadata Service", "MyGroup", "MyTermSet", "MyField", "MyFieldGroup", true, true);
      UpdateContentType("MyContentType", "MyField", Operation.Add);
     }
    
     public override void FeatureDeactivating(SPFeatureReceiverProperties properties)
     {
      UpdateContentType("MyContentType", "MyField", Operation.Remove);
      RemoveSiteColumn("MyField");
     }
    
     #region Helper Methods
    
     private bool ProvisionMetadataSiteColumn(string termStoreName, string termGroupName, string termSetName, string fieldName, string fieldGroupName, bool isRequired, bool allowMultipleValues)
     {
      bool added = false;
      using(SPSite site = SPContext.Current.Site)
      {
       if(GetSiteColumn(site.RootWeb, fieldName) == null)
       {
        //this can time out which would cause it to be returned with no term stores; might be a fluke or very situational
        TaxonomySession session = new TaxonomySession(site);
    
        TermStore termStore = GetTermStore(session, termStoreName);
        if (termStore != null)
        {
         Group group = GetTermGroup(termStore, termGroupName);
         if (group != null)
         {
          TermSet termSet = GetTermSet(group, termSetName);
          if (termSet != null)
          {
           string fieldType = (allowMultipleValues ? "TaxonomyFieldTypeMulti" : "TaxonomyFieldType");
    
           TaxonomyField field = (TaxonomyField)site.RootWeb.Fields.CreateNewField(fieldType, fieldName);
           field.SspId = termStore.Id;
           field.TermSetId = termSet.Id;
           field.AllowMultipleValues = allowMultipleValues;
           field.Group = fieldGroupName;
           field.Required = isRequired;
    
           site.RootWeb.Fields.Add(field);
           site.RootWeb.Update();
    
           added = true;
          }
         }
        }
       }
      }
    
      return added;
     }
    
     private void UpdateContentType(string contentTypeName, string fieldName, Operation operation)
     {
      using (SPSite site = SPContext.Current.Site)
      {
       SPContentType contentType = GetContentType(site.RootWeb, contentTypeName);
       if (contentType != null)
       {
        SPField field = GetSiteColumn(site.RootWeb, fieldName);
        if (field != null)
        {
         bool hasFieldLink = HasFieldLink(contentType, field.Id);
    
         if (operation == Operation.Add && !hasFieldLink)
         {
          SPFieldLink link = new SPFieldLink(field);
          contentType.FieldLinks.Add(link);
          contentType.Update();
         }
         else if (operation == Operation.Remove && hasFieldLink)
         {
          contentType.FieldLinks.Delete(field.Id);
          contentType.Update();
         }
        }
       }
      }
     }
    
     private bool RemoveSiteColumn(string fieldName)
     {
      bool deleted = false;
    
      using(SPSite site = SPContext.Current.Site)
      {
       SPField field = GetSiteColumn(site.RootWeb, fieldName);
       if (field != null)
       {
        try
        {
         field.Delete();
         deleted = true;
        }
        catch { }
       }
      }
    
      return deleted;
     }
    
     #endregion
    
     #region Helper Methods to allow NULL checks with SharePoint
    
     private SPField GetSiteColumn(SPWeb web, Guid fieldId)
     {
      SPField field = null;
    
      try
      {
       field = web.Fields[fieldId];
      }
      catch { }
    
      return field;
     }
    
     private SPField GetSiteColumn(SPWeb web, string fieldName)
     {
      SPField field = null;
    
      try
      {
       field = web.Fields[fieldName];
      }
      catch { }
    
      return field;
     }
    
     private SPContentType GetContentType(SPWeb web, string contentTypeName)
     {
      SPContentType contentType = null;
    
      try
      {
       contentType = web.ContentTypes[contentTypeName];
      }
      catch { }
    
      return contentType;
     }
    
     private bool HasFieldLink(SPContentType contentType, Guid fieldId)
     {
      bool found = false;
      foreach (SPFieldLink fl in contentType.FieldLinks)
      {
       if (fl.Id == fieldId)
       {
        found = true;
        break;
       }
      }
    
      return found;
     }
    
     private TermStore GetTermStore(TaxonomySession session, string termStoreName)
     {
      TermStore termStore = null;
    
      try
      {
       termStore = session.TermStores[termStoreName];
      }
      catch { }
    
      return termStore;
     }
    
     private Group GetTermGroup(TermStore termStore, string termGroupName)
     {
      Group group = null;
    
      try
      {
       group = termStore.Groups[termGroupName];
      }
      catch { }
    
      return group;
     }
    
     private TermSet GetTermSet(Group group, string termSetName)
     {
      TermSet termSet = null;
    
      try
      {
       termSet = group.TermSets[termSetName];
      }
      catch { }
    
      return termSet;
     }
    
     #endregion
    }
    

    TaxonomySession doesn't show my Term Stores

    I ran into an issue yesterday that I haven't completely figured out just yet. Specifically, I had a feature receiver that was querying a Term Store to create a Site Column. Unfortunately, it kept failing. When I attached a debugger to the process it eventually showed me that TaxonomySession.TermStores was empty.

    After playing with it a bit, I finally got it to work by opening up Central Administration and going to the Managed Metadata Service page to "warm" up the Taxonomy service. I'm not sure if it was a web service time out issue or not, but that seems likely. I believe that the worker process powering the Taxonomy web service was taking too long to spin up on my resource starved laptop, which was causing TaxonomySession to give up on loading any TermStores.

    Unfortunately, ULS shut itself down for some reason during this period, and I was unable to find any logging event that would give me more insight.

    If I find out anymore, I'll post about it.

    Monday, April 12, 2010

    SharePoint 2010 Site Columns for Managed Metadata

    I'm still not 100% sure how references to the Managed Metadata Term Stores work. But the Content Type I recently created used one and I'm hoping to get a chance to digest this further soon.

    For instance, if I make a feature that pushes out this Site Column, will the GUIDs line up with the external Term Store as long as I use the export/import feature on the Managed Metadata Service?

    While I ponder those things, here is the XML for what a site column that refers to the term store. As you can see, it's split across several SPFields.

    SharePoint 2010 Content Type Features

    Today I decided to create a Content Type for SharePoint 2010. As far as I can tell, while there has been a lot of effort in building in taxonomy improvements, such as tagging, the Content Type is still alive and well and works the same.

    The easiest way to create a Content Type feature in MOSS 2007 was to create it in the UI first and then extract it. Normally I use a tool for that, but decided this time I'd give it a shot and write code to do it myself. In the process I could see if the API had changed in this area. It appears to work the same, so experience in MOSS 2007 will apply directly.

    Here is the code if you want to try it yourself. The last line will be the contents of your Elements.xml file.

    class Program
        {
            static void Main(string[] args)
            {
                GetContentType();
            }
    
            private static void GetContentType()
            {
                //we'd pull from SPContext.Current.Site, so wouldn't need to do this
                using (SPSite site = new SPSite("http://sharepoint"))
                {
                    string ctype = "0x0100420995776E88234093C72FC85E59E4A8"; //from querystring. you could also loop through the RootWeb.ContentTypes collection and find by name
                    SPContentTypeId id = new SPContentTypeId(ctype);
    
                    bool includeSiteColumns = true;
    
                    XmlDocument xdElements = new XmlDocument();
                    XmlElement xeElementRoot = xdElements.CreateElement("Elements");
    
                    XmlAttribute xaNamespace = xdElements.CreateAttribute("xmlns");
                    xaNamespace.Value = "http://schemas.microsoft.com/sharepoint/";
                    xeElementRoot.Attributes.Append(xaNamespace);
    
                    xdElements.AppendChild(xeElementRoot);
    
                    SPWeb web = site.RootWeb;
                    SPContentType ctMatch = web.ContentTypes[id];
                    if (ctMatch != null)
                    {
                        XmlDocument xdSchema = new XmlDocument();
                        xdSchema.LoadXml(ctMatch.SchemaXml);
    
                        //move the field nodes - otherwise we won't pass the validation of an Elements node for our feature
                        //these will become Site Columns
                        XmlNode xnFields = xdSchema.SelectSingleNode("//Fields");
                        if (xnFields != null)
                        {
                            xnFields.ParentNode.RemoveChild(xnFields);
    
                            if (includeSiteColumns)
                            {
                                XmlDocumentFragment xdfFields = xdElements.CreateDocumentFragment();
                                xdfFields.InnerXml = xnFields.InnerXml;
                                xeElementRoot.AppendChild(xdfFields);
                            }
                        }
    
                        XmlDocumentFragment xdfContentType = xdElements.CreateDocumentFragment();
                        xdfContentType.InnerXml = xdSchema.FirstChild.OuterXml;
    
                        //inject FeildRefs to refer to the fields above
                        XmlNode xnFieldRefs = xdElements.CreateElement("FieldRefs");
    
                        StringBuilder sbFieldRefs = new StringBuilder();
                        foreach (SPFieldLink fr in ctMatch.FieldLinks)
                        {
                            sbFieldRefs.Append(fr.SchemaXml);
                        }
    
                        XmlDocumentFragment xdfFieldReferences = xdElements.CreateDocumentFragment();
                        xdfFieldReferences.InnerXml = sbFieldRefs.ToString();
                        xnFieldRefs.AppendChild(xdfFieldReferences);
    
                        xdfContentType.FirstChild.AppendChild(xnFieldRefs);
    
                        xeElementRoot.AppendChild(xdfContentType);
    
                        string s = xeElementRoot.OuterXml;
                    }
                }
            }
        }
    

    Thursday, April 8, 2010

    SharePoint 2010 Random Publishing Error

    Just got this lovely error when clicking the Edit button on a page that was pending approval on my publishing site. The joy of beta software!

    SharePoint 2010 Publishing Workflows Need the State Service!

    In learning SharePoint 2010, I'm trying to take a minimalist approach, and only install services that I need. The problem is I don't yet know what those services are!

    So far, on my new installation I have a Managed Metadata Service and a Search Service Application. Thinking to test Search, I edited one of the default pages and then tried to publish it and got the following message:


    I suppose that the State Service needs to go on my list of required services. So, I went to install it from the Manage Service Applications page. For some reason, however, it was not listed!


    Of course, given my problems with the Managed Metadata Service, I now know that the true magic in SharePoint comes from the Farm Configuration Wizard. I opened it up and was not disappointed when I found this guy:


    Which leaves me with the following now running in my environment, and publishing workflows that can display properly.

    SharePoint 2010 Pages Used to Create a New Managed Service

    Here is the list of pages that are displayed inside the popup windows when you choose to create a new managed service. See this post to see how I got this information.

    Application Type Create Application Link URL
    Access Database Service/_admin/AccessServerCreateApplication.aspx
    Business Data Connectivity/_admin/bdc/managebdcserviceapp.aspx?scenarioid=bdcservice
    Excel Calculation Services/_admin/ExcelServerCreateApplication.aspx?scenarioid=ExcelServicesCreateApplication
    Managed Metadata Web Service/_admin/ManageMetadataService.aspx
    PerformancePoint Service/_admin/CreatePpsMaServiceAppData.aspx?scenarioId=CreatePpsMaApp
    SearchQueryAndSiteSettingsService/_admin/search/TopologyAppSettings.aspx
    Secure Store Service/_admin/sssvc/createsssvcapplication.aspx?scenarioid=createsssvcapp
    User Profile Service/_admin/NewProfileServiceApplicationSettings.aspx?scenarioid=CreateNewProfileServiceApplication
    Visio Graphics Service/_admin/VisioServiceApplications.aspx
    Web Analytics Web Service/_admin/WebAnalyticsConfigWizardNewAppPage.aspx
    Word Automation Services/_admin/WordServerCreate.aspx

    SharePoint 2010 Create New Managed Service Popup Windows

    While exploring SharePoint I often read through Microsoft's code. With 2010, they added a lot of popup windows, and it's not always obvious which page is actually building the content of those popup windows.

    In this particular case, I wanted to know what the page was when I was trying to create a new Service Application. If you click the New button in the Ribbon, you'll get a nice flyout menu with a bunch of services to choose from. But where do they go?

    Here is a snippet that will get that answer for you:
    class Program
        {
            static void Main(string[] args)
            {
                ShortDisplay();
            }
    
            internal class HeaderColumn
            {
                public string Title { get; set; }
                public int Width { get; set; }
            }
    
            static HeaderColumn[] headers = new HeaderColumn[] { 
                 new HeaderColumn{ Title="Application Type", Width=35 }, 
                 new HeaderColumn{ Title="Create Application Link URL", Width=45 }
            };
    
            static string headerString = string.Empty;
            static string lineFormatString = string.Empty;
    
            static List administrationServices = new List();
    
            private static void FormatHeader()
            {
                for (int i = 0; i < headers.Length; i++)
                {
                    headerString += string.Format("{0,-" + headers[i].Width.ToString() + "}", headers[i].Title);
                    lineFormatString += "{" + i.ToString() + ",-" + headers[i].Width.ToString() + "}";
                }
            }
    
            private static void FindAdministrationServices()
            {
                foreach (SPService service in SPFarm.Local.Services)
                {
                    IServiceAdministration administration = service as IServiceAdministration;
                    if (administration != null)
                    {
                        if (!administrationServices.Contains(service))
                        {
                            administrationServices.Add(service);
                        }
                    }
                }
            }
    
            private static List GetAdministrationLinks(IServiceAdministration administration)
            {
                List administrationLinks = new List();
    
                foreach (Type type in administration.GetApplicationTypes())
                {
                    SPPersistedTypeDescription applicationTypeDescription = administration.GetApplicationTypeDescription(type);
                    if (applicationTypeDescription != null)
                    {
                        SPAdministrationLink createApplicationLink = administration.GetCreateApplicationLink(type);
                        if ((createApplicationLink != null) && !string.IsNullOrEmpty(createApplicationLink.Url))
                        {
                            administrationLinks.Add(createApplicationLink);
                        }
                    }
                }
    
                return administrationLinks;
            }
    
            public static void ShortDisplay()
            {
                FormatHeader();
                FindAdministrationServices();
                
                Console.WriteLine(headerString);
                foreach (SPService service in administrationServices)
                {
                    string displayName = (string.IsNullOrEmpty(service.DisplayName) ? service.TypeName : service.DisplayName);
                    //not very reusable, but good enough for now
                    if (displayName.Length > 35)
                    {
                        int lastDot = displayName.LastIndexOf('.');
                        if (lastDot >= 0 && displayName.Length > lastDot)
                        {
                            displayName = displayName.Substring(lastDot + 1, displayName.Length - lastDot - 1);
                        }
                    }
    
                    IServiceAdministration administration = service as IServiceAdministration;
                    List administrationLinks = GetAdministrationLinks(administration);
                    foreach(SPAdministrationLink createApplicationLink in administrationLinks)
                    {
                        Console.WriteLine(lineFormatString, displayName, createApplicationLink.Url);
                    }
                }
            }
        }
    

    And here is the output:

    SharePoint 2010 AppPool DropDownList

    While playing around trying to figure out how SharePoint 2010 populates that wonderful drop down list of AppPools, as seen in the screeshot just below, I noticed that all the classes it uses are marked internal sealed.


    Since I wanted to learn more about how this control worked, I decided to use reflector to poke around. Here is the small amount of info I discovered of the items in the list:


    And here is the code that produces that output.

    public class Program
        {
            static void Main(string[] args)
            {
                DiscoverAppPools();
            }
    
            public class AppPoolInfo
            {
                public string Name { get; set; }
                public string IISObjectName { get; set; }
                public IdentityType IdentityType { get; set; }
                public string Username { get; set; }
            }
    
            //Queries the farm to find the list of AppPools to display in an IisWebServiceApplicationPoolSection control
            public static void DiscoverAppPools()
            {
                List discoveredAppPools = new List();
    
                Assembly assembly = Assembly.Load("Microsoft.SharePoint, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c");
                Type typeSPIisWebServiceSettings = assembly.GetType("Microsoft.SharePoint.Administration.SPIisWebServiceSettings");
    
                /*
                 * There are a few other interesting properties that we aren't pulling, but might be handy to peek at.
                 * oSettings.IisSiteName
                 * oSettings.HttpPort
                 * oSettings.HttpsPort
                 * oAppPool.Id
                 * oAppPool.Status
                 * oAppPool.ProcessAccount.SecurityIdentifier.Value
                 */
                object oSettings = SPFarm.Local.GetObject("SharePoint Web Services", SPFarm.Local.Id, typeSPIisWebServiceSettings);
                if (oSettings != null)
                {
                    Type typeSPIisWebServiceApplicationPoolCollection = assembly.GetType("Microsoft.SharePoint.Administration.SPIisWebServiceApplicationPoolCollection");
                    ConstructorInfo constructor = typeSPIisWebServiceApplicationPoolCollection.GetConstructor(
                        BindingFlags.NonPublic | BindingFlags.Instance, null, new Type[] { typeSPIisWebServiceSettings }, null);
    
                    object oAppPools = null;                
                    if (constructor != null)
                    {
                        oAppPools = constructor.Invoke(new object[] {oSettings});
                        if (oAppPools != null)
                        {
                            Type typeSPIisWebServiceApplicationPool = assembly.GetType("Microsoft.SharePoint.Administration.SPIisWebServiceApplicationPool");
    
                            IEnumerable appPools = (IEnumerable)oAppPools;
                            if (appPools != null)
                            {
                                IEnumerator enumerator = appPools.GetEnumerator();
                                while (enumerator.MoveNext())
                                {
                                    AppPoolInfo api = new AppPoolInfo();
    
                                    object oAppPool = enumerator.Current;
                                    PropertyInfo piName = typeSPIisWebServiceApplicationPool.GetProperty("Name");
                                    if (piName != null)
                                    {
                                        api.Name = (string)piName.GetValue(oAppPool, null);
                                    }
    
                                    PropertyInfo piIisObjectName = typeSPIisWebServiceApplicationPool.GetProperty("IisObjectName", BindingFlags.Instance | BindingFlags.NonPublic);
                                    if (piIisObjectName != null)
                                    {
                                        api.IISObjectName = (string)piIisObjectName.GetValue(oAppPool, null);
                                    }
    
                                    //not terribly useful; really only gives a SID. Maybe good if you wanted their credentials as a SecureString
                                    //PropertyInfo piProcessAccount = typeSPIisWebServiceApplicationPool.GetProperty("ProcessAccount");
                                    //if (piProcessAccount != null)
                                    //{
                                    //    object o = piProcessAccount.GetValue(oAppPool, null);
                                    //}
    
                                    PropertyInfo piCurrentIdentityType = typeSPIisWebServiceApplicationPool.GetProperty("CurrentIdentityType", BindingFlags.Instance | BindingFlags.NonPublic);
                                    if (piCurrentIdentityType != null)
                                    {
                                        api.IdentityType = (IdentityType)piCurrentIdentityType.GetValue(oAppPool, null);
                                    }
    
                                    PropertyInfo piManagedAccount = typeSPIisWebServiceApplicationPool.GetProperty("ManagedAccount", BindingFlags.Instance | BindingFlags.NonPublic);
                                    if (piManagedAccount != null)
                                    {
                                        object oManagedAccount = piManagedAccount.GetValue(oAppPool, null);
                                        if (oManagedAccount != null)
                                        {
                                            PropertyInfo piUsername = oManagedAccount.GetType().GetProperty("Username");
                                            if (piUsername != null)
                                            {
                                                api.Username = (string)piUsername.GetValue(oManagedAccount, null);
                                            }
                                        }
                                    }
    
                                    discoveredAppPools.Add(api);
                                }
                            }
                        }
                    }
                }
    
                Console.WriteLine("SharePoint knows about the following AppPools");
                foreach(AppPoolInfo api in discoveredAppPools)
                {
                    Console.WriteLine("\nAppPool: {0}", api.Name);
                    Console.WriteLine("  IIS Object Name: {0}", api.IISObjectName);
                    Console.WriteLine("  Identity Type: {0}", api.IdentityType);
                    Console.WriteLine("  Username: {0}", api.Username );
                }
            }
        }