SharePoint 2010: Faster, Better Decisions – Self Service Business Reporting, will be one of my topics for the executive track at the SharePoint conference on June 26, 2012 at Atlanta.

Microsoft SharePoint 2010 is a powerful and feature rich platform for Business Intelligence. SPS 2010 empowers business users to easily consume data using various MS BI tools. With the recent release of SQL 2012, Microsoft dramatically enhanced the Microsoft BI platform. During this session, I will explore the improvements to SQL 2012 self-service reporting with SharePoint 2010 and talk about creating blended value from your BI platform.  

My other session will cover upgrading SharePoint 2003/2007 to 2010. More information at

http://www.gr8technologyconferences.com/Pages/Conference/?ConfId=2&Content=Speakers

Upgrading from SP 2003/2007 to SP 2010 will be my session topic for SharePoint Conference on April 5th, 2012 at Philadelphia.

Check out the conference details at:

http://www.gr8technologyconferences.com/Pages/philadelphia-20120405-Sessions/

I will share “Gotchas” based on numerous upgrade projects that I have been involved with. See you there!

Page copy protected against web site content infringement by Copyscape

Here are the steps to crawl a RSS Feed using FAST Search Web Crawler

1) Locate the RSS feed URL

2) Configure FAST Web Crawler

3) Search your RSS Content

Configure FAST Web Crawler:

Locate the xml file under C:\FASTSEARCH\etc\crawlerconfigtemplate-rss.xml. Make a copy of it and place it under C:\FASTSearch\bin\rss.xml

Before making the following changes, check the correct collection name is mentioned for DomainSpecification (example: sp) :

    <section name=”rss”>
            <!– List of start (seed) URIs pointing to RSS feeds. –>
            <attrib name=”start_uris” type=”list-string”>
              <member> http://yourRSSfeed URL </member>
              <!– <member> http://www.contoso.com/feed.rss </member> –>
            </attrib>

  <!– Delay in seconds between requests to a single site. You can mention 5 or 10 seconds –>
        <attrib name=”delay” type=”real”> 5 </attrib>

  <!– Length of crawl cycle expressed in minutes –>
        <attrib name=”refresh” type=”real”> 30 </attrib>

<! — include your domain so web crawler can download the content –> 

<section name=”include_domains”>
            <attrib name=”exact” type=”list-string”>
                <member> sathishtk.com </member>
            </attrib>
        </section>

<! — authenticate FAST Search web crawler to access SharePoint content.  –>

<section name=”passwd”>
 <attrib name=”http://www.sathishtk.com” type=”string”> username:password:sathishtk:auto </attrib>
 </section>

 Save changes and configure crawler to reflect changes made by executing the PowerShell command on the FAST Server

PS C:\FASTSearch\bin> crawleradmin -f rss.xml

 

Open up QR Server interface page – Example: http://localhost:13280 and test some search words present in your RSS feed. If the above steps are configured right, you should see the search result.

 

 

Page copy protected against web site content infringement by Copyscape

I was facing a minor issue with an upgraded site from 2007 to 2010. Navigate Up button TOP level entry was pointing to a wrong entry. The screen shot below shows 2 masked entries. One entry was showing the right link, where the top level was pointing to the old 2007 site.

 

This is not the case, when you have issues with the whole Navigate Up button. For that you would remove “PlaceHolderTitleBreadcrumb” content place holder, this was much simpler than that.

To fix, Navigate to the Portal Site connection under Site Settings – Site Collection Administration (thanks to Samer – colleague who pointed me to the link). Change the settings on the page to “Do not connect to portal site”. That should take care of the issue. Simple step but handy solution.

 

 

Page copy protected against web site content infringement by Copyscape

Recently, I had to troubleshoot a custom .NET web application within SharePoint not loading as it was encountering the following error: ( the web site was deployed to a folder within _layouts)

 

Cannot create file when that file already exists. (Exception from HRESULT:0X800700B7)

 

It turns out the culprit was the entries in the web.config for the custom web application. There was some entries in the web.config that was conflicting with SharePoint web.config and the above error occurred.

Suggestion is to remove configsections and controls and try out. If you still encounter errors, try with minimal entries in the web.config such as customErrors, connection strings, app settings, and assemblies. That should fix the error.

Page copy protected against web site content infringement by Copyscape

My SharePoint 2010 Farm has 2 WFEs and 1 APP server. The APP server was hosting Service application and the index and crawl components split between web front ends.

I encountered the following error

catalog Main: failing to copy index files from crawl component 4 for <x> minutes. Access is denied

So, digging further did not result in finding any suitable fixes.

So, I created a new query component folder, mapped the new folder to the Query component and removed the old folder reference.

 Navigate to Search Administration (Manage service applications – Select Search Service Application). Modify Topology – Pick the query component causing the issue and edit properties

Try the above approach and if still does not work, do an index reset (watch out… if you are indexing a lot of content… this may take a long time to index the content back again) along with a new folder mapping. I did not have to delete the Query component and re-create it.

Happy Searching!

 

Page copy protected against web site content infringement by Copyscape

Here is how to convert an existing web application configured with Classic Mode Authentication to Claims Based. This can be done using PowerShell.

Note: Once converted from classic to claims, you will not be able to revert the change.

Screen Shot showing web application in Classic Mode

 

Now, run the Powershell command:

 

You can see the web application is now converted to Claims Based Authentication: