We here at the Infusion blogger community are doing a weee bit of a renovation. In the meantime, check us out at our beta blog
And specifically my blog at:
Check out how easy it is to customize the Document Information Panel inside of Office 2007 using InfoPath. This article leads you through an approximately 3 step process that should allow you to have custom Document Information Panels up and running.
Two things I've noticed:
1. It's even easier than I previously thought. If you don't want to customize the panel beyond the xml data being stored, it's quite simple. You can simply edit the Content Type and the Document Information Panel default form will automatically pick up the changes (including specialized drop down fields).
2. This walk through doesn't seem to work on my version of Beta 2 TR. Not sure if that's a bug across all copies, but I have seen it work on the RTM. Something to watch out for.
Some screens and maybe a video walk through are in order, I think. Hopefully, I can post them at the end of the week.
Yesterday, I was giving a technical interview for Infusion. This is a fairly normal occurance for me as I give about 3 tech interviews a week(Infusion is going through a period of record growth and is looking for candidates *all the time* - more about this to come soon).
But the candidate said something that made me laugh. When I asked him a BizTalk related question, he said "Oh, yeah, that's easy, like you said on your blog ..."
and suddenly it hit me that people might actually be reading this thing. I'll have to stop taking my tech interview questions directly from my blog.
For the record, the guy lost points for knowing the answer, but gained more for reading my blog. Kudos to you, blog-reader-candidate.
This tip comes from Marty Waz (described yesterday in my post about Publishing as Web Services) . As you’ll remember, I’m at the Virtual TS Boot Camp, where we did a simple BizTalk 101 project – Marty Waz Style. Basically, it’s a Hello World Web-Services project, but using all best practices and done in 20 minutes.
Ok. Not a problem.
But it turned out to be quite a problem. The class is made up of mostly Virtual TS’s and actual Microsoft TS’s so it’s not like it’s a slacker group (although we’ve all run into the occasional bonehead TS). Probably one of the best groups to program BizTalk outside of the actual BizTalk Server team at MS. Anyway, enough praises – cause we sucked big time. Not a single person was able to do the assignment even with extra time following all of Marty’s best practices. Personally, I forgot to isolate the external schemas from the internal, used maps inside my orchestrations, and used Schemas directly as Messages.
Well, that last part is the topic of this post. Marty showed us how to save an extreme amount of time by loosely coupling our messages inside our orchestrations. No more messages of type schema. It’s a little odd, so let’s start from the beginning.
Here’s the non-leet method: Create your orchestration. Add a receive shape and a logical port. Well, now I need a message. Add a new Message in the Orchestration Explorer. Set the Message Type to the schema you want. Hook up your receive to that logical port. Nothing out of the ordinary so far. And let’s do something that should be pretty easy: change our schema type. So we go to the Message, select the Message Type and select a different Schema. We get the following error:
Property value is not valid: One or more Send or Receive actions are connected to Ports and are using this Message. Please disconnect the actions before changing the Message Type.
Now this is a fairly common thing. I mean, sometimes we change our schemas – during development this happens all the time. The object oriented equivalent is changing variable “int foo” to “double foo” in code. What if you couldn’t do that without removing all the references to foo. That would suck.
Here’s Marty’s super hot way around this: Create your orchestration. Add a receive shape. Well, now I need a message. ADD A NEW MULTI-PART MESSAGE TYPE. Give it name that doesn’t include the schema –remember, you are using this as an abstraction over your schemas so you can change between types (Object oriented analogue: changing “int foo” to “double foo”). Change the “MessagePart_1” to “Body” (CAPITAL B – it’s important). Now set the Type for this to the schema you were going to use before. NOW add a normal Message, name it, but instead of choosing a Schema for the Message Type, choose your Multi-part Message Type. Use this message like you normally would.
Go ahead, try changing the underlying schema type. Instead of doing it at the Message Level, you can do it at the Multi-part Message Type. Your links are still valid because the ports are bound to the wrapper around the schema (multipart message), not the schema itself. You have successfully changed from “int foo” to “double foo”.
Sounds pretty simple, but no one in the room knew it. So I figured I would blog it so we don’t see any more of this in the field. So, there you go, no more schema type messages, only multi-part messages.
[Editors note: this text is a little confusing without sitting down with an orchestration and trying it yourself. I may post a video shortly that walks you through the changes.]
"But I swear my Two Way Ports are Public scoped"
Publishing an Orchestration as a Web Service: it's a common thing. You create an orchestration; create a send-receive logical port; and try to expose via web-service. Ah, but you forgot to make it Public scoped - by default it's Internal - no biggie. Intuitively, per the error message you get ["There are no orchestrations with public receive ports in this BizTalk assembly. Click back and specify a BizTalk assembly containing orchestrations with public receive ports."], you change the scope to Public and re-try the Publish Orchestration as Web Service.
But you still get the message that it's not public.
I have run into this a number of times, and I should be ashamed because I have listened to erroneous blog posts that tell you to re-write your project. Normally, my projects are small enough that it doesn't matter.
But now, here's the canonical word on the subject - it comes via Marty Wazsnicky, Regional Program Manager at Microsoft and head of the Virtual TS program (which I am a part of). Mix equal parts caffeine and BizTalk and you have Marty. It's ashame he doesn't blog more often, because he's full of good information.
Anyway, to solve your problem Restart the BTSNTSVC.exe. As Marty explains it, BizTalk 2006 uses .NET 2.0 caching for inspection of the Assembly that contains your newly public ports. Thus, it never really inspects your new assembly and just relies on the old assembly (with Internal port scopes). This is a dependency on the .NET CLR 2.0, so that's why this is a problem with BizTalk 2006 and not BizTalk 2004.
To sum up: Restart the service, the caching goes away and you can now create your service.
Seems like there's not a lot of activity here - but I assure you there is..just not on the publish side. I've gotten about 20 emails (and a related number of comments) from people who are running into problems with the SharePoint Forms Authentication. Proves at least my setup isn't unusually bad ;) In fact, my post on the "File Not Found" issue was translated into Japanese!
Right now I am outside of Boston (Waltham) getting training from Marty Waz from Microsoft in BizTalk Server 2k6. Already both certified and a Virtual TS in BizTalk, but this is the code-until-you-drop-14-hour-a-day program that makes the VTS team worth its salt. Hopefully will be able to blog a few lessons.
Speaking of blogging a few lessons, I have run into a number of SharePoint issues that I uncovered from our internal implementation and from our clients. I'll be blogging them this week now that I have a "manageable" 14 hour a day schedule. *Gulps Coffee* Hopefully I'll get a post out tonight - thinking SharePoint Forms Authentication and how it breaks MySites.
(This is the second post in a series on Forms Authentication in SharePoint 2007. I announced this series a little while ago over here and continued it here)
This is an error that until today signaled the end of using Forms Authentication for a web application. Every once in a while, after setting up an application to use Forms Auth, I would get "File Not Found" when I was redirected to the login.aspx page. Not a 404 mind you, but simply a web page that said "File Not Found" (just the text, no SharePoint chrome). Basically, whenever I tried logging in, I wouldn't even be given the chance - and since I needed to be logged in - it would effectively bar me from the site completely.
I didn't really have a good workaround for this until I really started digging through the SharePoint code to find out what was causing it. I am not sure how wide-spread this problem is, but I've had seen it so far on all of my installations, so I thought I would post this fix below. It's not supported by Microsoft and when I brought it to the internal SharePoint DL, they didn't really have a solution. (Take note: before implementing the solution below, you should make a back up copy of login.aspx and authenticate.aspx):
Basically, the error is caused because the Microsoft.SharePoint.ApplicationPages.dll assembly is not accessible by the login.aspx or authenticate.aspx. I noticed this by removing the following tag from login.aspx (under \program files\common files\microsoft shared\web server extensions\12\template\layouts): < %@ Assembly Name="Microsoft.SharePoint.ApplicationPages, Version=18.104.22.168, Culture=neutral, PublicKeyToken=71e9bce111e9429c"% > (remove spaces between < and %). This caused me to get compliation error, but it actually loaded the file. So, throwing that tag back in, I just simply needed to make sure that the ApplicationPages assembly was available.
My solution was to toss the ApplicationPages assembly into the GAC (again, this is my work around hack - this "shouldn't have to be done" according to MS but they gave no reason as to why this file wouldn't be able to access the ApplicationPages dll in the first place). You'll find the ApplicationPages dll under \program files\common files\Microsoft Shared\web server extensions\12\CONFIG\BIN.
Next you should should be able to navigate to /_layouts/login.aspx and get your normal login page. However, when you click the "Sign In" link from SharePoint this will take you to /_layouts/authenticate.aspx which will now give a similar "File Not Found" but this time with the SharePoint chrome. Again, the ApplicationPages assembly tag is the culprit, so I replaced the stock < %@ Assembly Name="Microsoft.SharePoint.ApplicationPages" % > with the < %@ Assembly Name="Microsoft.SharePoint.ApplicationPages, Version=22.214.171.124, Culture=neutral, PublicKeyToken=71e9bce111e9429c"% > from the login.aspx. After I did this, the problem worked fine.
Again, this is something that you'll have to do on each server that is affected by this horrible problem. It doesn't seem ideal, and yes it is probably a bug in SharePoint, but if you want to get Forms Authentication to work in Beta 2, this is the kind of hack you might have to do. After I made this change, I never ran into the dreaded "File Not Found" exception again.
Do me a favor and shoot me an email () if you have run into this problem. I'd like to see how many people actually come across this problem.
(This is the first post in a series on Forms Authentication in SharePoint 2007. I announced this series a little while ago over here)
One of the main problems I was having with SharePoint Forms Authentication was getting it to recognize and list users from my new authentication provider. Once you switch authentication providers, you have to manually add the first authenticated user to SharePoint from the Central Administration. The reason for this is quite simple: the normal administrator credentials you used before won't be recognized under the new provider - so you have no way to log into the site under the new provider.
But what I found was that after going to "Policy for Web Application" (the page where you manually add the first authenticated user), is that Central Administration can't list or recognize your new user names. Being new to using ASP.NET provider model, I assumed I simply had the database set up incorrectly - not so. I just missed one essential step: You must add the provider settings to BOTH the Web Application web.config and the Central Administration web.config. I can't stress this last part enoguh. If you don't add the provider settings to the central admin, you won't be able to access the new credentials store, and thus won't be able to add users.
Sounds pretty intuitive once you think about it (how else would Central Administration know about the new provider) and indeed SharePoint gives a warning about this (albeit a small one) on the left hand side when setting the provider on the Authentication Providers page:
The membership provider must be correctly configured in the web.config file for the IIS Web site that hosts SharePoint content on each Web server. It must also be added to the web.config file for IIS site that hosts Central Administration. [Emphasis mine]
But to first time users of Forms Authentication in SharePoint (which we all are), this is a little counter-intuitive. We kinda assume that adding the provider details for the individual Web Application is enough. And indeed, I think the central admin web.config will become bloated with providers from every site that it's managing. All in all, I think it's a poor choice to rely on the central admin web.config and not using the web.config of the individual sites, but I am sure there are specific reasons why MS chose this model.
I think I'll file this under "things you'll probably run into if you're doing SharePoint Forms Auth." Next up in this series, the infamous "File Not Found" exception you will probably run into if you do Forms Authentication enough.
I'm currently working on a Federal Disaster Recovery Collaboration demo that heavily leverages InfoPath and Forms Services from Microsoft Office SharePoint Server 2007. If you're not familiar, MOSS 2007 and Forms Services allows for InfoPath forms to be displayed directly in the browser. Normally, Forms Services works out of the box, but the other day I ran into a bit of an odd situation.
Heres the error that I got:
This form template is browser-compatible, but it cannot be browser-enabled on the selecte site. This may be caused by one of the following reasons:
- The server is not running InfoPath Forms Services
- The necessary features are not available on the site collection
- The policy setting on the server does not allow users to browse enable templates.
The solution seemed pretty straight forward: enable Forms Services on the Site. Looking at the suggestion from InfoPath, perhaps I didn't have the SharePoint feature that allows Forms Services on my site. So, I went to the SharePoint Features page under Site Settings and activated the Office SharePoint Server Enterprise feature. Rerunning the publishing Wizard, I expected this to fix my problem ... but it didn't.
This turned out to be a bit of a noodle scratcher until I realized that although I enabled this feature on my subsite that I was going to publish to, I didn't enable it on my Site Collection. So, the solution is enable Forms Services on the Site Collection (Site Settings -> Site Collection Settings -> Features). Since I was trying to publish initially to the site collection, it rejects the form (even though the subsite could publish).
Now why did this work out of the box before on my previous installations? My previous installations used the Corporate Intranet Publishing Site as a template for the site collection; the broken one used the Internet Presence template. The difference here is the the Intranet Publishing Site activates these features by default - Internet Presence does not.
So the lesson here is: if you're going to use Forms Services on an Internet Presence site, activate the Office SharePoint Server Enterprise on the Site Collection first.
SharePoint 2003 had alerting functionality, whereby a user could subscribe to a list and would get an email alert if anyone changed the list. Basically, it was useful if you had a document library and you wanted to know when someone posted to it. I personally found this feature to be annoying at best, but apparently everyone else found it to be a super useful.
Well, now alerts form the basis of Workflow tasks in SharePoint 2007. And if it's not abundantly clear, I absolutely *love* using workflow in SharePoint. I thought that Workflow would handle it's own emails, but it turns out that by default it uses two different types of emailing systems:
- Code emails: these emails are genrerated directly in the workflow. An example is an email the user get when a workflow is kicked off.
- Task alert emails: these emails are generated via the Alerting system. An example of this is the alert that user gets when a Task is waiting for him.
To my dismay, there was a distinct difference between these emails: code emails were fast, task emails were dead slow. Turns out the problem is not confined to Workflow. By default, alert emails in SharePoint are throttled to being sent every 5 minutes. For most people, this is apparently acceptable - but for real time demos it's a deal breaker. I can't wait 5 minutes for SharePoint to send me an alert.
Apparently, there's a way to change this. Originally, this help comes from David Mann () in the betanews workflow group. He clued me in that there's a setting in SharePoint that throttles the email. He didn't remember the command to fix it, but I found that tip here. Basically run this stsadm command:
stsadm.exe -o setproperty -pn job-immediate-notification -pv [minutes]
I haven't tried setting this lower than 1 minute, so I am not sure whether zero is a valid option, but 1 minute is fast enough for me. Now both types of email are pretty much instantaneous. Certainly, a worthwhile setting for anyone expecting SharePoint instantaneous alerts to be really instantaneous.
Since Ajax and Web 2.0 is getting really hot - I figure I should put this up here. It comes via Syd's blog, so mad props to his post. Originally, it's from Scott Guthrie and it shows how you can create a full ATLAS application using zero C# code. For the uninitiated (which was me a short while ago) ATLAS is the Microsoft tag system that brings AJAX functionality to the ASP.NET framework.
So check out this video. You're not going to be sorry.
Hopefully, I'll be able to add some of my ATLAS experiments up here soon. ATLAS in SharePoint - that's the dream: just need to find the funding.
I have been monopolizing the Virtual Server (sweet 8gb dual processor machine) at Infusion lately. I've been running some heavyweight VPCs (MOSS 2007 / Exchange 2007 / LCS 2005 and a few others) which require about 2gb to run nicely. The problem lies in the way I've been backing up our development work. Usually, when I work on my machine locally, I turn on undo disks and simply merge changes when I know I am in a good state. If I screw up, well, I can always just discard the undo disk.
Now this is nothing Earth shattering. It's almost a standard best practice for those who use VPC. But what does undo disks buy you when you have virtually unlimited storage: nothing but headaches. The whole point is keeping a safe copy of the code - and well, if you have the storage space, why not simply make a full copy of the virtual hard drives themselves?
And after the error below, that's exactly what I'm going to do from now on. When I do personal development on my laptop, I am going to use undo disks, but on the server I am going to leave them off. Undo disks just slow performance and create headaches when they don't merge. So now on to the error.
"Virtual Server was unable to commit the changes made during the current session of "[VPC NAME]". The cause of this problem is that the parent virtual hard disk is a part of a saved state. Your changes will be kept for the next time you start this virtual machine."
If you search google for this, you come up with almost nothing. But there's a critical post right here about it. It spells out exactly what you need to do, but foolishly I ignored it and kept looking for a better solution, since I didn't understand the concept of "inspecting" a virtual disk.
The basic steps are:
- Rename the undo disks to *.vhd. This turns them into a virtual drive in the eyes of Virtual Server.
- Click on the Inspect option in the Virtual Disk section of Virtual Server. There should be an option to then merge the disk.
- Select a new disk to merge into. If you try merging into the original parent, Virtual Server will just give you the above error again.
- Take your newly merged virtual disk and replace it in the virtual machine (i.e. replace the parent that was couldn't be merged to before).
That's it. Pretty simple, but since I didn't know this option was there (very rare to have to merge into a completely new disk) it completely eluded me. Well, I won't have to worry about this with Virtual Server anyway, since I am going cold turkey off undo disk. They are only for smaller environments.
I ran into a bit of a problem trying to import a Business Scorecard file today. After importing the cube file, I successfully opened the Business Scorecard Workspace (.bsw) file. However, when I tried to Publish All I got the following error:
"The database connection failed. Please contact the administrator."
Being the administrator, I figured I should probably fix this myself. I assumed the problem was with the defined datasource not being able to hit my server. Since I imported the database myself, I figured I simply had to update the name. Not so, I found that the name under Data Sources was the same, but when I tried to set the name again, it the Business Scorecard Manager gave me the following error:
"The Business Scorecard Manager server could not connect to the data source. Verify that all of the required data has been entered in the boxes and that the connection information is correct."
Well, the key here is that the Business Scorecard Manager SERVER could not connect to our database. After checking out the credentials of the BSM web service app pool, I found it was using the Network Service. This did not have access to the SQL server I was using.
Solution: Change the app pool identity to be a (domain) account that has access to the SQL Server.
That basically did it.
Props to this conversation which pointed me in the right direction: http://www.eggheadcafe.com/aspnet_answers/officebusinessscorecardmanager/Jun2006/post26979401.asp
Recently I was trying to add both an Office Report View webpart and an Office Scorecard View webpart, when I ran across the following error:
"One of the properties of the Web Part has an incorrect format. Windows SharePoint Services cannot deserialize the Web Part. Check the format of the properties and try again"
I tried re-installing the Business Scorecard Manager server components by re-running server.msi, but no luck. I came across the answer here:
What you need to do is add the following line to the your web.config file under AppSettings: <add key="Bpm.ConnectionString" value="Integrated Security=SSPI;Initial Catalog=ScorecardServer; Data Source=govbase"/>
Without this AppSetting, the Scorecard webparts apparently won't serialize correctly. Also, make sure that this AppSetting matches your connection string correctly. I had renamed my scorecard server database and again without the correct AppSetting entry, the WebParts won't serialize correctly.
Just to add to the mass hysteria - and in case you didn't know - Scoble just left Microsoft. And if you don't know who Robert Scoble is, well shame on you, he's the man behind blogging at Microsoft. One of the blog visionaries, without whom, I probably wouldn't be writing a corporate blog.
Slashdot is already flipping out about this with their wild speculation (and they fault Dovorak for making wild conclusions).
I wish the best of luck to Scoble. I read some of his book (which was otherwise propping up the boss's window) and it seemed to grab my attention. In fact, I am going to add him to my rss list and should have while ago.
As a part of the BizTalk Virtual Technical Specialist program at Microsoft, I am required to pass the 74-135 exam: Developing E-Business Solutions Using Microsoft BizTalk Server 2004. It's something that I should have gotten around to a while ago, but for a variety of reasons I just haven't.
Primarily, I have just been a little too busy. Maybe I'll blog a little later about my projects, but suffice it to say, I have a full deck (those who know me know that I don't declare this until I am working constant 90+ hour weeks). But the exam would only take an hour or two, fair enough. However, I felt that I should put a concerted effort into studying for it. After hearing stories that Scott Woodgate himself failed the test the first time, I was a bit intimidated. After all, BizTalk is a very large group of technologies to master: in addition to the bread and butter orchestrations, maps and rules, there are huge areas like BAM, BAS, HWS. I don't have the most experience with all of these parts, so I deferred a little. In the end, I just decided to take it when I had a moment, without a lot of studying.
Well...the test *is* somewhat difficult, there are many questions that are based on real world experience. You just wouldn't know the answer without running into this problem type stuff. In addition, a lot of the questions are based on things that I would normally just googled. I don't like having to remember the exact interfaces that I have to implement for a custom pipeline component, though I have written them before. All in all, the test is fair.
I am going to encourage some of my fellow Infusionites to take the exam. We have a lot of in-house BizTalk knowledge, it's just that no one has "proven" it on paper - though we've provent it with plenty of projects. In addition, the Biztalk 2006 exam is coming out soon - it's beta now and I've had the pleasure of taking it. A lot more fair of an assessment of BizTalk than the 2004 exam, I would say. Certainly a lot more centered around BAM for sure. It's still under NDA, so I can't really say a lot more, but I encourage anyone to take it when it comes out of beta.
I am not sure if this qualifies me as a Microsoft BizTalk Server 2004 MCP, but either way, I am relieved that I am finally "proven on paper" to be an expert.
Today, I spent an inordinate amount of time getting Microsoft Virtual PC 2004 to play nice with a VPC that was sent from Ken Mallit (Microsoft). Now I know, Ken Mallit doesn't use undo disks very often for performance reasons, but for me, they are an utter necessity. Development without undo disks to me is like flying without a parachute. I need the ability to throw away changes and start fresh - often this involves more than just source control. And when working with server products like SharePoint, this is especially true, since you are actually editing config files that aren't a part of your source control.
Anyway, I just wanted to have my undo disks on, which is usually pretty simple. However, this VPC has two hard drives due to space constraints (it has basically *every* server product from Microsoft on there) so I was a little leary about having two VHDs and two undo disks. And add in the little fact that this was all getting run from an external non powered (5400rpm) drive.
Well, that last point makes all the difference.
Apparently, and I haven't confirmed this with anyone, that's just too much IO for that little disk to handle. So what happened when running the VPC? It totally tanked. First, the load time was incredibly slow - fair enough. But immediately after logging in - the VPC reset itself. No BSOD, no warning, just hard reset - basically a virtual power cycle. Well, that I couldn't believe. Slow IO or not, it shouldn't do that. And this behavior was reproducable across host machines, and across VPCs of this type, so I decided that this is a bug.
The solution: I left the VHDs on the external drive and moved the undos to save to my internal hard drive. Not only did it stop the crashing, but it vastly improved the performance of the VPC. So, make sure to run undo disks locally when you have more than one VHD.
If I'm a nice guy, maybe I'll go bother the good people at the VPC guy's blog. But for now, I'll just complain about it here.
Just as an FYI not in the Office 2007 Beta program, Frontpage - that loveable little html editor purchased by Microsoft in the mid 90's and integrated into their Office suite - is soon to be no more. As of Office 2007, Frontpage will now be broken into two separate products: Microsoft Office SharePoint Designer 2007 and Microsoft Expression Web Designer.
From the press release:
"After we fully release SharePoint Designer 2007 and Expression Web Designer, FrontPage will be discontinued gradually. [...] In the meantime, Microsoft will continue to provide current FrontPage customers with full product support through June 2008, as well as clear guidance on how they can smoothly migrate to SharePoint Designer 2007 or Expression Web Designer, depending on their roles and needs" -- John Richards
I welcome this change, not because I didn't particularly like using FrontPage before, but because it was just so hackish for editing SharePoint websites. It always led to problems with ghosting and couldn't really access the SharePoint webparts too easily. After some light work with the SharePoint Designer 2007 beta, it seems that this product is really tuned to editing SharePoint websites. This should make the occasional client request of "can I get rid of this SharePoint XYZ stuff" all the much easier.
While I haven't directly worked much with the Expression Web Designer, I have seen it used and I know that it generates some pretty spiffy WPF enabled websites. They require Internet Explorer 7 now, but it's really neat stuff.
Anyway, I guess this is only sad news for those Vermeer Technologies founders, Charles Ferguson and Randy Forgaard, who originally created FrontPage. I only mention this because it is one of the few case studies I remember from my Cornell Entrepreneurs course.
Ever since Microsoft announced that SharePoint 2007 would include the ability to use pluggable authentication providers, I have been psyched about upgrading to 2007. For each project that I do at Infusion, there is nearly always the request of setting up a collaboration site, usually in SharePoint. In 2003, we were forced to use Active Directory authentication, so I would have to pollute our CORP domain with the outsider authentications.
Yes, I know there were probably better ways of doing this. If you're a Network Admin, you've likely come up with at least 3 different ways as a work around, but the key is that none are out-of-the-box and easy to set up. The attraction of SharePoint 2007 is that it promised out-of-the-box support for the AspNetSqlMembershipProvider - a new feature from ASP Net 2.0. It has it's own database structure for managing your logins, just run some install scripts and you're done. Managing the users in the database is a little annoying, but it's definitely a manageable thing.
From my experiences, SharePoint 2007 provides the functionality out-of-the-box. But it isn't easy to set up.
In each version of SharePoint that I've worked with (DogFood 4/Alpha 1, Beta 1, Beta 1 TR, and Beta 2), Forms Authentication was a *nightmare* to implement. And I'm not trying to exaggerate - I've never been able to get it in less than 4 - 5 hours of tinkering. That's including the latest version, Beta 2.
This week I will be writing a series of articles that deal with the majority of the problems I've encountered while using Beta 2 Forms Authentication. I am pretty sure that if you use Forms Auth, then you've run into them. After fixing these problems, Forms Authentication works. And it's wonderful - it's one of those features that once you have it working, it's tough to live without.
A great feature it will be, once this code finally makes it to production quality. But until then, it's at least a 4 hour head ache - fair warning. Hopefully these posts will be able to save you a little time.
Recently I was on a consulting gig with Syd Millett, one of my co-workers from Infusion, that took us both down to the lovely state of South Carolina. It was there at a large bank that we did a fair bit of BizTalk customization: writing both a few custom functoids and an adapter.
Syd goes into more detail about having to write custom functoid that returns a blob of xml. Considering people are already linking to his post, I figured I would get in on the action, which describes how to attach our custom functoid to a scripting functoid to copy over a blob of xml. Basically, we re-implemnted the Mass Copy functoid as a custom functoid but with one important difference: the Mass Copy can only accept input from the input schema, whereas our functoid can be attached to any output.
This bears repeating because this is (besides Syd and my post) posted no where on the web: Mass Copy is limited to connecting to the input schema. You have to create a custom functoid to accept the output from a previous functoid.
If this helps even one person, then it'll be worth it. Thanks, Syd.