Wednesday, October 22, 2008

MOSS 2007

MOSS Alerts!!!



UNDER THE WOODS:

Alert works with four tables;

Event Cache
Event Log
Immed Subscription
Sched Subscription

Subscription Tables: These are needed for the type of Alert, i.e. if it is immediate or scheduled. Entries are made here accordingly when it is made in the Event Cache table.

Event Cache: This is to have the entries made for the alert collectively.
Events are added in the event Data Column and the ACL Column, once an alert is requested. Either these fields are NULL/ Not NULL. If NULL, we need to find an entry at the Event log Table.

Event Log:
This holds the entry of the Sched Subscription events. It gets it copied from the Event cache table. Remember only Scheduled.


Last, but not the lease is the Timer Service:
Someone should be there to drive all the above tasks. Engine cannot run without a driver accelerating it. This guy is the driver for the exact above process to be initialized and run successfully. This supervisor is very important. So check this guys presence, else the whole process will not work at all.
Just have this knowledge and lets now proceed with Troubleshooting Alert;


ISSUE ISOLATION OUTLINE:

Web application Level
Alert switch check
Database Check
Entry in the Subscription Table
Timer Job Definitions
Security Check
Synchronize the credentials
File-mon – To check if the timer service picks up the alert
Verbose Logs – To check for message “Alert has been sent”
Check Event Cache table for the processing of the alert (NULL/Not NULL)
Last step is reviewing the File Mon logs

NOTE: Run the file-mon from the step 7 through 8 and from 9 through 10.

EXPLAINED:

Create a new web application to check if the Alert works there.
If Answer==yes
{
Detach and attach the content DB from the old web application to the new one
}
If Answer==No
{
Go To Step 2

Enable and disable the Alert functionality:

Open the command prompt and go to the 12\Bin folder. Run this command and see whether alerts are enabled for the web application.

Command: Stsadm.exe-o getproperty -url http://problemsite -pn alerts-enabled

The expected output is .
Else
{
Run the following command to change the value.

Command: stsadm.exe -o setproperty -pn alerts-enabled -pv "true" -url http://problemsite
}

If the property is Yes and still the alerts are not sent, toggle the property from Yes to NO and then from No to Yes. This may delete all the existing alerts.

Check if the Alert requested is making an entry in the Subscription Table
Note: Since this is an Email Alert and since this is IMMEDIATE, an entry is made in the Immed Subscription table. So now we need to check this table for the successful entry
{
Command: stsadm.exe -o getproperty -url http://ProblemSite -pn job-immediate-alerts

Expected output:
. If you don’t get this, run the following command to change the value.

Command: stsadm.exe -o setproperty -pn job-immediate-alerts -pv "every 5 minutes between 0 and 59" -url http://ProblemSite
}

Check if the Alert Job is running:
Confirm the above step through the UI. Central administration>> Operations>> Timer Job Definitions and ensure that a job named Immediate Alerts is present for the web application. This should work!!!

Configure Alert now on a list for a user by entering his/her email address.
If this works as expected, then take the same user using the people picker and check if it works.

Status:
If Answer == Yes
{Check with different user and confirm the resolution}
If Answer == No
{Go to the step 6}

To Check the Subscription table for a new alert record presence:
Check in the table if a new record is added to the email field
If not, check whether the email address is present in the user’s profile through the SSP admin page.

Security Check:
The new alert is not security trimmed. So when an alert is created, it sends an email to the user irrespective of the security. If this does not work check if the user or security group has the lowest read permission at least on the list.
Read permission is the basic requirement.

Synch the credentials:
The issue might also be with a password change which is not synchronized properly. Hence we would need to re-sync the password.
Refer: http://support.microsoft.com/kb/934838

9. Now we are going back to the Pavilion here;

Create a new Alert on a list for a user, check if the user gets the success mail
Upload a document
Begin running the file-mon and analyze the logs.



10. To check if the timer Service picks up the Alert Template:
Run file-mon on the MOSS server which is responsible for the timer service and check if the Timer service picks up the alert template during the whole process.

Also take a Verbose logs to check if there is a message “Alert has been sent”
Check the Event Cache Table for processing

Query: Select * from eventcache order by EventTime DESC

Need: To check if the latest log is the one corresponds to your uploaded document. Make sure the Event Data and ACL columns are not NULL.
After 5 or more minutes, check the Event-Cache table again to see if the Event-Data and ACL columns are NULLed.

If so, stop file-mon after the Event-Data and ACL columns are NULLed. Review log.

Tuesday, October 21, 2008

MOSS 2007


MOSS Search!!!


Hi All,
This is my first Blog. This is on MOSS Search
I hope you guys will enjoy!!!
Ok ! Now first let us understand what is search. We perform search in engines like search.microsoft.com, google.com [favourite], yahoo.com etc, to find details on what we need. In MOSS, we search the corporate data and if necessary others too. Search in MOSS can be extended to the maximum. Also the algorithm used provides the exact result needed.

Note: This is just for a basic understanding of it.

Ok now to dive in to the Architecture;
Searching and Indexing are most important parts in MOSS and they are interconnected.
Due to this a SharePoint server can have the search and index role together or it can be in two different servers.
Some technical terms used in Search:
Index, Content Source, Crawl, Propogation, Property Store. Also there are many more which we will come to understand in the later phase of this BLog.
Index:
Index is just like the index of a book. This is a location on the file system which contains the exact reference to the word.

Content Source:
A content source is a collection of start addresses representing content that should be crawled by the search index component. A content source also specifies settings that define the crawl behavior and the schedule on which the content will be crawled.
Enterprise Search provides several types of content sources by default, so it is easy to configure crawls to different types of data, both internal and external.
Architecture and How Stuff work !!!
Search Role:
Phase 1:
+ Gatherer: Gathers the search query and sends it to the word breaker for processing
+ Word breakers: To break compound words and phrases into individual words or tokens.
Query Engine uses it
+ Stemmers: ing, Ed etc
+ Noise files are removed [is, a , and etc]
Sends the word to the next phase
Indexer Role:
Phase1:
THE FILTER DEAMON:
******* Components: I Filter+ Protocol Handler *******
+ Filter Daemon: It actually calls the exact protocol by making use of the protocol handler to open the documents and contents in the content source
+ Protocol Handler: opens the documents and contents in the content source in their native format and exposes it to the I filter
+ I Filter: Filters into the chunk of text and properties from the content source opened by the protocol handler and gives it to the index engine.

Phase2:
+ Index Engine: Processes the text and properties of the content
source and puts it in the Content Index and property store
+ Content Index - Has the words location in the content source
+ Property Store - DB
This table in SQL DB Stores the properties filtered
=========================================================
Properties Associated Values Security Map to the word in the content index
=========================================================
So references are created at two locations,
• At the File system Level – Content Index
• At the DB Level – Property Store
The text and the property are mapped...

To do:
This crawled property has to be mapped with the managed property for it to be included in the search.
Logic:
Crawled property is basically mapped according to the content sources;
Share Point content
Web content
File share content
Exchange folder content
Business data content

The managed property is with respect to the search made by the user.
Hence both have to be mapped.

Please refer to the flow diagram to get a bird view of the architecture.
Comments are Welcome!!!